Science.gov

Sample records for automatic vol analysis

  1. Automatic Microwave Network Analysis.

    DTIC Science & Technology

    A program and procedure are developed for the automatic measurement of microwave networks using a Hewlett-Packard network analyzer and programmable calculator . The program and procedure are used in the measurement of a simple microwave two port network. These measurements are evaluated by comparing with measurements on the same network using other techniques. The programs...in the programmable calculator are listed in Appendix 1. The step by step procedure used is listed in Appendix 2. (Author)

  2. Automatic cephalometric analysis.

    PubMed

    Leonardi, Rosalia; Giordano, Daniela; Maiorana, Francesco; Spampinato, Concetto

    2008-01-01

    To describe the techniques used for automatic landmarking of cephalograms, highlighting the strengths and weaknesses of each one and reviewing the percentage of success in locating each cephalometric point. The literature survey was performed by searching the Medline, the Institute of Electrical and Electronics Engineers, and the ISI Web of Science Citation Index databases. The survey covered the period from January 1966 to August 2006. Abstracts that appeared to fulfill the initial selection criteria were selected by consensus. The original articles were then retrieved. Their references were also hand-searched for possible missing articles. The search strategy resulted in 118 articles of which eight met the inclusion criteria. Many articles were rejected for different reasons; among these, the most frequent was that results of accuracy for automatic landmark recognition were presented as a percentage of success. A marked difference in results was found between the included studies consisting of heterogeneity in the performance of techniques to detect the same landmark. All in all, hybrid approaches detected cephalometric points with a higher accuracy in contrast to the results for the same points obtained by the model-based, image filtering plus knowledge-based landmark search and "soft-computing" approaches. The systems described in the literature are not accurate enough to allow their use for clinical purposes. Errors in landmark detection were greater than those expected with manual tracing and, therefore, the scientific evidence supporting the use of automatic landmarking is low.

  3. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  4. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  5. Automatic Linguistic Analysis.

    ERIC Educational Resources Information Center

    Coker, Pamela L.; Underwood, Mark A.

    Computer programs for linguistic analysis of language samples from bilingual children were surveyed in order to evaluate their usefulness. Eight programs which could be implemented on the UCLA IBM 370/3033 computer were considered. It was determined that the Computer Assisted Language Analysis System was the most promising in terms of capabilities…

  6. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  7. Integrating Automatic Genre Analysis into Digital Libraries.

    ERIC Educational Resources Information Center

    Rauber, Andreas; Muller-Kogler, Alexander

    With the number and types of documents in digital library systems increasing, tools for automatically organizing and presenting the content have to be found. While many approaches focus on topic-based organization and structuring, hardly any system incorporates automatic structural analysis and representation. Yet, genre information…

  8. Show me: automatic presentation for visual analysis.

    PubMed

    Mackinlay, Jock; Hanrahan, Pat; Stolte, Chris

    2007-01-01

    This paper describes Show Me, an integrated set of user interface commands and defaults that incorporate automatic presentation into a commercial visual analysis system called Tableau. A key aspect of Tableau is VizQL, a language for specifying views, which is used by Show Me to extend automatic presentation to the generation of tables of views (commonly called small multiple displays). A key research issue for the commercial application of automatic presentation is the user experience, which must support the flow of visual analysis. User experience has not been the focus of previous research on automatic presentation. The Show Me user experience includes the automatic selection of mark types, a command to add a single field to a view, and a pair of commands to build views for multiple fields. Although the use of these defaults and commands is optional, user interface logs indicate that Show Me is used by commercial users.

  9. Automatic emotional expression analysis from eye area

    NASA Astrophysics Data System (ADS)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  10. Automatic tools for microprocessor failure analysis

    NASA Astrophysics Data System (ADS)

    Conard, Didier; Laurent, J.; Velazco, Raoul; Ziade, Haissam; Cabestany, J.; Sala, F.

    A new approach for fault location when testing microprocessors is presented. The startpoint for the backtracing analysis converging to the failure is constituted by the automatic localization of a reduced area. Automatic image comparison based on pattern recognition is performed by means of an electron beam tester. The developed hardware and software tools allow large circuit areas to be covered offering powerful diagnosis capabilities to the user. The validation of this technique was performed on faulty 68000 microprocessors. It shows the feasibility of the automation of the first and most important step of failure analysis: fault location at the chip surface.

  11. Automatic photointerpretation via texture and morphology analysis

    NASA Technical Reports Server (NTRS)

    Tou, J. T.

    1982-01-01

    Computer-based techniques for automatic photointerpretation based upon information derived from texture and morphology analysis of images are discussed. By automatic photointerpretation, is meant the determination of semantic descriptions of the content of the images by computer. To perform semantic analysis of morphology, a heirarchical structure of knowledge representation was developed. The simplest elements in a morphology are strokes, which are used to form alphabets. The alphabets are the elements for generating words, which are used to describe the function or property of an object or a region. The words are the elements for constructing sentences, which are used for semantic description of the content of the image. Photointerpretation based upon morphology is then augmented by textural information. Textural analysis is performed using a pixel-vector approach.

  12. Automatic subsystem identification in statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Díaz-Cereceda, Cristina; Poblet-Puig, Jordi; Rodríguez-Ferran, Antonio

    2015-03-01

    An automatic methodology for identifying SEA (statistical energy analysis) subsystems within a vibroacoustic system is presented. It consists in dividing the system into cells and grouping them into subsystems via a hierarchical cluster analysis based on the problem eigenmodes. The subsystem distribution corresponds to the optimal grouping of the cells, which is defined in terms of the correlation distance between them. The main advantages of this methodology are its automatic performance and its applicability both to vibratory and vibroacoustic systems. Moreover, the method allows the definition of more than one subsystem in the same geometrical region when required. This is the case of eigenmodes with a very different mechanical response (e.g. out-of-plane or in-plane vibration in shells).

  13. Automatic Prosodic Analysis to Identify Mild Dementia

    PubMed Central

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  14. Automatic analysis of D-partition

    NASA Astrophysics Data System (ADS)

    Bogaevskaya, V. G.

    2017-01-01

    The paper is dedicated to automatization of D-partition analysis. D-partition is one of the most common methods for determination of solution stability in systems with time-delayed feedback control and its dependency on values of control parameters. A transition from analytical form of D-partition to plain graph has been investigated. An algorithm of graph faces determination and calculation of count of characteristic equation roots with positive real part for appropriate area of D-partition has been developed. The algorithm keeps an information about analytical formulas for edges of faces. It allows to make further analytical research based on the results of computer analysis.

  15. Research on automatic human chromosome image analysis

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to achieve the best results. Medial axis is extracted by the middle point algorithm. Chromosome band is enhanced by the algorithm based on multiscale B-spline wavelets, extracted by average gray profile, gradient profile and shape profile, and calculated by the WDD (Weighted Density Distribution) descriptors. The multilayer classifier is used in classification. Experiment results demonstrate that the algorithms perform well.

  16. Semi-automatic analysis of fire debris

    PubMed

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-08

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians.

  17. A micropolariscope for automatic stress analysis

    NASA Astrophysics Data System (ADS)

    Fessler, H.; Marston, R. E.; Ollerton, E.

    1987-01-01

    A micropolariscope has been developed for the automatic analysis of photoelastic data. It will position frozen stress slices mounted on its stage to within + or - 0.002 mm and take readings of isoclinic angles and fractional fringe orders, repeatable to within + or - 0.08 degrees and + or - 0.001 fringes. A rectangular grid of up to 3 x 50 points can be read automatically, taking about 1.25 minutes per point; the readings are stored on a floppy disc and printed out. The original slice is itself sliced, and the subslice is viewed again in the orthogonal direction to produce a second set of readings. Software has been devised to analyze the two sets of readings. It makes use of Tesar's (1933) modification of Frocht and Guernsey (1952) shear difference method to calculate five Cartesian stresses, which may be plotted and printed in tabular form. Flexible facilities are provided for editing, correcting, plotting, and printing intermediate stages in the analysis, and for storing results in data files.

  18. Automatic analysis of speckle photography fringes.

    PubMed

    Buendía, M; Cibrián, R; Salvador, R; Roldán, C; Iñesta, J M

    1997-04-10

    Speckle interferometry is a technique adequate to metrological problems such as the measurement of object deformation. An automatic system of analysis of such measurements is given; it consists of a motorized x-y plate positioner controlled by computer, a CCD video camera, and software for image analysis. A fringe-recognition algorithm determines the spacing and orientation of the fringes and permits the calculation of the magnitude and direction of the displacement of the analyzed object point in images with variable degrees of illumination. For a 256 x 256 pixel image resolution, the procedure allows one to analyze from three fringes to a number of fringes that corresponds to 3 pixels/fringe.

  19. Automatic dirt trail analysis in dermoscopy images.

    PubMed

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  20. Automatic cortical thickness analysis on rodent brain

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek

    2011-03-01

    Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.

  1. Application of automatic image analysis in wood science

    Treesearch

    Charles W. McMillin

    1982-01-01

    In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...

  2. AUTOMATIC DIRT TRAIL ANALYSIS IN DERMOSCOPY IMAGES

    PubMed Central

    Cheng, Beibei; Stanley, R. Joe; Stoecker, William V.; Osterwise, Christopher T.P.; Stricklin, Sherea M.; Hinton, Kristen A.; Moss, Randy H.; Oliviero, Margaret; Rabinovitz, Harold S.

    2011-01-01

    Basal cell carcinoma (BCC) is the most common cancer in the U.S. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 area under a receiver operating characteristic curve using a leave-one-out approach, demonstrating the potential of dirt trails for BCC lesion discrimination. PMID:22233099

  3. Automatic analysis of the corneal ulcer

    NASA Astrophysics Data System (ADS)

    Ventura, Liliane; Chiaradia, Caio; Faria de Sousa, Sidney J.

    1999-06-01

    A very common disease in agricultural countries is the corneal ulcer. Particularly in the public hospitals, several patients come every week presenting this kind of pathology. One of the most important features to diagnose the regression of the disease is the determination of the vanishing of the affected area. An automatic system (optical system and software), attached to a Slit Lamp, has been developed to determine automatically the area of the ulcer and to follow up its regression. The clinical procedure to isolate the ulcer is still done, but the measuring time is fast enough to not cause discomfort to the patient as the traditional evaluation does. The system has been used in the last 6 months in a hospital that has about 80 patients per week presenting corneal ulcer. The patients follow up (which is an indispensable criteria for the cure of the disease) has been improved by the system and has guaranteed the treatment success.

  4. Automatic basal slice detection for cardiac analysis

    NASA Astrophysics Data System (ADS)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  5. Semi-automatic aortic aneurysm analysis

    NASA Astrophysics Data System (ADS)

    Bodur, Osman; Grady, Leo; Stillman, Arthur; Setser, Randolph; Funka-Lea, Gareth; O'Donnell, Thomas

    2007-03-01

    Aortic aneurysms are the 13 th leading cause of death in the United States. In standard clinical practice, assessing the progression of disease in the aorta, as well as the risk of aneurysm rupture, is based on measurements of aortic diameter. We propose a method for automatically segmenting the aortic vessel border allowing the calculation of aortic diameters on CTA acquisitions which is accurate and fast, allowing clinicians more time for their evaluations. While segmentation of aortic lumen is straightforward in CTA, segmentation of the outer vessel wall (epithelial layer) in a diseased aorta is difficult; furthermore, no clinical tool currently exists to perform this task. The difficulties are due to the similarities in intensity of surrounding tissue (and thrombus due to lack of contrast agent uptake), as well as the complications from bright calcium deposits. Our overall method makes use of a centerline for the purpose of resampling the image volume into slices orthogonal to the vessel path. This centerline is computed semi-automatically via a distance transform. The difficult task of automatically segmenting the aortic border on the orthogonal slices is performed via a novel variation of the isoperimetric algorithm which incorporates circular constraints (priors). Our method is embodied in a prototype which allows the loading and registration of two datasets simultaneously, facilitating longitudinal comparisons. Both the centerline and border segmentation algorithms were evaluated on four patients, each with two volumes acquired 6 months to 1.5 years apart, for a total of eight datasets. Results showed good agreement with clinicians' findings.

  6. Functional analysis screening for problem behavior maintained by automatic reinforcement.

    PubMed

    Querim, Angie C; Iwata, Brian A; Roscoe, Eileen M; Schlichenmeyer, Kevin J; Ortega, Javier Virués; Hurl, Kylee E

    2013-01-01

    A common finding in previous research is that problem behavior maintained by automatic reinforcement continues to occur in the alone condition of a functional analysis (FA), whereas behavior maintained by social reinforcement typically is extinguished. Thus, the alone condition may represent an efficient screening procedure when maintenance by automatic reinforcement is suspected. We conducted a series of 5-min alone (or no-interaction) probes for 30 cases of problem behavior and compared initial predictions of maintenance or extinction to outcomes obtained in subsequent FAs. Results indicated that data from the screening procedure accurately predicted that problem behavior was maintained by automatic reinforcement in 21 of 22 cases and by social reinforcement in 7 of 8 cases. Thus, results of the screening accurately predicted the function of problem behavior (social vs. automatic reinforcement) in 28 of 30 cases. © Society for the Experimental Analysis of Behavior.

  7. Automatism

    PubMed Central

    McCaldon, R. J.

    1964-01-01

    Individuals can carry out complex activity while in a state of impaired consciousness, a condition termed “automatism”. Consciousness must be considered from both an organic and a psychological aspect, because impairment of consciousness may occur in both ways. Automatism may be classified as normal (hypnosis), organic (temporal lobe epilepsy), psychogenic (dissociative fugue) or feigned. Often painstaking clinical investigation is necessary to clarify the diagnosis. There is legal precedent for assuming that all crimes must embody both consciousness and will. Jurists are loath to apply this principle without reservation, as this would necessitate acquittal and release of potentially dangerous individuals. However, with the sole exception of the defence of insanity, there is at present no legislation to prohibit release without further investigation of anyone acquitted of a crime on the grounds of “automatism”. PMID:14199824

  8. Automatic ionospheric layers detection: Algorithms analysis

    NASA Astrophysics Data System (ADS)

    Molina, María G.; Zuccheretti, Enrico; Cabrera, Miguel A.; Bianchi, Cesidio; Sciacca, Umberto; Baskaradas, James

    2016-03-01

    Vertical sounding is a widely used technique to obtain ionosphere measurements, such as an estimation of virtual height versus frequency scanning. It is performed by high frequency radar for geophysical applications called ;ionospheric sounder; (or ;ionosonde;). Radar detection depends mainly on targets characteristics. While several targets behavior and correspondent echo detection algorithms have been studied, a survey to address a suitable algorithm for ionospheric sounder has to be carried out. This paper is focused on automatic echo detection algorithms implemented in particular for an ionospheric sounder, target specific characteristics were studied as well. Adaptive threshold detection algorithms are proposed, compared to the current implemented algorithm, and tested using actual data obtained from the Advanced Ionospheric Sounder (AIS-INGV) at Rome Ionospheric Observatory. Different cases of study have been selected according typical ionospheric and detection conditions.

  9. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  10. A hierarchical structure for automatic meshing and adaptive FEM analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Saxena, Mukul; Perucchio, Renato

    1987-01-01

    A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.

  11. Environment for the automatic manipulation and analysis of morphological expressions

    NASA Astrophysics Data System (ADS)

    Richardson, Craig H.; Schafer, Ronald W.

    1990-11-01

    This paper describes a LISP based environment for the automatic manipulation and analysis of morphological expressions. The foundation of this environment is an aggregation of morphological knowledge that includes signal and system property information rule bases for representing morphological relationships and inferencing mechanisms for using this collection of knowledge. The layers surrounding this foundation include representations of abstract signal and structuring element classes as well as actual structuring elements implementations of the morphological operators and the ability to optimally decompose structels. The representational requirements for automatically manipulating expressions and determining the computational cost are described and the capabilities of the environment are illustrated by examples of symbolic manipulations and expression analysis.

  12. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  13. Neural network architecture for automatic chromosome analysis

    NASA Astrophysics Data System (ADS)

    Diez-Higuera, Jose F.; Diaz-Pernas, F. J.; Lopez-Coronado, Juan

    1996-03-01

    We are interested in designing a neural network system for automatic chromosome. The goal of this approach is to make the chromosome regions more salient and more interpretable to human skilled technicians than they are in the original imagery. The proposed segmentation model is based upon the biologically derived boundary contour system (BCS) of Grossberg and Mingolla. The practical application of the model to real images raises an important problem. The boundaries generated by BCS have a sizable thickness that is a function of the contrast gradient between two adjacent regions. In order to solve this problem we propose the use of a feedback diffusion. The image resultant of the diffusion is fed back to the simple cell layer. Furthermore, the boundary representation is also fed back to the boundary segmentation stage. In this way, the boundaries are adapted to the variations produced by the feedback diffusion, achieving a gradual boundary thinning. We also propose a modificated diffusive filling-in equation for obtaining better results in homogeneous regions. The behavior of the Grossberg-Todorovic's equation reduces the homogenizing of the regions contained inside the boundaries. In order to solve this problem we introduce a new parameter, rho, called recovery parameter. This parameter regulates the activity variation margin of a node with respect to its initial value. With regard to the improvement in homogenizing, with a value for parameter rho near to zero, the resulting regions present a plain surface, making easy the chromosome bands separation.

  14. Profiling School Shooters: Automatic Text-Based Analysis

    PubMed Central

    Neuman, Yair; Assaf, Dan; Cohen, Yochai; Knoll, James L.

    2015-01-01

    School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters’ texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology. PMID:26089804

  15. Automatic functional analysis of left ventricle in cardiac cine MRI.

    PubMed

    Lu, Ying-Li; Connelly, Kim A; Dick, Alexander J; Wright, Graham A; Radau, Perry E

    2013-08-01

    A fully automated left ventricle segmentation method for the functional analysis of cine short axis (SAX) magnetic resonance (MR) images was developed, and its performance evaluated with 133 studies of subjects with diverse pathology: ischemic heart failure (n=34), non-ischemic heart failure (n=30), hypertrophy (n=32), and healthy (n=37). The proposed automatic method locates the left ventricle (LV), then for each image detects the contours of the endocardium, epicardium, papillary muscles and trabeculations. Manually and automatically determined contours and functional parameters were compared quantitatively. There was no significant difference between automatically and manually determined end systolic volume (ESV), end diastolic volume (EDV), ejection fraction (EF) and left ventricular mass (LVM) for each of the four groups (paired sample t-test, α=0.05). The automatically determined functional parameters showed high correlations with those derived from manual contours, and the Bland-Altman analysis biases were small (1.51 mL, 1.69 mL, -0.02%, -0.66 g for ESV, EDV, EF and LVM, respectively). The proposed technique automatically and rapidly detects endocardial, epicardial, papillary muscles' and trabeculations' contours providing accurate and reproducible quantitative MRI parameters, including LV mass and EF.

  16. Automatic functional analysis of left ventricle in cardiac cine MRI

    PubMed Central

    Lu, Ying-Li; Connelly, Kim A.; Dick, Alexander J.; Wright, Graham A.

    2013-01-01

    Rationale and objectives A fully automated left ventricle segmentation method for the functional analysis of cine short axis (SAX) magnetic resonance (MR) images was developed, and its performance evaluated with 133 studies of subjects with diverse pathology: ischemic heart failure (n=34), non-ischemic heart failure (n=30), hypertrophy (n=32), and healthy (n=37). Materials and methods The proposed automatic method locates the left ventricle (LV), then for each image detects the contours of the endocardium, epicardium, papillary muscles and trabeculations. Manually and automatically determined contours and functional parameters were compared quantitatively. Results There was no significant difference between automatically and manually determined end systolic volume (ESV), end diastolic volume (EDV), ejection fraction (EF) and left ventricular mass (LVM) for each of the four groups (paired sample t-test, α=0.05). The automatically determined functional parameters showed high correlations with those derived from manual contours, and the Bland-Altman analysis biases were small (1.51 mL, 1.69 mL, –0.02%, –0.66 g for ESV, EDV, EF and LVM, respectively). Conclusions The proposed technique automatically and rapidly detects endocardial, epicardial, papillary muscles’ and trabeculations’ contours providing accurate and reproducible quantitative MRI parameters, including LV mass and EF. PMID:24040616

  17. Automatic identification of reticular pseudodrusen using multimodal retinal image analysis.

    PubMed

    van Grinsven, Mark J J P; Buitendijk, Gabriëlle H S; Brussee, Corina; van Ginneken, Bram; Hoyng, Carel B; Theelen, Thomas; Klaver, Caroline C W; Sánchez, Clara I

    2015-01-08

    To examine human performance and agreement on reticular pseudodrusen (RPD) detection and quantification by using single- and multimodality grading protocols and to describe and evaluate a machine learning system for the automatic detection and quantification of reticular pseudodrusen by using single- and multimodality information. Color fundus, fundus autofluoresence, and near-infrared images of 278 eyes from 230 patients with or without presence of RPD were used in this study. All eyes were scored for presence of RPD during single- and multimodality setups by two experienced observers and a developed machine learning system. Furthermore, automatic quantification of RPD area was performed by the proposed system and compared with human delineations. Observers obtained a higher performance and better interobserver agreement for RPD detection with multimodality grading, achieving areas under the receiver operating characteristic (ROC) curve of 0.940 and 0.958, and a κ agreement of 0.911. The proposed automatic system achieved an area under the ROC of 0.941 with a multimodality setup. Automatic RPD quantification resulted in an intraclass correlation (ICC) value of 0.704, which was comparable with ICC values obtained between single-modality manual delineations. Observer performance and agreement for RPD identification improved significantly by using a multimodality grading approach. The developed automatic system showed similar performance as observers, and automatic RPD area quantification was in concordance with manual delineations. The proposed automatic system allows for a fast and accurate identification and quantification of RPD, opening the way for efficient quantitative imaging biomarkers in large data set analysis. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.

  18. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  19. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  20. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  1. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    PubMed

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  2. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    PubMed Central

    Magalhaes, Fabrício A.; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions) were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key Points The availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports. An important feature of automatic tracking software is to require limited human

  3. Automatic analysis of double coronal mass ejections from coronagraph images

    NASA Astrophysics Data System (ADS)

    Jacobs, Matthew; Chang, Lin-Ching; Pulkkinen, Antti; Romano, Michelangelo

    2015-11-01

    Coronal mass ejections (CMEs) can have major impacts on man-made technology and humans, both in space and on Earth. These impacts have created a high interest in the study of CMEs in an effort to detect and track events and forecast the CME arrival time to provide time for proper mitigation. A robust automatic real-time CME processing pipeline is greatly desired to avoid laborious and subjective manual processing. Automatic methods have been proposed to segment CMEs from coronagraph images and estimate CME parameters such as their heliocentric location and velocity. However, existing methods suffered from several shortcomings such as the use of hard thresholding and an inability to handle two or more CMEs occurring within the same coronagraph image. Double-CME analysis is a necessity for forecasting the many CME events that occur within short time frames. Robust forecasts for all CME events are required to fully understand space weather impacts. This paper presents a new method to segment CME masses and pattern recognition approaches to differentiate two CMEs in a single coronagraph image. The proposed method is validated on a data set of 30 halo CMEs, with results showing comparable ability in transient arrival time prediction accuracy and the new ability to automatically predict the arrival time of a double-CME event. The proposed method is the first automatic method to successfully calculate CME parameters from double-CME events, making this automatic method applicable to a wider range of CME events.

  4. Development of an automatic identification algorithm for antibiogram analysis.

    PubMed

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a <4mm difference between AIA and human analysis, exhibiting a correlation index of 0.85 for all images, 0.90 for standards and 0.80 for oddities with no significant difference between automatic and manual method. AIA resolved some reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories.

  5. Automatic movie skimming with general tempo analysis

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  6. Mytoe: automatic analysis of mitochondrial dynamics.

    PubMed

    Lihavainen, Eero; Mäkelä, Jarno; Spelbrink, Johannes N; Ribeiro, Andre S

    2012-04-01

    We present Mytoe, a tool for analyzing mitochondrial morphology and dynamics from fluorescence microscope images. The tool provides automated quantitative analysis of mitochondrial motion by optical flow estimation and of morphology by segmentation of individual branches of the network-like structure of the organelles. Mytoe quantifies several features of individual branches, such as length, tortuosity and speed, and of the macroscopic structure, such as mitochondrial area and degree of clustering. We validate the methods and apply them to the analysis of sequences of images of U2OS human cells with fluorescently labeled mitochondria. Source code, Windows software and Manual available at http://www.cs.tut.fi/%7Esanchesr/mito Supplementary data are available at Bioinformatics online. eero.lihavainen@tut.fi; andre.ribeiro@tut.fi.

  7. The method of quantitative automatic metallographic analysis

    NASA Astrophysics Data System (ADS)

    Martyushev, N. V.; Skeeba, V. Yu

    2017-01-01

    A brief analysis of the existing softwares for computer processing of microstructure photographs is presented. The descriptions of the the software package developed by the author are demonstrated. This software product is intended for quantitative metallographic analysis of digital photographs of the microstructure of materials. It allows calculating the volume fraction and the average size of particles of the structure by several hundred secants (depending on the photographs resolution) in one vision field. Besides, a special module is built in the software allowing assessing the degree of deviation of the shape of different particles and impurities from the spherical one. The article presents the main algorithms, used during the creation of the software product, and formulae according to which the software calculates the parameters of the microstructure. It is shown that the reliability of calculations depends on the quality of preparation of the microstructure.

  8. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  9. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  10. Automatic Spot Identification for High Throughput Microarray Analysis

    PubMed Central

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  11. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  12. Facilitator control as automatic behavior: A verbal behavior analysis

    PubMed Central

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is automatic when the speaker or writer is not stimulated by the behavior at the time of emission, the behavior is not edited, the products of behavior differ from what the person would produce normally, and the behavior is attributed to an outside source. All of these characteristics appear to be present in facilitator behavior. Other variables seem to account for the thematic content of the typed messages. These variables also are discussed. PMID:22477083

  13. Spectral analysis methods for automatic speech recognition applications

    NASA Astrophysics Data System (ADS)

    Parinam, Venkata Neelima Devi

    In this thesis, we evaluate the front-end of Automatic Speech Recognition (ASR) systems, with respect to different types of spectral processing methods that are extensively used. A filter bank approach for front end spectral analysis is one of the common methods used for spectral analysis. In this work we describe and evaluate spectral analysis based on Mel and Gammatone filter banks. These filtering methods are derived from auditory models and are thought to have some advantages for automatic speech recognition work. Experimentally, however, we show that direct use of FFT spectral values is just as effective as using either Mel or Gammatone filter banks, provided that the features extracted from the FFT spectral values take into account a Mel or Mel-like frequency scale. It is also shown that trajectory features based on sliding block of spectral features, computed using either FFT or filter bank spectral analysis are considerably more effective, in terms of ASR accuracy, than are delta and delta-delta terms often used for ASR. Although there is no major performance disadvantage to using a filter bank, simplicity of analysis is a reason to eliminate this step in speech processing. These assertions hold for both clean and noisy speech.

  14. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    PubMed Central

    2012-01-01

    Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding

  15. Rapid automatic keyword extraction for information retrieval and analysis

    DOEpatents

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  16. Corpus analysis and automatic detection of emotion-including keywords

    NASA Astrophysics Data System (ADS)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  17. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  18. Entropy analysis of automatic sequences revisited: An entropy diagnostic for automaticity

    NASA Astrophysics Data System (ADS)

    Karamanos, Kostas

    2001-06-01

    We give a necessary entropy condition, valid for all automatic sequences read by lumping. We next establish new entropic decimation schemes for the Thue-Morse, the Rudin-Shapiro and the paperfolding sequences read by lumping.

  19. Entropy analysis of OCT signal for automatic tissue characterization

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Qiu, Yi; Zaki, Farzana; Xu, Yiqing; Hubbi, Basil; Belfield, Kevin D.; Liu, Xuan

    2016-03-01

    Optical coherence tomography (OCT) signal can provide microscopic characterization of biological tissue and assist clinical decision making in real-time. However, raw OCT data is noisy and complicated. It is challenging to extract information that is directly related to the pathological status of tissue through visual inspection on huge volume of OCT signal streaming from the high speed OCT engine. Therefore, it is critical to discover concise, comprehensible information from massive OCT data through novel strategies for signal analysis. In this study, we perform Shannon entropy analysis on OCT signal for automatic tissue characterization, which can be applied in intraoperative tumor margin delineation for surgical excision of cancer. The principle of this technique is based on the fact that normal tissue is usually more structured with higher entropy value, compared to pathological tissue such as cancer tissue. In this study, we develop high-speed software based on graphic processing units (GPU) for real-time entropy analysis of OCT signal.

  20. Independent component analysis for automatic note extraction from musical trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  1. Independent component analysis for automatic note extraction from musical trills.

    PubMed

    Brown, Judith C; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  2. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  3. [Automatic analysis pipeline of next-generation sequencing data].

    PubMed

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  4. DWT features performance analysis for automatic speech recognition of Urdu.

    PubMed

    Ali, Hazrat; Ahmad, Nasir; Zhou, Xianwei; Iqbal, Khalid; Ali, Sahibzada Muhammad

    2014-01-01

    This paper presents the work on Automatic Speech Recognition of Urdu language, using a comparative analysis for Discrete Wavelets Transform (DWT) based features and Mel Frequency Cepstral Coefficients (MFCC). These features have been extracted for one hundred isolated words of Urdu, each word uttered by ten different speakers. The words have been selected from the most frequently used words of Urdu. A variety of age and dialect has been covered by using a balanced corpus approach. After extraction of features, the classification has been achieved by using Linear Discriminant Analysis. After the classification task, the confusion matrix obtained for the DWT features has been compared with the one obtained for Mel-Frequency Cepstral Coefficients based speech recognition. The framework has been trained and tested for speech data recorded under controlled environments. The experimental results are useful in determination of the optimum features for speech recognition task.

  5. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  6. Dynamic Analysis of AN Automatic Dynamic Balancer for Rotating Mechanisms

    NASA Astrophysics Data System (ADS)

    CHUNG, J.; RO, D. S.

    1999-12-01

    Dynamic stability and behavior of an automatic dynamic balance (ADB) are analyzed by a theoretical approach. Using Lagrange's equation, we derive the non-linear equations of motion for an autonomous system with respect to the polar co-ordinate system. From the equations of motion for the autonomous system, the equilibrium positions and the linear variational equations are obtained by the perturbation method. Based on the variational equations, the dynamic stability of the system in the neighborhood of the equilibrium positions is investigated by the Routh-Hurwitz criteria. The results of the stability analysis provide the design requirements for the ADB to achieve balancing of the system. In addition, in order to verify the stability of the system, time responses are computed by the generalized-α method. We also investigate the dynamic behavior of the system and the effects of damping on balancing.

  7. Baseline fetal heart rate analysis: eleven automatic methods versus expert consensus.

    PubMed

    de l'Aulnoit, Agathe Houze; Boudet, Samuel; Demailly, Romain; Peyrodie, Laurent; Beuscart, Regis; de l'Aulnoit, Denis Houze

    2016-08-01

    Visual analysis of fetal heart rate (FHR) during labor is subject to inter- and intra-observer variability that is particularly troublesome for anomalous recordings. Automatic FHR analysis has been proposed as a promising way to reduce this variability. The major difficulty with automatic analysis is to determine the baseline from which accelerations and decelerations will be detected. Eleven methods for automatic FHR analysis were reprogrammed using description from the literature and applied to 66 FHR recordings collected during the first stage of delivery. The FHR baselines produced by the automatic methods were compared with the baseline defined by agreement among a panel of three experts. The better performance of the automatic methods described by Mongelli, Lu, Wrobel and Pardey was noted despite their different approaches on signal processing. Nevertheless, for several recordings, none of the automatic studied methods produced a baseline similar to that defined by the experts.

  8. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    SciTech Connect

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  9. Automatic traffic real-time analysis system based on video

    NASA Astrophysics Data System (ADS)

    Ding, Liya; Liu, Jilin; Zhou, Qubo; Wang, Rengrong

    2003-05-01

    Automatic traffic analysis is very important in the modern world with heavy traffic. It can be achieved in numerous ways, among them, detection and analysis through video system, being able to provide affluent information and having little disturbance to the traffic, is an ideal choice. The proposed traffic vision analysis system uses Image Acquisition Card to capture real time images of the traffic scene through video camera, and then exploits the sequence of traffic scene and the image processing and analysis technique to detect the presence and movement of vehicles. First getting rid of the complex traffic background, which is always changing, the system segment each vehicle in the region the user interested. The system extracts features from each vehicle and tracks them through the image sequence. Combined with calibration, the system calculates information of the traffic, such as the speed of the vehicles, their types, the volume of flow, the traffic density, the waiting length of the lanes, the turning information of the vehicles, and so on. Traffic congestion and vehicles" shadows are disturbing problems of the vehicle detection, segmentation and tracking. So we make great effort to investigate on methods to dealing with them.

  10. Differentiation of normal and disturbed sleep by automatic analysis.

    PubMed

    Hasan, J

    1983-01-01

    stage classification could be used for the differentiation between normal and disturbed sleep. In the present work only EEG waveform parameters and body movement activity were studied with this in mind. It was found that sleep can satisfactorily be classified in stages by automatic analysis if it is not markedly disturbed. The percentage agreement obtained for the three groups having practically normal sleep (young normals appr. 80%, older normals 77% and anonymous alcoholics 75%) was satisfactory and sufficient for clinical and experimental work.(ABSTRACT TRUNCATED AT 400 WORDS)

  11. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    PubMed

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  12. Automatic adventitious respiratory sound analysis: A systematic review.

    PubMed

    Pramono, Renard Xaviero Adhi; Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11

  13. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  14. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  15. Shape analysis for an automatic oyster grading system

    NASA Astrophysics Data System (ADS)

    Lee, Dah-Jye; Xu, Xiaoqian; Lane, Robert M.; Zhan, Pengcheng

    2004-12-01

    An overview of the oyster industry in the U. S. with emphasis in Virginia shows oyster grading occurs at harvest, wholesale and processing markets. Currently whole oysters, also called shellstock, are graded manually by screening and sorting based on diameter or weight. The majority of oysters harvested for the processing industry are divided into three to four main grades: small, medium, large, and selects. We have developed a shape analysis method for an automatic oyster grading system. The system first detects and removes poor quality oysters such as banana shape, broken shell, and irregular shapes. Good quality oysters move further into grades of small, medium and large. The contours of the oysters are extracted for shape analysis. Banana shape and broken shell have a specific shape flaw (or difference) compared to the ones with good quality. Global shape properties such as compactness, roughness, and elongation are suitable and useful to measure the shape flaw. Image projection area or length of the major axis measured as global properties for sizing. Incorporating a machine vision system for grading, sorting and counting oysters supports reduced operating costs. The savings produced from reducing labor, increasing accuracy in size, grade and count and providing real time accurate data for accounting and billing would contribute to the profit of the oysters industry.

  16. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  17. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  18. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  19. Automatic Video Analysis for Obstructive Sleep Apnea Diagnosis

    PubMed Central

    Abad, Jorge; Muñoz-Ferrer, Aida; Cervantes, Miguel Ángel; Esquinas, Cristina; Marin, Alicia; Martínez, Carlos; Morera, Josep; Ruiz, Juan

    2016-01-01

    Study Objectives: We investigated the diagnostic accuracy for the identification of obstructive sleep apnea (OSA) and its severity of a noninvasive technology based on image processing (SleepWise). Methods: This is an observational, prospective study to evaluate the degree of agreement between polysomnography (PSG) and SleepWise. We recruited 56 consecutive subjects with suspected OSA who were referred as outpatients to the Sleep Unit of the Hospital Universitari Germans Trias i Pujol (HUGTiP) from January 2013 to January 2014. All patients underwent laboratory PSG and image processing with SleepWise simultaneously the same night. Both PSG and SleepWise analyses were carried independently and blindly. Results: We analyzed 50 of the 56 patients recruited. OSA was diagnosed through PSG in a total of 44 patients (88%) with a median apnea-hypopnea index (AHI) of 25.35 (24.9). According to SleepWise, 45 patients (90%) met the criteria for a diagnosis of OSA, with a median AHI of 22.8 (22.03). An analysis of the ability of PSG and SleepWise to classify patients by severity on the basis of their AHI shows that the two diagnostic systems distribute the different groups similarly. According to PSG, 23 patients (46%) had a diagnosis of severe OSA, 11 patients (22%) moderate OSA, and 10 patients (20%) mild OSA. According to SleepWise, 20, 13, and 12 patients (40%, 26%, and 24%, respectively) had a diagnosis of severe, moderate, and mild OSA respectively. For OSA diagnosis, SleepWise was found to have sensitivity of 100% and specificity of 83% in relation to PSG. The positive predictive value was 97% and the negative predictive value was 100%. The Bland-Altman plot comparing the mean AHI values obtained through PSG and SleepWise shows very good agreement between the two diagnostic techniques, with a bias of −3.85, a standard error of 12.18, and a confidence interval of −0.39 to −7.31. Conclusions: SleepWise was reasonably accurate for noninvasive and automatic diagnosis

  20. Automatic analysis of the 2015 Gorkha earthquake aftershock sequence.

    NASA Astrophysics Data System (ADS)

    Baillard, C.; Lyon-Caen, H.; Bollinger, L.; Rietbrock, A.; Letort, J.; Adhikari, L. B.

    2016-12-01

    The Mw 7.8 Gorkha earthquake, that partially ruptured the Main Himalayan Thrust North of Kathmandu on the 25th April 2015, was the largest and most catastrophic earthquake striking Nepal since the great M8.4 1934 earthquake. This mainshock was followed by multiple aftershocks, among them, two notable events that occurred on the 12th May with magnitudes of 7.3 Mw and 6.3 Mw. Due to these recent events it became essential for the authorities and for the scientific community to better evaluate the seismic risk in the region through a detailed analysis of the earthquake catalog, amongst others, the spatio-temporal distribution of the Gorkha aftershock sequence. Here we complement this first study by doing a microseismic study using seismic data coming from the eastern part of the Nepalese Seismological Center network associated to one broadband station in Everest. Our primary goal is to deliver an accurate catalog of the aftershock sequence. Due to the exceptional number of events detected we performed an automatic picking/locating procedure which can be splitted in 4 steps: 1) Coarse picking of the onsets using a classical STA/LTA picker, 2) phase association of picked onsets to detect and declare seismic events, 3) Kurtosis pick refinement around theoretical arrival times to increase picking and location accuracy and, 4) local magnitude calculation based amplitude of waveforms. This procedure is time efficient ( 1 sec/event), reduces considerably the location uncertainties ( 2 to 5 km errors) and increases the number of events detected compared to manual processing. Indeed, the automatic detection rate is 10 times higher than the manual detection rate. By comparing to the USGS catalog we were able to give a new attenuation law to compute local magnitudes in the region. A detailed analysis of the seismicity shows a clear migration toward the east of the region and a sudden decrease of seismicity 100 km east of Kathmandu which may reveal the presence of a tectonic

  1. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  2. Automatic analysis of ciliary beat frequency using optical flow

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  3. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  4. Trends of Science Education Research: An Automatic Content Analysis

    NASA Astrophysics Data System (ADS)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  5. Discourse Analysis for Language Learners. Australian Review of Applied Linguistics, Vol. 1, No. 2.

    ERIC Educational Resources Information Center

    Hartmann, R. R. K.

    Discourse analysis, a field that reflects an interest in language as text and social interaction, is discussed. Discourse analysis deals with the way language varies from one communicative situation to another; textological analysis deals with the internal organization of such discourse in terms of grammar and vocabulary. Assumptions in…

  6. Automatic adventitious respiratory sound analysis: A systematic review

    PubMed Central

    Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Background Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. Objective To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. Data sources A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Study selection Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Data extraction Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. Data synthesis A total of 77 reports from the literature were included in this review. 55 (71.43%) of the

  7. The Romanian-English Contrastive Analysis Project; Further Developments in Contrastive Studies, Vol. 5.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The fifth volume in this series contains ten articles dealing with various aspects of Romanian-English contrastive analysis. They are: "Theoretical Interpretation and Methodological Consequences of 'REGULARIZATION'," by Tatiana Slama-Cazacu; "On Error Analysis," by Charles M. Carlton; "The Contrastive Hypothesis in Second Language Acquisition," by…

  8. The Romanian-English Contrastive Analysis Project; Further Developments in Contrastive Studies, Vol. 5.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The fifth volume in this series contains ten articles dealing with various aspects of Romanian-English contrastive analysis. They are: "Theoretical Interpretation and Methodological Consequences of 'REGULARIZATION'," by Tatiana Slama-Cazacu; "On Error Analysis," by Charles M. Carlton; "The Contrastive Hypothesis in Second Language Acquisition," by…

  9. Automatic analysis of stereoscopic satellite image pairs for determination of cloud-top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Woodward, R. H.; Pierce, H.

    1991-01-01

    Results are presented on an automatic stereo analysis of cloud-top heights from nearly simultaneous satellite image pairs from the GOES and NOAA satellites, using a massively parallel processor computer. Comparisons of computer-derived height fields and manually analyzed fields show that the automatic analysis technique shows promise for performing routine stereo analysis in a real-time environment, providing a useful forecasting tool by augmenting observational data sets of severe thunderstorms and hurricanes. Simulations using synthetic stereo data show that it is possible to automatically resolve small-scale features such as 4000-m-diam clouds to about 1500 m in the vertical.

  10. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  11. Operational testing of system for automatic sleep analysis

    NASA Technical Reports Server (NTRS)

    Kellaway, P.

    1972-01-01

    Tables on the performance, under operational conditions, of an automatic sleep monitoring system are presented. Data are recorded from patients who were undergoing heart and great vessel surgery. This study resulted in cap, electrode, and preamplifier improvements. Children were used to test the sleep analyzer and medical console write out units. From these data, an automatic voltage control circuit for the analyzer was developed. A special circuitry for obviating the possibility of incorrect sleep staging due to the presence of a movement artifact was also developed as a result of the study.

  12. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  13. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  14. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  15. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  16. A Distance Measure for Automatic Document Classification by Sequential Analysis.

    ERIC Educational Resources Information Center

    Kar, Gautam; White, Lee J.

    1978-01-01

    Investigates the feasibility of using a distance measure for automatic sequential document classification. This property of the distance measure is used to design a sequential classification algorithm which classifies key words and analyzes them separately in order to assign primary and secondary classes to a document. (VT)

  17. Improvement of Automatic Abstracts by the Use of Structural Analysis

    ERIC Educational Resources Information Center

    Mathis, Betty A.; And Others

    1973-01-01

    Results of an attempt to extend the capabilities of a previously-existing automatic abstracting system by adding a modification procedure designed to make system-produced abstracts more acceptable to readers are reported. A rationale for this modification phase is presented, along with several modification rules and methods for improving…

  18. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  19. A Distance Measure for Automatic Document Classification by Sequential Analysis.

    ERIC Educational Resources Information Center

    Kar, Gautam; White, Lee J.

    1978-01-01

    Investigates the feasibility of using a distance measure for automatic sequential document classification. This property of the distance measure is used to design a sequential classification algorithm which classifies key words and analyzes them separately in order to assign primary and secondary classes to a document. (VT)

  20. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  1. Automatic audio signing. Volume 2: Review, analysis and design

    NASA Astrophysics Data System (ADS)

    1981-11-01

    Automati Audio Signing, also referred to as an 'automatic highway advisory radio system' (AHAR) provides appropriately equipped motor vehicles one way non commerical communications pertaining to traffic, road and weather conditions, travel advisories, directions, tourist information and other matters of interest to the traveling public. The automatic audio signing project reduces accidents by providing advance warning of hazardous traffic, weather and road conditions; saves the motorists' time and fuel, and reduces motorist irritation by improving traffic control and provides route diversion information when justified by traffic congestion or road blockage; and provides directions, locations of tourist facilities, descriptions of points of interest, and other messages intended to enhance the convenience and enjoyment of the traveling public.

  2. Automatic Metadata Generation Through Analysis of Narration Within Instructional Videos.

    PubMed

    Rafferty, Joseph; Nugent, Chris; Liu, Jun; Chen, Liming

    2015-09-01

    Current activity recognition based assistive living solutions have adopted relatively rigid models of inhabitant activities. These solutions have some deficiencies associated with the use of these models. To address this, a goal-oriented solution has been proposed. In a goal-oriented solution, goal models offer a method of flexibly modelling inhabitant activity. The flexibility of these goal models can dynamically produce a large number of varying action plans that may be used to guide inhabitants. In order to provide illustrative, video-based, instruction for these numerous actions plans, a number of video clips would need to be associated with each variation. To address this, rich metadata may be used to automatically match appropriate video clips from a video repository to each specific, dynamically generated, activity plan. This study introduces a mechanism of automatically generating suitable rich metadata representing actions depicted within video clips to facilitate such video matching. This performance of this mechanism was evaluated using eighteen video files; during this evaluation metadata was automatically generated with a high level of accuracy.

  3. Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis

    NASA Astrophysics Data System (ADS)

    Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.

    2013-11-01

    Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.

  4. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    DTIC Science & Technology

    2017-01-01

    AIS ) Data Co as ta l a nd H yd ra ul ic s La bo ra to ry Brandan M. Scully January 2017 Approved for public release; distribution is...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System ( AIS ) Data Brandan M. Scully Coastal and...from Automatic Identification System ( AIS ) data. This work employs the methodology presented by Mitchell and Scully (2014) for inferring tidal

  5. Automatic Crowd Analysis from Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Reinartz, P.

    2011-04-01

    Recently automatic detection of people crowds from images became a very important research field, since it can provide crucial information especially for police departments and crisis management teams. Due to the importance of the topic, many researchers tried to solve this problem using street cameras. However, these cameras cannot be used to monitor very large outdoor public events. In order to bring a solution to the problem, herein we propose a novel approach to detect crowds automatically from remotely sensed images, and especially from very high resolution satellite images. To do so, we use a local feature based probabilistic framework. We extract local features from color components of the input image. In order to eliminate redundant local features coming from other objects in given scene, we apply a feature selection method. For feature selection purposes, we benefit from three different type of information; digital elevation model (DEM) of the region which is automatically generated using stereo satellite images, possible street segment which is obtained by segmentation, and shadow information. After eliminating redundant local features, remaining features are used to detect individual persons. Those local feature coordinates are also assumed as observations of the probability density function (pdf) of the crowds to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf which gives us information about dense crowd and people locations. We test our algorithm usingWorldview-2 satellite images over Cairo and Munich cities. Besides, we also provide test results on airborne images for comparison of the detection accuracy. Our experimental results indicate the possible usage of the proposed approach in real-life mass events.

  6. An analysis of automatic human detection and tracking

    NASA Astrophysics Data System (ADS)

    Demuth, Philipe R.; Cosmo, Daniel L.; Ciarelli, Patrick M.

    2015-12-01

    This paper presents an automatic method to detect and follow people on video streams. This method uses two techniques to determine the initial position of the person at the beginning of the video file: one based on optical flow and the other one based on Histogram of Oriented Gradients (HOG). After defining the initial bounding box, tracking is done using four different trackers: Median Flow tracker, TLD tracker, Mean Shift tracker and a modified version of the Mean Shift tracker using HSV color space. The results of the methods presented in this paper are then compared at the end of the paper.

  7. Automatic interferometer with digital readout for refractometric analysis.

    PubMed

    Kinder, W; Neumann, J; Plesse, H; Torge, R

    1968-02-01

    The paper describes an interference refractometer for liquids and gases which operates automatically and reads out in digital or analog form. A compensating technique using white light is used for measurement. Zero adjustment is achieved by rotating the compensator and capturing the zero-order white light fringe by photoelectric means. Measurement of the path difference compensated by the compensator is based on electronic interpolation and counting of interference fringes by optointerferometric means, a time division multiplex technique with pulse amplitude modulation being used to obtain the electrical fringe signals.

  8. Automatic measurement of the sinus of Valsalva by image analysis.

    PubMed

    Mairesse, Fabrice; Blanchard, Cédric; Boucher, Arnaud; Sliwa, Tadeusz; Lalande, Alain; Voisin, Yvon

    2017-09-01

    Despite the importance of the morphology of the sinus of Valsalva in the behavior of heart valves and the proper irrigation of coronary arteries, the study of these sinuses from medical imaging is still limited to manual radii measurements. This paper aims to present an automatic method to measure the sinuses of Valsalva on medical images, more specifically on cine MRI and Xray CT. This paper introduces an enhanced method to automatically localize and extract each sinus of Valsalva edge and its relevant points. Compared to classical active contours, this new image approach enhances the edge extraction of the Sinus of Valsalva. Our process not only allows image segmentation but also a complex study of the considered region including morphological classification, metrological characterization, valve tracking and 2D modeling. The method was successfully used on single or multiplane cine MRI and aortic CT angiographies. The localization is robust and the proposed edge extractor is more efficient than the state-of-the-art methods (average success rate for MRI examinations=84% ± 24%, average success rate for CT examinations=89% ± 11%). Moreover, deduced measurements are close to manual ones. The software produces accurate measurements of the sinuses of Valsalva. The robustness and the reproducibility of results will help for a better understanding of sinus of Valsalva pathologies and constitutes a first step to the design of complex prostheses adapted to each patient. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Structuring Lecture Videos by Automatic Projection Screen Localization and Analysis.

    PubMed

    Li, Kai; Wang, Jue; Wang, Haoqian; Dai, Qionghai

    2015-06-01

    We present a fully automatic system for extracting the semantic structure of a typical academic presentation video, which captures the whole presentation stage with abundant camera motions such as panning, tilting, and zooming. Our system automatically detects and tracks both the projection screen and the presenter whenever they are visible in the video. By analyzing the image content of the tracked screen region, our system is able to detect slide progressions and extract a high-quality, non-occluded, geometrically-compensated image for each slide, resulting in a list of representative images that reconstruct the main presentation structure. Afterwards, our system recognizes text content and extracts keywords from the slides, which can be used for keyword-based video retrieval and browsing. Experimental results show that our system is able to generate more stable and accurate screen localization results than commonly-used object tracking methods. Our system also extracts more accurate presentation structures than general video summarization methods, for this specific type of video.

  10. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  11. Automatic computer aided analysis algorithms and system for adrenal tumors on CT images.

    PubMed

    Chai, Hanchao; Guo, Yi; Wang, Yuanyuan; Zhou, Guohui

    2017-08-04

    The adrenal tumor will disturb the secreting function of adrenocortical cells, leading to many diseases. Different kinds of adrenal tumors require different therapeutic schedules. In the practical diagnosis, it highly relies on the doctor's experience to judge the tumor type by reading the hundreds of CT images. This paper proposed an automatic computer aided analysis method for adrenal tumors detection and classification. It consisted of the automatic segmentation algorithms, the feature extraction and the classification algorithms. These algorithms were then integrated into a system and conducted on the graphic interface by using MATLAB Graphic user interface (GUI). The accuracy of the automatic computer aided segmentation and classification reached 90% on 436 CT images. The experiments proved the stability and reliability of this automatic computer aided analytic system.

  12. Development of a System for Automatic Facial Expression Analysis

    NASA Astrophysics Data System (ADS)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  13. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    ERIC Educational Resources Information Center

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  14. Review of automatic detection of pig behaviours by using image analysis

    NASA Astrophysics Data System (ADS)

    Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Kong, Fantao

    2017-06-01

    Automatic detection of lying, moving, feeding, drinking, and aggressive behaviours of pigs by means of image analysis can save observation input by staff. It would help staff make early detection of diseases or injuries of pigs during breeding and improve management efficiency of swine industry. This study describes the progress of pig behaviour detection based on image analysis and advancement in image segmentation of pig body, segmentation of pig adhesion and extraction of pig behaviour characteristic parameters. Challenges for achieving automatic detection of pig behaviours were summarized.

  15. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  16. Phase Segmentation Methods for an Automatic Surgical Workflow Analysis

    PubMed Central

    Sakurai, Ryuhei; Yamazoe, Hirotake

    2017-01-01

    In this paper, we present robust methods for automatically segmenting phases in a specified surgical workflow by using latent Dirichlet allocation (LDA) and hidden Markov model (HMM) approaches. More specifically, our goal is to output an appropriate phase label for each given time point of a surgical workflow in an operating room. The fundamental idea behind our work lies in constructing an HMM based on observed values obtained via an LDA topic model covering optical flow motion features of general working contexts, including medical staff, equipment, and materials. We have an awareness of such working contexts by using multiple synchronized cameras to capture the surgical workflow. Further, we validate the robustness of our methods by conducting experiments involving up to 12 phases of surgical workflows with the average length of each surgical workflow being 12.8 minutes. The maximum average accuracy achieved after applying leave-one-out cross-validation was 84.4%, which we found to be a very promising result. PMID:28408921

  17. Automatic shape model building based on principal geodesic analysis bootstrapping.

    PubMed

    Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M

    2008-04-01

    We present a novel method for automatic shape model building from a collection of training shapes. The result is a shape model consisting of the mean model and the major modes of variation with a dense correspondence map between individual shapes. The framework consists of iterations where a medial shape representation is deformed into the training shapes followed by computation of the shape mean and modes of shape variation. In the first iteration, a generic shape model is used as starting point - in the following iterations in the bootstrap method, the resulting mean and modes from the previous iteration are used. Thereby, we gradually capture the shape variation in the training collection better and better. Convergence of the method is explicitly enforced. The method is evaluated on collections of artificial training shapes where the expected shape mean and modes of variation are known by design. Furthermore, collections of real prostates and cartilage sheets are used in the evaluation. The evaluation shows that the method is able to capture the training shapes close to the attainable accuracy already in the first iteration. Furthermore, the correspondence properties measured by generality, specificity, and compactness are improved during the shape model building iterations.

  18. Analysis of uncompensated phase error on automatic target recognition performance

    NASA Astrophysics Data System (ADS)

    Montagnino, Lee J.; Cassabaum, Mary L.; Halversen, Shawn D.; Rupp, Chad T.; Wagner, Gregory M.; Young, Matthew T.

    2009-05-01

    Performance of Automatic Target Recognition (ATR) algorithms for Synthetic Aperture Radar (SAR) systems relies heavily on the system performance and specifications of the SAR sensor. A representative multi-stage SAR ATR algorithm [1, 2] is analyzed across imagery containing phase errors in the down-range direction induced during the transmission of the radar's waveform. The degradation induced on the SAR imagery by the phase errors is measured in terms of peak phase error, Root-Mean-Square (RMS) phase error, and multiplicative noise. The ATR algorithm consists of three stages: a two-parameter CFAR, a discrimination stage to reduce false alarms, and a classification stage to identify targets in the scene. The end-to-end performance of the ATR algorithm is quantified as a function of the multiplicative noise present in the SAR imagery through Receiver Operating Characteristic (ROC) curves. Results indicate that the performance of the ATR algorithm presented is robust over a 3dB change in multiplicative noise.

  19. Automatic classification for pathological prostate images based on fractal analysis.

    PubMed

    Huang, Po-Whei; Lee, Cheng-Hsiung

    2009-07-01

    Accurate grading for prostatic carcinoma in pathological images is important to prognosis and treatment planning. Since human grading is always time-consuming and subjective, this paper presents a computer-aided system to automatically grade pathological images according to Gleason grading system which is the most widespread method for histological grading of prostate tissues. We proposed two feature extraction methods based on fractal dimension to analyze variations of intensity and texture complexity in regions of interest. Each image can be classified into an appropriate grade by using Bayesian, k-NN, and support vector machine (SVM) classifiers, respectively. Leave-one-out and k-fold cross-validation procedures were used to estimate the correct classification rates (CCR). Experimental results show that 91.2%, 93.7%, and 93.7% CCR can be achieved by Bayesian, k-NN, and SVM classifiers, respectively, for a set of 205 pathological prostate images. If our fractal-based feature set is optimized by the sequential floating forward selection method, the CCR can be promoted up to 94.6%, 94.2%, and 94.6%, respectively, using each of the above three classifiers. Experimental results also show that our feature set is better than the feature sets extracted from multiwavelets, Gabor filters, and gray-level co-occurrence matrix methods because it has a much smaller size and still keeps the most powerful discriminating capability in grading prostate images.

  20. The tool for the automatic analysis of text cohesion (TAACO): Automatic assessment of local, global, and text cohesion.

    PubMed

    Crossley, Scott A; Kyle, Kristopher; McNamara, Danielle S

    2016-12-01

    This study introduces the Tool for the Automatic Analysis of Cohesion (TAACO), a freely available text analysis tool that is easy to use, works on most operating systems (Windows, Mac, and Linux), is housed on a user's hard drive (rather than having an Internet interface), allows for the batch processing of text files, and incorporates over 150 classic and recently developed indices related to text cohesion. The study validates TAACO by investigating how its indices related to local, global, and overall text cohesion can predict expert judgments of text coherence and essay quality. The findings of this study provide predictive validation of TAACO and support the notion that expert judgments of text coherence and quality are either negatively correlated or not predicted by local and overall text cohesion indices, but are positively predicted by global indices of cohesion. Combined, these findings provide supporting evidence that coherence for expert raters is a property of global cohesion and not of local cohesion, and that expert ratings of text quality are positively related to global cohesion.

  1. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  2. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  3. Automatic Method of Supernovae Classification by Modeling Human Procedure of Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Módolo, Marcelo; Rosa, Reinaldo; Guimaraes, Lamartine N. F.

    2016-07-01

    The classification of a recently discovered supernova must be done as quickly as possible in order to define what information will be captured and analyzed in the following days. This classification is not trivial and only a few experts astronomers are able to perform it. This paper proposes an automatic method that models the human procedure of classification. It uses Multilayer Perceptron Neural Networks to analyze the supernovae spectra. Experiments were performed using different pre-processing and multiple neural network configurations to identify the classic types of supernovae. Significant results were obtained indicating the viability of using this method in places that have no specialist or that require an automatic analysis.

  4. System for Automatic Detection and Analysis of Targets in FMICW Radar Signal

    NASA Astrophysics Data System (ADS)

    Rejfek, Luboš; Mošna, Zbyšek; Urbář, Jaroslav; Koucká Knížová, Petra

    2016-01-01

    This paper presents the automatic system for the processing of the signals from the frequency modulated interrupted continuous wave (FMICW) radar and describes methods for the primary signal processing. Further, we present methods for the detection of the targets in strong noise. These methods are tested both on the real and simulated signals. The real signals were measured using the developed at the IAP CAS experimental prototype of FMICW radar with operational frequency 35.4 GHz. The measurement campaign took place at the TU Delft, the Netherlands. The obtained results were used for development of the system for the automatic detection and analysis of the targets measured by the FMICW radar.

  5. Automatic A-set selection for dynamics analysis

    NASA Technical Reports Server (NTRS)

    Allen, Tom

    1993-01-01

    A method for selecting optimum NASTRAN analysis set degrees of freedom for the dynamic eigenvalue problem is described. Theoretical development of the Guyan reduction procedure on which the method is based is first summarized. The algorithm used to select the analysis set degrees of freedom is then developed. Two example problems are provided to demonstrate the accuracy of the algorithm.

  6. Multiple Regression Analysis Approach To The Automatic Design Of Adaptive Image Processing Systems

    NASA Astrophysics Data System (ADS)

    Otsu, N.

    1984-01-01

    Multiple regression analysis for modeling the correspondence between a set of input variates and an output variate or a set of variates seems to be one of the most promising and direct approaches to automatically designing adaptive (or learning) systems for image pro-cessing and computer vision. Some approaches are shown with experimental results, such as automatic design of adaptive filters for image enhancement and restoration by giving the input image and the desired out-put image as a pair. The advantage of such an approach is the capability to simulate in an automatic and gen-eral way the functional "black boxes" (solutions) which are imposed by real problems regard-less of their inner detail, while the usual approaches are based on the so-called trial and error methods where any method proposed is repeatedly tried and checked for its results.

  7. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  8. Content-based analysis of Ki-67 stained meningioma specimens for automatic hot-spot selection.

    PubMed

    Swiderska-Chadaj, Zaneta; Markiewicz, Tomasz; Grala, Bartlomiej; Lorent, Malgorzata

    2016-10-07

    Hot-spot based examination of immunohistochemically stained histological specimens is one of the most important procedures in pathomorphological practice. The development of image acquisition equipment and computational units allows for the automation of this process. Moreover, a lot of possible technical problems occur in everyday histological material, which increases the complexity of the problem. Thus, a full context-based analysis of histological specimens is also needed in the quantification of immunohistochemically stained specimens. One of the most important reactions is the Ki-67 proliferation marker in meningiomas, the most frequent intracranial tumour. The aim of our study is to propose a context-based analysis of Ki-67 stained specimens of meningiomas for automatic selection of hot-spots. The proposed solution is based on textural analysis, mathematical morphology, feature ranking and classification, as well as on the proposed hot-spot gradual extinction algorithm to allow for the proper detection of a set of hot-spot fields. The designed whole slide image processing scheme eliminates such artifacts as hemorrhages, folds or stained vessels from the region of interest. To validate automatic results, a set of 104 meningioma specimens were selected and twenty hot-spots inside them were identified independently by two experts. The Spearman rho correlation coefficient was used to compare the results which were also analyzed with the help of a Bland-Altman plot. The results show that most of the cases (84) were automatically examined properly with two fields of view with a technical problem at the very most. Next, 13 had three such fields, and only seven specimens did not meet the requirement for the automatic examination. Generally, the Automatic System identifies hot-spot areas, especially their maximum points, better. Analysis of the results confirms the very high concordance between an automatic Ki-67 examination and the expert's results, with a Spearman

  9. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  10. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  11. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  12. Automatic system for brain MRI analysis using a novel combination of fuzzy rule-based and automatic clustering techniques

    NASA Astrophysics Data System (ADS)

    Hillman, Gilbert R.; Chang, Chih-Wei; Ying, Hao; Kent, T. A.; Yen, John

    1995-05-01

    Analysis of magnetic resonance images (MRI) of the brain permits the identification and measurement of brain compartments. These compartments include normal subdivisions of brain tissue, such as gray matter, white matter and specific structures, and also include pathologic lesions associated with stroke or viral infection. A fuzzy system has been developed to analyze images of animal and human brain, segmenting the images into physiologically meaningful regions for display and measurement. This image segmentation system consists of two stages which include a fuzzy rule-based system and fuzzy c-means algorithm (FCM). The first stage of this system is a fuzzy rule-based system which classifies most pixels in MR images into several known classes and one `unclassified' group, which fails to fit the predetermined rules. In the second stage, this system uses the result of the first stage as initial estimates for the properties of the compartments and applies FCM to classify all the previously unclassified pixels. The initial prototypes are estimated by using the averages of the previously classified pixels. The combined processes constitute a fast, accurate and robust image segmentation system. This method can be applied to many clinical image segmentation problems. While the rule-based portion of the system allows specialized knowledge about the images to be incorporated, the FCM allows the resolution of ambiguities that result from noise and artifacts in the image data. The volumes and locations of the compartments can easily be measured and reported quantitatively once they are identified. It is easy to adapt this approach to new imaging problems, by introducing a new set of fuzzy rules and adjusting the number of expected compartments. However, for the purpose of building a practical fully automatic system, a rule learning mechanism may be necessary to improve the efficiency of modification of the fuzzy rules.

  13. Development of automatic movement analysis system for a small laboratory animal using image processing

    NASA Astrophysics Data System (ADS)

    Nagatomo, Satoshi; Kawasue, Kikuhito; Koshimoto, Chihiro

    2013-03-01

    Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

  14. Automatic Detection of Military Targets Utilising Neural Networks and Scale Space Analysis

    DTIC Science & Technology

    2001-04-01

    Detection of Military Targets utilising Neural Networks and Scale Space Analysis A. Khashman Chairman, Department of Computer Engineering Near East...and 3) high computational cost. This required for the employment of neural networks and new approach to edge detection is formalized in the providing...analysis, edge detection and of the Laplacian of the Gaussian (FLoG) edge neural networks . The result is an automatic edge detection operator [1][2], as

  15. Learning Enterprise Malware Triage from Automatic Dynamic Analysis

    DTIC Science & Technology

    2013-03-01

    mother, for always encouraging me academically. Your sacrifices have made this journey and these opportunities possible. Thank you, Major Thomas Dube...Automated Malware Analysis - Cuckoo Sandbox”, Nov 15 2012. URL http://www.cuckoosandbox.org/. [13] Hall, Mark, Eibe Frank, Geoffrey Holmes, Bernhard ...awareness’ ‘feature generation’ ‘feature selection’ ‘malware instruction set’ ‘n-gram’ ‘q-gram’ U U U UU 75 Thomas E. Dube, Maj, USAF (ENG) (937) 255-3636 ext. 4613

  16. Biosignal analysis to assess mental stress in automatic driving of trucks: palmar perspiration and masseter electromyography.

    PubMed

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-03-02

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports.

  17. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  18. Automatic quantification of neurite outgrowth by means of image analysis

    NASA Astrophysics Data System (ADS)

    Van de Wouwer, Gert; Nuydens, Rony; Meert, Theo; Weyn, Barbara

    2004-07-01

    A system for quantification of neurite outgrowth in in-vitro experiments is described. The system is developed for routine use in a high-throughput setting and is therefore needs fast, cheap, and robust. It relies on automated digital microscopical imaging of microtiter plates. Image analysis is applied to extract features for characterisation of neurite outgrowth. The system is tested in a dose-response experiment on PC12 cells + Taxol. The performance of the system and its ability to measure changes on neuronal morphology is studied.

  19. Automatic forensic analysis of automotive paints using optical microscopy.

    PubMed

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Automatic Determination of Bacterioplankton Biomass by Image Analysis

    PubMed Central

    Bjørnsen, Peter Koefoed

    1986-01-01

    Image analysis was applied to epifluorescense microscopy of acridine orange-stained plankton samples. A program was developed for discrimination and binary segmentation of digitized video images, taken by an ultrasensitive video camera mounted on the microscope. Cell volumes were estimated from area and perimeter of the objects in the binary image. The program was tested on fluorescent latex beads of known diameters. Biovolumes measured by image analysis were compared with directly determined carbon biomasses in batch cultures of estuarine and freshwater bacterioplankton. This calibration revealed an empirical conversion factor from biovolume to biomass of 0.35 pg of C μm−3 (± 0.03 95% confidence limit). The deviation of this value from the normally used conversion factors of 0.086 to 0.121 pg of C μm−3 is discussed. The described system was capable of measuring 250 cells within 10 min, providing estimates of cell number, mean cell volume, and biovolume with a precision of 5%. Images PMID:16347077

  1. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    PubMed Central

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays. PMID:24404030

  2. Automatic Fatigue Detection of Drivers through Yawning Analysis

    NASA Astrophysics Data System (ADS)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  3. Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.

    PubMed

    Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C

    2012-08-07

    In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.

  4. IMPROMPTU: a system for automatic 3D medical image-analysis.

    PubMed

    Sundaramoorthy, G; Hoford, J D; Hoffman, E A; Higgins, W E

    1995-01-01

    The utility of three-dimensional (3D) medical imaging is hampered by difficulties in extracting anatomical regions and making measurements in 3D images. Presently, a user is generally forced to use time-consuming, subjective, manual methods, such as slice tracing and region painting, to define regions of interest. Automatic image-analysis methods can ameliorate the difficulties of manual methods. This paper describes a graphical user interface (GUI) system for constructing automatic image-analysis processes for 3D medical-imaging applications. The system, referred to as IMPROMPTU, provides a user-friendly environment for prototyping, testing and executing complex image-analysis processes. IMPROMPTU can stand alone or it can interact with an existing graphics-based 3D medical image-analysis package (VIDA), giving a strong environment for 3D image-analysis, consisting of tools for visualization, manual interaction, and automatic processing. IMPROMPTU links to a large library of 1D, 2D, and 3D image-processing functions, referred to as VIPLIB, but a user can easily link in custom-made functions. 3D applications of the system are given for left-ventricular chamber, myocardial, and upper-airway extractions.

  5. Automatic localization of cerebral cortical malformations using fractal analysis

    NASA Astrophysics Data System (ADS)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  6. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms

    PubMed Central

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F.

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7–76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  7. Investigation of Ballistic Evidence through an Automatic Image Analysis and Identification System.

    PubMed

    Kara, Ilker

    2016-05-01

    Automated firearms identification (AFI) systems contribute to shedding light on criminal events by comparison between different pieces of evidence on cartridge cases and bullets and by matching similar ones that were fired from the same firearm. Ballistic evidence can be rapidly analyzed and classified by means of an automatic image analysis and identification system. In addition, it can be used to narrow the range of possible matching evidence. In this study conducted on the cartridges ejected from the examined pistol, three imaging areas, namely the firing pin impression, capsule traces, and the intersection of these traces, were compared automatically using the image analysis and identification system through the correlation ranking method to determine the numeric values that indicate the significance of the similarities. These numerical features that signify the similarities and differences between pistol makes and models can be used in groupings to make a distinction between makes and models of pistols. © 2016 American Academy of Forensic Sciences.

  8. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    NASA Astrophysics Data System (ADS)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  9. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  10. The Automatic Assessment and Reduction of Noise Using Edge Pattern Analysis in Nonlinear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-ur; Woodells, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  11. Theoretical Analysis of the Longitudinal Behavior of an Automatically Controlled Supersonic Interceptor During the Attack Phase

    NASA Technical Reports Server (NTRS)

    Gates, Ordway B., Jr.; Woodling, C. H.

    1959-01-01

    Theoretical analysis of the longitudinal behavior of an automatically controlled supersonic interceptor during the attack phase against a nonmaneuvering target is presented. Control of the interceptor's flight path is obtained by use of a pitch rate command system. Topics lift, and pitching moment, effects of initial tracking errors, discussion of normal acceleration limited, limitations of control surface rate and deflection, and effects of neglecting forward velocity changes of interceptor during attack phase.

  12. Automatic Assessment and Reduction of Noise using Edge Pattern Analysis in Non-Linear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  13. [The concordance of manuel (visual) scoring and automatic analysis in sleep staging].

    PubMed

    Oztürk, Onder; Mutlu, Levent Cem; Sağcan, Gülseren; Deniz, Yüksel; Cuhadaroğlu, Cağlar

    2009-01-01

    Full night polysomnography (PSG) remains the gold standard diagnostic test for the evaluation of sleep and the detection of sleep disorders. The computer-assisted scoring methods have been developed to accelerate the scoring. It is said that there was a concordance up to 80% between these scoring softwares and manual scoring. According to our experiences, it is not matched with this belief. In this study, we intend to examine whether the results of automatic analysis match with manual (visual) evaluation. The PSG records of 30 cases with a diagnosis of obstructive sleep apnea syndrome (OSAS) are chosen randomly. We compare the results of automatic analysis with the results of two scorers who have a concordance of 80-95% and at least 1000 PSG scoring experiences. We evaluated 21.060 epochs of 18 men with 48.83 + or - 13.51 ages, and 12 women with 44.56 + or - 14.28 ages. In automatic analysis; total sleep time (p= 0.003) and sleep efficiency (p= 0.004) were low. AHI (p= 0.802) and ODI (p= 0.193) values were high. The epochs scored differently were 8819 epochs (41.88%). The stage I (88.43%) scored mostly different, was allocated to be awake (572 epochs). Stage II and stage IV were scored as stage III in 2276 and 983 epochs respectively. REM epochs were allocated to stage II (574 epochs). The differences in recording times and sleep architecture of PSG tests which examed by automatic analysis will affect all other parameters. Thus, we believe that it will make mistakes in the diagnosis and treatment of sleep disorders.

  14. Automatic analysis of auditory nerve electrically evoked compound action potential with an artificial neural network.

    PubMed

    Charasse, Basile; Thai-Van, Hung; Chanal, Jean Marc; Berger-Vachon, Christian; Collet, Lionel

    2004-07-01

    The auditory nerve's electrically evoked compound action potential is recorded in deaf patients equipped with the Nucleus 24 cochlear implant using a reverse telemetry system (NRT). Since the threshold of the NRT response (NRT-T) is thought to reflect the psychophysics needed for programming cochlear implants, efforts have been made by specialized management teams to develop its use. This study aimed at developing a valid tool, based on artificial neural networks (ANN) technology, for automatic estimation of NRT-T. The ANN used was a single layer perceptron, trained with 120 NRT traces. Learning traces differed from data used for the validation. A total of 550 NRT traces from 11 cochlear implant subjects were analyzed separately by the system and by a group of physicians with expertise in NRT analysis. Both worked to determine 37 NRT-T values, using the response amplitude growth function (AGF) (linear regression of response amplitudes obtained at decreasing stimulus intensity levels). The validity of the system was assessed by comparing the NRT-T values automatically determined by the system with those determined by the physicians. A strong correlation was found between automatic and physician-obtained NRT-T values (Pearson r correlation coefficient >0.9). ANOVA statistics confirmed that automatic NRT-Ts did not differ from physician-obtained values (F = 0.08999, P = 0.03). Moreover, the average error between NRT-Ts predicted by the system and NRT-Ts measured by the physicians (3.6 stimulation units) did not differ significantly from the average error between NRT-Ts measured by each of the three physicians (4.2 stimulation units). In conclusion, the automatic system developed in this study was found to be as efficient as human experts for fitting the amplitude growth function and estimating NRT-T, with the advantage of considerable time-saving.

  15. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  16. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  17. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    NASA Technical Reports Server (NTRS)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  18. Automatic analysis of 2D polyacrylamide gels in the diagnosis of DNA polymorphisms.

    PubMed

    Koprowski, Robert; Wróbel, Zygmunt; Korzyńska, Anna; Chwiałkowska, Karolina; Kwaśniewski, Mirosław

    2013-07-08

    The analysis of polyacrylamide gels is currently carried out manually or automatically. In the automatic method, there are limitations related to the acceptable degree of distortion of lane and band continuity. The available software cannot deal satisfactorily with this type of situations. Therefore, the paper presents an original image analysis method devoid of the aforementioned drawbacks. This paper examines polyacrylamide gel images from Li-Cor DNA Sequencer 4300S resulting from the use of the electrophoretic separation of DNA fragments. The acquired images have a resolution dependent on the length of the analysed DNA fragments and typically it is MG×NG=3806×1027 pixels. The images are saved in TIFF format with a grayscale resolution of 16 bits/pixel. The presented image analysis method was performed on gel images resulting from the analysis of DNA methylome profiling in plants exposed to drought stress, carried out with the MSAP (Methylation Sensitive Amplification Polymorphism) technique. The results of DNA polymorphism analysis were obtained in less than one second for the Intel Core™ 2 Quad CPU Q9300@2.5GHz, 8GB RAM. In comparison with other known methods, specificity was 0.95, sensitivity = 0.94 and AUC (Area Under Curve) = 0.98. It is possible to carry out this method of DNA polymorphism analysis on distorted images of polyacrylamide gels. The method is fully automatic and does not require any operator intervention. Compared with other methods, it produces the best results and the resulting image is easy to interpret. The presented method of measurement is used in the practical analysis of polyacrylamide gels in the Department of Genetics at the University of Silesia in Katowice, Poland.

  19. Automatic analysis of 2D polyacrylamide gels in the diagnosis of DNA polymorphisms

    PubMed Central

    2013-01-01

    Introduction The analysis of polyacrylamide gels is currently carried out manually or automatically. In the automatic method, there are limitations related to the acceptable degree of distortion of lane and band continuity. The available software cannot deal satisfactorily with this type of situations. Therefore, the paper presents an original image analysis method devoid of the aforementioned drawbacks. Material This paper examines polyacrylamide gel images from Li-Cor DNA Sequencer 4300S resulting from the use of the electrophoretic separation of DNA fragments. The acquired images have a resolution dependent on the length of the analysed DNA fragments and typically it is MG×NG=3806×1027 pixels. The images are saved in TIFF format with a grayscale resolution of 16 bits/pixel. The presented image analysis method was performed on gel images resulting from the analysis of DNA methylome profiling in plants exposed to drought stress, carried out with the MSAP (Methylation Sensitive Amplification Polymorphism) technique. Results The results of DNA polymorphism analysis were obtained in less than one second for the Intel Core™ 2 Quad CPU Q9300@2.5GHz, 8GB RAM. In comparison with other known methods, specificity was 0.95, sensitivity = 0.94 and AUC (Area Under Curve) = 0.98. Conclusions It is possible to carry out this method of DNA polymorphism analysis on distorted images of polyacrylamide gels. The method is fully automatic and does not require any operator intervention. Compared with other methods, it produces the best results and the resulting image is easy to interpret. The presented method of measurement is used in the practical analysis of polyacrylamide gels in the Department of Genetics at the University of Silesia in Katowice, Poland. PMID:23835039

  20. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  1. Automatic computerized analysis of heart rate variability with digital filtering of ectopic beats.

    PubMed

    Storck, N; Ericson, M; Lindblad, L; Jensen-Urstad, M

    2001-01-01

    Analysis of heart rate variability (HRV) has been used in studies of autonomic function and risk assessment in different patient groups such as in patients with diabetes mellitus, after myocardial infarction (MI) and other cardiovascular disease. Ectopic beats can, however, interfere with HRV analysis and give erroneous results. We have therefore studied the impact of ectopic beats on HRV analysis and the ability of a filter algorithm to correct this. Power spectral analysis of synthetic data with an increasing proportion of ectopic beats and 24-h Holter recordings from 98 healthy subjects and 93 post MI patients was done with and without digital filtering and interpolation of errors in the data stream. The analysis of HRV was seriously hampered by less than 1% of ectopic beats. A filter algorithm based on detection and linear interpolation of ectopic beats and other noise in the data stream corrected effectively for this in the synthetic data employed. In the healthy subjects and the post MI patients, filtering markedly reduced the extra variability related to non-normal beats. The software could automatically analyse over one hundred 24-h files in one batch. HRV analysis should include filtering for ectopic beats even with a small number of such beats. It is possible to make a fast analysis automatically even in huge clinical series, which makes it possible to use the method both clinically and in epidemiological studies.

  2. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  3. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    NASA Astrophysics Data System (ADS)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  4. Texture analysis for automatic segmentation of intervertebral disks of scoliotic spines from MR images.

    PubMed

    Chevrefils, Claudia; Cheriet, Farida; Aubin, Carl-Eric; Grimard, Guy

    2009-07-01

    This paper presents a unified framework for automatic segmentation of intervertebral disks of scoliotic spines from different types of magnetic resonance (MR) image sequences. The method exploits a combination of statistical and spectral texture features to discriminate closed regions representing intervertebral disks from background in MR images of the spine. Specific texture features are evaluated for three types of MR sequences acquired in the sagittal plane: 2-D spin echo, 3-D multiecho data image combination, and 3-D fast imaging with steady state precession. A total of 22 texture features (18 statistical and 4 spectral) are extracted from every closed region obtained from an automatic segmentation procedure based on the watershed approach. The feature selection step based on principal component analysis and clustering process permit to decide among all the extracted features which ones resulted in the highest rate of good classification. The proposed method is validated using a supervised k-nearest-neighbor classifier on 505 MR images coming from three different scoliotic patients and three different MR acquisition protocols. Results suggest that the selected texture features and classification can contribute to solve the problem of oversegmentation inherent to existing automatic segmentation methods by successfully discriminating intervertebral disks from the background on MRI of scoliotic spines.

  5. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    NASA Astrophysics Data System (ADS)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  6. Automatic Segmentation and Quantitative Analysis of the Articular Cartilages From Magnetic Resonance Images of the Knee

    PubMed Central

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K.; Ourselin, Sébastien

    2010-01-01

    In this paper, we present a segmentation scheme that automatically and accurately segments all the cartilages from magnetic resonance (MR) images of nonpathological knees. Our scheme involves the automatic segmentation of the bones using a three-dimensional active shape model, the extraction of the expected bone-cartilage interface (BCI), and cartilage segmentation from the BCI using a deformable model that utilizes localization, patient specific tissue estimation and a model of the thickness variation. The accuracy of this scheme was experimentally validated using leave one out experiments on a database of fat suppressed spoiled gradient recall MR images. The scheme was compared to three state of the art approaches, tissue classification, a modified semi-automatic watershed algorithm and nonrigid registration (B-spline based free form deformation). Our scheme obtained an average Dice similarity coefficient (DSC) of (0.83, 0.83, 0.85) for the (patellar, tibial, femoral) cartilages, while (0.82, 0.81, 0.86) was obtained with a tissue classifier and (0.73, 0.79, 0.76) was obtained with nonrigid registration. The average DSC obtained for all the cartilages using a semi-automatic watershed algorithm (0.90) was slightly higher than our approach (0.89), however unlike this approach we segment each cartilage as a separate object. The effectiveness of our approach for quantitative analysis was evaluated using volume and thickness measures with a median volume difference error of (5.92, 4.65, 5.69) and absolute Laplacian thickness difference of (0.13, 0.24, 0.12) mm. PMID:19520633

  7. Automatic segmentation and quantitative analysis of the articular cartilages from magnetic resonance images of the knee.

    PubMed

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K; Ourselin, Sébastien

    2010-01-01

    In this paper, we present a segmentation scheme that automatically and accurately segments all the cartilages from magnetic resonance (MR) images of nonpathological knees. Our scheme involves the automatic segmentation of the bones using a three-dimensional active shape model, the extraction of the expected bone-cartilage interface (BCI), and cartilage segmentation from the BCI using a deformable model that utilizes localization, patient specific tissue estimation and a model of the thickness variation. The accuracy of this scheme was experimentally validated using leave one out experiments on a database of fat suppressed spoiled gradient recall MR images. The scheme was compared to three state of the art approaches, tissue classification, a modified semi-automatic watershed algorithm and nonrigid registration (B-spline based free form deformation). Our scheme obtained an average Dice similarity coefficient (DSC) of (0.83, 0.83, 0.85) for the (patellar, tibial, femoral) cartilages, while (0.82, 0.81, 0.86) was obtained with a tissue classifier and (0.73, 0.79, 0.76) was obtained with nonrigid registration. The average DSC obtained for all the cartilages using a semi-automatic watershed algorithm (0.90) was slightly higher than our approach (0.89), however unlike this approach we segment each cartilage as a separate object. The effectiveness of our approach for quantitative analysis was evaluated using volume and thickness measures with a median volume difference error of (5.92, 4.65, 5.69) and absolute Laplacian thickness difference of (0.13, 0.24, 0.12) mm.

  8. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    PubMed

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments.

  9. Fully automatic algorithm for the analysis of vessels in the angiographic image of the eye fundus.

    PubMed

    Koprowski, Robert; Teper, Sławomir Jan; Węglarz, Beata; Wylęgała, Edward; Krejca, Michał; Wróbel, Zygmunt

    2012-06-22

    The available scientific literature contains descriptions of manual, semi-automated and automated methods for analysing angiographic images. The presented algorithms segment vessels calculating their tortuosity or number in a given area. We describe a statistical analysis of the inclination of the vessels in the fundus as related to their distance from the center of the optic disc. The paper presents an automated method for analysing vessels which are found in angiographic images of the eye using a Matlab implemented algorithm. It performs filtration and convolution operations with suggested masks. The result is an image containing information on the location of vessels and their inclination angle in relation to the center of the optic disc. This is a new approach to the analysis of vessels whose usefulness has been confirmed in the diagnosis of hypertension. The proposed algorithm analyzed and processed the images of the eye fundus using a classifier in the form of decision trees. It enabled the proper classification of healthy patients and those with hypertension. The result is a very good separation of healthy subjects from the hypertensive ones: sensitivity - 83%, specificity - 100%, accuracy - 96%. This confirms a practical usefulness of the proposed method. This paper presents an algorithm for the automatic analysis of morphological parameters of the fundus vessels. Such an analysis is performed during fluorescein angiography of the eye. The presented algorithm automatically calculates the global statistical features connected with both tortuosity of vessels and their total area or their number.

  10. Fully automatic algorithm for the analysis of vessels in the angiographic image of the eye fundus

    PubMed Central

    2012-01-01

    Background The available scientific literature contains descriptions of manual, semi-automated and automated methods for analysing angiographic images. The presented algorithms segment vessels calculating their tortuosity or number in a given area. We describe a statistical analysis of the inclination of the vessels in the fundus as related to their distance from the center of the optic disc. Methods The paper presents an automated method for analysing vessels which are found in angiographic images of the eye using a Matlab implemented algorithm. It performs filtration and convolution operations with suggested masks. The result is an image containing information on the location of vessels and their inclination angle in relation to the center of the optic disc. This is a new approach to the analysis of vessels whose usefulness has been confirmed in the diagnosis of hypertension. Results The proposed algorithm analyzed and processed the images of the eye fundus using a classifier in the form of decision trees. It enabled the proper classification of healthy patients and those with hypertension. The result is a very good separation of healthy subjects from the hypertensive ones: sensitivity - 83%, specificity - 100%, accuracy - 96%. This confirms a practical usefulness of the proposed method. Conclusions This paper presents an algorithm for the automatic analysis of morphological parameters of the fundus vessels. Such an analysis is performed during fluorescein angiography of the eye. The presented algorithm automatically calculates the global statistical features connected with both tortuosity of vessels and their total area or their number. PMID:22727245

  11. Neural network approach for automatic image analysis of cutting edge wear

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, T.; Nowicki, K.; Kłodowski, A.; Pimenov, D. Yu.

    2017-05-01

    This study describes image processing systems based on an artificial neural network to estimate tool wear. The Single Category-Based Classifier neural network was used to process tool image data. We present a method to determine the rate of tool wear based on image analysis, and discuss the evaluation of errors. Using the proposed algorithm, we made in Visual Basic the special Neural Wear software for analysis of the worn part of the cutting edge. For example, the image of worn edge was created determining the optimum setting of Neural Wear software to automatically indicate the wear area. The result of the analysis was the number of pixels that belonged to the worn area. Using these settings, we made an image analysis of edge wear for different working times. We used the calculated parameters of correlation between the number of pixels and VB index. Our results promise a good correlation between the new methods and the commonly used optically measured VB index, with an absolute mean relative error of 6.7% for the tools' entire life range. Automatic detection of wear of the cutting edge can be useful in many applications; for example, in predicting tool life based on the current value of edge wear.

  12. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis

    PubMed Central

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text “The North Wind and the Sun” were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. PMID:26136813

  13. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  14. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    SciTech Connect

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.; Jin, Shuangshuang; Lin, Jenglung; Hauer, Matthew L.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliably and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.

  15. Automatic generation of stop word lists for information retrieval and analysis

    DOEpatents

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  16. Boron-10 concentration measurements using CR-39 and automatic image analysis

    SciTech Connect

    Blue, T.E.; Roberts, T.C.; Barth, R.F.

    1986-01-01

    As part of a study of the effectiveness of /sup 10/B as a capture agent for neutron capture therapy, the /sup 10/B concentrations of samples of tissue taken from brains, from tumors within the brains, and from the blood of rats are being measured following administration of pharmaceuticals. The measurements are made with the polycarbonate solid-state nuclear track detector (SSNTD) CR-39, using an autoradiographic procedure and an image analysis system for automatic track counting. Boron-10 concentrations in tissue have been measured by others using autoradiography and lexan and cellulose nitrate SSNTDs. This paper reports our procedure for measuring /sup 10/B concentrations using CR-39.

  17. An integrated exhaust gas analysis system with self-contained data processing and automatic calibration

    NASA Technical Reports Server (NTRS)

    Anderson, R. C.; Summers, R. L.

    1981-01-01

    An integrated gas analysis system designed to operate in automatic, semiautomatic, and manual modes from a remote control panel is described. The system measures the carbon monoxide, oxygen, water vapor, total hydrocarbons, carbon dioxide, and oxides of nitrogen. A pull through design provides increased reliability and eliminates the need for manual flow rate adjustment and pressure correction. The system contains two microprocessors to range the analyzers, calibrate the system, process the raw data to units of concentration, and provides information to the facility research computer and to the operator through terminal and the control panels. After initial setup, the system operates for several hours without significant operator attention.

  18. An automatic ocular artifacts removal method based on wavelet-enhanced canonical correlation analysis.

    PubMed

    Zhao, Chunyu; Qiu, Tianshuang

    2011-01-01

    In this paper, a new method for automatic ocular artifacts (OA) removal in EEG recordings is proposed based on wavelet-enhanced canonical correlation analysis (wCCA). Compared to three popular ocular artifacts removal methods, wCCA owns two advantages. First, there is no need to identify the artifact components by subjective visual inspection, because the first canonical components found by CCA for each dataset, also the most common component between the left and right hemisphere, are definitely related to artifacts. Second, quantitative evaluation of the corrected EEG signals demonstrates that wCCA removed the most ocular artifacts with minimal cerebral information loss.

  19. Semi-automatic detection of skin malformations by analysis of spectral images

    NASA Astrophysics Data System (ADS)

    Rubins, U.; Spigulis, J.; Valeine, L.; Berzina, A.

    2013-06-01

    The multi-spectral imaging technique to reveal skin malformations has been described in this work. Four spectral images taken at polarized monochromatic LED illumination (450nm, 545nm, 660nm and 940 nm) and polarized white LED light imaged by CMOS sensor via cross-oriented polarizing filter were analyzed to calculate chromophore maps. The algorithm based on skin color analysis and user-defined threshold selection allows highlighting of skin areas with predefined chromophore concentration semi-automatically. Preliminary results of clinical tests are presented.

  20. Comparison of an automatic analysis and a manual analysis of conjunctival microcirculation in a sheep model of haemorrhagic shock.

    PubMed

    Arnemann, Philip-Helge; Hessler, Michael; Kampmeier, Tim; Morelli, Andrea; Van Aken, Hugo Karel; Westphal, Martin; Rehberg, Sebastian; Ertmer, Christian

    2016-12-01

    Life-threatening diseases of critically ill patients are known to derange microcirculation. Automatic analysis of microcirculation would provide a bedside diagnostic tool for microcirculatory disorders and allow immediate therapeutic decisions based upon microcirculation analysis. After induction of general anaesthesia and instrumentation for haemodynamic monitoring, haemorrhagic shock was induced in ten female sheep by stepwise blood withdrawal of 3 × 10 mL per kilogram body weight. Before and after the induction of haemorrhagic shock, haemodynamic variables, samples for blood gas analysis, and videos of conjunctival microcirculation were obtained by incident dark field illumination microscopy. Microcirculatory videos were analysed (1) manually with AVA software version 3.2 by an experienced user and (2) automatically by AVA software version 4.2 for total vessel density (TVD), perfused vessel density (PVD) and proportion of perfused vessels (PPV). Correlation between the two analysis methods was examined by intraclass correlation coefficient and Bland-Altman analysis. The induction of haemorrhagic shock decreased the mean arterial pressure (from 87 ± 11 to 40 ± 7 mmHg; p < 0.001); stroke volume index (from 38 ± 14 to 20 ± 5 ml·m(-2); p = 0.001) and cardiac index (from 2.9 ± 0.9 to 1.8 ± 0.5 L·min(-1)·m(-2); p < 0.001) and increased the heart rate (from 72 ± 9 to 87 ± 11 bpm; p < 0.001) and lactate concentration (from 0.9 ± 0.3 to 2.0 ± 0.6 mmol·L(-1); p = 0.001). Manual analysis showed no change in TVD (17.8 ± 4.2 to 17.8 ± 3.8 mm*mm(-2); p = 0.993), whereas PVD (from 15.6 ± 4.6 to 11.5 ± 6.5 mm*mm(-2); p = 0.041) and PPV (from 85.9 ± 11.8 to 62.7 ± 29.6%; p = 0.017) decreased significantly. Automatic analysis was not able to identify these changes. Correlation analysis showed a poor correlation between the analysis methods and a wide

  1. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10(4) . We show that the accuracy of isotope ratios is in fact proportional to SNR(-1) . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width Ts by Ts(0.5) . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Urban land use of the Sao Paulo metropolitan area by automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Niero, M.; Foresti, C.

    1983-01-01

    The separability of urban land use classes in the metropolitan area of Sao Paulo was studied by means of automatic analysis of MSS/LANDSAT digital data. The data were analyzed using the media K and MAXVER classification algorithms. The land use classes obtained were: CBD/vertical growth area, residential area, mixed area, industrial area, embankment area type 1, embankment area type 2, dense vegetation area and sparse vegetation area. The spectral analysis of representative samples of urban land use classes was done using the "Single Cell" analysis option. The classes CBD/vertical growth area, residential area and embankment area type 2 showed better spectral separability when compared to the other classes.

  3. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  4. Automatic network coupling analysis for dynamical systems based on detailed kinetic models.

    PubMed

    Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich

    2005-10-01

    We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.

  5. Automatic selection of a representative trial from multiple measurements using Principle Component Analysis.

    PubMed

    Schweizer, Katrin; Cattin, Philippe C; Brunner, Reinald; Müller, Bert; Huber, Cora; Romkes, Jacqueline

    2012-08-31

    Experimental data in human movement science commonly consist of repeated measurements under comparable conditions. One can face the question how to identify a single trial, a set of trials, or erroneous trials from the entire data set. This study presents and evaluates a Selection Method for a Representative Trial (SMaRT) based on the Principal Component Analysis. SMaRT was tested on 1841 data sets containing 11 joint angle curves of gait analysis. The automatically detected characteristic trials were compared with the choice of three independent experts. SMaRT required 1.4s to analyse 100 data sets consisting of 8±3 trials each. The robustness against outliers reached 98.8% (standard visual control). We conclude that SMaRT is a powerful tool to determine a representative, uncontaminated trial in movement analysis data sets with multiple parameters.

  6. Performance Analysis of Distributed Applications using Automatic Classification of Communication Inefficiencies

    SciTech Connect

    Vetter, J.

    1999-11-01

    We present a technique for performance analysis that helps users understand the communication behavior of their message passing applications. Our method automatically classifies individual communication operations and it reveals the cause of communication inefficiencies in the application. This classification allows the developer to focus quickly on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, we trace the message operations of MPI applications and then classify each individual communication event using decision tree classification, a supervised learning technique. We train our decision tree using microbenchmarks that demonstrate both efficient and inefficient communication. Since our technique adapts to the target system's configuration through these microbenchmarks, we can simultaneously automate the performance analysis process and improve classification accuracy. Our experiments on four applications demonstrate that our technique can improve the accuracy of performance analysis, and dramatically reduce the amount of data that users must encounter.

  7. Automatic RBG-depth-pressure anthropometric analysis and individualised sleep solution prescription.

    PubMed

    Esquirol Caussa, Jordi; Palmero Cantariño, Cristina; Bayo Tallón, Vanessa; Cos Morera, Miquel Àngel; Escalera, Sergio; Sánchez, David; Sánchez Padilla, Maider; Serrano Domínguez, Noelia; Relats Vilageliu, Mireia

    2017-08-01

    Sleep surfaces must adapt to individual somatotypic features to maintain a comfortable, convenient and healthy sleep, preventing diseases and injuries. Individually determining the most adequate rest surface can often be a complex and subjective question. To design and validate an automatic multimodal somatotype determination model to automatically recommend an individually designed mattress-topper-pillow combination. Design and validation of an automated prescription model for an individualised sleep system is performed through a single-image 2 D-3 D analysis and body pressure distribution, to objectively determine optimal individual sleep surfaces combining five different mattress densities, three different toppers and three cervical pillows. A final study (n = 151) and re-analysis (n = 117) defined and validated the model, showing high correlations between calculated and real data (>85% in height and body circumferences, 89.9% in weight, 80.4% in body mass index and more than 70% in morphotype categorisation). Somatotype determination model can accurately prescribe an individualised sleep solution. This can be useful for healthy people and for health centres that need to adapt sleep surfaces to people with special needs. Next steps will increase model's accuracy and analise, if this prescribed individualised sleep solution can improve sleep quantity and quality; additionally, future studies will adapt the model to mattresses with technological improvements, tailor-made production and will define interfaces for people with special needs.

  8. Intercellular fluorescence background on microscope slides: some problems and solutions for automatic analysis

    NASA Astrophysics Data System (ADS)

    Piper, Jim; Sudar, Damir; Peters, Don; Pinkel, Daniel

    1994-05-01

    Although high contrast between signal and the dark background is often claimed as a major advantage of fluorescence staining in cytology and cytogenetics, in practice this is not always the case and in some circumstances the inter-cellular or, in the case of metaphase preparations, the inter-chromosome background can be both brightly fluorescent and vary substantially across the slide or even across a single metaphase. Bright background results in low image contrast, making automatic detection of metaphase cells more difficult. The background correction strategy employed in automatic search must both cope with variable background and be computationally efficient. The method employed in a fluorescence metaphase finder is presented, and the compromises involved are discussed. A different set of problems arise when the analysis is aimed at accurate quantification of the fluorescence signal. Some insight into the nature of the background in the case of comparative genomic hybridization is obtained by image analysis of data obtained from experiments using cell lines with known abnormal copy numbers of particular chromosome types.

  9. Automatic analysis of image of surface structure of cell wall-deficient EVC.

    PubMed

    Li, S; Hu, K; Cai, N; Su, W; Xiong, H; Lou, Z; Lin, T; Hu, Y

    2001-01-01

    Some computer applications for cell characterization in medicine and biology, such as analysis of surface structure of cell wall-deficient EVC (El Tor Vibrio of Cholera), operate with cell samples taken from very small areas of interest. In order to perform texture characterization in such an application, only a few texture operators can be employed: the operators should be insensitive to noise and image distortion and be reliable in order to estimate texture quality from images. Therefore, we introduce wavelet theory and mathematical morphology to analyse the cellular surface micro-area image obtained by SEM (Scanning Electron Microscope). In order to describe the quality of surface structure of cell wall-deficient EVC, we propose a fully automatic computerized method. The image analysis process is carried out in two steps. In the first, we decompose the given image by dyadic wavelet transform and form an image approximation with higher resolution, by doing so, we perform edge detection of given images efficiently. In the second, we introduce many operations of mathematical morphology to obtain morphological quantitative parameters of surface structure of cell wall-deficient EVC. The obtained results prove that the method can eliminate noise, detect the edge and extract the feature parameters validly. In this work, we have built automatic analytic software named "EVC.CELL".

  10. In-air PIXE set-up for automatic analysis of historical document inks

    NASA Astrophysics Data System (ADS)

    Budnar, Miloš; Simčič, Jure; Rupnik, Zdravko; Uršič, Mitja; Pelicon, Primož; Kolar, Jana; Strlič, Matija

    2004-06-01

    The iron gall inks were one of the writing materials mostly applied in historical documents of the western civilization. Due to the ink corrosive character, the documents are faced with a danger of being seriously, and in some cases also irreversibly changed. The elemental composition of the inks is an important information for taking the adequate conservation action [Project InkCor, http://www.infosrvr.nuk.uni-lj.si/jana/Inkcor/index.htm, and references within]. Here, the in-air PIXE analysis offers an indispensable tool due to its sensitivity and almost non-destructive character. An experimental approach developed for precise and automatic analysis of documents at Jožef Stefan Institute Tandetron accelerator is presented. The selected documents were mounted, one at the time, on the positioning board and the chosen ink spots on the sample were irradiated by 1.7 MeV protons. The data acquisition on the selected ink spots is done automatically throughout the measuring pattern determined prior to the measurement. The chemical elements identified in the documents ranged from Si to Pb, and between them the significant iron gall ink components like Fe, S, K, Cu, Zn, Co, Mn, Ni were deduced with precision of ±10%. The measurements were done non-destructively and no visible damage was observed on the irradiated documents.

  11. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    NASA Astrophysics Data System (ADS)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  12. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  13. AnaSP: a software suite for automatic image analysis of multicellular spheroids.

    PubMed

    Piccinini, Filippo

    2015-04-01

    Today, more and more biological laboratories use 3D cell cultures and tissues grown in vitro as a 3D model of in vivo tumours and metastases. In the last decades, it has been extensively established that multicellular spheroids represent an efficient model to validate effects of drugs and treatments for human care applications. However, a lack of methods for quantitative analysis limits the usage of spheroids as models for routine experiments. Several methods have been proposed in literature to perform high throughput experiments employing spheroids by automatically computing different morphological parameters, such as diameter, volume and sphericity. Nevertheless, these systems are typically grounded on expensive automated technologies, that make the suggested solutions affordable only for a limited subset of laboratories, frequently performing high content screening analysis. In this work we propose AnaSP, an open source software suitable for automatically estimating several morphological parameters of spheroids, by simply analyzing brightfield images acquired with a standard widefield microscope, also not endowed with a motorized stage. The experiments performed proved sensitivity and precision of the segmentation method proposed, and excellent reliability of AnaSP to compute several morphological parameters of spheroids imaged in different conditions. AnaSP is distributed as an open source software tool. Its modular architecture and graphical user interface make it attractive also for researchers who do not work in areas of computer vision and suitable for both high content screenings and occasional spheroid-based experiments. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Automatic determination of chlorine without standard solutions using a biamperometric flow-batch analysis system.

    PubMed

    Nascimento, Valberes B; Selva, Thiago M G; Coelho, Elaine C S; Santos, Francyana P; Antônio, Jadielson L S; Silva, José R; Gaião, Edvaldo N; Araújo, Mário C U

    2010-04-15

    This study presents an automatic analysis system that does not require the use of standard solutions. The system uses an electrochemical flow cell for in line generation of the standards, and operates under the standard addition technique. The versatility of this system was demonstrated by the development of a one key touch fully automatic method for the determination of total available chlorine in real samples. The extremely simple, accurate and inexpensive method was based simply on the biamperometric monitoring of the well known redox reaction of chlorine with iodide ions in a flow-batch system, where the produced iodine (triiodide ions) generates an electrical current proportional to the chlorine concentration in the sample. The flow-batch parameters were optimized to maximize the sensitivity without losses on the precision of the analysis. An excellent linear dependence between the biamperometric signal and the chlorine concentration for the standard additions and a good agreement between the proposed approach and a reference method were obtained. The method was successfully applied to determine chlorine in several different bleach and chlorinated water samples (r=0.9995, LOD=8.261 x 10(-7) mol L(-1)) and could be easily extended to other oxidants and samples. Comparison to a reference method and recoveries close to 100% demonstrated the reliability of the proposed method. In addition, low residue disposal and reagent consumption, allied with high accuracy and precision, make it very promising for routine applications.

  15. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  16. The Romanian-English Contrastive Analysis Project; Contrastive Studies in the Syntax and Semantics of English and Romanian, Vol. 6.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The sixth volume of this series contains eight contrastive studies in the syntax and semantics of English and Romanian. They are: "Criteria for the Contrastive Analysis of English Nouns," by Andrei Bantas; "Adjectives as Noun Modifiers in Post-Verbal Position," by Ioana Poenaru; "Towards a Semantic Description of 'Tense' and 'Aspect' in English…

  17. The Romanian-English Contrastive Analysis Project; Contrastive Studies in the Syntax and Semantics of English and Romanian, Vol. 6.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The sixth volume of this series contains eight contrastive studies in the syntax and semantics of English and Romanian. They are: "Criteria for the Contrastive Analysis of English Nouns," by Andrei Bantas; "Adjectives as Noun Modifiers in Post-Verbal Position," by Ioana Poenaru; "Towards a Semantic Description of 'Tense' and 'Aspect' in English…

  18. Analysis of facial expressions in parkinson's disease through video-based automatic methods.

    PubMed

    Bandini, Andrea; Orlandi, Silvia; Escalante, Hugo Jair; Giovannelli, Fabio; Cincotta, Massimo; Reyes-Garcia, Carlos A; Vanni, Paola; Zaccara, Gaetano; Manfredi, Claudia

    2017-04-01

    The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. Results show that control subjects reported on average higher distances than PD patients along the tasks. This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Feature relevance analysis supporting automatic motor imagery discrimination in EEG based BCI systems.

    PubMed

    Álvarez-Meza, A M; Velásquez-Martínez, L F; Castellanos-Dominguez, G

    2013-01-01

    Recently, there have been many efforts to develop Brain Computer Interface (BCI) systems, allowing identifying and discriminating brain activity, as well as, support the control of external devices, and to understand cognitive behaviors. In this work, a feature relevance analysis approach based on an eigen decomposition method is proposed to support automatic Motor Imagery (MI) discrimination in electroencephalography signals for BCI systems. We select a set of features representing the best as possible the studied process. For such purpose, a variability study is performed based on traditional Principal Component Analysis. EEG signals modelling is carried out by feature estimation of three frequency-based and one time-based. Our approach provides testing over a well-known MI dataset. Attained results show that presented algorithm can be used as tool to support discrimination of MI brain activity, obtaining acceptable results in comparison to state of the art approaches.

  20. Analysis and Exploitation of Automatically Generated Scene Structure from Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Nilosek, David R.

    The recent advancements made in the field of computer vision, along with the ever increasing rate of computational power has opened up opportunities in the field of automated photogrammetry. Many researchers have focused on using these powerful computer vision algorithms to extract three-dimensional point clouds of scenes from multi-view imagery, with the ultimate goal of creating a photo-realistic scene model. However, geographically accurate three-dimensional scene models have the potential to be exploited for much more than just visualization. This work looks at utilizing automatically generated scene structure from near-nadir aerial imagery to identify and classify objects within the structure, through the analysis of spatial-spectral information. The limitation to this type of imagery is imposed due to the common availability of this type of aerial imagery. Popular third-party computer-vision algorithms are used to generate the scene structure. A voxel-based approach for surface estimation is developed using Manhattan-world assumptions. A surface estimation confidence metric is also presented. This approach provides the basis for further analysis of surface materials, incorporating spectral information. Two cases of spectral analysis are examined: when additional hyperspectral imagery of the reconstructed scene is available, and when only R,G,B spectral information can be obtained. A method for registering the surface estimation to hyperspectral imagery, through orthorectification, is developed. Atmospherically corrected hyperspectral imagery is used to assign reflectance values to estimated surface facets for physical simulation with DIRSIG. A spatial-spectral region growing-based segmentation algorithm is developed for the R,G,B limited case, in order to identify possible materials for user attribution. Finally, an analysis of the geographic accuracy of automatically generated three-dimensional structure is performed. An end-to-end, semi-automated, workflow

  1. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring

  2. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  3. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    PubMed

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-08

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  4. Automatic analysis of left ventricular ejection fraction using stroke volume images.

    PubMed

    Nelson, T R; Verba, J W; Bhargava, V; Shabetai, R; Slutsky, R

    1983-01-01

    The purpose of this study was to analyze, validate, and report on an automatic computer algorithm for analyzing left ventricular ejection fraction and to indicate future applications of the technique to other chambers and more advanced measurements. Thirty-eight patients were studied in the cardiac catheterization laboratory by equilibrium radionuclide ventriculography and concurrent contrast ventriculography. The temporal and spatial behavior of each picture element in a filtered stroke volume image series was monitored throughout the cardiac cycle. Pixels that met specific phase, amplitude, and derivative criteria were assigned to the appropriate chamber. Volume curves were generated from regions of interest for each chamber to enable calculation of the left ventricular ejection fraction. Left ventricular ejection fractions showed a good correlation (r = 0.89) between the two techniques. Ejection fractions ranged between 0.12 and 0.88, showing a wide range of application. It is concluded that automatic analysis of left ventricular ejection fraction is possible using the present algorithm and will be useful in improving the reproducibility and providing more accurate information during exercise protocols, pharmaceutical interventions, and routine clinical studies.

  5. Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain

    NASA Astrophysics Data System (ADS)

    Krauß, Thomas; Fischer, Peter

    2016-08-01

    In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.

  6. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  7. Automatic geocoding of high-value targets using structural image analysis and GIS data

    NASA Astrophysics Data System (ADS)

    Soergel, Uwe; Thoennessen, Ulrich

    1999-12-01

    Geocoding based merely on navigation data and sensor model is often not possible or precise enough. In these cases an improvement of the preregistration through image-based approaches is a solution. Due to the large amount of data in remote sensing automatic geocoding methods are necessary. For geocoding purposes appropriate tie points, which are present in image and map, have to be detected and matched. The tie points are base of the transformation function. Assigning the tie points is combinatorial problem depending on the number of tie points. This number can be reduced using structural tie points like corners or crossings of prominent extended targets (e.g. harbors, airfields). Additionally the reliability of the tie points is improved. Our approach extracts structural tie points independently in the image and in the vector map by a model-based image analysis. The vector map is provided by a GIS using ATKIS data base. The model parameters are extracted from maps or collateral information of the scenario. The two sets of tie points are automatically matched with a Geometric Hashing algorithm. The algorithm was successfully applied to VIS, IR and SAR data.

  8. Automatic fuzzy object-based analysis of VHSR images for urban objects extraction

    NASA Astrophysics Data System (ADS)

    Sebari, Imane; He, Dong-Chen

    2013-05-01

    We present an automatic approach for object extraction from very high spatial resolution (VHSR) satellite images based on Object-Based Image Analysis (OBIA). The proposed solution requires no input data other than the studied image. Not input parameters are required. First, an automatic non-parametric cooperative segmentation technique is applied to create object primitives. A fuzzy rule base is developed based on the human knowledge used for image interpretation. The rules integrate spectral, textural, geometric and contextual object proprieties. The classes of interest are: tree, lawn, bare soil and water for natural classes; building, road, parking lot for man made classes. The fuzzy logic is integrated in our approach in order to manage the complexity of the studied subject, to reason with imprecise knowledge and to give information on the precision and certainty of the extracted objects. The proposed approach was applied to extracts of Ikonos images of Sherbrooke city (Canada). An overall total extraction accuracy of 80% was observed. The correctness rates obtained for building, road and parking lot classes are of 81%, 75% and 60%, respectively.

  9. Semi-automatic system for UV images analysis of historical musical instruments

    NASA Astrophysics Data System (ADS)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  10. Group-wise automatic mesh-based analysis of cortical thickness

    NASA Astrophysics Data System (ADS)

    Vachet, Clement; Cody Hazlett, Heather; Niethammer, Marc; Oguz, Ipek; Cates, Joshua; Whitaker, Ross; Piven, Joseph; Styner, Martin

    2011-03-01

    The analysis of neuroimaging data from pediatric populations presents several challenges. There are normal variations in brain shape from infancy to adulthood and normal developmental changes related to tissue maturation. Measurement of cortical thickness is one important way to analyze such developmental tissue changes. We developed a novel framework that allows group-wise automatic mesh-based analysis of cortical thickness. Our approach is divided into four main parts. First an individual pre-processing pipeline is applied on each subject to create genus-zero inflated white matter cortical surfaces with cortical thickness measurements. The second part performs an entropy-based group-wise shape correspondence on these meshes using a particle system, which establishes a trade-off between an even sampling of the cortical surfaces and the similarity of corresponding points across the population using sulcal depth information and spatial proximity. A novel automatic initial particle sampling is performed using a matched 98-lobe parcellation map prior to a particle-splitting phase. Third, corresponding re-sampled surfaces are computed with interpolated cortical thickness measurements, which are finally analyzed via a statistical vertex-wise analysis module. This framework consists of a pipeline of automated 3D Slicer compatible modules. It has been tested on a small pediatric dataset and incorporated in an open-source C++ based high-level module called GAMBIT. GAMBIT's setup allows efficient batch processing, grid computing and quality control. The current research focuses on the use of an average template for correspondence and surface re-sampling, as well as thorough validation of the framework and its application to clinical pediatric studies.

  11. Quantitative analysis of lunar crater's landscape: automatic detection, classification and geological applications

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Jianping; He, Shujun; Zhang, Mingchao

    2013-04-01

    Lunar craters are the most important geological tectonic features on the moon; they are among the most studied subjects when it comes to the analysis of the surface of the moon since they provide us with the relative age of the surface unit and more information about lunar geology. Quantitative analysis of landscape on lunar crater is an important approach in lunar geological unit dating which plays a key role in understanding and reconstruction of lunar geological evolution. In this paper, a new approach of automatic crater detection and classification is proposed based on the quantitative analysis of crater's landscape with different spatial resolution digital terrain models. The approach proposed in this paper includes the following key points: 1) A new crater detection method which selects profile similarity parameters as the distinguishing marks is presented. The new method overcomes the high error defect of former DTM based crater detection algorithm; 2) Craters are sorted by the morphological characteristics of profiles. The new quantitative classification method overcomes the subjectivity of the previously descriptive classification method. In order to verify the usefulness of the proposed method the pre-selected landing area of China's Chang'e-III lunar satellite-Sinus Iridum is chosen as the experimental zone. DTM with different resolutions from the Chang'e-I Laser Altimeter, the Chang'e-I Stereoscopic Camera and the Lunar Orbiter Laser Altimeter (LOLA) are used for crater detection and classification. Dating results of each geological unit are gotten using crater size-frequency distribution method (CSFD). By comparing the former dating and manual classification data, we found that the results obtained by our method and the former results have the strong consistency. With the combination of automatic crater detection and classification, this paper basically provided a quantitative approach which can analyze the lunar crater's landscape and get geological

  12. A Multi-wavelength Analysis of Active Regions and Sunspots by Comparison of Automatic Detection Algorithms

    NASA Astrophysics Data System (ADS)

    Verbeeck, C.; Higgins, P. A.; Colak, T.; Watson, F. T.; Delouille, V.; Mampaey, B.; Qahwaji, R.

    2013-03-01

    Since the Solar Dynamics Observatory (SDO) began recording ≈ 1 TB of data per day, there has been an increased need to automatically extract features and events for further analysis. Here we compare the overall detection performance, correlations between extracted properties, and usability for feature tracking of four solar feature-detection algorithms: the Solar Monitor Active Region Tracker ( SMART) detects active regions in line-of-sight magnetograms; the Automated Solar Activity Prediction code ( ASAP) detects sunspots and pores in white-light continuum images; the Sunspot Tracking And Recognition Algorithm ( STARA) detects sunspots in white-light continuum images; the Spatial Possibilistic Clustering Algorithm ( SPoCA) automatically segments solar EUV images into active regions (AR), coronal holes (CH), and quiet Sun (QS). One month of data from the Solar and Heliospheric Observatory (SOHO)/ Michelson Doppler Imager (MDI) and SOHO/ Extreme Ultraviolet Imaging Telescope (EIT) instruments during 12 May - 23 June 2003 is analysed. The overall detection performance of each algorithm is benchmarked against National Oceanic and Atmospheric Administration (NOAA) and Solar Influences Data Analysis Center (SIDC) catalogues using various feature properties such as total sunspot area, which shows good agreement, and the number of features detected, which shows poor agreement. Principal Component Analysis indicates a clear distinction between photospheric properties, which are highly correlated to the first component and account for 52.86% of variability in the data set, and coronal properties, which are moderately correlated to both the first and second principal components. Finally, case studies of NOAA 10377 and 10365 are conducted to determine algorithm stability for tracking the evolution of individual features. We find that magnetic flux and total sunspot area are the best indicators of active-region emergence. Additionally, for NOAA 10365, it is shown that the

  13. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  14. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  15. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    PubMed

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  16. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  17. Drift problems in the automatic analysis of gamma-ray spectra using associative memory algorithms

    SciTech Connect

    Olmos, P.; Diaz, J.C.; Perez, J.M.; Aguayo, P. ); Gomez, P.; Rodellar, V. )

    1994-06-01

    Perturbations affecting nuclear radiation spectrometers during their operation frequently spoil the accuracy of automatic analysis methods. One of the problems usually found in practice refers to fluctuations in the spectrum gain and zero, produced by drifts in the detector and nuclear electronics. The pattern acquired in these conditions may be significantly different from that expected with stable instrumentation, thus complicating the identification and quantification of the radionuclides present in it. In this work, the performance of Associative Memory algorithms when dealing with spectra affected by drifts is explored considering a linear energy-calibration function. The formulation of the extended algorithm, constructed to quantify the possible presence of drifts in the spectrometer, is deduced and the results obtained from its application to several practical cases are commented.

  18. An image analysis approach for automatically re-orienteering CT images for dental implants.

    PubMed

    Cucchiara, Rita; Lamma, Evelina; Sansoni, Tommaso

    2004-06-01

    In the last decade, computerized tomography (CT) has become the most frequently used imaging modality to obtain a correct pre-operative implant planning. In this work, we present an image analysis and computer vision approach able to identify, from the reconstructed 3D data set, the optimal cutting plane specific to each implant to be planned, in order to obtain the best view of the implant site and to have correct measures. If the patient requires more implants, different cutting planes are automatically identified, and the axial and cross-sectional images can be re-oriented accordingly to each of them. In the paper, we describe the defined algorithms in order to recognize 3D markers (each one aligned with a missed tooth for which an implant has to be planned) in the 3D reconstructed space, and the results in processing real exams, in terms of effectiveness and precision and reproducibility of the measure.

  19. Analysis of steranes and triterpanes in geolipid extracts by automatic classification of mass spectra

    NASA Technical Reports Server (NTRS)

    Wardroper, A. M. K.; Brooks, P. W.; Humberston, M. J.; Maxwell, J. R.

    1977-01-01

    A computer method is described for the automatic classification of triterpanes and steranes into gross structural type from their mass spectral characteristics. The method has been applied to the spectra obtained by gas-chromatographic/mass-spectroscopic analysis of two mixtures of standards and of hydrocarbon fractions isolated from Green River and Messel oil shales. Almost all of the steranes and triterpanes identified previously in both shales were classified, in addition to a number of new components. The results indicate that classification of such alkanes is possible with a laboratory computer system. The method has application to diagenesis and maturation studies as well as to oil/oil and oil/source rock correlations in which rapid screening of large numbers of samples is required.

  20. Application of automatic image analysis for the investigation of autoclaved aerated concrete structure

    SciTech Connect

    Petrov, I.; Schlegel, E. . Inst. fuer Silikattechnik)

    1994-01-01

    Autoclaved aerated concrete (AAC) is formed from small-grained mixtures of raw materials and Al-powder as an air entraining agent. Owing to its high porosity AAC has a low bulk density which leads to very good heat insulating qualities. Automatic image analysis in connection with stereology and stochastic geometry was used to describe the size distribution of air pores in autoclaved concrete. The experiments were carried out an AAC samples with extremely different bulk densities and compressive strengths. The assumption of an elliptic shape of pores leads to an unambiguous characterization of structure by bi-histograms. It will be possible to calculate the spatial pore size distribution by these histograms, if the pores are assumed as being spheroids. A marked point field model and the pair correlation function g[sub a](r) were used to describe the pore structure.

  1. Automatic quantitative analysis of t-tubule organization in cardiac myocytes using ImageJ.

    PubMed

    Pasqualin, Côme; Gannier, François; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2015-02-01

    The transverse tubule system in mammalian striated muscle is highly organized and contributes to optimal and homogeneous contraction. Diverse pathologies such as heart failure and atrial fibrillation include disorganization of t-tubules and contractile dysfunction. Few tools are available for the quantification of the organization of the t-tubule system. We developed a plugin for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. This plugin (TTorg) analyzes raw confocal microscopy images. Analysis options include the whole image, specific regions of the image (cropping), and z-axis analysis of the same image. Batch analysis of a series of images with identical criteria is also one of the options. There is no need to either reorientate any specimen to the horizontal or to do a thresholding of the image to perform analysis. TTorg includes a synthetic "myocyte-like" image generator to test the plugin's efficiency in the user's own experimental conditions. This plugin was validated on synthetic images for different simulated cell characteristics and acquisition parameters. TTorg was able to detect significant differences between the organization of the t-tubule systems in experimental data of mouse ventricular myocytes isolated from wild-type and dystrophin-deficient mice. TTorg is freely distributed, and its source code is available. It provides a reliable, easy-to-use, automatic, and unbiased measurement of t-tubule organization in a wide variety of experimental conditions. Copyright © 2015 the American Physiological Society.

  2. A new and fast methodology to assess oxidative damage in cardiovascular diseases risk development through eVol-MEPS-UHPLC analysis of four urinary biomarkers.

    PubMed

    Mendes, Berta; Silva, Pedro; Mendonça, Isabel; Pereira, Jorge; Câmara, José S

    2013-11-15

    In this work, a new, fast and reliable methodology using a digitally controlled microextraction by packed sorbent (eVol(®)-MEPS) followed by ultra-high pressure liquid chromatography (UHPLC) analysis with photodiodes (PDA) detection, was developed to establish the urinary profile levels of four putative oxidative stress biomarkers (OSBs) in healthy subjects and patients evidencing cardiovascular diseases (CVDs). This data was used to verify the suitability of the selected OSBs (uric acid-UAc, malondialdehyde-MDA, 5-(hydroxymethyl)uracil-5-HMUra and 8-hydroxy-2'-deoxyguanosine-8-oxodG) as potential biomarkers of CVDs progression. Important parameters affecting the efficiency of the extraction process were optimized, particularly stationary phase selection, pH influence, sample volume, number of extraction cycles and washing and elution volumes. The experimental conditions that allowed the best extraction efficiency, expressed in terms of total area of the target analytes and data reproducibility, includes a 10 times dilution and pH adjustment of the urine samples to 6.0, followed by a gradient elution through the C8 adsorbent with 5 times 50 µL of 0.01% formic acid and 3×50 µL of 20% methanol in 0.01% formic acid. The chromatographic separation of the target analytes was performed with a HSS T3 column (100 mm × 2.1 mm, 1.7 µm in particle size) using 0.01% formic acid 20% methanol at 250 µL min(-1). The methodology was validated in terms of selectivity, linearity, instrumental limit of detection (LOD), method limit of quantification (LOQ), matrix effect, accuracy and precision (intra-and inter-day). Good results were obtained in terms of selectivity and linearity (r(2)>0.9906), as well as the LOD and LOQ, whose values were low, ranging from 0.00005 to 0.72 µg mL(-1) and 0.00023 to 2.31 µg mL(-1), respectively. The recovery results (91.1-123.0%), intra-day (1.0-8.3%), inter-day precision (4.6-6.3%) and the matrix effect (60.1-110.3%) of eVol

  3. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  4. Development of a machine learning technique for automatic analysis of seafloor image data: Case example, Pogonophora coverage at mud volcanoes

    NASA Astrophysics Data System (ADS)

    Lüdtke, A.; Jerosch, K.; Herzog, O.; Schlüter, M.

    2012-02-01

    Digital image processing provides powerful tools for fast and precise analysis of large image data sets in marine and geoscientific applications. Because of the increasing volume of georeferenced image and video data acquired by underwater platforms such as remotely operated vehicles, means of automatic analysis of the acquired image data are required. A new and fast-developing application is the combination of video imagery and mosaicking techniques for seafloor habitat mapping. In this article we introduce an approach to fully automatic detection and quantification of Pogonophora coverage in seafloor video mosaics from mud volcanoes. The automatic recognition is based on textural image features extracted from the raw image data and classification using machine learning techniques. Classification rates of up to 98.86% were achieved on the training data. The approach was extensively validated on a data set of more than 4000 seafloor video mosaics from the Håkon Mosby Mud Volcano.

  5. Cold Flow Properties of Biodiesel by Automatic and Manual Analysis Methods

    USDA-ARS?s Scientific Manuscript database

    Biodiesel from most common feedstocks has inferior cold flow properties compared to conventional diesel fuel. Blends with as little as 10 vol% biodiesel content typically have significantly higher cloud point (CP), pour point (PP) and cold filter plugging point (CFPP) than No. 2 grade diesel fuel (...

  6. Automatic generation of the non-holonomic equations of motion for vehicle stability analysis

    NASA Astrophysics Data System (ADS)

    Minaker, B. P.; Rieveley, R. J.

    2010-09-01

    The mathematical analysis of vehicle stability has been utilised as an important tool in the design, development, and evaluation of vehicle architectures and stability controls. This paper presents a novel method for automatic generation of the linearised equations of motion for mechanical systems that is well suited to vehicle stability analysis. Unlike conventional methods for generating linearised equations of motion in standard linear second order form, the proposed method allows for the analysis of systems with non-holonomic constraints. In the proposed method, the algebraic constraint equations are eliminated after linearisation and reduction to first order. The described method has been successfully applied to an assortment of classic dynamic problems of varying complexity including the classic rolling coin, the planar truck-trailer, and the bicycle, as well as in more recent problems such as a rotor-stator and a benchmark road vehicle with suspension. This method has also been applied in the design and analysis of a novel three-wheeled narrow tilting vehicle with zero roll-stiffness. An application for determining passively stable configurations using the proposed method together with a genetic search algorithm is detailed. The proposed method and software implementation has been shown to be robust and provides invaluable conceptual insight into the stability of vehicles and mechanical systems.

  7. Automatic MRI segmentation of para-pharyngeal fat pads using interactive visual feature space analysis for classification.

    PubMed

    Shahid, Muhammad Laiq Ur Rahman; Chitiboi, Teodora; Ivanovska, Tetyana; Molchanov, Vladimir; Völzke, Henry; Linsen, Lars

    2017-02-14

    Obstructive sleep apnea (OSA) is a public health problem. Detailed analysis of the para-pharyngeal fat pads can help us to understand the pathogenesis of OSA and may mediate the intervention of this sleeping disorder. A reliable and automatic para-pharyngeal fat pads segmentation technique plays a vital role in investigating larger data bases to identify the anatomic risk factors for the OSA. Our research aims to develop a context-based automatic segmentation algorithm to delineate the fat pads from magnetic resonance images in a population-based study. Our segmentation pipeline involves texture analysis, connected component analysis, object-based image analysis, and supervised classification using an interactive visual analysis tool to segregate fat pads from other structures automatically. We developed a fully automatic segmentation technique that does not need any user interaction to extract fat pads. Our algorithm is fast enough that we can apply it to population-based epidemiological studies that provide a large amount of data. We evaluated our approach qualitatively on thirty datasets and quantitatively against the ground truths of ten datasets resulting in an average of approximately 78% detected volume fraction and a 79% Dice coefficient, which is within the range of the inter-observer variation of manual segmentation results. The suggested method produces sufficiently accurate results and has potential to be applied for the study of large data to understand the pathogenesis of the OSA syndrome.

  8. Automatic Differentiation Package

    SciTech Connect

    Gay, David M.; Phipps, Eric; Bratlett, Roscoe

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  9. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    PubMed

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p < 0.01; RV, r = 0.79, p < 0.01). Our semi-automatic longitudinal strain analysis in 4CH cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  10. A system for automatic recording and analysis of motor activity in rats.

    PubMed

    Heredia-López, Francisco J; May-Tuyub, Rossana M; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J

    2013-03-01

    We describe the design and evaluation of an electronic system for the automatic recording of motor activity in rats. The device continually locates the position of a rat inside a transparent acrylic cube (50 cm/side) with infrared sensors arranged on its walls so as to correspond to the x-, y-, and z-axes. The system is governed by two microcontrollers. The raw data are saved in a text file within a secure digital memory card, and offline analyses are performed with a library of programs that automatically compute several parameters based on the sequence of coordinates and the time of occurrence of each movement. Four analyses can be made at specified time intervals: traveled distance (cm), movement speed (cm/s), time spent in vertical exploration (s), and thigmotaxis (%). In addition, three analyses are made for the total duration of the experiment: time spent at each x-y coordinate pair (min), time spent on vertical exploration at each x-y coordinate pair (s), and frequency distribution of vertical exploration episodes of distinct durations. User profiles of frequently analyzed parameters may be created and saved for future experimental analyses, thus obtaining a full set of analyses for a group of rats in a short time. The performance of the developed system was assessed by recording the spontaneous motor activity of six rats, while their behaviors were simultaneously videotaped for manual analysis by two trained observers. A high and significant correlation was found between the values measured by the electronic system and by the observers.

  11. Automatic detection of noisy channels in fNIRS signal based on correlation analysis.

    PubMed

    Guerrero-Mosquera, Carlos; Borragán, Guillermo; Peigneux, Philippe

    2016-09-15

    fNIRS signals can be contaminated by distinct sources of noise. While most of the noise can be corrected using digital filters, optimized experimental paradigms or pre-processing methods, few approaches focus on the automatic detection of noisy channels. In the present study, we propose a new method that detect automatically noisy fNIRS channels by combining the global correlations of the signal obtained from sliding windows (Cui et al., 2010) with correlation coefficients extracted experimental conditions defined by triggers. The validity of the method was evaluated on test data from 17 participants, for a total of 16 NIRS channels per subject, positioned over frontal, dorsolateral prefrontal, parietal and occipital areas. Additionally, the detection of noisy channels was tested in the context of different levels of cognitive requirement in a working memory N-back paradigm. Bad channels detection accuracy, defined as the proportion of bad NIRS channels correctly detected among the total number of channels examined, was close to 91%. Under different cognitive conditions the area under the Receiver Operating Curve (AUC) increased from 60.5% (global correlations) to 91.2% (local correlations). Our results show that global correlations are insufficient for detecting potentially noisy channels when the whole data signal is included in the analysis. In contrast, adding specific local information inherent to the experimental paradigm (e.g., cognitive conditions in a block or event-related design), improved detection performance for noisy channels. Also, we show that automated fNIRS channel detection can be achieved with high accuracy at low computational cost. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Automatic isotropic fractionation for large-scale quantitative cell analysis of nervous tissue.

    PubMed

    Azevedo, Frederico A C; Andrade-Moraes, Carlos H; Curado, Marco R; Oliveira-Pinto, Ana V; Guimarães, Daniel M; Szczupak, Diego; Gomes, Bruna V; Alho, Ana T L; Polichiso, Livia; Tampellini, Edilaine; Lima, Luzia; de Lima, Daniel Oliveira; da Silva, Hudson Alves; Lent, Roberto

    2013-01-15

    Isotropic fractionation is a quantitative technique that allows reliable estimates of absolute numbers of neuronal and non-neuronal brain cells. However, being fast for single small brains, it requires a long time for processing large brains or many small ones, if done manually. To solve this problem, we developed a machine to automate the method, and tested its efficiency, consistency, and reliability as compared with manual processing. The machine consists of a set of electronically controlled rotation and translation motors coupled to tissue grinders, which automatically transform fixed tissue into homogeneous nuclei suspensions. Speed and torque of the motors can be independently regulated by electronic circuits, according to the volume of tissue being processed and its mechanical resistance to fractionation. To test the machine, twelve paraformaldehyde-fixed rat brains and eight human cerebella were separated into two groups, respectively: one processed automatically and the other, manually. Both pairs of groups (rat and human tissue) followed the same, published protocol of the method. We compared the groups according to nuclei morphology, degree of clustering and number of cells. The machine proved superior for yielding faster results due to simultaneous processing in multiple grinders. Quantitative analysis of machine-processed tissue resulted in similar average numbers of total brain cells, neurons, and non-neuronal cells, statistically similar to the manually processed tissue and equivalent to previously published data. We concluded that the machine is more efficient because it utilizes many homogenizers simultaneously, equally consistent in producing high quality material for counting, and quantitatively reliable as compared to manual processing. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  15. Application of an automatic adaptive filter for Heart Rate Variability analysis.

    PubMed

    Dos Santos, Laurita; Barroso, Joaquim J; Macau, Elbert E N; de Godoy, Moacir F

    2013-12-01

    The presence of artifacts and noise effects in temporal series can seriously hinder the analysis of Heart Rate Variability (HRV). The tachograms should be carefully edited to avoid erroneous interpretations. The physician should carefully analyze the tachogram in order to detect points that might be associated with unlikely biophysical behavior and manually eliminate them from the data series. However, this is a time-consuming procedure. To facilitate the pre-analysis of the tachogram, this study uses a method of data filtering based on an adaptive filter which is quickly able to analyze a large amount of data. The method was applied to 229 time series from a database of patients with different clinical conditions: premature newborns, full-term newborns, healthy young adults, adults submitted to a very-low-calorie diet, and adults under preoperative evaluation for coronary artery bypass grafting. This proposed method is compared to the demanding conventional method, wherein the corrections of occasional ectopic beats and artifacts are usually manually executed by a specialist. To confirm the reliability of the results obtained, correlation coefficients were calculated, using both automatic and manual methods of ltering for each HRV index selected. A high correlation between the results was found, with highly significant p values, for all cases, except for some parameters analyzed in the premature newborns group, an issue that is thoroughly discussed. The authors concluded that the proposed adaptive filtering method helps to efficiently handle the task of editing temporal series for HRV analysis.

  16. Adaptive automatic data analysis in full-field fringe-pattern-based optical metrology

    NASA Astrophysics Data System (ADS)

    Trusiak, Maciej; Patorski, Krzysztof; Sluzewski, Lukasz; Pokorski, Krzysztof; Sunderland, Zofia

    2016-12-01

    Fringe pattern processing and analysis is an important task of full-field optical measurement techniques like interferometry, digital holography, structural illumination and moiré. In this contribution we present several adaptive automatic data analysis solutions based on the notion of Hilbert-Huang transform for measurand retrieval via fringe pattern phase and amplitude demodulation. The Hilbert-Huang transform consists of 2D empirical mode decomposition algorithm and Hilbert spiral transform analysis. Empirical mode decomposition adaptively dissects a meaningful number of same-scale subimages from the analyzed pattern - it is a data-driven method. Appropriately managing this set of unique subimages results in a very powerful fringe pre-filtering tool. Phase/amplitude demodulation is performed using Hilbert spiral transform aided by the local fringe orientation estimator. We describe several optical measurement techniques for technical and biological objects characterization basing on the especially tailored Hilbert-Huang algorithm modifications for fringe pattern denoising, detrending and amplitude/phase demodulation.

  17. Fractal analysis of elastographic images for automatic detection of diffuse diseases of salivary glands: preliminary results.

    PubMed

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of "real-time" elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology.

  18. Fractal Analysis of Elastographic Images for Automatic Detection of Diffuse Diseases of Salivary Glands: Preliminary Results

    PubMed Central

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of “real-time” elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology. PMID:23762183

  19. Automatic Classification of Staphylococci by Principal-Component Analysis and a Gradient Method1

    PubMed Central

    Hill, L. R.; Silvestri, L. G.; Ihm, P.; Farchi, G.; Lanciani, P.

    1965-01-01

    Hill, L. R. (Università Statale, Milano, Italy), L. G. Silvestri, P. Ihm, G. Farchi, and P. Lanciani. Automatic classification of staphylococci by principal-component analysis and a gradient method. J. Bacteriol. 89:1393–1401. 1965.—Forty-nine strains from the species Staphylococcus aureus, S. saprophyticus, S. lactis, S. afermentans, and S. roseus were submitted to different taxometric analyses; clustering was performed by single linkage, by the unweighted pair group method, and by principal-component analysis followed by a gradient method. Results were substantially the same with all methods. All S. aureus clustered together, sharply separated from S. roseus and S. afermentans; S. lactis and S. saprophyticus fell between, with the latter nearer to S. aureus. The main purpose of this study was to introduce a new taxometric technique, based on principal-component analysis followed by a gradient method, and to compare it with some other methods in current use. Advantages of the new method are complete automation and therefore greater objectivity, execution of the clustering in a space of reduced dimensions in which different characters have different weights, easy recognition of taxonomically important characters, and opportunity for representing clusters in three-dimensional models; the principal disadvantage is the need for large computer facilities. Images PMID:14293013

  20. [Real-time automatic analysis of sleep-waking behavior in the rat recorded by telemetry (author's transl)].

    PubMed

    Kirkham, P A; Lacoste, G; Rodrigues, L; Arnaud, C; Moos, F; Gottesmann, C

    1977-01-01

    The authors present a method of real-time automatic analysis of the sleep-waking cycle in the rat recorded by a miniature telemetry system. This method detects seven behavioural phases second by second. The correspondance computer-corrector is 88 p. 100 for the three principal phases: waking, slow sleep and paradoxical sleep.

  1. Characterization of the Vibrio cholerae VolA Surface-Exposed Lipoprotein Lysophospholipase

    PubMed Central

    Pride, Aaron C.; Guan, Ziqiang

    2014-01-01

    Bacterial lipases play important roles in bacterial metabolism and environmental response. Our laboratory recently discovered that a novel lipoprotein lysophospholipase, VolA, localizes on the surface of the Gram-negative aquatic pathogen Vibrio cholerae. VolA functions to cleave exogenous lysophosphatidylcholine, freeing the fatty acid moiety for use by V. cholerae. This fatty acid is transported into the cell and can be used as a nutrient and, more importantly, as a way to alter the membrane architecture via incorporation into the phospholipid biosynthesis pathway. There are few examples of Gram-negative, surface-exposed lipoproteins, and VolA is unique, as it has a previously undercharacterized function in V. cholerae membrane remodeling. Herein, we report the biochemical characterization of VolA. We show that VolA is a canonical lipoprotein via mass spectrometry analysis and demonstrate the in vitro activity of VolA under a variety of conditions. Additionally, we show that VolA contains a conserved Gly-Xaa-Ser-Xaa-Gly motif typical of lipases. Interestingly, we report the observation of VolA homologs in other aquatic pathogens. An Aeromonas hydrophila VolA homolog complements a V. cholerae VolA mutant in growth on lysophosphatidylcholine as the sole carbon source and in enzymatic assays. These results support the idea that the lipase activity of surface-exposed VolA likely contributes to the success of V. cholerae, improving the overall adaptation and survival of the organism in different environments. PMID:24532770

  2. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    SciTech Connect

    Gainey, M; Rothe, T

    2015-06-15

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to ease navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.

  3. Automatic roof plane detection and analysis in airborne lidar point clouds for solar potential assessment.

    PubMed

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m(2).

  4. Automatic Roof Plane Detection and Analysis in Airborne Lidar Point Clouds for Solar Potential Assessment

    PubMed Central

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m2. PMID:22346695

  5. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    PubMed

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  6. P3BSseq: parallel processing pipeline software for automatic analysis of bisulfite sequencing data.

    PubMed

    Luu, Phuc-Loi; Gerovska, Daniela; Arrospide-Elgarresta, Mikel; Retegi-Carrión, Sugoi; Schöler, Hans R; Araúzo-Bravo, Marcos J

    2017-02-01

    Bisulfite sequencing (BSseq) processing is among the most cumbersome next generation sequencing (NGS) applications. Though some BSseq processing tools are available, they are scattered, require puzzling parameters and are running-time and memory-usage demanding. We developed P3BSseq, a parallel processing pipeline for fast, accurate and automatic analysis of BSseq reads that trims, aligns, annotates, records the intermediate results, performs bisulfite conversion quality assessment, generates BED methylome and report files following the NIH standards. P3BSseq outperforms the known BSseq mappers regarding running time, computer hardware requirements (processing power and memory use) and is optimized to process the upcoming, extended BSseq reads. We optimized the P3BSseq parameters for directional and non-directional libraries, and for single-end and paired-end reads of Whole Genome and Reduced Representation BSseq. P3BSseq is a user-friendly streamlined solution for BSseq upstream analysis, requiring only basic computer and NGS knowledge. P3BSseq binaries and documentation are available at: http://sourceforge.net/p/p3bsseq/wiki/Home/ mararabra@yahoo.co.uk Supplementary data are available at Bioinformatics online.

  7. Automatic sign language analysis: a survey and the future beyond lexical meaning.

    PubMed

    Ong, Sylvie C W; Ranganath, Surendra

    2005-06-01

    Research in automatic analysis of sign language has largely focused on recognizing the lexical (or citation) form of sign gestures as they appear in continuous signing, and developing algorithms that scale well to large vocabularies. However, successful recognition of lexical signs is not sufficient for a full understanding of sign language communication. Nonmanual signals and grammatical processes which result in systematic variations in sign appearance are integral aspects of this communication but have received comparatively little attention in the literature. In this survey, we examine data acquisition, feature extraction and classification methods employed for the analysis of sign language gestures. These are discussed with respect to issues such as modeling transitions between signs in continuous signing, modeling inflectional processes, signer independence, and adaptation. We further examine works that attempt to analyze nonmanual signals and discuss issues related to integrating these with (hand) sign gestures. We also discuss the overall progress toward a true test of sign recognition systems--dealing with natural signing by native signers. We suggest some future directions for this research and also point to contributions it can make to other fields of research. Web-based supplemental materials (appendicies) which contain several illustrative examples and videos of signing can be found at www.computer.org/publications/dlib.

  8. Automatic Behavior Analysis During a Clinical Interview with a Virtual Human.

    PubMed

    Rizzo, Albert; Lucas, Gale; Gratch, Jonathan; Stratou, Giota; Morency, Louis-Philippe; Chavez, Kenneth; Shilling, Russ; Scherer, Stefan

    2016-01-01

    SimSensei is a Virtual Human (VH) interviewing platform that uses off-the-shelf sensors (i.e., webcams, Microsoft Kinect and a microphone) to capture and interpret real-time audiovisual behavioral signals from users interacting with the VH system. The system was specifically designed for clinical interviewing and health care support by providing a face-to-face interaction between a user and a VH that can automatically react to the inferred state of the user through analysis of behavioral signals gleaned from the user's facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human-to-human interaction and communication, SimSensei aims to capture and infer user state from signals generated from user non-verbal communication to improve engagement between a VH and a user and to quantify user state from the data captured across a 20 minute interview. Results from of sample of service members (SMs) who were interviewed before and after a deployment to Afghanistan indicate that SMs reveal more PTSD symptoms to the VH than they report on the Post Deployment Health Assessment. Pre/Post deployment facial expression analysis indicated more sad expressions and few happy expressions at post deployment.

  9. Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan

    2017-06-01

    Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

  10. Automatic Vehicle Trajectory Extraction for Traffic Analysis from Aerial Video Data

    NASA Astrophysics Data System (ADS)

    Apeltauer, J.; Babinec, A.; Herman, D.; Apeltauer, T.

    2015-03-01

    This paper presents a new approach to simultaneous detection and tracking of vehicles moving through an intersection in aerial images acquired by an unmanned aerial vehicle (UAV). Detailed analysis of spatial and temporal utilization of an intersection is an important step for its design evaluation and further traffic inspection. Traffic flow at intersections is typically very dynamic and requires continuous and accurate monitoring systems. Conventional traffic surveillance relies on a set of fixed cameras or other detectors, requiring a high density of the said devices in order to monitor the intersection in its entirety and to provide data in sufficient quality. Alternatively, a UAV can be converted to a very agile and responsive mobile sensing platform for data collection from such large scenes. However, manual vehicle annotation in aerial images would involve tremendous effort. In this paper, the proposed combination of vehicle detection and tracking aims to tackle the problem of automatic traffic analysis at an intersection from visual data. The presented method has been evaluated in several real-life scenarios.

  11. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    PubMed

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation. (c) 2015 APA, all rights reserved).

  12. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    PubMed

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  13. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    PubMed Central

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J.; Wild, Conor J.; Auer, Tibor; Linke, Annika C.; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  14. Automatic differentiation bibliography

    SciTech Connect

    Corliss, G.F.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  15. Automatic differentiation bibliography

    SciTech Connect

    Corliss, G.F.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  16. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  17. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  18. Automatic indexing

    SciTech Connect

    Harman, D.

    1992-09-01

    Automatic indexing has been a critical technology as more full-text data becomes available online. The paper discusses issues for automatic indexing of different types of full-text and also presents a survey of much of the current research into new techniques for automatic indexing.

  19. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  20. Gene Ontology density estimation and discourse analysis for automatic GeneRiF extraction.

    PubMed

    Gobeill, Julien; Tbahriti, Imad; Ehrler, Frédéric; Mottaz, Anaïs; Veuthey, Anne-Lise; Ruch, Patrick

    2008-04-11

    This paper describes and evaluates a sentence selection engine that extracts a GeneRiF (Gene Reference into Functions) as defined in ENTREZ-Gene based on a MEDLINE record. Inputs for this task include both a gene and a pointer to a MEDLINE reference. In the suggested approach we merge two independent sentence extraction strategies. The first proposed strategy (LASt) uses argumentative features, inspired by discourse-analysis models. The second extraction scheme (GOEx) uses an automatic text categorizer to estimate the density of Gene Ontology categories in every sentence; thus providing a full ranking of all possible candidate GeneRiFs. A combination of the two approaches is proposed, which also aims at reducing the size of the selected segment by filtering out non-content bearing rhetorical phrases. Based on the TREC-2003 Genomics collection for GeneRiF identification, the LASt extraction strategy is already competitive (52.78%). When used in a combined approach, the extraction task clearly shows improvement, achieving a Dice score of over 57% (+10%). Argumentative representation levels and conceptual density estimation using Gene Ontology contents appear complementary for functional annotation in proteomics.

  1. Gene Ontology density estimation and discourse analysis for automatic GeneRiF extraction

    PubMed Central

    Gobeill, Julien; Tbahriti, Imad; Ehrler, Frédéric; Mottaz, Anaïs; Veuthey, Anne-Lise; Ruch, Patrick

    2008-01-01

    Background This paper describes and evaluates a sentence selection engine that extracts a GeneRiF (Gene Reference into Functions) as defined in ENTREZ-Gene based on a MEDLINE record. Inputs for this task include both a gene and a pointer to a MEDLINE reference. In the suggested approach we merge two independent sentence extraction strategies. The first proposed strategy (LASt) uses argumentative features, inspired by discourse-analysis models. The second extraction scheme (GOEx) uses an automatic text categorizer to estimate the density of Gene Ontology categories in every sentence; thus providing a full ranking of all possible candidate GeneRiFs. A combination of the two approaches is proposed, which also aims at reducing the size of the selected segment by filtering out non-content bearing rhetorical phrases. Results Based on the TREC-2003 Genomics collection for GeneRiF identification, the LASt extraction strategy is already competitive (52.78%). When used in a combined approach, the extraction task clearly shows improvement, achieving a Dice score of over 57% (+10%). Conclusions Argumentative representation levels and conceptual density estimation using Gene Ontology contents appear complementary for functional annotation in proteomics. PMID:18426554

  2. Automatic Sleep Stage Scoring Using Time-Frequency Analysis and Stacked Sparse Autoencoders.

    PubMed

    Tsinalis, Orestis; Matthews, Paul M; Guo, Yike

    2016-05-01

    We developed a machine learning methodology for automatic sleep stage scoring. Our time-frequency analysis-based feature extraction is fine-tuned to capture sleep stage-specific signal features as described in the American Academy of Sleep Medicine manual that the human experts follow. We used ensemble learning with an ensemble of stacked sparse autoencoders for classifying the sleep stages. We used class-balanced random sampling across sleep stages for each model in the ensemble to avoid skewed performance in favor of the most represented sleep stages, and addressed the problem of misclassification errors due to class imbalance while significantly improving worst-stage classification. We used an openly available dataset from 20 healthy young adults for evaluation. We used a single channel of EEG from this dataset, which makes our method a suitable candidate for longitudinal monitoring using wearable EEG in real-world settings. Our method has both high overall accuracy (78%, range 75-80%), and high mean [Formula: see text]-score (84%, range 82-86%) and mean accuracy across individual sleep stages (86%, range 84-88%) over all subjects. The performance of our method appears to be uncorrelated with the sleep efficiency and percentage of transitional epochs in each recording.

  3. A marked point process of rectangles and segments for automatic analysis of digital elevation models.

    PubMed

    Ortner, Mathias; Descombe, Xavier; Zerubia, Josiane

    2008-01-01

    This work presents a framework for automatic feature extraction from images using stochastic geometry. Features in images are modeled as realizations of a spatial point process of geometrical shapes. This framework allows the incorporation of a priori knowledge on the spatial repartition of features. More specifically, we present a model based on the superposition of a process of segments and a process of rectangles. The former is dedicated to the detection of linear networks of discontinuities, while the latter aims at segmenting homogeneous areas. An energy is defined, favoring connections of segments, alignments of rectangles, as well as a relevant interaction between both types of objects. The estimation is performed by minimizing the energy using a simulated annealing algorithm. The proposed model is applied to the analysis of Digital Elevation Models (DEMs). These images are raster data representing the altimetry of a dense urban area. We present results on real data provided by the IGN (French National Geographic Institute) consisting in low quality DEMs of various types.

  4. Analysis of automatic repeat request methods for deep-space downlinks

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Ekroot, L.

    1995-01-01

    Automatic repeat request (ARQ) methods cannot increase the capacity of a memoryless channel. However, they can be used to decrease the complexity of the channel-coding system to achieve essentially error-free transmission and to reduce link margins when the channel characteristics are poorly predictable. This article considers ARQ methods on a power-limited channel (e.g., the deep-space channel), where it is important to minimize the total power needed to transmit the data, as opposed to a bandwidth-limited channel (e.g., terrestrial data links), where the spectral efficiency or the total required transmission time is the most relevant performance measure. In the analysis, we compare the performance of three reference concatenated coded systems used in actual deep-space missions to that obtainable by ARQ methods using the same codes, in terms of required power, time to transmit with a given number of retransmissions, and achievable probability of word error. The ultimate limits of ARQ with an arbitrary number of retransmissions are also derived.

  5. Automatic fault diagnosis of rotating machines by time-scale manifold ridge analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; He, Qingbo; Kong, Fanrang

    2013-10-01

    This paper explores the improved time-scale representation by considering the non-linear property for effectively identifying rotating machine faults in the time-scale domain. A new time-scale signature, called time-scale manifold (TSM), is proposed in this study through combining phase space reconstruction (PSR), continuous wavelet transform (CWT), and manifold learning. For the TSM generation, an optimal scale band is selected to eliminate the influence of unconcerned scale components, and the noise in the selected band is suppressed by manifold learning to highlight the inherent non-linear structure of faulty impacts. The TSM reserves the non-stationary information and reveals the non-linear structure of the fault pattern, with the merits of noise suppression and resolution improvement. The TSM ridge is further extracted by seeking the ridge with energy concentration lying on the TSM signature. It inherits the advantages of both the TSM and ridge analysis, and hence is beneficial to demodulation of the fault information. Through analyzing the instantaneous amplitude (IA) of the TSM ridge, in which the noise is nearly not contained, the fault characteristic frequency can be exactly identified. The whole process of the proposed fault diagnosis scheme is automatic, and its effectiveness has been verified by means of typical faulty vibration/acoustic signals from a gearbox and bearings. A reliable performance of the new method is validated in comparison with traditional enveloping methods for rotating machine fault diagnosis.

  6. Sensitivity Analysis of Photochemical Indicators for O3 Chemistry Using Automatic Differentiation

    SciTech Connect

    Zhang, Yang; Bischof, Christian H.; Easter, Richard C.; Wu, Po-Ting

    2005-05-01

    Photochemical indicators for determination of O{sub 3}-NO{sub x}-ROG sensitivity and their sensitivity to model parameters are studied for a variety of polluted conditions using a comprehensive mixed-phase chemistry box model and the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect the indicators are identified and analyzed. Condensed mixed-phase chemical mechanisms are derived from the sensitivity analysis. Our results show that cloud chemistry has a significant impact on the indicators and their sensitivities, particularly on those involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z}. Caution should be taken when applying the established threshold values of indicators in regions with large cloud coverage. Among the commonly used indicators, NO{sub y} and O{sub 3}/NO{sub y} are relatively insensitive to most model parameters, whereas indicators involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z} are highly sensitive to changes in initial species concentrations, reaction rate constants, equilibrium constants, temperature, relative humidity, cloud droplet size, and cloud water content.

  7. Sensitivity analysis of a mixed-phase chemical mechanism using automatic differentiation

    SciTech Connect

    Zhang, Y.; Easter, R.C.

    1998-08-01

    A sensitivity analysis of a comprehensive mixed-phase chemical mechanism is conducted under a variety of atmospheric conditions. The local sensitivities of gas and aqueous phase species concentrations with respect to a variety of model parameters are calculated using the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect tropospheric O{sub 3} formation and O{sub 3}-precursor relations under all modeled conditions are identified and analyzed. The results show that the presence of clouds not only reduces many gas phase species concentrations and the total oxidizing capacity but alters O{sub 3}-precursor relations. Decreases in gas phase concentrations and photochemical formation rates of O{sub 3} can be up to 9{percent} and 100{percent}, respectively, depending on the preexisting atmospheric conditions. The decrease in O{sub 3} formation is primarily caused by the aqueous phase reactions of O{sub 2}{sup {minus}} with dissolved HO{sub 2} and O{sub 3} under most cloudy conditions. {copyright} 1998 American Geophysical Union

  8. Quantitative analysis of retina layer elasticity based on automatic 3D segmentation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    He, Youmin; Qu, Yueqiao; Zhang, Yi; Ma, Teng; Zhu, Jiang; Miao, Yusi; Humayun, Mark; Zhou, Qifa; Chen, Zhongping

    2017-02-01

    Age-related macular degeneration (AMD) is an eye condition that is considered to be one of the leading causes of blindness among people over 50. Recent studies suggest that the mechanical properties in retina layers are affected during the early onset of disease. Therefore, it is necessary to identify such changes in the individual layers of the retina so as to provide useful information for disease diagnosis. In this study, we propose using an acoustic radiation force optical coherence elastography (ARF-OCE) system to dynamically excite the porcine retina and detect the vibrational displacement with phase resolved Doppler optical coherence tomography. Due to the vibrational mechanism of the tissue response, the image quality is compromised during elastogram acquisition. In order to properly analyze the images, all signals, including the trigger and control signals for excitation, as well as detection and scanning signals, are synchronized within the OCE software and are kept consistent between frames, making it possible for easy phase unwrapping and elasticity analysis. In addition, a combination of segmentation algorithms is used to accommodate the compromised image quality. An automatic 3D segmentation method has been developed to isolate and measure the relative elasticity of every individual retinal layer. Two different segmentation schemes based on random walker and dynamic programming are implemented. The algorithm has been validated using a 3D region of the porcine retina, where individual layers have been isolated and analyzed using statistical methods. The errors compared to manual segmentation will be calculated.

  9. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features.

    PubMed

    Martínez, Fabio; Manzanera, Antoine; Romero, Eduardo

    2015-01-19

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of [Formula: see text] and [Formula: see text] for the global angular acceleration and the global wing deformation respectively.

  10. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  11. Automatic Actin Filament Quantification of Osteoblasts and Their Morphometric Analysis on Microtextured Silicon-Titanium Arrays

    PubMed Central

    Matschegewski, Claudia; Staehlke, Susanne; Birkholz, Harald; Lange, Regina; Beck, Ulrich; Engel, Konrad; Nebe, J. Barbara

    2012-01-01

    Microtexturing of implant surfaces is of major relevance in the endeavor to improve biorelevant implant designs. In order to elucidate the role of biomaterial’s topography on cell physiology, obtaining quantitative correlations between cellular behavior and distinct microarchitectural properties is in great demand. Until now, the microscopically observed reorganization of the cytoskeleton on structured biomaterials has been difficult to convert into data. We used geometrically microtextured silicon-titanium arrays as a model system. Samples were prepared by deep reactive-ion etching of silicon wafers, resulting in rectangular grooves (width and height: 2 µm) and cubic pillars (pillar dimensions: 2 × 2 × 5 and 5 × 5 × 5 µm); finally sputter-coated with 100 nm titanium. We focused on the morphometric analysis of MG-63 osteoblasts, including a quantification of the actin cytoskeleton. By means of our novel software FilaQuant, especially developed for automatic actin filament recognition, we were first able to quantify the alterations of the actin network dependent on the microtexture of a material surface. The cells’ actin fibers were significantly reduced in length on the pillared surfaces versus the grooved array (4–5 fold) and completely reorganized on the micropillars, but without altering the orientation of cells. Our morpho-functional approach opens new possibilities for the data correlation of cell-material interactions.

  12. Automatic Robust Neurite Detection and Morphological Analysis of Neuronal Cell Cultures in High-content Screening

    PubMed Central

    Wu, Chaohong; Schulte, Joost; Sepp, Katharine J.; Littleton, J. Troy

    2011-01-01

    Cell-based high content screening (HCS) is becoming an important and increasingly favored approach in therapeutic drug discovery and functional genomics. In HCS, changes in cellular morphology and biomarker distributions provide an information-rich profile of cellular responses to experimental treatments such as small molecules or gene knockdown probes. One obstacle that currently exists with such cell-based assays is the availability of image processing algorithms that are capable of reliably and automatically analyzing large HCS image sets. HCS images of primary neuronal cell cultures are particularly challenging to analyze due to complex cellular morphology. Here we present a robust method for quantifying and statistically analyzing the morphology of neuronal cells in HCS images. The major advantages of our method over existing software lie in its capability to correct non-uniform illumination using the contrast-limited adaptive histogram equalization method; segment neuromeres using Gabor-wavelet texture analysis; and detect faint neurites by a novel phase-based neurite extraction algorithm that is invariant to changes in illumination and contrast and can accurately localize neurites. Our method was successfully applied to analyze a large HCS image set generated in a morphology screen for polyglutamine-mediated neuronal toxicity using primary neuronal cell cultures derived from embryos of a Drosophila Huntington’s Disease (HD) model. PMID:20405243

  13. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis.

    PubMed

    Liu, Chanjuan; van Netten, Jaap J; van Baal, Jeff G; Bus, Sicco A; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8% ± 1.1% sensitivity and 98.4% ± 0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  14. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  15. Automatic detection of basal cell carcinoma using telangiectasia analysis in dermoscopy skin lesion images.

    PubMed

    Cheng, Beibei; Erdos, David; Stanley, Ronald J; Stoecker, William V; Calcara, David A; Gómez, David D

    2011-08-01

    Telangiectasia, dilated blood vessels near the surface of the skin of small, varying diameter, are critical dermoscopy structures used in the detection of basal cell carcinoma (BCC). Distinguishing these vessels from other telangiectasia, that are commonly found in sun-damaged skin, is challenging. Image analysis techniques are investigated to find vessels structures in BCC automatically. The primary screen for vessels uses an optimized local color drop technique. A noise filter is developed to eliminate false-positive structures, primarily bubbles, hair, and blotch and ulcer edges. From the telangiectasia mask containing candidate vessel-like structures, shape, size and normalized count features are computed to facilitate the discrimination of benign skin lesions from BCCs with telangiectasia. Experimental results yielded a diagnostic accuracy as high as 96.7% using a neural network classifier for a data set of 59 BCCs and 152 benign lesions for skin lesion discrimination based on features computed from the telangiectasia masks. In current clinical practice, it is possible to find smaller BCCs by dermoscopy than by clinical inspection. Although almost all of these small BCCs have telangiectasia, they can be short and thin. Normalization of lengths and areas helps to detect these smaller BCCs. © 2011 John Wiley & Sons A/S.

  16. Performance of an automatic quantitative ultrasound analysis of the fetal lung to predict fetal lung maturity.

    PubMed

    Palacio, Montse; Cobo, Teresa; Martínez-Terrón, Mònica; Rattá, Giuseppe A; Bonet-Carné, Elisenda; Amat-Roldán, Ivan; Gratacós, Eduard

    2012-12-01

    The objective of the study was to evaluate the performance of automatic quantitative ultrasound analysis (AQUA) texture extractor to predict fetal lung maturity tests in amniotic fluid. Singleton pregnancies (24.0-41.0 weeks) undergoing amniocentesis to assess fetal lung maturity (TDx fetal lung maturity assay [FLM]) were included. A manual-delineated box was placed in the lung area of a 4-chamber view of the fetal thorax. AQUA transformed the information into a set of descriptors. Genetic algorithms extracted the most relevant descriptors and then created and validated a model that could distinguish between mature or immature fetal lungs using TDx-FLM as a reference. Gestational age at enrollment was (mean [SD]) 32.2 (4.5) weeks. According to the TDx-FLM results, 41 samples were mature and 62 were not. The imaging biomarker based on AQUA presented a sensitivity 95.1%, specificity 85.7%, and an accuracy 90.3% to predict a mature or immature lung. Fetal lung ultrasound textures extracted by AQUA provided robust features to predict TDx-FLM results. Copyright © 2012 Mosby, Inc. All rights reserved.

  17. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    PubMed

    Martínez, Leandro

    2015-01-01

    The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD) and Root Mean Square Fluctuations (RMSF) of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO) strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  18. Automatic detection of basal cell carcinoma using telangiectasia analysis in dermoscopy skin lesion images

    PubMed Central

    Cheng, Beibei; Erdos, David; Stanley, Ronald J.; Stoecker, William V.; Calcara, David A.; Gómez, David D.

    2011-01-01

    Background Telangiectasia, dilated blood vessels near the surface of the skin of small, varying diameter, are critical dermoscopy structures used in the detection of basal cell carcinoma (BCC). Distinguishing these vessels from other telangiectasia, that are commonly found in sun-damaged skin, is challenging. Methods Image analysis techniques are investigated to find vessels structures found in BCC automatically. The primary screen for vessels uses an optimized local color drop technique. A noise filter is developed to eliminate false-positive structures, primarily bubbles, hair, and blotch and ulcer edges. From the telangiectasia mask containing candidate vessel-like structures, shape, size and normalized count features are computed to facilitate the discrimination of benign skin lesions from BCCs with telangiectasia. Results Experimental results yielded a diagnostic accuracy as high as 96.7% using a neural network classifier for a data set of 59 BCCs and 152 benign lesions for skin lesion discrimination based on features computed from the telangiectasia masks. Conclusion In current clinical practice, it is possible to find smaller BCCs by dermoscopy than by clinical inspection. Although almost all of these small BCCs have telangiectasia, they can be short and thin. Normalization of lengths and areas helps to detect these smaller BCCs. PMID:23815446

  19. Automatic aerial image shadow detection through the hybrid analysis of RGB and HIS color space

    NASA Astrophysics Data System (ADS)

    Wu, Jun; Li, Huilin; Peng, Zhiyong

    2015-12-01

    This paper presents our research on automatic shadow detection from high-resolution aerial image through the hybrid analysis of RGB and HIS color space. To this end, the spectral characteristics of shadow are firstly discussed and three kinds of spectral components including the difference between normalized blue and normalized red component - BR, intensity and saturation components are selected as criterions to obtain initial segmentation of shadow region (called primary segmentation). After that, within the normalized RGB color space and HIS color space, the shadow region is extracted again (called auxiliary segmentation) using the OTSU operation, respectively. Finally, the primary segmentation and auxiliary segmentation are combined through a logical AND-connection operation to obtain reliable shadow region. In this step, small shadow areas are removed from combined shadow region and morphological algorithms are apply to fill small holes as well. The experimental results show that the proposed approach can effectively detect the shadow region from high-resolution aerial image and in high degree of automaton.

  20. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks.

    PubMed

    Wang, Changhan; Yan, Xinchen; Smith, Max; Kochhar, Kanika; Rubin, Marcie; Warren, Stephen M; Wrobel, James; Lee, Honglak

    2015-01-01

    Wound surface area changes over multiple weeks are highly predictive of the wound healing process. Furthermore, the quality and quantity of the tissue in the wound bed also offer important prognostic information. Unfortunately, accurate measurements of wound surface area changes are out of reach in the busy wound practice setting. Currently, clinicians estimate wound size by estimating wound width and length using a scalpel after wound treatment, which is highly inaccurate. To address this problem, we propose an integrated system to automatically segment wound regions and analyze wound conditions in wound images. Different from previous segmentation techniques which rely on handcrafted features or unsupervised approaches, our proposed deep learning method jointly learns task-relevant visual features and performs wound segmentation. Moreover, learned features are applied to further analysis of wounds in two ways: infection detection and healing progress prediction. To the best of our knowledge, this is the first attempt to automate long-term predictions of general wound healing progress. Our method is computationally efficient and takes less than 5 seconds per wound image (480 by 640 pixels) on a typical laptop computer. Our evaluations on a large-scale wound database demonstrate the effectiveness and reliability of the proposed system.

  1. Improving knowledge of patient skills thanks to automatic analysis of online discussions.

    PubMed

    Hamon, Thierry; Gagnayre, Rémi

    2013-08-01

    Automatically analyze the online discussions related to diabetes and extract information on patient skills for managing this disease. Two collections of about 7000 and 23,000 messages from online discussion fora and 174 skills from an available taxonomy are processed with Natural Language Processing methods and semantically enriched. Skills are projected on the messages to detect those skills which are mentioned by patients. Quantitative and qualitative evaluation is performed. The method recognizes almost all the aimed skills in fora. The quality of the skills' recognition varies with the method's parameters. Most of the selected messages are relevant to at least one of the associated skills. Manual analysis shows a substantial number of messages is dedicated to daily self-care and psychosocial skills. Study of real exchanges between patients leads to a better understanding of their skills in daily self-management of diabetes. Our experiments can be useful for a better understanding and better knowledge of self-management of diseases by patients. They can also refine existing patient education programs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    PubMed Central

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  3. Automatic Detections of P and S Phases using Singular Value Decomposition Analysis

    NASA Astrophysics Data System (ADS)

    Kurzon, I.; Vernon, F.; Ben-Zion, Y.; Rosenberger, A.

    2012-12-01

    We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm, with the modification that we apply a set of filters prior to the SVD, and study the success of these filters in detecting correctly the P and S arrivals, in different stations and segments of the San Jacinto Fault Zone. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic networks, with ~ 90 stations in a fault zone which is 150km long and 30km wide. Embedded in this network are 5 linear arrays crossing the fault trace, with ~ 10 stations at ~ 25-50m spacing in each array. This allows us to test the detection algorithm in a diverse setting, including events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, such as rays propagating within the fault and recorded on the linear arrays. Comparing our new method with classic automatic detection methods using Short Time Average (STA) to Long Time Average (LTA) ratios, we show the success of this SVD detection. Unlike the STA to LTA ratio methods that normally tend to detect the P phase, but in many cases cannot distinguish the S arrival, the main advantage of the SVD method is that almost all the P arrivals have an associated S arrival. Moreover, even for cases of short distance events, in which the S arrivals are masked by the P waves, the SVD algorithm under low band filters, manages to detect those S arrivals. The method is less consistent for stations located directly on the fault traces, in which the SVD approximation is not always valid; but even in such cases the

  4. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  5. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    PubMed

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  6. Automatic detection of cortical and PSC cataracts using texture and intensity analysis on retro-illumination lens images.

    PubMed

    Chow, Yew Chung; Gao, Xinting; Li, Huiqi; Lim, Joo Hwee; Sun, Ying; Wong, Tien Yin

    2011-01-01

    Cataract remains a leading cause for blindness worldwide. Cataract diagnosis via human grading is subjective and time-consuming. Several methods of automatic grading are currently available, but each of them suffers from some drawbacks. In this paper, a new approach for automatic detection based on texture and intensity analysis is proposed to address the problems of existing methods and improve the performance from three aspects, namely ROI detection, lens mask generation and opacity detection. In the detection method, image clipping and texture analysis are applied to overcome the over-detection problem for clear lens images and global thresholding is exploited to solve the under-detection problem for severe cataract images. The proposed method is tested on 725 retro-illumination lens images randomly selected from a database of a community study. Experiments show improved performance compared with the state-of-the-art method.

  7. Implementation of terbium-sensitized luminescence in sequential-injection analysis for automatic analysis of orbifloxacin.

    PubMed

    Llorent-Martínez, E J; Ortega-Barrales, P; Molina-Díaz, A; Ruiz-Medina, A

    2008-12-01

    Orbifloxacin (ORBI) is a third-generation fluoroquinolone developed exclusively for use in veterinary medicine, mainly in companion animals. This antimicrobial agent has bactericidal activity against numerous gram-negative and gram-positive bacteria. A few chromatographic methods for its analysis have been described in the scientific literature. Here, coupling of sequential-injection analysis and solid-phase spectroscopy is described in order to develop, for the first time, a terbium-sensitized luminescent optosensor for analysis of ORBI. The cationic resin Sephadex-CM C-25 was used as solid support and measurements were made at 275/545 nm. The system had a linear dynamic range of 10-150 ng mL(-1), with a detection limit of 3.3 ng mL(-1) and an R.S.D. below 3% (n = 10). The analyte was satisfactorily determined in veterinary drugs and dog and horse urine.

  8. Ambulatory 24-h oesophageal impedance-pH recordings: reliability of automatic analysis for gastro-oesophageal reflux assessment.

    PubMed

    Roman, S; Bruley des Varannes, S; Pouderoux, P; Chaput, U; Mion, F; Galmiche, J-P; Zerbib, F

    2006-11-01

    Oesophageal pH-impedance monitoring allows detection of acid and non-acid gastro-oesophageal reflux (GOR) events. Visual analysis of impedance recording requires expertise. Our aim was to evaluate the efficacy of an automated analysis for GOR assessment. Seventy-three patients with suspected GORD underwent 24-h oesophageal pH-impedance monitoring. Recordings analysis was performed visually (V) and automatically using Autoscan function (AS) of Bioview software. A symptom index (SI) > or =50% was considered for a significant association between symptoms and reflux events. AS analysis detected more reflux events, especially non-acid, liquid, pure gas and proximal events. Detection of oesophageal acid exposure and acid reflux events was similar with both analyses. Agreement between V and AS analysis was good (Kendall's coefficient W > 0.750, P < 0.01) for all parameters. During pH-impedance studies, 65 patients reported symptoms. As compared to visual analysis, the sensitivity and specificity of a positive SI determined by AS were respectively 85.7% and 80% for all reflux events, 100% and 98% for acid reflux and 33% and 87.5% for non-acid reflux. Despite good agreement with visual analysis, automatic analysis overestimates the number of non-acid reflux events. Visual analysis remains the gold standard to detect an association between symptoms and non-acid reflux events.

  9. Stable hydrogen isotopic analysis of nanomolar molecular hydrogen by automatic multi-step gas chromatographic separation.

    PubMed

    Komatsu, Daisuke D; Tsunogai, Urumu; Kamimura, Kanae; Konno, Uta; Ishimura, Toyoho; Nakagawa, Fumiko

    2011-11-15

    We have developed a new automated analytical system that employs a continuous flow isotope ratio mass spectrometer to determine the stable hydrogen isotopic composition (δD) of nanomolar quantities of molecular hydrogen (H(2)) in an air sample. This method improves previous methods to attain simpler and lower-cost analyses, especially by avoiding the use of expensive or special devices, such as a Toepler pump, a cryogenic refrigerator, and a special evacuation system to keep the temperature of a coolant under reduced pressure. Instead, the system allows H(2) purification from the air matrix via automatic multi-step gas chromatographic separation using the coolants of both liquid nitrogen (77 K) and liquid nitrogen + ethanol (158 K) under 1 atm pressure. The analytical precision of the δD determination using the developed method was better than 4‰ for >5 nmol injections (250 mL STP for 500 ppbv air sample) and better than 15‰ for 1 nmol injections, regardless of the δD value, within 1 h for one sample analysis. Using the developed system, the δD values of H(2) can be quantified for atmospheric samples as well as samples of representative sources and sinks including those containing small quantities of H(2) , such as H(2) in soil pores or aqueous environments, for which there is currently little δD data available. As an example of such trace H(2) analyses, we report here the isotope fractionations during H(2) uptake by soils in a static chamber. The δD values of H(2) in these H(2)-depleted environments can be useful in constraining the budgets of atmospheric H(2) by applying an isotope mass balance model.

  10. A new automatic device for routine cord blood banking: critical analysis of different volume reduction methodologies.

    PubMed

    Solves, Pilar; Mirabet, Vicente; Blanquer, Amando; Delgado-Rosas, Francisco; Planelles, Dolores; Andrade, Margarita; Carbonell-Uberos, Francisco; Soler, M Angeles; Roig, Roberto

    2009-01-01

    Volume reduction is the usual process in cord blood banking that has some advantages regarding reducing the storage space and dimethyl sulfoxide (DMSO) quantity in the final product. The volume reduction methodology must guarantee high cell recovery and red blood cell (RBC) depletion by reducing all the umbilical cord blood (UCB) units to a standard volume. We analyzed and compared critically three different volume reduction methods [hydroxyethylstarch (HES), top and bottom with Optipress II and Compomat G4, and AXP] used at the Valencia Cord Blood Bank over 10 years. The highest significant RBC depletion was achieved with the AXP system (P<0.001), while the top and bottom system with Compomat G4 and an adjusted buffy coat (BC) volume to 41 mL enabled the best total nucleated cell (TNC) recovery (P<0.001). TNC recovery and RBC depletion were similar for AXP and HES with an adjusted volume to 21 mL. In the multivariate analysis, when analyzing all cases, the BC volume set significantly influenced TNC, CD34+ and lymphocyte recoveries and RBC depletion (P<0.001). RBC depletion was significantly influenced by the initial volume and initial RBC content of UCB units (P<0.001). AXP is a highly efficient method for RBC depletion, providing the same TNC recovery as HES method with a final volume of 41 mL. AXP has the advantages of being an automatic and functionally closed system that shortens and better standardizes the proceedings. Top and bottom is a closed system that allows better TNC recoveries when the BC volume set is 41 mL.

  11. AUTOMATIC MASS SPECTROMETER

    DOEpatents

    Hanson, M.L.; Tabor, C.D. Jr.

    1961-12-01

    A mass spectrometer for analyzing the components of a gas is designed which is capable of continuous automatic operation such as analysis of samples of process gas from a continuous production system where the gas content may be changing. (AEC)

  12. Design of advanced automatic inspection system for turbine blade FPI analysis

    NASA Astrophysics Data System (ADS)

    Zheng, J.; Xie, W. F.; Viens, M.; Birglen, L.; Mantegh, I.

    2013-01-01

    Aircraft engine turbine blade is the most susceptible part to discontinuities as it works in the extremely high pressure and temperature. Among various types of NDT method, Fluorescent Penetrant Inspection (FPI) is comparably cheap and efficient thus suitable for detecting turbine blade surface discontinuities. In this paper, we have developed an Advanced Automatic Inspection System (AAIS) with Image Processing and Pattern Recognition techniques to aid human inspector. The system can automatically detect, measure and classify the discontinuities from turbine blade FPI images. The tests on the sample images provided by industrial partner have been performed to evaluate the system.

  13. Imaging and Analysis Platform for Automatic Phenotyping and Trait Ranking of Plant Root Systems1[W][OA

    PubMed Central

    Iyer-Pascuzzi, Anjali S.; Symonova, Olga; Mileyko, Yuriy; Hao, Yueling; Belcher, Heather; Harer, John; Weitz, Joshua S.; Benfey, Philip N.

    2010-01-01

    The ability to nondestructively image and automatically phenotype complex root systems, like those of rice (Oryza sativa), is fundamental to identifying genes underlying root system architecture (RSA). Although root systems are central to plant fitness, identifying genes responsible for RSA remains an underexplored opportunity for crop improvement. Here we describe a nondestructive imaging and analysis system for automated phenotyping and trait ranking of RSA. Using this system, we image rice roots from 12 genotypes. We automatically estimate RSA traits previously identified as important to plant function. In addition, we expand the suite of features examined for RSA to include traits that more comprehensively describe monocot RSA but that are difficult to measure with traditional methods. Using 16 automatically acquired phenotypic traits for 2,297 images from 118 individuals, we observe (1) wide variation in phenotypes among the genotypes surveyed; and (2) greater intergenotype variance of RSA features than variance within a genotype. RSA trait values are integrated into a computational pipeline that utilizes supervised learning methods to determine which traits best separate two genotypes, and then ranks the traits according to their contribution to each pairwise comparison. This trait-ranking step identifies candidate traits for subsequent quantitative trait loci analysis and demonstrates that depth and average radius are key contributors to differences in rice RSA within our set of genotypes. Our results suggest a strong genetic component underlying rice RSA. This work enables the automatic phenotyping of RSA of individuals within mapping populations, providing an integrative framework for quantitative trait loci analysis of RSA. PMID:20107024

  14. A Constrastive Grammatical Analysis of English and Vietnamese. A Contrastive Analysis of English and Vietnamese, Vol. 3. Pacific Linguistics, Series C--Books, No. 5.

    ERIC Educational Resources Information Center

    Nguyen, Dang Liem

    The constrastive analysis presented here is Volume Three in the author's series "A Contrastive Analysis of English and Vietnamese." Other volumes already published are "English Grammar, A Combined Tagmemic and Transformational Approach" [AL 002 422] and -Vietnamese Grammar, A Combined Tagmemic and Transformational Approach." The volume covering…

  15. Flying Qualities (Qualites de Vol)

    DTIC Science & Technology

    1991-02-01

    These new technologies have expanded flight envelopes, reduced drag, increased man(xuvrahility. provided the framework for practical gust alleviation...criteria haa in gcnccal not kept pace with these technological changes. The purpose of this Symposium was to review flying qualities issues today, and...de libertý et Li couplage intt~gral. Ces nouvelles technologies ont eu pour effet S’~argir Ie domaine de vol, de r~duire la trainee, d’accroitre Ia

  16. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  17. A system for automatic analysis of blood pressure data for digital computer entry

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1972-01-01

    Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.

  18. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  19. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  20. Ability and efficiency of an automatic analysis software to measure microvascular parameters.

    PubMed

    Carsetti, Andrea; Aya, Hollmann D; Pierantozzi, Silvia; Bazurro, Simone; Donati, Abele; Rhodes, Andrew; Cecconi, Maurizio

    2016-09-01

    Analysis of the microcirculation is currently performed offline, is time consuming and operator dependent. The aim of this study was to assess the ability and efficiency of the automatic analysis software CytoCamTools 1.7.12 (CC) to measure microvascular parameters in comparison with Automated Vascular Analysis (AVA) software 3.2. 22 patients admitted to the cardiothoracic intensive care unit following cardiac surgery were prospectively enrolled. Sublingual microcirculatory videos were analysed using AVA and CC software. The total vessel density (TVD) for small vessels, perfused vessel density (PVD) and proportion of perfused vessels (PPV) were calculated. Blood flow was assessed using the microvascular flow index (MFI) for AVA software and the averaged perfused speed indicator (APSI) for the CC software. The duration of the analysis was also recorded. Eighty-four videos from 22 patients were analysed. The bias between TVD-CC and TVD-AVA was 2.20 mm/mm(2) (95 % CI 1.37-3.03) with limits of agreement (LOA) of -4.39 (95 % CI -5.66 to -3.16) and 8.79 (95 % CI 7.50-10.01) mm/mm(2). The percentage error (PE) for TVD was ±32.2 %. TVD was positively correlated between CC and AVA (r = 0.74, p < 0.001). The bias between PVD-CC and PVD-AVA was 6.54 mm/mm(2) (95 % CI 5.60-7.48) with LOA of -4.25 (95 % CI -8.48 to -0.02) and 17.34 (95 % CI 13.11-21.57) mm/mm(2). The PE for PVD was ±61.2 %. PVD was positively correlated between CC and AVA (r = 0.66, p < 0.001). The median PPV-AVA was significantly higher than the median PPV-CC [97.39 % (95.25, 100 %) vs. 81.65 % (61.97, 88.99), p < 0.0001]. MFI categories cannot estimate or predict APSI values (p = 0.45). The time required for the analysis was shorter with CC than with AVA system [2'42″ (2'12″, 3'31″) vs. 16'12″ (13'38″, 17'57″), p < 0.001]. TVD is comparable between the two softwares, although faster with CC software. The values for PVD and PPV are not interchangeable given the

  1. The use of normalized cross-correlation analysis for automatic tendon excursion measurement in dynamic ultrasound imaging.

    PubMed

    Pearson, Stephen J; Ritchings, Tim; Mohamed, Ahmad S A

    2013-04-01

    The work describes an automated method of tracking dynamic ultrasound images using a normalized cross-correlation algorithm, applied to the patellar and gastrocnemius tendon. Displacement was examined during active and passive tendon excursions using B-mode ultrasonography. In the passive test where two regions of interest (2-ROI) were tracked, the automated tracking algorithm showed insignificant deviations from relative zero displacement for the knee (0.01 ± 0.04 mm) and ankle (-0.02 ± 0.04 mm) (P > .05). Similarly, when tracking 1-ROI the passive tests showed no significant differences (P > .05) between automatic and manual methods, 7.50 ± 0.60 vs 7.66 ± 0.63 mm for the patellar and 11.28 ± 1.36 vs 11.17 ± 1.35 mm for the gastrocnemius tests. The active tests gave no significant differences (P > .05) between automatic and manual methods with differences of 0.29 ± 0.04 mm for the patellar and 0.26 ± 0.01 mm for the gastrocnemius. This study showed that automatic tracking of in vivo displacement of tendon during dynamic excursion under load is possible and valid when compared with the standardized method. This approach will save time during analysis and enable discrete areas of the tendon to be examined.

  2. Automatic segmentation in three-dimensional analysis of fibrovascular pigmentepithelial detachment using high-definition optical coherence tomography.

    PubMed

    Ahlers, C; Simader, C; Geitzenauer, W; Stock, G; Stetson, P; Dastmalchi, S; Schmidt-Erfurth, U

    2008-02-01

    A limited number of scans compromise conventional optical coherence tomography (OCT) to track chorioretinal disease in its full extension. Failures in edge-detection algorithms falsify the results of retinal mapping even further. High-definition-OCT (HD-OCT) is based on raster scanning and was used to visualise the localisation and volume of intra- and sub-pigment-epithelial (RPE) changes in fibrovascular pigment epithelial detachments (fPED). Two different scanning patterns were evaluated. 22 eyes with fPED were imaged using a frequency-domain, high-speed prototype of the Cirrus HD-OCT. The axial resolution was 6 mum, and the scanning speed was 25 kA scans/s. Two different scanning patterns covering an area of 6 x 6 mm in the macular retina were compared. Three-dimensional topographic reconstructions and volume calculations were performed using MATLAB-based automatic segmentation software. Detailed information about layer-specific distribution of fluid accumulation and volumetric measurements can be obtained for retinal- and sub-RPE volumes. Both raster scans show a high correlation (p<0.01; R2>0.89) of measured values, that is PED volume/area, retinal volume and mean retinal thickness. Quality control of the automatic segmentation revealed reasonable results in over 90% of the examinations. Automatic segmentation allows for detailed quantitative and topographic analysis of the RPE and the overlying retina. In fPED, the 128 x 512 scanning-pattern shows mild advantages when compared with the 256 x 256 scan. Together with the ability for automatic segmentation, HD-OCT clearly improves the clinical monitoring of chorioretinal disease by adding relevant new parameters. HD-OCT is likely capable of enhancing the understanding of pathophysiology and benefits of treatment for current anti-CNV strategies in future.

  3. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  4. Studies on quantitative analysis and automatic recognition of cell types of lung cancer.

    PubMed

    Chen, Yi-Chen; Hu, Kuang-Hu; Li, Fang-Zhen; Li, Shu-Yu; Su, Wan-Fang; Huang, Zhi-Ying; Hu, Ying-Xiong

    2006-01-01

    Recognition of lung cancer cells is very important to the clinical diagnosis of lung cancer. In this paper we present a novel method to extract the structure characteristics of lung cancer cells and automatically recognize their types. Firstly soft mathematical morphology methods are used to enhance the grayscale image, to improve the definition of images, and to eliminate most of disturbance, noise and information of subordinate images, so the contour of target lung cancer cell and biological shape characteristic parameters can be extracted accurately. Then the minimum distance classifier is introduced to realize the automatic recognition of different types of lung cancer cells. A software system named "CANCER.LUNG" is established to demonstrate the efficiency of this method. The clinical experiments show that this method can accurately and objectively recognize the type of lung cancer cells, which can significantly improve the pathology research on the pathological changes of lung cancer and clinical assistant diagnoses.

  5. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  6. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  7. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    PubMed

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  8. Analysis of Automatic Automotive Gear Boxes by Means of Versatile Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Drewniak, J.; Kopeć, J.; Zawiślak, S.

    Automotive gear boxes are special mechanisms which are created based upon some planetary gears and additionally equipped in control systems. The control system allows for an activation of particular drives. In the present paper, some graph based models of these boxes are considered i.e. contour, bond and mixed graphs. An exemplary automatic gear box is considered. Based upon the introduced models, ratios for some drives have been calculated. Advantages of the proposed method of modeling are: algorithmic approach and simplicity.

  9. An automatic variational level set segmentation framework for computer aided dental X-rays analysis in clinical environments.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2006-03-01

    An automatic variational level set segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) in clinical environments is proposed. Designed for clinical environments, the segmentation contains two stages: a training stage and a segmentation stage. During the training stage, first, manually chosen representative images are segmented using hierarchical level set region detection. Then the window based feature extraction followed by principal component analysis (PCA) is applied and results are used to train a support vector machine (SVM) classifier. During the segmentation stage, dental X-rays are classified first by the trained SVM. The classifier provides initial contours which are close to correct boundaries for three coupled level sets driven by a proposed pathologically variational modeling which greatly accelerates the level set segmentation. Based on the segmentation results and uncertainty maps that are built based on a proposed uncertainty measurement, a computer aided analysis scheme is applied. The experimental results show that the proposed method is able to provide an automatic pathological segmentation which naturally segments those problem areas. Based on the segmentation results, the analysis scheme is able to provide indications of possible problem areas of bone loss and decay to the dentists. As well, the experimental results show that the proposed segmentation framework is able to speed up the level set segmentation in clinical environments.

  10. Accuracy of Coronary Plaque Detection and Assessment of Interobserver Agreement for Plaque Quantification Using Automatic Coronary Plaque Analysis Software on Coronary CT Angiography.

    PubMed

    Laqmani, A; Klink, T; Quitzke, M; Creder, D D; Adam, G; Lund, G

    2016-10-01

    To evaluate the accuracy of automatic plaque detection and the interobserver agreement of automatic versus manually adjusted quantification of coronary plaques on coronary CT angiography (cCTA) using commercially available software. 10 cCTA datasets were evaluated using plaque software. First, the automatically detected plaques were verified. Second, two observers independently performed plaque quantification without revising the automatically constructed plaque contours (automatic approach). Then, each observer adjusted the plaque contours according to plaque delineation (adjusted approach). The interobserver agreement of both approaches was analyzed. 32 of 114 automatically identified findings were true-positive plaques, while 82 (72 %) were false-positive. 20 of 52 plaques (38 %) were missed by the software (false-negative). The automatic approach provided good interobserver agreement with relative differences of 0.9 ± 16.0 % for plaque area and -3.3 ± 33.8 % for plaque volume. Both observers independently adjusted all contours because they did not represent the plaque delineation. Interobserver agreement decreased for the adjusted approach with relative differences of 25.0 ± 24.8 % for plaque area and 20.0 ± 40.4 % for plaque volume. The automatic plaque analysis software is of limited value due to high numbers of false-positive and false-negative plaque findings. The automatic approach was reproducible but it necessitated adjustment of all constructed plaque contours resulting in deterioration of the interobserver agreement. • Automatic plaque detection is limited due to high false-positive and false-negative findings.• Automatic plaque quantification was reproducible in the few accurately detected plaques.• The automatically constructed contours did not represent the plaque delineation.• Both observers independently adjusted the plaque contours.• Manual adjustment of plaque contours reduced the interobserver

  11. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  12. Rosette Tracker: An Open Source Image Analysis Tool for Automatic Quantification of Genotype Effects1[C][W

    PubMed Central

    De Vylder, Jonas; Vandenbussche, Filip; Hu, Yuming; Philips, Wilfried; Van Der Straeten, Dominique

    2012-01-01

    Image analysis of Arabidopsis (Arabidopsis thaliana) rosettes is an important nondestructive method for studying plant growth. Some work on automatic rosette measurement using image analysis has been proposed in the past but is generally restricted to be used only in combination with specific high-throughput monitoring systems. We introduce Rosette Tracker, a new open source image analysis tool for evaluation of plant-shoot phenotypes. This tool is not constrained by one specific monitoring system, can be adapted to different low-budget imaging setups, and requires minimal user input. In contrast with previously described monitoring tools, Rosette Tracker allows us to simultaneously quantify plant growth, photosynthesis, and leaf temperature-related parameters through the analysis of visual, chlorophyll fluorescence, and/or thermal infrared time-lapse sequences. Freely available, Rosette Tracker facilitates the rapid understanding of Arabidopsis genotype effects. PMID:22942389

  13. An approach to automatic blood vessel image registration of microcirculation for blood flow analysis on nude mice.

    PubMed

    Lin, Wen-Chen; Wu, Chih-Chieh; Zhang, Geoffrey; Wu, Tung-Hsin; Lin, Yang-Hsien; Huang, Tzung-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2011-04-01

    Image registration is often a required and a time-consuming step in blood flow analysis of large microscopic video sequences in vivo. In order to obtain stable images for blood flow analysis, frame-to-frame image matching as a preprocessing step is a solution to the problem of movement during image acquisition. In this paper, microscopic system analysis without fluorescent labelling is performed to provide precise and continuous quantitative data of blood flow rate in individual microvessels of nude mice. The performance properties of several matching metrics are evaluated through simulated image registrations. An automatic image registration programme based on Powell's optimisation search method with low calculation redundancy was implemented. The matching method by variance of ratio is computationally efficient and improves the registration robustness and accuracy in practical application of microcirculation registration. The presented registration method shows acceptable results in close requisition to analyse red blood cell velocities, confirming the scientific potential of the system in blood flow analysis.

  14. Automatic Detection of Laryngeal Pathology on Sustained Vowels Using Short-Term Cepstral Parameters: Analysis of Performance and Theoretical Justification

    NASA Astrophysics Data System (ADS)

    Fraile, Rubén; Godino-Llorente, Juan Ignacio; Sáenz-Lechón, Nicolás; Osma-Ruiz, Víctor; Gómez-Vilda, Pedro

    The majority of speech signal analysis procedures for automatic detection of laryngeal pathologies mainly rely on parameters extracted from time-domain processing. Moreover, calculation of these parameters often requires prior pitch period estimation; therefore, their validity heavily depends on the robustness of pitch detection. Within this paper, an alternative approach based on cepstral - domain processing is presented which has the advantage of not requiring pitch estimation, thus providing a gain in both simplicity and robustness. While the proposed scheme is similar to solutions based on Mel-frequency cepstral parameters, already present in literature, it has an easier physical interpretation while achieving similar performance standards.

  15. Automatic transmission

    SciTech Connect

    Miura, M.; Aoki, H.

    1988-02-02

    An automatic transmission is described comprising: an automatic transmission mechanism portion comprising a single planetary gear unit and a dual planetary gear unit; carriers of both of the planetary gear units that are integral with one another; an input means for inputting torque to the automatic transmission mechanism, clutches for operatively connecting predetermined ones of planetary gear elements of both of the planetary gear units to the input means and braking means for restricting the rotation of predetermined ones of planetary gear elements of both of the planetary gear units. The clutches are disposed adjacent one another at an end portion of the transmission for defining a clutch portion of the transmission; a first clutch portion which is attachable to the automatic transmission mechanism portion for comprising the clutch portion when attached thereto; a second clutch portion that is attachable to the automatic transmission mechanism portion in place of the first clutch portion for comprising the clutch portion when so attached. The first clutch portion comprising first clutch for operatively connecting the input means to a ring gear of the single planetary gear unit and a second clutch for operatively connecting the input means to a single gear of the automatic transmission mechanism portion. The second clutch portion comprising a the first clutch, the second clutch, and a third clutch for operatively connecting the input member to a ring gear of the dual planetary gear unit.

  16. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    NASA Astrophysics Data System (ADS)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  17. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    PubMed

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  18. Finite Element Analysis of Osteosynthesis Screw Fixation in the Bone Stock: An Appropriate Method for Automatic Screw Modelling

    PubMed Central

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  19. A fast automatic plate changer for the analysis of nuclear emulsions

    NASA Astrophysics Data System (ADS)

    Balestra, S.; Bertolin, A.; Bozza, C.; Calligola, P.; Cerroni, R.; D'Ambrosio, N.; Degli Esposti, L.; De Lellis, G.; De Serio, M.; Di Capua, F.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dusini, S.; Esposito, L. S.; Fini, R. A.; Giacomelli, G.; Giacomelli, R.; Grella, G.; Ieva, M.; Kose, U.; Longhin, A.; Mandrioli, G.; Mauri, N.; Medinaceli, E.; Monacelli, P.; Muciaccia, M. T.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pozzato, M.; Pupilli, F.; Rescigno, R.; Rosa, G.; Ruggieri, A.; Russo, A.; Sahnoun, Z.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Stellacci, S. M.; Strolin, P.; Tenti, M.; Tioukov, V.; Togo, V.; Valieri, C.

    2013-07-01

    This paper describes the design and performance of a computer controlled emulsion Plate Changer for the automatic placement and removal of nuclear emulsion films for the European Scanning System microscopes. The Plate Changer is used for mass scanning and measurement of the emulsions of the OPERA neutrino oscillation experiment at the Gran Sasso lab on the CNGS neutrino beam. Unlike other systems it works with both dry and oil objectives. The film changing takes less than 20 s and the accuracy on the positioning of the emulsion films is about 10 μm. The final accuracy in retrieving track coordinates after fiducial marks measurement is better than 1 μm.

  20. Fully Automatic Cross-Associations

    DTIC Science & Technology

    2004-08-01

    in Proc. 8th KDD, 2002. [16] S. Deerwester, S. T. Dumais, G. W. Furnas, T. K. Landauer, and R. Harshman, “Indexing by latent semantic analysis,” JASI...vol. 41, pp. 391–407, 1990. [17] T. G. Kolda and D. P. O’Leary, “A semidiscrete matrix decomposition for latent semantic indexing informa- tion...retrieval,” ACM Transactions on Information Systems, vol. 16, no. 4, pp. 322–346, 1998. [18] T. Hofmann, “Probabilistic latent semantic indexing,” in Proc

  1. Comparative analysis of image classification methods for automatic diagnosis of ophthalmic images

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Zhang, Kai; Liu, Xiyang; Long, Erping; Jiang, Jiewei; An, Yingying; Zhang, Jia; Liu, Zhenzhen; Lin, Zhuoling; Li, Xiaoyan; Chen, Jingjing; Cao, Qianzhong; Li, Jing; Wu, Xiaohang; Wang, Dongni; Li, Wangting; Lin, Haotian

    2017-01-01

    There are many image classification methods, but it remains unclear which methods are most helpful for analyzing and intelligently identifying ophthalmic images. We select representative slit-lamp images which show the complexity of ocular images as research material to compare image classification algorithms for diagnosing ophthalmic diseases. To facilitate this study, some feature extraction algorithms and classifiers are combined to automatic diagnose pediatric cataract with same dataset and then their performance are compared using multiple criteria. This comparative study reveals the general characteristics of the existing methods for automatic identification of ophthalmic images and provides new insights into the strengths and shortcomings of these methods. The relevant methods (local binary pattern +SVMs, wavelet transformation +SVMs) which achieve an average accuracy of 87% and can be adopted in specific situations to aid doctors in preliminarily disease screening. Furthermore, some methods requiring fewer computational resources and less time could be applied in remote places or mobile devices to assist individuals in understanding the condition of their body. In addition, it would be helpful to accelerate the development of innovative approaches and to apply these methods to assist doctors in diagnosing ophthalmic disease.

  2. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations.

  3. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI

    PubMed Central

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z.; Stone, Maureen; Prince, Jerry L.

    2014-01-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  4. A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood

    2013-02-01

    In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.

  5. Comparative analysis of image classification methods for automatic diagnosis of ophthalmic images

    PubMed Central

    Wang, Liming; Zhang, Kai; Liu, Xiyang; Long, Erping; Jiang, Jiewei; An, Yingying; Zhang, Jia; Liu, Zhenzhen; Lin, Zhuoling; Li, Xiaoyan; Chen, Jingjing; Cao, Qianzhong; Li, Jing; Wu, Xiaohang; Wang, Dongni; Li, Wangting; Lin, Haotian

    2017-01-01

    There are many image classification methods, but it remains unclear which methods are most helpful for analyzing and intelligently identifying ophthalmic images. We select representative slit-lamp images which show the complexity of ocular images as research material to compare image classification algorithms for diagnosing ophthalmic diseases. To facilitate this study, some feature extraction algorithms and classifiers are combined to automatic diagnose pediatric cataract with same dataset and then their performance are compared using multiple criteria. This comparative study reveals the general characteristics of the existing methods for automatic identification of ophthalmic images and provides new insights into the strengths and shortcomings of these methods. The relevant methods (local binary pattern +SVMs, wavelet transformation +SVMs) which achieve an average accuracy of 87% and can be adopted in specific situations to aid doctors in preliminarily disease screening. Furthermore, some methods requiring fewer computational resources and less time could be applied in remote places or mobile devices to assist individuals in understanding the condition of their body. In addition, it would be helpful to accelerate the development of innovative approaches and to apply these methods to assist doctors in diagnosing ophthalmic disease. PMID:28139688

  6. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    NASA Astrophysics Data System (ADS)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  7. Comparative reliability analysis of publicly available software packages for automatic intracranial volume estimation.

    PubMed

    Sargolzaei, S; Goryawala, M; Cabrerizo, M; Chen, G; Jayakar, P; Duara, R; Barker, W; Adjouadi, M

    2014-01-01

    Intracranial volume is an important measure in brain research often used as a correction factor in inter subject studies. The current study investigates the resulting outcome in terms of the type of software used for automatically estimating ICV measure. Five groups of 70 subjects are considered, including adult controls (AC) (n=11), adult with dementia (AD) (n=11), pediatric controls (PC) (n=18) and two groups of pediatric epilepsy subjects (PE1.5 and PE3) (n=30) using 1.5 T and 3T scanners, respectively. Reference measurements were calculated for each subject by manually tracing intracranial cavity without sub-sampling. Four publicly available software packages (AFNI, Freesurfer, FSL, and SPM) were examined in their ability to automatically estimate ICV across the five groups. Linear regression analyses suggest that reference measurement discrepancy could be explained best by SPM [R(2)= 0.67;p <; 0.01] for the AC group, Freesurfer [R(2) = 0.46; p = 0.02] for the AD group, AFNI [R(2)=0.97;p<; 0.01] for the PC group and FSL [R(2) = 0.6; p = 0.1] for the PE1.5 and [R(2) = 0.6; p <; 0.01] for PE3 groups. The study demonstrates that the choice of the automated software for ICV estimation is dependent on the population under consideration and whether the software used is atlas-based or not.

  8. Automatic Identification of Motion Artifacts in EHG Recording for Robust Analysis of Uterine Contractions

    PubMed Central

    Ye-Lin, Yiyao; Alberola-Rubio, José; Perales, Alfredo

    2014-01-01

    Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique. PMID:24523828

  9. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images

    PubMed Central

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  10. Automatic Tracking and Motility Analysis of Human Sperm in Time-Lapse Images.

    PubMed

    Urbano, Leonardo F; Masson, Puneet; VerMilyea, Matthew; Kam, Moshe

    2017-03-01

    We present a fully automated multi-sperm tracking algorithm. It has the demonstrated capability to detect and track simultaneously hundreds of sperm cells in recorded videos while accurately measuring motility parameters over time and with minimal operator intervention. Algorithms of this kind may help in associating dynamic swimming parameters of human sperm cells with fertility and fertilization rates. Specifically, we offer an image processing method, based on radar tracking algorithms, that detects and tracks automatically the swimming paths of human sperm cells in timelapse microscopy image sequences of the kind that is analyzed by fertility clinics. Adapting the well-known joint probabilistic data association filter (JPDAF), we automatically tracked hundreds of human sperm simultaneously and measured their dynamic swimming parameters over time. Unlike existing CASA instruments, our algorithm has the capability to track sperm swimming in close proximity to each other and during apparent cell-to-cell collisions. Collecting continuously parameters for each sperm tracked without sample dilution (currently impossible using standard CASA systems) provides an opportunity to compare such data with standard fertility rates. The use of our algorithm thus has the potential to free the clinician from having to rely on elaborate motility measurements obtained manually by technicians, speed up semen processing, and provide medical practitioners and researchers with more useful data than are currently available.

  11. On the automaticity and flexibility of covert attention: a speed-accuracy trade-off analysis.

    PubMed

    Giordano, Anna Marie; McElree, Brian; Carrasco, Marisa

    2009-03-31

    Exogenous covert attention improves discriminability and accelerates the rate of visual information processing (M. Carrasco & B. McElree, 2001). Here we investigated and compared the effects of both endogenous (sustained) and exogenous (transient) covert attention. Specifically, we directed attention via spatial cues and evaluated the automaticity and flexibility of exogenous and endogenous attention by manipulating cue validity in conjunction with a response-signal speed-accuracy trade-off (SAT) procedure, which provides conjoint measures of discriminability and information accrual. To investigate whether discriminability and rate of information processing differ as a function of cue validity (chance to 100%), we compared how both types of attention affect performance while keeping experimental conditions constant. With endogenous attention, both the observed benefits (valid-cue) and the costs (invalid-cue) increased with cue validity. However, with exogenous attention, the benefits and costs in both discriminability and processing speed were similar across cue validity conditions. These results provide compelling time-course evidence that whereas endogenous attention can be flexibly allocated according to cue validity, exogenous attention is automatic and unaffected by cue validity.

  12. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    PubMed

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  13. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    PubMed

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  14. Automatic Sampling and Analysis of Organics and Biomolecules by Capillary Action-Supported Contactless Atmospheric Pressure Ionization Mass Spectrometry

    PubMed Central

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated. PMID:23762484

  15. Mass-spectra-based peak alignment for automatic nontargeted metabolic profiling analysis for biomarker screening in plant samples.

    PubMed

    Fu, Hai-Yan; Hu, Ou; Zhang, Yue-Ming; Zhang, Li; Song, Jing-Jing; Lu, Peang; Zheng, Qing-Xia; Liu, Ping-Ping; Chen, Qian-Si; Wang, Bing; Wang, Xiao-Yu; Han, Lu; Yu, Yong-Jie

    2017-09-01

    Nontargeted metabolic profiling analysis is a difficult task in a routine investigation because hundreds of chromatographic peaks are eluted within a short time, and the time shift problem is severe across samples. To address these problems, the present work developed an automatic nontargeted metabolic profiling analysis (anTMPA) method. First, peaks from the total ion chromatogram were extracted using modified multiscale Gaussian smoothing method. Then, a novel peak alignment strategy was employed based on the mass spectra and retention times of the peaks in which the maximum mass spectral correlation coefficient path was extracted using a modified dynamic programming method. Moreover, an automatic landmark peak-searching strategy was employed for self-adapting time shift modification. Missing peaks across samples were grouped and registered into the aligned peak list table for final refinement. Finally, the aligned peaks across samples were analyzed using statistical methods to identify potential biomarkers. Mass spectral information on the screened biomarkers could be directly imported into the National Institute of Standards and Technology library to select the candidate compounds. The performance of the anTMPA method was evaluated using a complicated plant gas chromatography-mass spectrometry dataset with the aim of identifying biomarkers between the growth and maturation stages of the tested plant. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques.

    PubMed

    Akyol, Kemal; Şen, Baha; Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  17. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    PubMed Central

    Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272

  18. Analysis of the distances covered by first division brazilian soccer players obtained with an automatic tracking method.

    PubMed

    Barros, Ricardo M L; Misuta, Milton S; Menezes, Rafael P; Figueroa, Pascual J; Moura, Felipe A; Cunha, Sergio A; Anido, Ricardo; Leite, Neucimar J

    2007-01-01

    Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. Key pointsA novel automatic tracking method was presented. No previous

  19. Analysis of the Distances Covered by First Division Brazilian Soccer Players Obtained with an Automatic Tracking Method

    PubMed Central

    Barros, Ricardo M.L.; Misuta, Milton S.; Menezes, Rafael P.; Figueroa, Pascual J.; Moura, Felipe A.; Cunha, Sergio A.; Anido, Ricardo; Leite, Neucimar J.

    2007-01-01

    Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players’, both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. Key pointsA novel automatic tracking method was presented. No previous

  20. γH2AX foci as a measure of DNA damage: a computational approach to automatic analysis

    PubMed Central

    Ivashkevich, Alesia N.; Martin, Olga A.; Smith, Andrea J.; Redon, Christophe E.; Bonner, William M.; Martin, Roger F.; Lobachevsky, Pavel N.

    2011-01-01

    The γH2AX focus assay represents a fast and sensitive approach for detection of one of the critical types of DNA damage – double-strand breaks (DSB) induced by various cytotoxic agents including ionising radiation. Apart from research applications, the assay has a potential in clinical medicine/pathology, such as assessment of individual radiosensitivity, response to cancer therapies, as well as in biodosimetry. Given that generally there is a direct relationship between numbers of microscopically visualised γH2AX foci and DNA DSB in a cell, the number of foci per nucleus represents the most efficient and informative parameter of the assay. Although computational approaches have been developed for automatic focus counting, the tedious and time consuming manual focus counting still remains the most reliable approach due to limitations of computational approaches. We suggest a computational approach and associated software for automatic focus counting that minimises these limitations. Our approach, while using standard image processing algorithms, maximises the automation of identification of nuclei/cells in complex images, offers an efficient way to optimise parameters used in the image analysis and counting procedures, optionally invokes additional procedures to deal with variations in intensity of the signal and background in individual images, and provides automatic batch processing of a series of images. We report results of validation studies that demonstrated correlation of manual focus counting with results obtained using our computational algorithm for mouse jejunum touch prints, mouse tongue sections and human blood lymphocytes as well as radiation dose response of γH2AX focus induction for these biological specimens. PMID:21216255

  1. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  2. Automatic analysis of cerebral asymmetry: an exploratory study of the relationship between brain torque and planum temporale asymmetry.

    PubMed

    Barrick, Thomas R; Mackay, Clare E; Prima, Sylvain; Maes, Frederik; Vandermeulen, Dirk; Crow, Timothy J; Roberts, Neil

    2005-02-01

    Leftward occipital and rightward frontal lobe asymmetry (brain torque) and leftward planum temporale asymmetry have been consistently reported in postmortem and in vivo neuroimaging studies of the human brain. Here automatic image analysis techniques are applied to quantify global and local asymmetries, and investigate the relationship between brain torque and planum temporale asymmetries on T1-weighted magnetic resonance (MR) images of 30 right-handed young healthy subjects (15 male, 15 female). Previously described automatic cerebral hemisphere extraction and 3D interhemispheric reflection-based methods for studying brain asymmetry are applied with a new technique, LowD (Low Dimension), which enables automatic quantification of brain torque. LowD integrates extracted left and right cerebral hemispheres in columns orthogonal to the midsagittal plane (2D column maps), and subsequently integrates slices along the brain's anterior-posterior axis (1D slice profiles). A torque index defined as the magnitude of occipital and frontal lobe asymmetry is computed allowing exploratory investigation of relationships between this global asymmetry and local asymmetries found in the planum temporale. LowD detected significant torque in the 30 subjects with occipital and frontal components found to be highly correlated (P<0.02). Significant leftward planum temporale asymmetry was detected (P<0.05), and the torque index correlated with planum temporale asymmetry (P<0.001). However, torque and total brain volume were not correlated. Therefore, although components of cerebral asymmetry may be related, their magnitude is not influenced by total hemisphere volume. LowD provides increased sensitivity for detection and quantification of brain torque on an individual subject basis, and future studies will apply these techniques to investigate the relationship between cerebral asymmetry and functional laterality.

  3. An automatic detector of drowsiness based on spectral analysis and wavelet decomposition of EEG records.

    PubMed

    Garces Correa, Agustina; Laciar Leber, Eric

    2010-01-01

    An algorithm to detect automatically drowsiness episodes has been developed. It uses only one EEG channel to differentiate the stages of alertness and drowsiness. In this work the vectors features are building combining Power Spectral Density (PDS) and Wavelet Transform (WT). The feature extracted from the PSD of EEG signal are: Central frequency, the First Quartile Frequency, the Maximum Frequency, the Total Energy of the Spectrum, the Power of Theta and Alpha bands. In the Wavelet Domain, it was computed the number of Zero Crossing and the integrated from the scale 3, 4 and 5 of Daubechies 2 order WT. The classifying of epochs is being done with neural networks. The detection results obtained with this technique are 86.5 % for drowsiness stages and 81.7% for alertness segment. Those results show that the features extracted and the classifier are able to identify drowsiness EEG segments.

  4. The Analysis of Image Contrast: From Quality Assessment to Automatic Enhancement.

    PubMed

    Gu, Ke; Zhai, Guangtao; Lin, Weisi; Liu, Min

    2016-01-01

    Proper contrast change can improve the perceptual quality of most images, but it has largely been overlooked in the current research of image quality assessment (IQA). To fill this void, we in this paper first report a new large dedicated contrast-changed image database (CCID2014), which includes 655 images and associated subjective ratings recorded from 22 inexperienced observers. We then present a novel reduced-reference image quality metric for contrast change (RIQMC) using phase congruency and statistics information of the image histogram. Validation of the proposed model is conducted on contrast related CCID2014, TID2008, CSIQ and TID2013 databases, and results justify the superiority and efficiency of RIQMC over a majority of classical and state-of-the-art IQA methods. Furthermore, we combine aforesaid subjective and objective assessments to derive the RIQMC based Optimal HIstogram Mapping (ROHIM) for automatic contrast enhancement, which is shown to outperform recently developed enhancement technologies.

  5. An Analysis of Serial Number Tracking Automatic Identification Technology as Used in Naval Aviation Programs

    NASA Astrophysics Data System (ADS)

    Csorba, Robert

    2002-09-01

    The Government Accounting Office found that the Navy, between 1996 and 1998, lost 3 billion in materiel in-transit. This thesis explores the benefits and cost of automatic identification and serial number tracking technologies under consideration by the Naval Supply Systems Command and the Naval Air Systems Command. Detailed cost-savings estimates are made for each aircraft type in the Navy inventory. Project and item managers of repairable components using Serial Number Tracking were surveyed as to the value of this system. It concludes that two thirds of the in-transit losses can be avoided with implementation of effective information technology-based logistics and maintenance tracking systems. Recommendations are made for specific steps and components of such an implementation. Suggestions are made for further research.

  6. Flow measurements in sewers based on image analysis: automatic flow velocity algorithm.

    PubMed

    Jeanbourquin, D; Sage, D; Nguyen, L; Schaeli, B; Kayal, S; Barry, D A; Rossi, L

    2011-01-01

    Discharges of combined sewer overflows (CSOs) and stormwater are recognized as an important source of environmental contamination. However, the harsh sewer environment and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. An in situ system for sewer water flow monitoring based on video images was evaluated. Algorithms to determine water velocities were developed based on image-processing techniques. The image-based water velocity algorithm identifies surface features and measures their positions with respect to real world coordinates. A web-based user interface and a three-tier system architecture enable remote configuration of the cameras and the image-processing algorithms in order to calculate automatically flow velocity on-line. Results of investigations conducted in a CSO are presented. The system was found to measure reliably water velocities, thereby providing the means to understand particular hydraulic behaviors.

  7. Versatile, high sensitivity, and automatized angular dependent vectorial Kerr magnetometer for the analysis of nanostructured materials.

    PubMed

    Teixeira, J M; Lusche, R; Ventura, J; Fermento, R; Carpinteiro, F; Araujo, J P; Sousa, J B; Cardoso, S; Freitas, P P

    2011-04-01

    Magneto-optical Kerr effect (MOKE) magnetometry is an indispensable, reliable, and one of the most widely used techniques for the characterization of nanostructured magnetic materials. Information, such as the magnitude of coercive fields or anisotropy strengths, can be readily obtained from MOKE measurements. We present a description of our state-of-the-art vectorial MOKE magnetometer, being an extremely versatile, accurate, and sensitivity unit with a low cost and comparatively simple setup. The unit includes focusing lenses and an automatized stepper motor stage for angular dependent measurements. The performance of the magnetometer is demonstrated by hysteresis loops of Co thin films displaying uniaxial anisotropy induced on growth, MnIr/CoFe structures exhibiting the so called exchange bias effect, spin valves, and microfabricated flux guides produced by optical lithography.

  8. SpLiNeS: automatic analysis of ecographic movies of flow-mediated dilation

    NASA Astrophysics Data System (ADS)

    Bartoli, Guido; Menegaz, Gloria; Dragoni, Saverio; Gori, Tommaso

    2007-03-01

    In this paper, we propose a fully automatic system for analyzing ecographic movies of flow-mediated dilation. Our approach uses a spline-based active contour (deformable template) to follow artery boundaries during the FMD procedure. A number of preprocessing steps (grayscale conversion, contrast enhancing, sharpening) are used to improve the visual quality of frames coming from the echographic acquisition. Our system can be used in real-time environments due to the high speed of edge recognition which iteratively minimizes fitting errors on endothelium boundaries. We also implemented a fully functional GUI which permits to interactively follow the whole recognition process as well as to reshape the results. The system accuracy and reproducibility has been validated with extensive in vivo experiments.

  9. Analysis of parameters for the automatic computation of the tear film break-up time test based on CCLRU standards.

    PubMed

    Ramos, L; Barreira, N; Mosquera, A; Penedo, M G; Yebra-Pimentel, E; García-Resúa, C

    2014-03-01

    Dry eye syndrome is affecting a remarkable percentage of population. The prevalence is 10-15% of normal population, and 18-30% of contact lenses users. The break-up time (BUT) is a clinical test used for the diagnosis of this disease. In this work, we perform an analysis of parameters for a global and a local automatic computation of the BUT measure, based on criteria of specificity and sensitivity. We have tested our methodology on a dataset composed of 18 videos annotated by 4 different experts. The local analysis preserves the results of the global approach providing useful additional information about the break-up tear zone. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Automatic cell segmentation and nuclear-to-cytoplasmic ratio analysis for third harmonic generated microscopy medical images.

    PubMed

    Lee, Gwo Giun; Lin, Huan-Hsiang; Tsai, Ming-Rung; Chou, Sin-Yo; Lee, Wen-Jeng; Liao, Yi-Hua; Sun, Chi-Kuang; Chen, Chun-Fu

    2013-04-01

    Traditional biopsy procedures require invasive tissue removal from a living subject, followed by time-consuming and complicated processes, so noninvasive in vivo virtual biopsy, which possesses the ability to obtain exhaustive tissue images without removing tissues, is highly desired. Some sets of in vivo virtual biopsy images provided by healthy volunteers were processed by the proposed cell segmentation approach, which is based on the watershed-based approach and the concept of convergence index filter for automatic cell segmentation. Experimental results suggest that the proposed algorithm not only reveals high accuracy for cell segmentation but also has dramatic potential for noninvasive analysis of cell nuclear-to-cytoplasmic ratio (NC ratio), which is important in identifying or detecting early symptoms of diseases with abnormal NC ratios, such as skin cancers during clinical diagnosis via medical imaging analysis.

  11. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    NASA Astrophysics Data System (ADS)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  12. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Rashidi, Mehdi; Dehmeshki, Jamshid; Dickenson, Eric; Daemi, M. Farhang

    1997-10-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurement within a porous medium. An aqueous fluid lace with a fluorescent dye to microspheres flows through a transparent, refractive-index-matched column packed with transparent crystals. For illumination purposes, a planar sheet of laser passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fields have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows through the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder. The recorded images are acquired automatically frame by frame and transferred to the computer for processing, by using a frame grabber an written relevant algorithms through an RS-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these enhanced particles are monitored to calculate velocity vectors in the plane of the beam. For concentration measurements, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact images that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentrations as a function of time within the porous column.

  13. Automatic quantification of IHC stain in breast TMA using colour analysis.

    PubMed

    Fernández-Carrobles, M Milagro; Bueno, Gloria; García-Rojo, Marcial; González-López, Lucía; López, Carlos; Déniz, Oscar

    2017-06-13

    Immunohistochemical (IHC) biomarkers in breast tissue microarray (TMA) samples are used daily in pathology departments. In recent years, automatic methods to evaluate positive staining have been investigated since they may save time and reduce errors in the diagnosis. These errors are mostly due to subjective evaluation. The aim of this work is to develop a density tool able to automatically quantify the positive brown IHC stain in breast TMA for different biomarkers. To avoid the problem of colour variation and make a robust tool independent of the staining process, several colour standardization methods have been analysed. Four colour standardization methods have been compared against colour model segmentation. The standardization methods have been compared by means of NBS colour distance. The use of colour standardization helps to reduce noise due to stain and histological sample preparation. However, the most reliable and robust results have been obtained by combining the HSV and RGB colour models for segmentation with the HSB channels. The segmentation provides three outputs based on three saturation values for weak, medium and strong staining. Each output image can be combined according to the type of biomarker staining. The results with 12 biomarkers were evaluated and compared to the segmentation and density calculation done by expert pathologists. The Hausdorff distance, sensitivity and specificity have been used to quantitative validate the results. The tests carried out with 8000 TMA images provided an average of 95.94% accuracy applied to the total tissue cylinder area. Colour standardization was used only when the tissue core had blurring and fading stain and the expert could not evaluate them without a pre-processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA) and Spectral Discrimination Measurements

    PubMed Central

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-01

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials. PMID:25625907

  15. Automatic extraction of optimal endmembers from airborne hyperspectral imagery using iterative error analysis (IEA) and spectral discrimination measurements.

    PubMed

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-23

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials.

  16. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  17. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    NASA Astrophysics Data System (ADS)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  18. Extraction and parametrization of grain boundary networks in glacier ice, using a dedicated method of automatic image analysis.

    PubMed

    Binder, T; Garbe, C S; Wagenbach, D; Freitag, J; Kipfstuhl, S

    2013-05-01

    Microstructure analysis of polar ice cores is vital to understand the processes controlling the flow of polar ice on the microscale. This paper presents an automatic image processing framework for extraction and parametrization of grain boundary networks from images of the NEEM deep ice core. As cross-section images are acquired using controlled surface sublimation, grain boundaries and air inclusions appear dark, whereas the inside of grains appears grey. The initial segmentation step of the software is to separate possible boundaries of grains and air inclusions from background. A Machine learning approach is utilized to gain automatic, reliable classification, which is required for processing large data sets along deep ice cores. The second step is to compose the perimeter of section profiles of grains by planar sections of the grain surface between triple points. Ultimately, grain areas, grain boundaries and triple junctions of the later are diversely parametrized. High resolution is achieved, so that small grain sizes and local curvatures of grain boundaries can systematically be investigated. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  19. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1988-01-19

    approach for the analysis of aerial images. In this approach image analysis is performed ast three levels of abstraction, namely iconic or low-level... image analysis , symbolic or medium-level image analysis , and semantic or high-level image analysis . Domain dependent knowledge about prototypical urban

  20. Conversation analysis at work: detection of conflict in competitive discussions through semi-automatic turn-organization analysis.

    PubMed

    Pesarin, Anna; Cristani, Marco; Murino, Vittorio; Vinciarelli, Alessandro

    2012-10-01

    This study proposes a semi-automatic approach aimed at detecting conflict in conversations. The approach is based on statistical techniques capable of identifying turn-organization regularities associated with conflict. The only manual step of the process is the segmentation of the conversations into turns (time intervals during which only one person talks) and overlapping speech segments (time intervals during which several persons talk at the same time). The rest of the process takes place automatically and the results show that conflictual exchanges can be detected with Precision and Recall around 70% (the experiments have been performed over 6 h of political debates). The approach brings two main benefits: the first is the possibility of analyzing potentially large amounts of conversational data with a limited effort, the second is that the model parameters provide indications on what turn-regularities are most likely to account for the presence of conflict.

  1. Isothermal reduction kinetics of Panzhihua ilmenite concentrate under 30vol% CO-70vol% N2 atmosphere

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-yi; Lü, Wei; Lü, Xue-wei; Li, Sheng-ping; Bai, Chen-guang; Song, Bing; Han, Ke-xi

    2017-03-01

    The reduction of ilmenite concentrate in 30vol% CO-70vol% N2 atmosphere was characterized by thermogravimetric and differential thermogravimetric (TG-DTG) analysis methods at temperatures from 1073 to 1223 K. The isothermal reduction results show that the reduction process comprised two stages; the corresponding apparent activation energy was obtained by the iso-conversional and model-fitting methods. For the first stage, the effect of temperature on the conversion degree was not obvious, the phase boundary chemical reaction was the controlling step, with an apparent activation energy of 15.55-40.71 kJ·mol-1. For the second stage, when the temperatures was greater than 1123 K, the reaction rate and the conversion degree increased sharply with increasing temperature, and random nucleation and subsequent growth were the controlling steps, with an apparent activation energy ranging from 182.33 to 195.95 kJ·mol-1. For the whole reduction process, the average activation energy and pre-exponential factor were 98.94-118.33 kJ·mol-1 and 1.820-1.816 min-1, respectively.

  2. Monitoring infants by automatic video processing: A unified approach to motion analysis.

    PubMed

    Cattani, Luca; Alinovi, Davide; Ferrari, Gianluigi; Raheli, Riccardo; Pavlidis, Elena; Spagnoli, Carlotta; Pisani, Francesco

    2017-01-01

    A unified approach to contact-less and low-cost video processing for automatic detection of neonatal diseases characterized by specific movement patterns is presented. This disease category includes neonatal clonic seizures and apneas. Both disorders are characterized by the presence or absence, respectively, of periodic movements of parts of the body-e.g., the limbs in case of clonic seizures and the chest/abdomen in case of apneas. Therefore, one can analyze the data obtained from multiple video sensors placed around a patient, extracting relevant motion signals and estimating, using the Maximum Likelihood (ML) criterion, their possible periodicity. This approach is very versatile and allows to investigate various scenarios, including: a single Red, Green and Blue (RGB) camera, an RGB-depth sensor or a network of a few RGB cameras. Data fusion principles are considered to aggregate the signals from multiple sensors. In the case of apneas, since breathing movements are subtle, the video can be pre-processed by a recently proposed algorithm which is able to emphasize small movements. The performance of the proposed contact-less detection algorithms is assessed, considering real video recordings of newborns, in terms of sensitivity, specificity, and Receiver Operating Characteristic (ROC) curves, with respect to medical gold standard devices. The obtained results show that a video processing-based system can effectively detect the considered specific diseases, with increasing performance for increasing number of sensors.

  3. Analysis of electric energy consumption of automatic milking systems in different configurations and operative conditions.

    PubMed

    Calcante, Aldo; Tangorra, Francesco M; Oberti, Roberto

    2016-05-01

    Automatic milking systems (AMS) have been a revolutionary innovation in dairy cow farming. Currently, more than 10,000 dairy cow farms worldwide use AMS to milk their cows. Electric consumption is one of the most relevant and uncontrollable operational cost of AMS, ranging between 35 and 40% of their total annual operational costs. The aim of the present study was to measure and analyze the electric energy consumption of 4 AMS with different configurations: single box, central unit featuring a central vacuum system for 1 cow unit and for 2 cow units. The electrical consumption (daily consumption, daily consumption per cow milked, consumption per milking, and consumption per 100L of milk) of each AMS (milking unit + air compressor) was measured using 2 energy analyzers. The measurement period lasted 24h with a sampling frequency of 0.2Hz. The daily total energy consumption (milking unit + air compressor) ranged between 45.4 and 81.3 kWh; the consumption per cow milked ranged between 0.59 and 0.99 kWh; the consumption per milking ranged between 0.21 and 0.33 kWh; and the consumption per 100L of milk ranged between 1.80 to 2.44 kWh according to the different configurations and operational contexts considered. Results showed that AMS electric consumption was mainly conditioned by farm management rather than machine characteristics/architectures. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Multi-objective differential evolution for automatic clustering with application to micro-array data analysis.

    PubMed

    Suresh, Kaushik; Kundu, Debarati; Ghosh, Sayan; Das, Swagatam; Abraham, Ajith; Han, Sang Yong

    2009-01-01

    This paper applies the Differential Evolution (DE) algorithm to the task of automatic fuzzy clustering in a Multi-objective Optimization (MO) framework. It compares the performances of two multi-objective variants of DE over the fuzzy clustering problem, where two conflicting fuzzy validity indices are simultaneously optimized. The resultant Pareto optimal set of solutions from each algorithm consists of a number of non-dominated solutions, from which the user can choose the most promising ones according to the problem specifications. A real-coded representation of the search variables, accommodating variable number of cluster centers, is used for DE. The performances of the multi-objective DE-variants have also been contrasted to that of two most well-known schemes of MO clustering, namely the Non Dominated Sorting Genetic Algorithm (NSGA II) and Multi-Objective Clustering with an unknown number of Clusters K (MOCK). Experimental results using six artificial and four real life datasets of varying range of complexities indicate that DE holds immense promise as a candidate algorithm for devising MO clustering schemes.

  5. Analysis of individual classification of lameness using automatic measurement of back posture in dairy cattle.

    PubMed

    Viazzi, S; Bahr, C; Schlageter-Tello, A; Van Hertem, T; Romanini, C E B; Pluk, A; Halachmi, I; Lokhorst, C; Berckmans, D

    2013-01-01

    Currently, diagnosis of lameness at an early stage in dairy cows relies on visual observation by the farmer, which is time consuming and often omitted. Many studies have tried to develop automatic cow lameness detection systems. However, those studies apply thresholds to the whole population to detect whether or not an individual cow is lame. Therefore, the objective of this study was to develop and test an individualized version of the body movement pattern score, which uses back posture to classify lameness into 3 classes, and to compare both the population and the individual approach under farm conditions. In a data set of 223 videos from 90 cows, 76% of cows were correctly classified, with an 83% true positive rate and 22% false positive rate when using the population approach. A new data set, containing 105 videos of 8 cows that had moved through all 3 lameness classes, was used for an ANOVA on the 3 different classes, showing that body movement pattern scores differed significantly among cows. Moreover, the classification accuracy and the true positive rate increased by 10 percentage units up to 91%, and the false positive rate decreased by 4 percentage units down to 6% when based on an individual threshold compared with a population threshold. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Automatic bone registration in MR knee images for cartilage morphological analysis

    NASA Astrophysics Data System (ADS)

    Yoo, Ji Hyun; Kim, Soo Kyung; Hong, Helen; Shim, Hackjoon; Kwoh, C. Kent; Bae, Kyongtae T.

    2009-02-01

    We propose a cartilage matching technique based on the registration of the corresponding bone structures instead of using the cartilage. Our method consists of five steps. First, cartilage and corresponding bone structures are extracted by semi-automatic segmentation. Second, gross translational mismatch between corresponding bone structures is corrected by point-based rough registration. The center of inertia (COI) of each segmented bone structure is considered as the reference point. Third, the initial alignment is refined by distance-based surface registration. For fast and robust convergence of the distance measure to the optimal value, a 3D distance map is generated by the Gaussian-weighted narrow-band distance propagation. Fourth, rigid transformation of the bone surface registration is applied to the cartilage of baseline MR images. Finally, morphological differences of the corresponding cartilages are visualized by color-coded mapping and image fusion. Experimental results show that the cartilage morphological changes of baseline and follow-up MR knee images can be easily recognized by the correct registration of the corresponding bones.

  7. Volume-based Feature Analysis of Mucosa for Automatic Initial Polyp Detection in Virtual Colonoscopy

    PubMed Central

    Wang, Su; Zhu, Hongbin; Lu, Hongbing; Liang, Zhengrong

    2009-01-01

    In this paper, we present a volume-based mucosa-based polyp candidate determination scheme for automatic polyp detection in computed colonography. Different from most of the existing computer-aided detection (CAD) methods where mucosa layer is a one-layer surface, a thick mucosa of 3-5 voxels wide fully reflecting partial volume effect is intentionally extracted, which excludes the direct applications of the traditional geometrical features. In order to address this dilemma, fast marching-based adaptive gradient/curvature and weighted integral curvature along normal directions (WICND) are developed for volume-based mucosa. In doing so, polyp candidates are optimally determined by computing and clustering these fast marching-based adaptive geometrical features. By testing on 52 patients datasets in which 26 patients were found with polyps of size 4-22 mm, both the locations and number of polyp candidates detected by WICND and previously developed linear integral curvature (LIC) were compared. The results were promising that WICND outperformed LIC mainly in two aspects: (1) the number of detected false positives was reduced from 706 to 132 on average, which significantly released our burden of machine learning in the feature space, and (2) both the sensitivity and accuracy of polyp detection have been slightly improved, especially for those polyps smaller than 5mm. PMID:19774204

  8. Automatic nevi segmentation using adaptive mean shift filters and feature analysis

    NASA Astrophysics Data System (ADS)

    King, Michael A.; Lee, Tim K.; Atkins, M. Stella; McLean, David I.

    2004-05-01

    A novel automatic method of segmenting nevi is explained and analyzed in this paper. The first step in nevi segmentation is to iteratively apply an adaptive mean shift filter to form clusters in the image and to remove noise. The goal of this step is to remove differences in skin intensity and hairs from the image, while still preserving the shape of nevi present on the skin. Each iteration of the mean shift filter changes pixel values to be a weighted average of pixels in its neighborhood. Some new extensions to the mean shift filter are proposed to allow for better segmentation of nevi from the skin. The kernel, that describes how the pixels in its neighborhood will be averaged, is adaptive; the shape of the kernel is a function of the local histogram. After initial clustering, a simple merging of clusters is done. Finally, clusters that are local minima are found and analyzed to determine which clusters are nevi. When this algorithm was compared to an assessment by an expert dermatologist, it showed a sensitivity rate and diagnostic accuracy of over 95% on the test set, for nevi larger than 1.5mm.

  9. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents

    PubMed Central

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M.; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  10. Comparison of automatic control systems

    NASA Technical Reports Server (NTRS)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  11. Evaluation of two software tools dedicated to an automatic analysis of the CT scanner image spatial resolution.

    PubMed

    Torfeh, Tarraf; Beaumont, Stéphane; Guédon, Jean Pierre; Denis, Eloïse

    2007-01-01

    An evaluation of two software tools dedicated to an automatic analysis of the CT scanner image spatial resolution is presented in this paper. The methods evaluated consist of calculating the Modulation Transfer Function (MTF) of the CT scanners; the first uses an image of an impulse source, while the second method proposed by Droege and Morin uses an image of cyclic bar patterns. Two Digital Test Objects (DTO) are created to this purpose. These DTOs are then blurred by doing a convolution with a two-dimensional Gaussian Point Spread Function (PSF(Ref)), which has a well known Full Width at Half Maximum (FWHM). The evaluation process consists then of comparing the Fourier transform of the PSF on the one hand, and the two mentioned methods on the other hand.

  12. Preliminary Statistical Analysis of the 1995 Evaluation by NASA LaRC of the IAI Automatic Balance Calibration Machine

    NASA Technical Reports Server (NTRS)

    Tcheng, Ping; Tripp, John S.

    1999-01-01

    The NASA Langley Research Center (LARC) participated in a national cooperative evaluation of the Israel Aircraft Industries (IAI) automatic balance calibration machine at Microcraft, San Diego in September 1995. A LaRC-designed six-component strain gauge balance was selected for test and calibration during LaRC's scheduled evaluation period. Eight calibrations were conducted using three selected experimental designs. Raw data were exported to LaRC facilities for reduction and statistical analysis using the techniques outlined in Tripp and Tcheng (1994). This report presents preliminary assessments of the results, and compares IAI calibration results with manual calibration results obtained at the Modern Machine and Tool Co., Inc. (MM & T). Newport News, VA. A more comprehensive report is forthcoming.

  13. Validating the performance of one-time decomposition for fMRI analysis using ICA with automatic target generation process.

    PubMed

    Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei

    2013-07-01

    Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA.

  14. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    PubMed

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design.

  15. Evaluation of automatic exposure control performance in full-field digital mammography systems using contrast-detail analysis

    NASA Astrophysics Data System (ADS)

    Suarez Castellanos, Ivan M.; Kaczmarek, Richard; Brunner, Claudia C.; de Las Heras, Hugo; Liu, Haimo; Chakrabarti, Kish

    2012-03-01

    Full Field Digital Mammography (FFDM) is increasingly replacing screen-film systems for screening and diagnosis of breast abnormalities. All FFDM systems are equipped with an Automatic Exposure Control (AEC) which automatically selects technique factors to optimize dose and image quality. It is therefore crucial that AEC performance is properly adjusted and optimized to different breast thicknesses. In this work, we studied the AEC performance of three widely used FFDM systems using the CDMAM and QUART mam/digi phantoms. We used the CDMAM phantom to generate Contrast-Detail (C-D) curves for each AEC mode available in the FFDM systems under study for phantoms with equivalent X-Ray attenuation properties as 3.2 cm, 6 cm and 7.5 cm thick breasts. Generated C-D curves were compared with ideal C-D curves constructed using a metric referred to as the k-factor which is the product of the thickness and the diameter of the smallest correctly identified disks in the CDMAM phantom. Previous observer studies have indicated that k-factor values of 60 to 80 μm2 are particularly useful in demonstrating the threshold for object detectability for detectors used in digital mammography systems. The QUART mam/digi phantom was used to calculate contrast-to-noise ratio (CNR) values at different phantom thicknesses. The results of the C-D analysis and CNR measurements were used to determine limiting CNR values intended to provide a threshold for proper image quality assessment. The results of the Contrast-Detail analysis show that for two of the three evaluated FFDM systems, at higher phantom thicknesses, low contrast signal detectability gets worse. This agrees with the results obtained with the QUART mam/digi phantom, where CNR decreases below determined limiting CNR values.

  16. A hybrid 3D region growing and 4D curvature analysis-based automatic abdominal blood vessel segmentation through contrast enhanced CT

    NASA Astrophysics Data System (ADS)

    Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen

    2017-03-01

    In abdominal disease diagnosis and various abdominal surgeries planning, segmentation of abdominal blood vessel (ABVs) is a very imperative task. Automatic segmentation enables fast and accurate processing of ABVs. We proposed a fully automatic approach for segmenting ABVs through contrast enhanced CT images by a hybrid of 3D region growing and 4D curvature analysis. The proposed method comprises three stages. First, candidates of bone, kidneys, ABVs and heart are segmented by an auto-adapted threshold. Second, bone is auto-segmented and classified into spine, ribs and pelvis. Third, ABVs are automatically segmented in two sub-steps: (1) kidneys and abdominal part of the heart are segmented, (2) ABVs are segmented by a hybrid approach that integrates a 3D region growing and 4D curvature analysis. Results are compared with two conventional methods. Results show that the proposed method is very promising in segmenting and classifying bone, segmenting whole ABVs and may have potential utility in clinical use.

  17. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  18. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    SciTech Connect

    Fang, Y; Huang, H; Su, T

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  19. Automatic regional analysis of DTI properties in the developmental macaque brain

    NASA Astrophysics Data System (ADS)

    Styner, Martin; Knickmeyer, Rebecca; Coe, Christopher; Short, Sarah J.; Gilmore, John

    2008-03-01

    Many neuroimaging studies are applied to monkeys as pathologies and environmental exposures can be studied in well-controlled settings and environment. In this work, we present a framework for the use of an atlas based, fully automatic segmentation of brain tissues, lobar parcellations, subcortical structures and the regional extraction of Diffusion Tensor Imaging (DTI) properties. We first built a structural atlas from training images by iterative, joint deformable registration into an unbiased average image. On this atlas, probabilistic tissue maps, a lobar parcellation and subcortical structures were determined. This information is applied to each subjects structural image via affine, followed by deformable registration. The affinely transformed atlas is employed for a joint T1 and T2 based tissue classification. The deformed parcellation regions mask the tissue segmentations to define the parcellation for white and gray matter separately. Each subjects structural image is then non-rigidly matched with its DTI image by normalized mutual information, b-spline based registration. The DTI property histograms were then computed using the probabilistic white matter information for each lobar parcellation. We successfully built an average atlas using a developmental training datasets of 18 cases aged 16-34 months. Our framework was successfully applied to over 50 additional subjects in the age range of 9 70 months. The probabilistically weighted FA average in the corpus callosum region showed the largest increase over time in the observed age range. Most cortical regions show modest FA increase, whereas the cerebellums FA values remained stable. The individual methods used in this segmentation framework have been applied before, but their combination is novel, as is their application to macaque MRI data. Furthermore, this is the first study to date looking at the DTI properties of the developing macaque brain.

  20. Automatic analysis of the Gorkha earthquake aftershock sequence: evidences of structurally-segmented seismicity

    NASA Astrophysics Data System (ADS)

    Baillard, Christian; Lyon-Caen, Hélène; Bollinger, Laurent; Rietbrock, Andreas; Letort, Jean; Adhikari, Lok Bijaya

    2017-03-01

    We present the first 3 months of aftershock activity following the 25th April 2015 Gorkha earthquake MW 7.8 recorded on the Nepalese Seismic network. We deployed an automatic procedure composed of three main stages: 1) coarse determination of the P and S onsets; 2) phase association to declare events and 3) iterative addition and refinement of onsets using the Kurtosis characteristic function. In total 9188 events could be located in the Kathmandu region with the majority having small location errors (< 4.5, 9, 10 km in the X, Y, Z directions, respectively). Additionally, we propose a new attenuation law to estimate local magnitudes in the region. This new seismic catalog reveals a detailed insight into the Gorkha aftershock sequence and its relation to the main shock rupture models and tectonic structures in the region. Most aftershocks fall within the Main Himalayan Thrust (MHT) shear zone or in its hanging-wall. Significant temporal and lateral variations of aftershocks location are observed among them: 1) three distinct stages, highlighting subsequent jump-offs at the easternmost termination, 2) the existence of a seismic gap north of Kathmandu which matches with a low slip zone in the rupture area of the mainshock, 3) the confinement of seismic activity in the trace of the 12th May MW 7.3 earthquake within the MHT and its hanging-wall through a 30 by 30 km2 region, 4) a shallow westward-dipping structure east of the Kathmandu klippe. These new observations with the inferred tectonic structures at depth suggests a tectonic control of part of the aftershock activity by the lateral breaks along the MHT and by the geometry of the duplex above the thrust.

  1. Automatic analysis of the Gorkha earthquake aftershock sequence: evidences of structurally segmented seismicity

    NASA Astrophysics Data System (ADS)

    Baillard, Christian; Lyon-Caen, Hélène; Bollinger, Laurent; Rietbrock, Andreas; Letort, Jean; Adhikari, Lok Bijaya

    2017-05-01

    We present the first 3 months of aftershock activity following the 2015 April 25 Gorkha earthquake Mw 7.8 recorded on the Nepalese Seismic network. We deployed an automatic procedure composed of three main stages: (1) coarse determination of the P and S onsets; (2) phase association to declare events and (3) iterative addition and refinement of onsets using the Kurtosis characteristic function. In total 9188 events could be located in the Kathmandu region with the majority having small location errors (<4.5, 9 and 10 km in the X-, Y- and Z-directions, respectively). Additionally, we propose a new attenuation law to estimate local magnitudes in the region. This new seismic catalogue reveals a detailed insight into the Gorkha aftershock sequence and its relation to the main shock rupture models and tectonic structures in the region. Most aftershocks fall within the Main Himalayan Thrust (MHT) shear zone or in its hangingwall. Significant temporal and lateral variations of aftershocks location are observed among them: (1) three distinct stages, highlighting subsequent jump-offs at the easternmost termination, (2) the existence of a seismic gap north of Kathmandu which matches with a low slip zone in the rupture area of the main shock, (3) the confinement of seismic activity in the trace of the May 12 Mw 7.3 earthquake within the MHT and its hangingwall through a 30 × 30 km2 region and (4) a shallow westward-dipping structure east of the Kathmandu klippe. These new observations with the inferred tectonic structures at depth suggest a tectonic control of part of the aftershock activity by the lateral breaks along the MHT and by the geometry of the duplex above the thrust.

  2. Automatic analysis of slips of the tongue: Insights into the cognitive architecture of speech production.

    PubMed

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-04-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When "pig" is mispronounced as "big," the resulting /b/ sound differs from correct productions of "big," moving towards intended "pig"-revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Automatic Analysis of Slips of the Tongue: Insights into the Cognitive Architecture of Speech Production

    PubMed Central

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-01-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When “pig” is mispronounced as “big,” the resulting /b/ sound differs from correct productions of “big,” moving towards intended “pig”—revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing. PMID:26779665

  4. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling.

    PubMed

    Gardi, J E; Nyengaard, J R; Gundersen, H J G

    2008-03-01

    The proportionator is a novel and radically different approach to sampling with microscopes based on the well-known statistical theory (probability proportional to size-PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section a weight proportional to some characteristic of the structure under study. A typical and very simple example, examined here, is the amount of color characteristic for the structure, marked with a stain with known properties. The color may be specific or not. In the recorded list of weights in all fields, the desired number of fields is sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections examined, which in turn leads to any of the known stereological estimates including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator is 2-15-fold more efficient than the common systematic, uniformly random sampling. The simulations also indicate that the lack of a simple predictor of the coefficient of error (CE) due to field-to-field variation is a more severe problem for uniform sampling strategies than anticipated. Because of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to

  5. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  6. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  7. Automatic quantitative computed tomography segmentation and analysis of aerated lung volumes in acute respiratory distress syndrome-A comparative diagnostic study.

    PubMed

    Klapsing, Philipp; Herrmann, Peter; Quintel, Michael; Moerer, Onnen

    2016-11-08

    Quantitative lung computed tomographic (CT) analysis yields objective data regarding lung aeration but is currently not used in clinical routine primarily because of the labor-intensive process of manual CT segmentation. Automatic lung segmentation could help to shorten processing times significantly. In this study, we assessed bias and precision of lung CT analysis using automatic segmentation compared with manual segmentation. In this monocentric clinical study, 10 mechanically ventilated patients with mild to moderate acute respiratory distress syndrome were included who had received lung CT scans at 5- and 45-mbar airway pressure during a prior study. Lung segmentations were performed both automatically using a computerized algorithm and manually. Automatic segmentation yielded similar lung volumes compared with manual segmentation with clinically minor differences both at 5 and 45 mbar. At 5 mbar, results were as follows: overdistended lung 49.58mL (manual, SD 77.37mL) and 50.41mL (automatic, SD 77.3mL), P=.028; normally aerated lung 2142.17mL (manual, SD 1131.48mL) and 2156.68mL (automatic, SD 1134.53mL), P = .1038; and poorly aerated lung 631.68mL (manual, SD 196.76mL) and 646.32mL (automatic, SD 169.63mL), P = .3794. At 45 mbar, values were as follows: overdistended lung 612.85mL (manual, SD 449.55mL) and 615.49mL (automatic, SD 451.03mL), P=.078; normally aerated lung 3890.12mL (manual, SD 1134.14mL) and 3907.65mL (automatic, SD 1133.62mL), P = .027; and poorly aerated lung 413.35mL (manual, SD 57.66mL) and 469.58mL (automatic, SD 70.14mL), P=.007. Bland-Altman analyses revealed the following mean biases and limits of agreement at 5 mbar for automatic vs manual segmentation: overdistended lung +0.848mL (±2.062mL), normally aerated +14.51mL (±49.71mL), and poorly aerated +14.64mL (±98.16mL). At 45 mbar, results were as follows: overdistended +2.639mL (±8.231mL), normally aerated 17.53mL (±41.41mL), and poorly aerated 56.23mL (±100.67mL). Automatic

  8. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    NASA Astrophysics Data System (ADS)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78

  9. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  10. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  11. A normative spatiotemporal MRI atlas of the fetal brain for automatic segmentation and analysis of early brain growth.

    PubMed

    Gholipour, Ali; Rollins, Caitlin K; Velasco-Annis, Clemente; Ouaalam, Abdelhakim; Akhondi-Asl, Alireza; Afacan, Onur; Ortinau, Cynthia M; Clancy, Sean; Limperopoulos, Catherine; Yang, Edward; Estroff, Judy A; Warfield, Simon K

    2017-03-28

    Longitudinal characterization of early brain growth in-utero has been limited by a number of challenges in fetal imaging, the rapid change in size, shape and volume of the developing brain, and the consequent lack of suitable algorithms for fetal brain image analysis. There is a need for an improved digital brain atlas of the spatiotemporal maturation of the fetal brain extending over the key developmental periods. We have developed an algorithm for construction of an unbiased four-dimensional atlas of the developing fetal brain by integrating symmetric diffeomorphic deformable registration in space with kernel regression in age. We applied this new algorithm to construct a spatiotemporal atlas from MRI of 81 normal fetuses scanned between 19 and 39 weeks of gestation and labeled the structures of the developing brain. We evaluated the use of this atlas and additional individual fetal brain MRI atlases for completely automatic multi-atlas segmentation of fetal brain MRI. The atlas is available online as a reference for anatomy and for registration and segmentation, to aid in connectivity analysis, and for groupwise and longitudinal analysis of early brain growth.

  12. A Q-GERT Analysis of the Effect of Improved Automatic Testing on F-16 Aircraft Availability.

    DTIC Science & Technology

    1983-09-01

    September 1932. ADA 12%000 guerra, Joel A., Lesko, Andrew J., & Periera, Jose . OpratinE and supoort cost model for avionics automatic tst eauipment...1981. perr ., Corporation. Automatic test equipment. Final report No. RL-R-80-3, U. S. Army; Missile Command, Redstoie Aroenal AL, February 1980. 17 84

  13. SplitRacer - a semi-automatic tool for the analysis and interpretation of teleseismic shear-wave splitting

    NASA Astrophysics Data System (ADS)

    Reiss, Miriam Christina; Rümpker, Georg

    2017-04-01

    We present a semi-automatic, graphical user interface tool for the analysis and interpretation of teleseismic shear-wave splitting in MATLAB. Shear wave splitting analysis is a standard tool to infer seismic anisotropy, which is often interpreted as due to lattice-preferred orientation of e.g. mantle minerals or shape-preferred orientation caused by cracks or alternating layers in the lithosphere and hence provides a direct link to the earth's kinematic processes. The increasing number of permanent stations and temporary experiments result in comprehensive studies of seismic anisotropy world-wide. Their successive comparison with a growing number of global models of mantle flow further advances our understanding the earth's interior. However, increasingly large data sets pose the inevitable question as to how to process them. Well-established routines and programs are accurate but often slow and impractical for analyzing a large amount of data. Additionally, shear wave splitting results are seldom evaluated using the same quality criteria which complicates a straight-forward comparison. SplitRacer consists of several processing steps: i) download of data per FDSNWS, ii) direct reading of miniSEED-files and an initial screening and categorizing of XKS-waveforms using a pre-set SNR-threshold. iii) an analysis of the particle motion of selected phases and successive correction of the sensor miss-alignment based on the long-axis of the particle motion. iv) splitting analysis of selected events: seismograms are first rotated into radial and transverse components, then the energy-minimization method is applied, which provides the polarization and delay time of the phase. To estimate errors, the analysis is done for different randomly-chosen time windows. v) joint-splitting analysis for all events for one station, where the energy content of all phases is inverted simultaneously. This allows to decrease the influence of noise and to increase robustness of the measurement

  14. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice.

  15. On 3-D modeling and automatic regridding in shape design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Yao, Tse-Min

    1987-01-01

    The material derivative idea of continuum mechanics and the adjoint variable method of design sensitivity analysis are used to obtain a computable expression for the effect of shape variations on measures of structural performance of three-dimensional elastic solids.

  16. Clarifying inconclusive functional analysis results: Assessment and treatment of automatically reinforced aggression.

    PubMed

    Saini, Valdeep; Greer, Brian D; Fisher, Wayne W

    2015-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy's functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences that maintained aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation.

  17. Clarifying Inconclusive Functional Analysis Results: Assessment and Treatment of Automatically Reinforced Aggression

    PubMed Central

    Saini, Valdeep; Greer, Brian D.; Fisher, Wayne W.

    2016-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy’s functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences maintaining aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation. PMID:25891269

  18. Comparison of the automatic analysis versus the manual scoring from ApneaLink™ device for the diagnosis of obstructive sleep apnoea syndrome.

    PubMed

    Nigro, Carlos Alberto; Dibur, Eduardo; Aimaretti, Silvia; González, Sergio; Rhodius, Edgardo

    2011-12-01

    The purpose of this study was to compare the performance of the automated detection versus the manual scoring from the ApneaLink™ device to diagnose obstructive sleep apnoea syndrome (OSAS). All participants (96) performed the ApneaLink™ (AL) and polysomnography (PSG) simultaneously in the sleep laboratory. The two recordings were interpreted blindly. The hypopnoea criterion used for the analysis of both automatic and manual ApneaLink™ was a fall in airflow ≥50% of baseline for ≥10 s. The agreement between AL and PSG and the interobserver concordance was calculated. ROC analysis, sensitivity and specificity were assessed for the different ApneaLink™ and OSAS criteria. Ninety patients were included (69 men; mean age, 49.6; median RDI, 13.9; median BMI, 29.3 kg/m(2)). The automatic apnoea/hypopnoea index (AHI-a) showed a lower agreement with the respiratory disturbance index (RDI) than the manual apnoea/hypopnoea (AHI-m) [AHI-a/RDI: intraclass correlation coefficient (ICC) 0.88 versus AHI-m/RDI: ICC 0.91]. The manual scoring (MS) showed a similar sensitivity and a higher specificity than the automatic scoring (AA) for the detection of OSAS, defined as an RDI ≥ 5 (sensitivity and specificity AA and MS: 89%/89%, 60%/86.7%, respectively). The accuracy of the automatic and manual scoring of the AL was similar when OSAS was defined as an RDI ≥ 20 or 30. The ApneaLink™ manual scoring had a very good interobserver agreement (k = 0.86). The manual scoring of an ApneaLink™ recording was better than the automatic scoring in terms of agreement with RDI and to discriminate patients with OSAS. The hand scoring did not improve the accuracy of automatic scoring in patients with severe OSAS.

  19. Adapting to Changes in the Characteristics of College-Age Youth. Policy Analysis Service Reports, Vol. 4, No. 2, December 1978.

    ERIC Educational Resources Information Center

    Henderson, Cathy; Plummer, Janet C.

    The number of 18-year-olds and characteristics of the traditional college-age population are examined. This study is a followup and expansion of an earlier study, "Changes in Enrollment by 1985," by Cathy Henderson. The analysis considers changes in characteristics of students, college attendance patterns, the effect of demographic changes on…

  20. Automatic transmission

    SciTech Connect

    Miki, N.

    1988-10-11

    This patent describes an automatic transmission including a fluid torque converter, a first gear unit having three forward-speed gears and a single reverse gear, a second gear unit having a low-speed gear and a high-speed gear, and a hydraulic control system, the hydraulic control system comprising: a source of pressurized fluid; a first shift valve for controlling the shifting between the first-speed gear and the second-speed gear of the first gear unit; a second shift valve for controlling the shifting between the second-speed gear and the third-speed gear of the first gear unit; a third shift valve equipped with a spool having two positions for controlling the shifting between the low-speed gear and the high-speed gear of the second gear unit; a manual selector valve having a plurality of shift positions for distributing the pressurized fluid supply from the source of pressurized fluid to the first, second and third shift valves respectively; first, second and third solenoid valves corresponding to the first, second and third shift valves, respectively for independently controlling the operation of the respective shift valves, thereby establishing a six forward-speed automatic transmission by combining the low-speed gear and the high-speed gear of the second gear unit with each of the first-speed gear, the second speed gear and the third-speed gear of the first gear unit; and means to fixedly position the spool of the third shift valve at one of the two positions by supplying the pressurized fluid to the third shift valve when the manual selector valve is shifted to a particular shift position, thereby locking the second gear unit in one of low-speed gear and the high-speed gear, whereby the six forward-speed automatic transmission is converted to a three forward-speed automatic transmission when the manual selector valve is shifted to the particular shift position.

  1. Automatic transmission

    SciTech Connect

    Ohkubo, M.

    1988-02-16

    An automatic transmission is described combining a stator reversing type torque converter and speed changer having first and second sun gears comprising: (a) a planetary gear train composed of first and second planetary gears sharing one planetary carrier in common; (b) a clutch and requisite brakes to control the planetary gear train; and (c) a speed-increasing or speed-decreasing mechanism is installed both in between a turbine shaft coupled to a turbine of the stator reversing type torque converter and the first sun gear of the speed changer, and in between a stator shaft coupled to a reversing stator and the second sun gear of the speed changer.

  2. Automatic Seismic Signal Processing

    DTIC Science & Technology

    1982-02-04

    CATALOG NUMBER 4. TITLE (end Sublitle) S. TYPE O REPORT & PERIOD COVERED FINAL TECHNICAL REPORT - ROUTINE AUTOM!ATIC SEISMIC ANALYSIS TECHNICAL PACKAGE 6...Seismic Analysis Package ARPA Order Number: 4199 Name of Contractor: ENSCO, Inc. 4 - Contract Number: F086 06-80-C-0021 Effective Date of Contract: 10...developed and demonstrated. This timing detector algorithm times the start time of signals and their envelope peaks. It was designed to measure the size

  3. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    SciTech Connect

    Wei, J; Yuan, A; Li, G

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  4. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  5. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis

    PubMed Central

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R. G.; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  6. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  7. Direct analysis of oligomeric tackifying resins in rubber compounds by automatic thermal desorption gas chromatography/mass spectrometry

    PubMed

    Kim

    1999-01-01

    Two analytical methods, automatic thermal desorption gas chromatography/mass spectrometry (ATD-GC/MS) and pyrolysis gas chromatography/mass spectrometry (Py-GC/MS), were applied as direct methods for the analysis of oligomeric tackifying resins in a vulcanized rubber. The ATD-GC/MS method, based on discontinuous volatile extraction, was found to be an effective means for direct analysis of the oligomeric tackifying resins contained in a vulcanized rubber. The oligomeric tackifying resins, such as t-octylphenolformaldehyde (TOPF) resin, rosin-modified terpene resin, and cashew resin, could be directly analyzed in vulcanized rubber by ATD-GC/MS. Much simpler total ion chromatograms were obtained by ATD-GC/MS than by flash pyrolysis with a Curie-point pyrolyzer, permitting much easier interpretation. Ions at m/z 206, 135, and 107 were fingerprints in the characteristic mass spectra obtained by ATD-GC/MS for TOPF resin in the vulcanized rubber. 1H-Indene, styrene, and isolongifolene were observed as their characteristic mass spectra in the pyrolyzate of the rosin-modified terpene resin. From the cashew resin, phenol, 3-methylphenol, and 4-(1,1,3, 3-tetramethylbutyl)phenol were obtained as the characteristic pyrolyzates by discontinuous thermal extraction via ATD-GC/MS. Copyright 1999 John Wiley & Sons, Ltd.

  8. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  9. Automatic transmission

    SciTech Connect

    Aoki, H.

    1989-03-21

    An automatic transmission is described, comprising: a torque converter including an impeller having a connected member, a turbine having an input member and a reactor; and an automatic transmission mechanism having first to third clutches and plural gear units including a single planetary gear unit with a ring gear and a dual planetary gear unit with a ring gear. The single and dual planetary gear units have respective carriers integrally coupled with each other and respective sun gears integrally coupled with each other, the input member of the turbine being coupled with the ring gear of the single planetary gear unit through the first clutch, and being coupled with the sun gear through the second clutch. The connected member of the impeller is coupled with the ring gear of the dual planetary gear of the dual planetary gear unit is made to be and ring gear of the dual planetary gear unit is made to be restrained as required, and the carrier is coupled with an output member.

  10. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  11. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    ERIC Educational Resources Information Center

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  12. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    NASA Astrophysics Data System (ADS)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  13. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  14. Automatic analysis of nuclear-magnetic-resonance-spectroscopy clinical research data

    NASA Astrophysics Data System (ADS)

    Scott, Katherine N.; Wilson, David C.; Bruner, Angela P.; Lyles, Teresa A.; Underhill, Brandon; Geiser, Edward A.; Ballinger, J. Ray; Scott, James D.; Stopka, Christine B.

    1998-03-01

    A major problem of P-31 nuclear magnetic spectroscopy (MRS) in vivo applications is that when large data sets are acquired, the time invested in data reduction and analysis with currently available technologies may totally overshadow the time required for data acquisition. An example is out MRS monitoring of exercise therapy for patients with peripheral vascular disease. In these, the spectral acquisition requires 90 minutes per patient study, whereas data analysis and reduction requires 6-8 hours. Our laboratory currently uses the proprietary software SA/GE developed by General Electric. However, other software packages have similar limitations. When data analysis takes this long, the researcher does not have the rapid feedback required to ascertain the quality of data acquired nor the result of the study. This highly undesirable even in a research environment, but becomes intolerable in the clinical setting. The purpose of this report is to outline progress towards the development of an automated method for eliminating the spectral analysis burden on the researcher working in the clinical setting.

  15. Automatic co-registration of space-based sensors for precision change detection and analysis

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  16. Automatic co-registration of space-based sensors for precision change detection and analysis

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  17. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  18. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    ERIC Educational Resources Information Center

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  19. Statistical Approaches to Automatic Indexing.

    ERIC Educational Resources Information Center

    Harter, Stephen P.

    1978-01-01

    Views automatic indexing as a two-tiered word frequency analysis that involves selection of a technical vocabulary and identification of document keywords. Assumptions, criteria, evaluation, and relevance are discussed. (JD)

  20. Analysis of automatic match results for cone-beam computed tomography localization of conventionally fractionated lung tumors.

    PubMed

    Grams, Michael P; Brown, Lindsay C; Brinkmann, Debra H; Pafundi, Deanna H; Mundy, Daniel W; Garces, Yolanda I; Park, Sean S; Olivier, Kenneth R; de los Santos, Luis E Fong

    2014-01-01

    To evaluate the dependence of an automatic match process on the size of the user-defined region of interest (ROI), the structure volume of interest (VOI), and changes in tumor volume when using cone-beam computed tomography (CBCT) for tumor localization and to compare these results with a gold standard defined by a physician's manual match. Daily CBCT images for 11 patients with lung cancer treated with conventionally fractionated radiation therapy were retrospectively matched to a reference CT image using the Varian On Board Imager software (Varian, Palo Alto, CA) and a 3-step automatic matching protocol. Matches were performed with 3 ROI sizes (small, medium, large), with and without a structure VOI (internal target volume [ITV] or planning target volume [PTV]) used in the last step. Additionally, matches were performed using an intensity range that isolated the bony anatomy of the spinal column. All automatic matches were compared with a manual match made by a physician. The CBCT images from 109 fractions were analyzed. Automatic match results depend on ROI size and the structure VOI. Compared with the physician's manual match, automatic matches using the PTV as the structure VOI and a small ROI resulted in differences ≥ 5 mm in 1.8% of comparisons. Automatic matches using no VOI and a large ROI differed by ≥ 5 mm in 30.3% of comparisons. Differences between manual and automatic matches using the ITV as the structure VOI increased as tumor size decreased during the treatment course. Users of automatic matching techniques should carefully consider how user-defined parameters affect tumor localization. Automatic matches using the PTV as the structure VOI and a small ROI were most consistent with a physician's manual match, and were independent of volumetric tumor changes. Copyright © 2014 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  1. Enzymatic Microreactors for the Determination of Ethanol by an Automatic Sequential Injection Analysis System

    NASA Astrophysics Data System (ADS)

    Alhadeff, Eliana M.; Salgado, Andrea M.; Cos, Oriol; Pereira, Nei; Valdman, Belkis; Valero, Francisco

    A sequential injection analysis system with two enzymatic microreactors for the determination of ethanol has been designed. Alcohol oxidase and horseradish peroxidase were separately immobilized on glass aminopropyl beads, and packed in 0.91-mL volume microreactors, working in line with the sequential injection analysis system. A stop flow of 120 s was selected for a linear ethanol range of 0.005-0.04 g/L±0.6% relative standard deviation with a throughput of seven analyses per hour. The system was applied to measure ethanol concentrations in samples of distilled and nondistilled alcoholic beverages, and of alcoholic fermentation with good performance and no significant difference compared with other analytical procedures (gas chromatography and high-performance liquid chromatography).

  2. Toward Automatic Scalability Analysis of Message Passing Programs: A Case Study

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; Mehra, Pankaj; Block, Robert; Tucker, Deanne (Technical Monitor)

    1994-01-01

    Scalability analysis forms an important component of any performance debugging cycle, for massively parallel machines. However, tools that help in performing such analysis for parallel programs are non-existent. The primary reason for lack of such tools is the complexity involved in capturing program dynamics such as communication-computation overlap, communication latencies and memory hierarchy reference patterns. In this paper, we highlight some simple techniques that can be used to study scalability of explicit message-passing parallel programs that consider the above issues. We start from the high level source code and use a methodology for deducing communication characteristics and its impact on the total execution time of the program. The approach is validated with the help of a pipelined method for solving scalar tri-diagonal systems, using both simulations and symbolic cost models on the Intel hypercube.

  3. Sensitivity analysis of automatic flight control systems using singular value concepts

    NASA Technical Reports Server (NTRS)

    Herrera-Vaillard, A.; Paduano, J.; Downing, D.

    1985-01-01

    A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.

  4. Automatic backscatter analysis of regional left ventricular systolic function using color kinesis.

    PubMed

    Schwartz, S L; Cao, Q L; Vannan, M A; Pandian, N G

    1996-06-15

    Assessment of regional wall motion by 2-dimensional echocardiography can be performed by either semiquantitative wall motion scoring or by quantitative analysis. The former is subjective and requires expertise. Quantitative methods are too time-consuming for routine use in a busy clinical laboratory. Color kinesis is a new algorithm utilizing acoustic backscatter analysis. It provides a color encoded map of endocardial motion in real time. In each frame a new color layer is added; the thickness of the color beam represents endocardial motion during that frame. The end-systolic image has multiple color layers, representing regional and temporal heterogeneity of segmental motion. The purpose of this study was to validate the use of color kinesis for semiquantitative analysis of regional left ventricular systolic function and quantitatively in measurement of endocardial excursion. Semiquantitative wall motion scoring was performed in 18 patients using both 2-dimensional echo and color kinesis. Scoring was identical in 74% of segments; there was 84% agreement in definition of normal vs. abnormal. There was less interobserver variability in wall motion scoring using color kinesis. Endocardial excursion was quantified in 21 patients. 70% of the imaged segments were suitable for analysis. Correlation between 2-dimensional echocardiographic measurements and color kinesis was excellent, r = 0.87. The mean difference in excursion as measured by the 2 methods was -0.05 +/- 2.0 mm. In conclusion, color kinesis is a useful method for assessing regional contraction by displaying a color map of systolic endocardial excursion. This algorithm may improve the confidence and accuracy of assessment of segmental ventricular function by echocardiographic methods.

  5. A Semi-Automatic Method for Image Analysis of Edge Dynamics in Living Cells.

    PubMed

    Huang, Lawrence; Helmke, Brian P

    2011-06-01

    Spatial asymmetry of actin edge ruffling contributes to the process of cell polarization and directional migration, but mechanisms by which external cues control actin polymerization near cell edges remain unclear. We designed a quantitative image analysis strategy to measure the spatiotemporal distribution of actin edge ruffling. Time-lapse images of endothelial cells (ECs) expressing mRFP-actin were segmented using an active contour method. In intensity line profiles oriented normal to the cell edge, peak detection identified the angular distribution of polymerized actin within 1 µm of the cell edge, which was localized to lamellipodia and edge ruffles. Edge features associated with filopodia and peripheral stress fibers were removed. Circular statistical analysis enabled detection of cell polarity, indicated by a unimodal distribution of edge ruffles. To demonstrate the approach, we detected a rapid, nondirectional increase in edge ruffling in serum-stimulated ECs and a change in constitutive ruffling orientation in quiescent, nonpolarized ECs. Error analysis using simulated test images demonstrate robustness of the method to variations in image noise levels, edge ruffle arc length, and edge intensity gradient. These quantitative measurements of edge ruffling dynamics enable investigation at the cellular length scale of the underlying molecular mechanisms regulating actin assembly and cell polarization.

  6. A Semi-Automatic Method for Image Analysis of Edge Dynamics in Living Cells

    PubMed Central

    Huang, Lawrence; Helmke, Brian P.

    2011-01-01

    Spatial asymmetry of actin edge ruffling contributes to the process of cell polarization and directional migration, but mechanisms by which external cues control actin polymerization near cell edges remain unclear. We designed a quantitative image analysis strategy to measure the spatiotemporal distribution of actin edge ruffling. Time-lapse images of endothelial cells (ECs) expressing mRFP-actin were segmented using an active contour method. In intensity line profiles oriented normal to the cell edge, peak detection identified the angular distribution of polymerized actin within 1 µm of the cell edge, which was localized to lamellipodia and edge ruffles. Edge features associated with filopodia and peripheral stress fibers were removed. Circular statistical analysis enabled detection of cell polarity, indicated by a unimodal distribution of edge ruffles. To demonstrate the approach, we detected a rapid, nondirectional increase in edge ruffling in serum-stimulated ECs and a change in constitutive ruffling orientation in quiescent, nonpolarized ECs. Error analysis using simulated test images demonstrate robustness of the method to variations in image noise levels, edge ruffle arc length, and edge intensity gradient. These quantitative measurements of edge ruffling dynamics enable investigation at the cellular length scale of the underlying molecular mechanisms regulating actin assembly and cell polarization. PMID:21643526

  7. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  8. Analysis of cannabis in oral fluid specimens by GC-MS with automatic SPE.

    PubMed

    Choi, Hyeyoung; Baeck, Seungkyung; Kim, Eunmi; Lee, Sooyeun; Jang, Moonhee; Lee, Juseon; Choi, Hwakyung; Chung, Heesun

    2009-12-01

    Methamphetamine (MA) is the most commonly abused drug in Korea, followed by cannabis. Traditionally, MA analysis is carried out on both urine and hair samples and cannabis analysis in urine samples only. Despite the fact that oral fluid has become increasingly popular as an alternative specimen in the field of driving under the influence of drugs (DUID) and work place drug testing, its application has not been expanded to drug analysis in Korea. Oral fluid is easy to collect and handle and can provide an indication of recent drug abuse. In this study, we present an analytical method using GC-MS to determine tetrahydrocannabinol (THC) and its main metabolite 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in oral fluid. The validated method was applied to oral fluid samples collected from drug abuse suspects and the results were compared with those in urine. The stability of THC and THC-COOH in oral fluid stored in different containers was also investigated. Oral fluid specimens from 12 drug abuse suspects, submitted by the police, were collected by direct expectoration. The samples were screened with microplate ELISA. For confirmation they were extracted using automated SPE with mixed-mode cation exchange cartridge, derivatized and analyzed by GC-MS using selective ion monitoring (SIM). The concentrations ofTHC and THC-COOH in oral fluid showed a large variation and the results from oral fluid and urine samples from cannabis abusers did not show any correlation. Thus, detailed information about time interval between drug use and sample collection is needed to interpret the oral fluid results properly. In addition, further investigation about the detection time window ofTHC and THC-COOH in oral fluid is required to substitute oral fluid for urine in drug testing.

  9. Automatic and Interactive Analysis Software for Beta-Gamma Coincidence Systems Used in CTBT Monitoring

    DTIC Science & Technology

    2000-09-01

    publication in the Journal of Radioanalytical and Nuclear Chemistry , April 2000. [2] Biegalski, K.M.F. and Biegalski, S. “Determining Minimum Detectable... Radioanalytical and Nuclear Chemistry , April 2000. [3] Reeder, P.L., Bowyer, T.W., and Perkins, R.W. “Analysis of Beta-Gamma Spectra for the PNNL ARSA and...DTRA01-99-C-0031 ABSTRACT A suite of software has been developed by Veridian Systems as part of the Prototype International Data Center (PIDC) to assist

  10. Automatic classification of the interferential tear film lipid layer using colour texture analysis.

    PubMed

    Remeseiro, B; Penas, M; Barreira, N; Mosquera, A; Novo, J; García-Resúa, C

    2013-07-01

    The tear film lipid layer is heterogeneous among the population. Its classification depends on its thickness and can be done using the interference pattern categories proposed by Guillon. This papers presents an exhaustive study about the characterisation of the interference phenomena as a texture pattern, using different feature extraction methods in different colour spaces. These methods are first analysed individually and then combined to achieve the best results possible. The principal component analysis (PCA) technique has also been tested to reduce the dimensionality of the feature vectors. The proposed methodologies have been tested on a dataset composed of 105 images from healthy subjects, with a classification rate of over 95% in some cases.

  11. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    PubMed

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection

  12. Automaticity of Conceptual Magnitude.

    PubMed

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-02-16

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object's conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system.

  13. Automaticity of Conceptual Magnitude

    PubMed Central

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-01-01

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object’s conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system. PMID:26879153

  14. Automatic transmission

    SciTech Connect

    Meyman, U.

    1987-03-10

    An automatic transmission is described comprising wheel members each having discs defining an inner space therebetween; turnable blades and vane members located in the inner space between the discs of at least one of the wheel members, the turnable blades being mechanically connected with the vane members. Each of the turnable blades has an inner surface and an outer surface formed by circular cylindrical surfaces having a common axis, each of the turnable blades being turnable about the common axis of the circular cylindrical surfaces forming the inner and outer surfaces of the respective blade; levers turnable about the axes and supporting the blades; the discs having openings extending coaxially with the surfaces which describe the blades. The blades are partially received in the openings of the discs; and a housing accommodating the wheel members and the turnable blades and the vane members.

  15. Automatic transmission

    SciTech Connect

    Hamane, M.; Ohri, H.

    1989-03-21

    This patent describes an automatic transmission connected between a drive shaft and a driven shaft and comprising: a planetary gear mechanism including a first gear driven by the drive shaft, a second gear operatively engaged with the first gear to transmit speed change output to the driven shaft, and a third gear operatively engaged with the second gear to control the operation thereof; centrifugally operated clutch means for driving the first gear and the second gear. It also includes a ratchet type one-way clutch for permitting rotation of the third gear in the same direction as that of the drive shaft but preventing rotation in the reverse direction; the clutch means comprising a ratchet pawl supporting plate coaxially disposed relative to the drive shaft and integrally connected to the third gear, the ratchet pawl supporting plate including outwardly projection radial projections united with one another at base portions thereof.

  16. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    NASA Astrophysics Data System (ADS)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  17. Performance of an implantable automatic atrial fibrillation detection device: impact of software adjustments and relevance of manual episode analysis.

    PubMed

    Eitel, Charlotte; Husser, Daniela; Hindricks, Gerhard; Frühauf, Manuela; Hilbert, Sebastian; Arya, Arash; Gaspar, Thomas; Wetzel, Ulrike; Bollmann, Andreas; Piorkowski, Christopher

    2011-04-01

    Implantable loop recorders (ILRs) with specific atrial fibrillation (AF) detection algorithms (ILR-AF) have been developed for continuous AF monitoring. We sought to analyse the clinical value of a new AF monitoring device and to compare it to serial 7-day Holter. Sixty-four consecutive patients suffering from paroxysmal AF were included in this prospective analysis and received an ILR-AF. Manual electrogram analysis was performed for each automatically detected episode and each was categorized into one of three possible diagnoses: 'no AF', 'definite AF', and 'possible AF' (non-diagnostic). Analysis was performed separately before and after a software upgrade that was introduced during the course of the study. A subgroup of patients (51 of 64) underwent AF catheter ablation with subsequent serial 7-day Holter in comparison with the ILR-AF. A total of 333 interrogations were performed (203 before and 130 after software upgrade). The number of patients with AF misdetection was significantly reduced from 72 to 44% following the software upgrade (P = 0.001). The number of patients with non-diagnostic interrogations went from 38 to 16% (P = 0.001). Compared with serial 7-day Holter, the ILR-AF had a tendency to detect a higher number of patients with AF recurrences (31 vs. 24%; P = 0.125). The rate of AF detection on ILR-AF may be higher compared with standard AF monitoring. However, false-positive AF recordings hamper the clinical value. Developments in device technology and device handling are necessary to minimize non-diagnostic interrogations.

  18. Eolo: software for the automatic on-line treatment and analysis of GPS data for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Amore, M.; Bonaccorso, A.; Ferrari, F.; Mattia, M.

    2002-02-01

    Geodetic measurements based on GPS satellite technology allow monitoring of the state of deformation with great precision. Geodetic studies through GPS have proven to be of strategic importance in risk evaluation of various phenomena such as landslides, volcanoes, earthquakes or for the evaluation of the stability of large structures (dykes, bridges, etc.). For control and surveillance purposes, continuous monitoring with real-time acquisition and analysis of data is required. We have developed a software, called Eolo, which allows the management of data from a GPS network using automatic acquisition. The software is the fruit of 5 years experience of working with continuous GPS and has been designed with a modular object-based structure to be extremely flexible and adaptable to different needs. The software allows treatment of raw data, their storage in a database, calculation of the main strain parameters, and the visualisation of all the results. In this sense the software is a powerful tool in the field of environmental monitoring with GPS geodetic techniques. The software is in use for the continuous monitoring of active Sicilian volcanoes.

  19. Online Determination of Trace Amounts of Tannic Acid in Colored Tannery Wastewaters by Automatic Reference Flow Injection Analysis

    PubMed Central

    Wei, Liang

    2010-01-01

    A simple, rapid and sensitive method was proposed for online determination of tannic acid in colored tannery wastewater by automatic reference flow injection analysis. Based on the tannic acid reduction phosphotungstic acid to form blue compound in pH 12.38 alkaline solutions, the shade of blue compound is in a linear relation to the content of tannic acid at the point of the maximum absorption peak of 760 nm. The optimal experimental conditions had been obtained. The linear range of the proposed method was between 200 μg L−1 to 80 mg L−1 and the detection limit was 0.58 μg L−1. The relative standard deviation was 3.08% and 2.43% for 500 μg L−1 and 40 mg L−1 of tannic acid standard solution, respectively, (n = 10). The method had been successfully applied to determination of tannic acid in colored tannery wastewaters and the analytical results were satisfactory. PMID:20508812

  20. Automatic detection and analysis of the EEG sharp wave-slow wave patterns evoked by fluorinated inhalation anesthetics.

    PubMed

    Olejarczyk, Elzbieta; Jozwik, Adam; Zmyslowski, Wojciech; Sobieszek, Aleksander; Marciniak, Radoslaw; Byrczek, Tomasz; Jalowiecki, Przemyslaw; Bem, Tiaza

    2012-08-01

    The aim of this study was to develop a method for the automatic detection of sharp wave-slow wave (SWSW) patterns evoked in EEG by volatile anesthetics and to identify the patterns' characteristics. The proposed method consisted in the k-NN classification with a reference set obtained using expert knowledge, the morphology of the EEG patterns and the condition for their synchronization. The decision rules were constructed and evaluated using 24h EEG records in ten patients. The sensitivity, specificity and selectivity of the method were 0.88 ± 0.10, 0.81 ± 0.13 and 0.42 ± 0.16, respectively. SWSW patterns' recruitment was strictly dependent on anesthetic concentration. SWSW patterns evoked by different types of anesthetics expressed different characteristics. Synchronization criterion and adequately selected morphological features of "slow wave" were sufficient to achieve the high sensitivity and specificity of the method. The monitoring of SWSW patterns is important in view of possible side effects of volatile anesthetics. The analysis of SWSW patterns' recruitment and morphology could be helpful in the diagnosis of the anesthesia effects on the CNS. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Automatic extraction of ECG strips from continuous 12-lead holter recordings for QT analysis at prescheduled versus optimized time points.

    PubMed

    Badilini, Fabio; Vaglio, Martino; Sarapa, Nenad

    2009-01-01

    Continuous 12-lead ECG monitoring (Holter) in early-phase pharmaceutical studies is today widely used as an ideal platform to extract discrete ECGs for analysis. The extraction process is typically performed manually by trained readers using commercial Holter processing systems. Antares, a novel method for automatic 12-lead extraction from continuous Holter recordings applying minimal noise criteria and heart-rate stability conditions is presented. A set of 12-lead Holter recordings from healthy subjects administered with sotalol is used to compare ECG extractions at fixed time points with ECG extractions generated by Antares optimizing noise and heart rate inside 5 minute windows centered around each expected time point of interest. Global, low- and high-frequency noise content of extracted ECGs was significantly reduced via optimized approach by Antares. Heart rate was also slightly reduced (from 69 +/- 13 to 64 +/- 13 bpm, P < 0.05). Similarly, the corrected QT interval from optimized extractions was significantly reduced (QTcB from 414 +/- 32 to 402 +/- 30 ms, P < 0.05). Using only baseline data, and after adjusting for intersubject variability, the standard deviation (SD) of QT intervals was highly reduced with optimized extraction (SD of QTcF from 11 +/- 8 to 7 +/- 2 ms, P < 0.05). Extraction of discrete 12-lead ECG strips from continuous Holter generates less noisy and more stable ECGs leading to more robust QTc data, thereby potentially facilitating the assessment of ECG effects on clinical trials.

  2. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    NASA Astrophysics Data System (ADS)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  3. The metagenomics RAST server - a public resource for the automatic phylogenetic and functional analysis of metagenomes.

    SciTech Connect

    Meyer, F.; Paarmann, D.; D'Souza, M.; Olson, R.; Glass, E. M.; Kubal, M.; Paczian, T.; Stevens, R.; Wilke, A.; Wilkening, J.; Edwards, R. A.; Rodriguez, A.; Mathematics and Computer Science; Univ. of Chicago; San Diego State Univ.

    2008-09-19

    Random community genomes (metagenomes) are now commonly used to study microbes in different environments. Over the past few years, the major challenge associated with metagenomics shifted from generating to analyzing sequences. High-throughput, low-cost next-generation sequencing has provided access to metagenomics to a wide range of researchers. A high-throughput pipeline has been constructed to provide high-performance computing to all researchers interested in using metagenomics. The pipeline produces automated functional assignments of sequences in the metagenome by comparing both protein and nucleotide databases. phylogenetic and functional summaries of the metagenomes are generated, and tools for comparative metagenomics are incorporated into the standard views. user access is controlled to ensure data privacy, but the collaborative environment underpinning the service provides a framework for sharing databasets between multiple users. In the metagenomics RAST, all users retain full control of their data, and everything is available for download in a variety of formats. The open-source metagenomics RAST service provides a new paradigm for the annotation and analysis of metagenomes. With built-in support for multiple data sources and a back end that houses abstract data types, the metagenomics RAST is stable, extensible, and freely available to all researchers. This service has removed one of the primary bottlenecks in metagenome sequence analysis--the available of high-performance computing for annotating the data.

  4. International seismological datacenter. Database structure, computer facilities, automatic and interactive analysis

    NASA Astrophysics Data System (ADS)

    Barkeby, G.

    1980-11-01

    A data base and data analysis system were designed for receiving and processing seismological data from the WMO Global Telecommunications System. The interface with the system uses a PDP 11/34 computer. Data from the seismic bulletins can be analyzed online or stored for later use. The data base is constructed on a list basis and is a set of seismological data consisting of either reported or calculated parameters. Each parameter is accessible to the user. Programming for the system is in standard FORTRAN except for the communications interface. Record structure is shown with examples of the command language. Computer equipment includes an Amdahl 470/V7, a DEC 10, and a CDC 170-720

  5. Automatic Contrast Enhancement of Brain MR Images Using Hierarchical Correlation Histogram Analysis.

    PubMed

    Chen, Chiao-Min; Chen, Chih-Cheng; Wu, Ming-Chi; Horng, Gwoboa; Wu, Hsien-Chu; Hsueh, Shih-Hua; Ho, His-Yun

    Parkinson's disease is a progressive neurodegenerative disorder that has a higher probability of occurrence in middle-aged and older adults than in the young. With the use of a computer-aided diagnosis (CAD) system, abnormal cell regions can be identified, and this identification can help medical personnel to evaluate the chance of disease. This study proposes a hierarchical correlation histogram analysis based on the grayscale distribution degree of pixel intensity by constructing a correlation histogram, that can improves the adaptive contrast enhancement for specific objects. The proposed method produces significant results during contrast enhancement preprocessing and facilitates subsequent CAD processes, thereby reducing recognition time and improving accuracy. The experimental results show that the proposed method is superior to existing methods by using two estimation image quantitative methods of PSNR and average gradient values. Furthermore, the edge information pertaining to specific cells can effectively increase the accuracy of the results.

  6. Automatic Segmentation of Invasive Breast Carcinomas from DCE-MRI using Time Series Analysis

    PubMed Central

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A.; Gombos, Eva

    2013-01-01

    Purpose Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise and fitting algorithms. To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Methods We modeled the underlying dynamics of the tumor by a LDS and use the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist’s segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). Results The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared to the radiologist’s segmentation and 82.1% accuracy and 100% sensitivity when compared to the CADstream output. The overlap of the algorithm output with the radiologist’s segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72 respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC=0.95. Conclusion The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. PMID:24115175

  7. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature

    PubMed Central

    MacGillivray, Thomas J; Cameron, James R.; Zhang, Qiuli; El-Medany, Ahmed; Mulholland, Carl; Sheng, Ziyan; Dhillon, Bal; Doubal, Fergus N.; Foster, Paul J.

    2015-01-01

    Purpose To assess the suitability of retinal images held in the UK Biobank - the largest retinal data repository in a prospective population-based cohort - for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease. Methods Non-mydriatic fundus images from both eyes of 2,690 participants - people with a self-reported history of myocardial infarction (n=1,345) and a matched control group (n=1,345) - were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry. Results Total operator time was approximately 360 hours (4 minutes per image). 2,252 (84%) of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60%) of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004) and analysed (p<0.0001). Cases exhibited more acute arteriolar branching angles (p=0.02) as well as lower arteriolar and venular tortuosity (p<0.0001). Conclusions A proportion of the retinal images in UK Biobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group. PMID:26000792

  8. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  9. Automatic pole-like object modeling via 3D part-based analysis of point cloud

    NASA Astrophysics Data System (ADS)

    He, Liu; Yang, Haoxiang; Huang, Yuchun

    2016-10-01

    Pole-like objects, including trees, lampposts and traffic signs, are indispensable part of urban infrastructure. With the advance of vehicle-based laser scanning (VLS), massive point cloud of roadside urban areas becomes applied in 3D digital city modeling. Based on the property that different pole-like objects have various canopy parts and similar trunk parts, this paper proposed the 3D part-based shape analysis to robustly extract, identify and model the pole-like objects. The proposed method includes: 3D clustering and recognition of trunks, voxel growing and part-based 3D modeling. After preprocessing, the trunk center is identified as the point that has local density peak and the largest minimum inter-cluster distance. Starting from the trunk centers, the remaining points are iteratively clustered to the same centers of their nearest point with higher density. To eliminate the noisy points, cluster border is refined by trimming boundary outliers. Then, candidate trunks are extracted based on the clustering results in three orthogonal planes by shape analysis. Voxel growing obtains the completed pole-like objects regardless of overlaying. Finally, entire trunk, branch and crown part are analyzed to obtain seven feature parameters. These parameters are utilized to model three parts respectively and get signal part-assembled 3D model. The proposed method is tested using the VLS-based point cloud of Wuhan University, China. The point cloud includes many kinds of trees, lampposts and other pole-like posters under different occlusions and overlaying. Experimental results show that the proposed method can extract the exact attributes and model the roadside pole-like objects efficiently.

  10. Upgrade of a semi-automatic flow injection analysis system to a fully automatic one by means of a resident program

    PubMed Central

    Prodromidis, M. I.; Tsibiris, A. B.

    1995-01-01

    The program and the arrangement for a versatile, computer-controlled flow injection analysis system is described. A resident program (which can be run simultaneously and complementary to any other program) controls (on/off, speed, direction) a pump and a pneumatic valve (emptying and filling position). The system was designed to be simple and flexible for both research and routine work. PMID:18925039

  11. Bayesian analysis of fingerprint, face and signature evidences with automatic biometric systems.

    PubMed

    Gonzalez-Rodriguez, Joaquin; Fierrez-Aguilar, Julian; Ramos-Castro, Daniel; Ortega-Garcia, Javier

    2005-12-20

    The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.

  12. Automatic detection of sleep apnea based on EEG detrended fluctuation analysis and support vector machine.

    PubMed

    Zhou, Jing; Wu, Xiao-ming; Zeng, Wei-jie

    2015-12-01

    Sleep apnea syndrome (SAS) is prevalent in individuals and recently, there are many studies focus on using simple and efficient methods for SAS detection instead of polysomnography. However, not much work has been done on using nonlinear behavior of the electroencephalogram (EEG) signals. The purpose of this study is to find a novel and simpler method for detecting apnea patients and to quantify nonlinear characteristics of the sleep apnea. 30 min EEG scaling exponents that quantify power-law correlations were computed using detrended fluctuation analysis (DFA) and compared between six SAS and six healthy subjects during sleep. The mean scaling exponents were calculated every 30 s and 360 control values and 360 apnea values were obtained. These values were compared between the two groups and support vector machine (SVM) was used to classify apnea patients. Significant difference was found between EEG scaling exponents of the two groups (p < 0.001). SVM was used and obtained high and consistent recognition rate: average classification accuracy reached 95.1% corresponding to the sensitivity 93.2% and specificity 98.6%. DFA of EEG is an efficient and practicable method and is helpful clinically in diagnosis of sleep apnea.

  13. Automatic segmentation and analysis of fibrin networks in 3D confocal microscopy images

    NASA Astrophysics Data System (ADS)

    Liu, Xiaomin; Mu, Jian; Machlus, Kellie R.; Wolberg, Alisa S.; Rosen, Elliot D.; Xu, Zhiliang; Alber, Mark S.; Chen, Danny Z.

    2012-02-01

    Fibrin networks are a major component of blood clots that provides structural support to the formation of growing clots. Abnormal fibrin networks that are too rigid or too unstable can promote cardiovascular problems and/or bleeding. However, current biological studies of fibrin networks rarely perform quantitative analysis of their structural properties (e.g., the density of branch points) due to the massive branching structures of the networks. In this paper, we present a new approach for segmenting and analyzing fibrin networks in 3D confocal microscopy images. We first identify the target fibrin network by applying the 3D region growing method with global thresholding. We then produce a one-voxel wide centerline for each fiber segment along which the branch points and other structural information of the network can be obtained. Branch points are identified by a novel approach based on the outer medial axis. Cells within the fibrin network are segmented by a new algorithm that combines cluster detection and surface reconstruction based on the α-shape approach. Our algorithm has been evaluated on computer phantom images of fibrin networks for identifying branch points. Experiments on z-stack images of different types of fibrin networks yielded results that are consistent with biological observations.

  14. Evaluation of Septa Quality for Automatic SPME–GC–MS Trace Analysis

    PubMed Central

    Ulanowska, Agnieszka; Ligor, Tomasz; Amann, Anton; Buszewski, Bogusław

    2012-01-01

    The vials used for the preparation of breath samples for automated solid-phase microextraction–gas chromatography–mass spectrometry analysis are crimped with septa. These septa often emit specific volatile organic compounds (VOCs) confounding the measurement results of breath samples. In the current paper, 14 different brands of magnetic caps with silicone–polytetrafluoroethylene (PTFE), butyl–PTFE, or butyl rubber septa were tested. The total emission of septa over a 4 h period was also evaluated. The tested septa emitted 39 different compounds, which are mainly hydrocarbons, alcohols, and ketones. Acetone and toluene are the most abundant out-gassing products. The concentration of acetone was in the range from 55 to 694 ppb for butyl–PTFE septum (brand 14) and butyl rubber (brand 10), respectively. The measured toluene amount was 69–1323 ppb for the septum brand 14 and brand 8 (silicone–PTFE), respectively. Generally, the butyl rubber septa released higher amounts of contaminants in comparison to the silicone ones. PMID:22291050

  15. Formal analysis and automatic generation of user interfaces: approach, methodology, and an algorithm.

    PubMed

    Heymann, Michael; Degani, Asaf

    2007-04-01

    We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.

  16. Automatic Quantitative MRI Texture Analysis in Small-for-Gestational-Age Fetuses Discriminates Abnormal Neonatal Neurobehavior

    PubMed Central

    Sanz-Cortes, Magdalena; Ratta, Giuseppe A.; Figueras, Francesc; Bonet-Carne, Elisenda; Padilla, Nelly; Arranz, Angela; Bargallo, Nuria; Gratacos, Eduard

    2013-01-01

    Background We tested the hypothesis whether texture analysis (TA) from MR images could identify patterns associated with an abnormal neurobehavior in small for gestational age (SGA) neonates. Methods Ultrasound and MRI were performed on 91 SGA fetuses at 37 weeks of GA. Frontal lobe, basal ganglia, mesencephalon and cerebellum were delineated from fetal MRIs. SGA neonates underwent NBAS test and were classified as abnormal if ≥1 area was <5th centile and as normal if all areas were >5th centile. Textural features associated with neurodevelopment were selected and machine learning was used to model a predictive algorithm. Results Of the 91 SGA neonates, 49 were classified as normal and 42 as abnormal. The accuracies to predict an abnormal neurobehavior based on TA were 95.12% for frontal lobe, 95.56% for basal ganglia, 93.18% for mesencephalon and 83.33% for cerebellum. Conclusions Fetal brain MRI textural patterns were associated with neonatal neurodevelopment. Brain MRI TA could be a useful tool to predict abnormal neurodevelopment in SGA. PMID:23922750

  17. SpotMetrics: An Open-Source Image-Analysis Software Plugin for Automatic Chromatophore Detection and Measurement.

    PubMed

    Hadjisolomou, Stavros P; El-Haddad, George

    2017-01-01

    Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, "SpotMetrics," that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines.

  18. Fully automatic analysis of the knee articular cartilage T1ρ relaxation time using voxel-based relaxometry.

    PubMed

    Pedoia, Valentina; Li, Xiaojuan; Su, Favian; Calixto, Nathaniel; Majumdar, Sharmila

    2016-04-01

    To develop and compare with the classical region of interest (ROI)-based approach a fully automatic, local, and unbiased way of studying the knee T1ρ relaxation time by creating an atlas and using voxel-based relaxometry (VBR) in osteoarthritis (OA) and anterior cruciate ligament (ACL) subjects. In this study 110 subjects from two cohorts: 1) Mild OA 40 patients with mild-OA Kellgren-Lawrence (KL) ≤ 2 and 15 controls KL ≤ 1; 2) ACL cohort (a model for early OA): 40 ACL-injured patients imaged prior to ACL reconstruction and 1-year postsurgery and 15 controls are analyzed. All the subjects were acquired at 3T with a protocol that includes: 3D-FSE (CUBE) and 3D-T1ρ . A nonrigid registration technique was applied to align all the images on a single template. This allows for performing VBR to assess local statistical differences of T1ρ values using z-score analysis. VBR results were compared with those obtained with classical ROI-based technique. ROI-based results from atlas-based segmentation were consistent with classical ROI-based method (coefficient of variation [CV] = 3.83%). Voxel-based group analysis revealed local patterns that were overlooked by the ROI-based approach; eg, VBR showed posterior lateral femur and posterior lateral tibia significant T1ρ elevations in ACL-injured patients (sample mean z-score=9.7 and 10.3). Those elevations were overlooked by the classical ROI-based approach (sample mean z-score=1.87 and -1.73) CONCLUSION: VBR is a feasible and accurate tool for the local evaluation of the biochemical composition of knee articular cartilage. VBR is capable of detecting specific local patterns on T1ρ maps in OA and ACL subjects. © 2015 Wiley Periodicals, Inc.

  19. SpotMetrics: An Open-Source Image-Analysis Software Plugin for Automatic Chromatophore Detection and Measurement

    PubMed Central

    Hadjisolomou, Stavros P.; El-Haddad, George

    2017-01-01

    Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, “SpotMetrics,” that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines. PMID:28298896

  20. Fully Automatic Analysis of the Knee Articular Cartilage T1ρ relaxation time using Voxel Based Relaxometry

    PubMed Central

    Pedoia, Valentina; Li, Xiaojuan; Su, Favian; Calixto, Nathaniel; Majumdar, Sharmila

    2016-01-01

    Purpose To develop and compare with classical ROI-based approach, a fully-automatic, local and unbiased way of studying the knee T1ρ relaxation time by creating an atlas and using Voxel Based Relaxometry (VBR) in OA and ACL subjects Materials and Methods In this study 110 subjects from 2 cohorts: (i) Mild OA 40 patients with mild-OA KL ≤ 2 and 15 controls KL ≤ 1; (ii) ACL cohort (a model for early OA): 40 ACL-injured patients imaged prior to ACL reconstruction and 1-year post-surgery and 15 controls are analyzed. All the subjects were acquired at 3T with a protocol that includes: 3D-FSE (CUBE) and 3D-T1ρ. A Non-rigid registration technique was applied to align all the images on a single template. This allows for performing VBR to assess local statistical differences of T1ρ values using z-score analysis. VBR results are compared with those obtained with classical ROI-based technique Results ROI-based results from atlas-based segmentation were consistent with classical ROI-based method (CV = 3.83%). Voxel-based group analysis revealed local patterns that were overlooked by ROI-based approach; e.g. VBR showed posterior lateral femur and posterior lateral tibia significant T1ρ elevations in ACL injured patients (sample mean z-score=9.7 and 10.3). Those elevations were overlooked by the classical ROI-based approach (sample mean z-score =1.87, and −1.73) Conclusion VBR is a feasible and accurate tool for the local evaluation of the biochemical composition of knee articular cartilage. VBR is capable of detecting specific local patterns on T1ρ maps in OA and ACL subjects PMID:26443990

  1. Is automatic CPAP titration as effective as manual CPAP titration in OSAHS patients? A meta-analysis.

    PubMed

    Gao, Weijie; Jin, Yinghui; Wang, Yan; Sun, Mei; Chen, Baoyuan; Zhou, Ning; Deng, Yuan

    2012-06-01

    It is costly and time-consuming to conduct the standard manual titration to identify an effective pressure before continuous positive airway pressure (CPAP) treatment for obstructive sleep apnea (OSA) patients. Automatic titration is cheaper and more easily available than manual titration. The purpose of this systematic review was to evaluate the effect of automatic titration in identifying a pressure and on the improvement of apnea/hyponea index (AHI) and somnolence, the change of sleep quality, and the acceptance and compliance of CPAP treatment, compared with the manual titration. A systematic search was made of the PubMed, EMBASE, Cochrane Library, SCI, China Academic Journals Full-text Databases, Chinese Biomedical Literature Database, Chinese Scientific Journals Databases and Chinese Medical Association Journals. Randomized controlled trials comparing automatic titration and manual titration were reviewed. Studies were pooled to yield odds ratios (OR) or mean differences (MD) with 95% confidence intervals (CI). Ten trials involving 849 patients met the inclusion criteria. It is hard to identify a trend in the pressures determined by either automatic or manual titration. Automatic titration can improve the AHI (MD = 0.03/h, 95% CI = -4.48 to 4.53) and Epworth sleepiness scale (SMD = -0.02, 95% CI = -0.34 to 0.31,) as effectively as the manual titration. There is no difference between sleep architecture under automatic titration or manual titration. The acceptance of CPAP treatment (OR = 0.96, 95% CI = 0.60 to 1.55) and the compliance with treatment (MD = -0.04, 95% CI = -0.17 to 0.10) after automatic titration is not different from manual titration. Automatic titration is as effective as standard manual titration in improving AHI, somnolence while maintaining sleep quality similar to the standard method. In addition, automatic titration has the same effect on the acceptance and compliance of CPAP treatment as manual titration. With the potential advantage

  2. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    NASA Astrophysics Data System (ADS)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  3. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab

    PubMed Central

    Faron, Matthew L.; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R.; Soucy, Genevieve; Korver, John

    2016-01-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and “nonnegative” chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study. PMID:27413193

  4. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    PubMed

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study.

  5. Regional analysis of volumes and reproducibilities of automatic and manual hippocampal segmentations

    PubMed Central

    Vrenken, Hugo; Bijma, Fetsje; Barkhof, Frederik; van Herk, Marcel; de Munck, Jan C.

    2017-01-01

    Purpose Precise and reproducible hippocampus outlining is important to quantify hippocampal atrophy caused by neurodegenerative diseases and to spare the hippocampus in whole brain radiation therapy when performing prophylactic cranial irradiation or treating brain metastases. This study aimed to quantify systematic differences between methods by comparing regional volume and outline reproducibility of manual, FSL-FIRST and FreeSurfer hippocampus segmentations. Materials and methods This study used a dataset from ADNI (Alzheimer’s Disease Neuroimaging Initiative), including 20 healthy controls, 40 patients with mild cognitive impairment (MCI), and 20 patients with Alzheimer’s disease (AD). For each subject back-to-back (BTB) T1-weighted 3D MPRAGE images were acquired at time-point baseline (BL) and 12 months later (M12). Hippocampi segmentations of all methods were converted into triangulated meshes, regional volumes were extracted and regional Jaccard indices were computed between the hippocampi meshes of paired BTB scans to evaluate reproducibility. Regional volumes and Jaccard indices were modelled as a function of group (G), method (M), hemisphere (H), time-point (T), region (R) and interactions. Results For the volume data the model selection procedure yielded the following significant main effects G, M, H, T and R and interaction effects G-R and M-R. The same model was found for the BTB scans. For all methods volumes reduces with the severity of disease. Significant fixed effects for the regional Jaccard index data were M, R and the interaction M-R. For all methods the middle region was most reproducible, independent of diagnostic group. FSL-FIRST was most and FreeSurfer least reproducible. Discussion/Conclusion A novel method to perform detailed analysis of subtle differences in hippocampus segmentation is proposed. The method showed that hippocampal segmentation reproducibility was best for FSL-FIRST and worst for Freesurfer. We also found systematic

  6. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  7. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    PubMed

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  8. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    PubMed Central

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  9. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    PubMed

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  10. [The impact of blood smear preparation on effectiveness of functioning of Vision Hema--the digital system of automatic blood analysis].

    PubMed

    Sosnin, D Iu; Nenasheva, O Iu; Falkov, B F; Trusheva, L A

    2013-04-01

    The article deals with study of the impact of standardization of blood smears preparation on effectiveness of functioning of Vision Hema system. The analysis was applied to the results of counting of 200 leukocytes in 30 blood smears prepared from venous blood stabilized with ethylenediaminetetraacetic acid using thoroughly degreased slide plates and applying automatic device to prepare blood smears (comparative group) and in 49 preparations prepared manually from non-stabilized capillary blood (control group). The standardization of the procedure of preparation of glass samples resulted in five time decrease of total amount of artifacts and in disappearance of thrombocytes aggregates and pavement epithelium cells. The absolute amount of destroyed leukocytes decreased in 2.4 times and particles of dirt in 9.5 time. The proposed technique of preparation of smear increased velocity of automatic analysis of leukogram by the Vision Hema system in 2 times and speeded up validation by physician of derived results in 3 times.

  11. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    NASA Astrophysics Data System (ADS)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  12. CIRF Publications, Vol. 12, No. 5.

    ERIC Educational Resources Information Center

    International Labour Office, Geneva (Switzerland).

    CIRF Publications, Vol. 12, No. 5 is a collection of 80 abstracts giving particular attention to education, training, and economic growth in developing countries, Iran, Japan, Kenya, the Solomon Islands, and Sri Lanka; vocational rehabilitation in Italy, Spain, the United Kingdom, and the U. S. A.; agriculture in Chad, developing countries, and…

  13. Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion

    NASA Astrophysics Data System (ADS)

    Beigi, Parmida; Salcudean, Septimiu E.; Rohling, Robert; Ng, Gary C.

    2016-03-01

    This paper presents an automatic localization method for a standard hand-held needle in ultrasound based on temporal motion analysis of spatially decomposed data. Subtle displacement arising from tremor motion has a periodic pattern which is usually imperceptible in the intensity image but may convey information in the phase image. Our method aims to detect such periodic motion of a hand-held needle and distinguish it from intrinsic tissue motion, using a technique inspired by video magnification. Complex steerable pyramids allow specific design of the wavelets' orientations according to the insertion angle as well as the measurement of the local phase. We therefore use steerable pairs of even and odd Gabor wavelets to decompose the ultrasound B-mode sequence into various spatial frequency bands. Variations of the local phase measurements in the spatially decomposed input data is then temporally analyzed using a finite impulse response bandpass filter to detect regions with a tremor motion pattern. Results obtained from different pyramid levels are then combined and thresholded to generate the binary mask input for the Hough transform, which determines an estimate of the direction angle and discards some of the outliers. Polynomial fitting is used at the final stage to remove any remaining outliers and improve the trajectory detection. The detected needle is finally added back to the input sequence as an overlay of a cloud of points. We demonstrate the efficiency of our approach to detect the needle using subtle tremor motion in an agar phantom and in-vivo porcine cases where intrinsic motion is also present. The localization accuracy was calculated by comparing to expert manual segmentation, and presented in (mean, standard deviation and root-mean-square error) of (0.93°, 1.26° and 0.87°) and (1.53 mm, 1.02 mm and 1.82 mm) for the trajectory and the tip, respectively.

  14. The prediction of ICD therapy in multicenter automatic defibrillator implantation trial (MADIT) II like patients: a retrospective analysis.

    PubMed

    Budeus, Marco; Reinsch, Nico; Wieneke, Heinrich; Sack, Stefan; Erbel, Raimund

    2008-04-01

    MADIT II like patients have not been compared to patients without an electrophysiological study, patients in whom ventricular tachycardia or fibrillation were induced in an electrophysiological study (EPS) and patients without an inducibility in EPS in one study. The multicenter automatic defibrillator implantation trial (MADIT) II showed a benefit of ICD implantation in patients with ischemic heart disease. We performed a retrospective analysis in 93 patients with an ischemic heart disease and an ejection fraction

  15. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    PubMed

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  16. Automatic Chloroplast Movement Analysis.

    PubMed

    Johansson, Henrik; Zeidler, Mathias

    2016-01-01

    In response to low or high intensities of light, the chloroplasts in the mesophyll cells of the leaf are able to increase or decrease their exposure to light by accumulating at the upper and lower sides or along the side walls of the cell respectively. This movement, regulated by the phototropin blue light photoreceptors phot1 and phot2, results in a decreased or increased transmission of light through the leaf. This way the plant is able to optimize harvesting of the incoming light or avoid damage caused by excess light. Here we describe a method that indirectly measures the movement of chloroplasts by taking advantage of the resulting change in leaf transmittance. By using a microplate reader, quantitative measurements of chloroplast accumulation or avoidance can be monitored over time, for multiple samples with relatively little hands-on time.

  17. Automatic transmission

    SciTech Connect

    Miura, M.; Inuzuka, T.

    1986-08-26

    1. An automatic transmission with four forward speeds and one reverse position, is described which consists of: an input shaft; an output member; first and second planetary gear sets each having a sun gear, a ring gear and a carrier supporting a pinion in mesh with the sun gear and ring gear; the carrier of the first gear set, the ring gear of the second gear set and the output member all being connected; the ring gear of the first gear set connected to the carrier of the second gear set; a first clutch means for selectively connecting the input shaft to the sun gear of the first gear set, including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a second clutch means for selectively connecting the input shaft to the sun gear of the second gear set a third clutch means for selectively connecting the input shaft to the carrier of the second gear set including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a first drive-establishing means for selectively preventing rotation of the ring gear of the first gear set and the carrier of the second gear set in only one direction and, alternatively, in any direction; a second drive-establishing means for selectively preventing rotation of the sun gear of the second gear set; and a drum being open to the first planetary gear set, with a cylindrical intermediate wall, an inner peripheral wall and outer peripheral wall and forming the hydraulic servos of the first and third clutch means between the intermediate wall and the inner peripheral wall and between the intermediate wall and the outer peripheral wall respectively.

  18. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    NASA Astrophysics Data System (ADS)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  19. A theoretical analysis of the effect of time lag in an automatic stabilization system on the lateral oscillatory stability of an airplane

    NASA Technical Reports Server (NTRS)

    Sternfield, Leonard; Gates, Ordway B , Jr

    1951-01-01

    A method is presented for determining the effect of time lag in an automatic stabilization system on the lateral oscillatory stability of an airplane. The method is based on an analytical-graphical procedure. The critical time lag of the airplane-autopilot system is readily determined from the frequency-response analysis. The method is applied to a typical present-day airplane equipped with an automatic pilot sensitive to yawing acceleration and geared to the rudder so that rudder control is applied in proportion to the yawing acceleration. The results calculated for this airplane-autopilot system by this method are compared with the airplane motions calculated by a step-by-step procedure.

  20. Sentiment Analysis and Social Cognition Engine (SEANCE): An automatic tool for sentiment, social cognition, and social-order analysis.

    PubMed

    Crossley, Scott A; Kyle, Kristopher; McNamara, Danielle S

    2017-06-01

    This study introduces the Sentiment Analysis and Cognition Engine (SEANCE), a freely available text analysis tool that is easy to use, works on most operating systems (Windows, Mac, Linux), is housed on a user's hard drive (as compared to being accessed via an Internet interface), allows for batch processing of text files, includes negation and part-of-speech (POS) features, and reports on thousands of lexical categories and 20 component scores related to sentiment, social cognition, and social order. In the study, we validated SEANCE by investigating whether its indices and related component scores can be used to classify positive and negative reviews in two well-known sentiment analysis test corpora. We contrasted the results of SEANCE with those from Linguistic Inquiry and Word Count (LIWC), a similar tool that is popular in sentiment analysis, but is pay-to-use and does not include negation or POS features. The results demonstrated that both the SEANCE indices and component scores outperformed LIWC on the categorization tasks.

  1. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  2. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    ERIC Educational Resources Information Center

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  3. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    ERIC Educational Resources Information Center

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  4. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  5. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells.

  6. Automatic analysis of stereoscopic GOES/GOES and GOES/NOAA image pairs for measurement of hurricane cloud top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Pierce, H.; Woodward, R. H.

    1989-01-01

    Results are presented from a baseline study using an synthetic stereo image pair to test the Automatic Stereo Analysis (ASA) technique for reproducing cloud top structure. The ASA analysis, display, and calibration procedures are described. A GEO/LEO (GOES/NOAA AVHRR) image pair from Hurrican Allen in 1980 is used to illustrate the results that can be obtained using the ASA technique. Also, results are presented from applying the ASA technique to a GEO/GEO (GOES/GOES) image pair of Hurricane Gilbert in 1988.

  7. Automatic analysis of stereoscopic GOES/GOES and GOES/NOAA image pairs for measurement of hurricane cloud top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Pierce, H.; Woodward, R. H.

    1989-01-01

    Results are presented from a baseline study using an synthetic stereo image pair to test the Automatic Stereo Analysis (ASA) technique for reproducing cloud top structure. The ASA analysis, display, and calibration procedures are described. A GEO/LEO (GOES/NOAA AVHRR) image pair from Hurrican Allen in 1980 is used to illustrate the results that can be obtained using the ASA technique. Also, results are presented from applying the ASA technique to a GEO/GEO (GOES/GOES) image pair of Hurricane Gilbert in 1988.

  8. AUTOMATIC NAVIGATION.

    DTIC Science & Technology

    NAVIGATION, REPORTS), (*CONTROL SYSTEMS, *INFORMATION THEORY), ABSTRACTS, OPTIMIZATION, DYNAMIC PROGRAMMING, GAME THEORY, NONLINEAR SYSTEMS, CORRELATION TECHNIQUES, FOURIER ANALYSIS, INTEGRAL TRANSFORMS, DEMODULATION, NAVIGATION CHARTS, PATTERN RECOGNITION, DISTRIBUTION THEORY , TIME SHARING, GRAPHICS, DIGITAL COMPUTERS, FEEDBACK, STABILITY

  9. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer

    PubMed Central

    2013-01-01

    Background In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. Methods After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Results Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. Conclusion The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition. PMID:24499557

  10. Sleep apnea diagnosis using an ECG Holter device including a nasal pressure (NP) recording: validation of visual and automatic analysis of nasal pressure versus full polysomnography.

    PubMed

    Pépin, Jean-Louis; Defaye, Pascal; Vincent, Elodie; Christophle-Boulard, Sylvain; Tamisier, Renaud; Lévy, Patrick

    2009-06-01

    New simplified techniques for diagnosing sleep apnea should be specially tailored for easy use in cardiologic practice. e dedicated one of the channels of a Holter Electrocardiogram (ECG) device (SpiderView() ELA Medical, France) to nasal pressure (NP) recordings. We also developed an automatic analysis of NP signal providing an apnea-hypopnea index (AHI) for physicians without the know-how in sleep medicine. Thirty-four unselected patients referred for symptoms suggesting sleep apnea underwent a polysomnography (PSG) with simultaneous NP and Holter ECG recordings. An expert blinded to PSG results visually scored the Holter plus NP recordings. The results of the AHI obtained in PSG (AHI-PSG) were compared, respectively, to the AHI-NP obtained by visual analysis and automatic analysis (AHI-NP Auto) of Holter ECG nasal pressure. In 10 randomly selected subjects (development set), the best cut-off on Holter ECG for diagnosing sleep apnea patients as defined by AHI>20/h in PSG was determined at 35 events/h by a receiver operator curve (ROC) analysis. Prospective testing of this threshold was then performed in 19 subjects (test set). For visually scored recordings of Holter ECG plus NP, we obtained a negative predictive value (NPV) of 80% and a positive predictive value (PPV) of 100% for sleep apnea. The area under the ROC curve was 0.97. For the automatic analysis, the NPV was 86% and the PPV value 100%. The area under the ROC curve was 0.85. NP recording using a Holter system is an efficient and easy-to-use tool for screening for sleep-disordered breathing in routine cardiology practice.

  11. Abundance analysis of targets for the COROT/MONS asteroseismology missions. I. Semi-automatic abundance analysis of the gamma Dor star HD 49434

    NASA Astrophysics Data System (ADS)

    Bruntt, H.; Catala, C.; Garrido, R.; Rodríguez, E.; Stütz, C.; Knoglinger, P.; Mittermayer, P.; Bouret, J. C.; Hua, T.; Lignières, F.; Charpinet, S.; Van't Veer-Menneret, C.; Ballereau, D.

    2002-07-01

    One of the goals of the ground-based support program for the COROT and MONS/Römer satellite missions is to select and characterise suitable target stars for the part of the missions dedicated to asteroseismology. While the global atmospheric parameters may be determined with good accuracy from the Strömgren indices, careful abundance analysis must be made for the proposed main targets. This is a time consuming process considering the long list of primary and secondary targets. We have therefore developed new software called VWA for this task. The VWA automatically selects the least blended lines from the atomic line database VALD, and consequently adjusts the abundance in order to find the best match between the calculated and observed spectra. The variability of HD 49434 was discovered as part of COROT ground-based support observations. Here we present a detailed abundance analysis of HD 49434 using VWA. For most elements we find abundances somewhat below the Solar values, in particular we find [Fe/H] = -0.13 +/- 0.14. We also present the results from the study of the variability that is seen in spectroscopic and photometric time series observations. From the characteristics of the variation seen in photometry and in the line profiles we propose that HD 49434 is a variable star of the gamma Doradus type. Based on observations obtained at Observatoire d'Haute Provence, France and at the Observatory of Sierra Nevada, Granada, Spain.

  12. Automatic vs semi-automatic global cardiac function assessment using 64-row CT

    PubMed Central

    Greupner, J; Zimmermann, E; Hamm, B; Dewey, M

    2012-01-01

    Objective Global cardiac function assessment using multidetector CT (MDCT) is time-consuming. Therefore we sought to compare an automatic software tool with an established semi-automatic method. Methods A total of 36 patients underwent CT with 64×0.5 mm detector collimation, and global left ventricular function was subsequently assessed by two independent blinded readers using both an automatic region-growing-based software tool (with and without manual adjustment) and an established semi-automatic software tool. We also analysed automatic motion mapping to identify end-systole. Results The time needed for assessment using the semi-automatic approach (12:12±6:19 min) was reduced by 75–85% with the automatic software tool (unadjusted, 01:34±0:29 min, adjusted, 02:53±1:19 min; both p<0.001). There was good correlation (r=0.89; p<0.001) for the ejection fraction (EF) between the adjusted automatic (58.6±14.9%) and the semi-automatic (58.0±15.3%) approaches. Also the manually adjusted automatic approach led to significantly smaller limits of agreement than the unadjusted automatic approach for end-diastolic volume (±36.4 ml vs ±58.5 ml, p>0.05). Using motion mapping to automatically identify end-systole reduced analysis time by 95% compared with the semi-automatic approach, but showed inferior precision for EF and end-systolic volume. Conclusion Automatic function assessment using MDCT with manual adjustment shows good agreement with an established semi-automatic approach, while reducing the analysis by 75% to less than 3 min. This suggests that automatic CT function assessment with manual correction may be used for fast, comfortable and reliable evaluation of global left ventricular function. PMID:22045953

  13. Automatic single-step quick, easy, cheap, effective, rugged and safe sample preparation devices for analysis of pesticide residues in foods.

    PubMed

    Wang, Jishi; He, Zeying; Wang, Lu; Xu, Yaping; Peng, Yi; Liu, Xiaowei

    2017-09-18

    In this research, the manual two-step QuEChERS approach has been streamlined and automated into a one-step method using a cleanup tube fitted within an extraction tube. A novel automatic QuEChERS combination have been developed to simplify the QuEChERS procedures and improve sample preparation efficiency. This combination integrates QuEChERS procedures into a single run via the use of a vortex vibration-centrifuge device and a centrifuge filtration tube. To validate the efficiency of our automatic QuEChERS device, 270 pesticides were analyzed in plant origined foods including celery, tomatoes, leeks, eggplants, grapes, corn, green tea, and soybean oil using this automatic platform. The results were then compared with those obtained using the manual QuEChERS method. Different parameters were validated and compared including recovery, linearity, repeatability and limits of quantification (LOQ). Satisfactory results, comparable to results obtained using the manual QuEChERS method were obtained. The average recoveries ranged between 70% and 120% for most pesticides with associated relative standard deviations (RSDs) <20% (n=5) indicating satisfactory accuracy and repeatability. An LOQ of 2μg/kg was obtained for most pesticides present in celery and corn matrices, and the correlation coefficients (r(2)) were >0.990 within a linearity range of 2-500μg/kg. Compared to manual QuEChERS, this novel automatic QuEChERS device and combination could significantly improve the sample preparation efficiency for the multiresidue analysis of pesticides. Copyright © 2017. Published by Elsevier B.V.

  14. System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator

    DTIC Science & Technology

    2006-08-01

    System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator Jae-Jun Kim∗ and Brij N. Agrawal † Department of...TITLE AND SUBTITLE System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator 5a. CONTRACT NUMBER 5b...and Dynamics, Vol. 20, No. 4, July-August 1997, pp. 625-632. 6Schwartz, J. L. and Hall, C. D., “ System Identification of a Spherical Air-Bearing

  15. Cognitions, emotions, and sexual response: analysis of the relationship among automatic thoughts, emotional responses, and sexual arousal.

    PubMed

    Nobre, Pedro J; Pinto-Gouveia, José

    2008-08-01

    The relationship between automatic thoughts and emotions presented during sexual activity and their correlation with sexual arousal was investigated. A total of 491 individuals (163 women and 232 men without sexual problems and 47 women and 49 men with a DSM-IV diagnosis of sexual dysfunction) completed the Sexual Modes Questionnaire (SMQ; Nobre and Pinto-Gouveia, Journal of Sex Research, 40, 368-382, 2003). Results indicated several significant correlations among automatic thoughts, emotions, and sexual arousal. Erection concern thoughts in the men and failure/disengagement thoughts and lack of erotic thoughts in the women presented the most significant negative correlations with sexual arousal. Additionally, sadness and disillusion were positively related to these negative cognitions and negatively associated with sexual arousal in both sexes. On the other hand, pleasure and satisfaction were negatively associated with the above-mentioned negative cognitions and positively associated with subjective sexual arousal in both men and women. Overall, findings support the hypothesis that cognitive, emotional, and behavioral dimensions are closely linked and suggest a mode typical of sexual dysfunction composed of negative automatic thoughts, depressive affect, and low subjective sexual arousal.

  16. Analysis of biases from parallel observations of co-located manual and automatic weather stations in Indonesia

    NASA Astrophysics Data System (ADS)

    Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip

    2017-04-01

    Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.

  17. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  18. Determination of urinary levels of leukotriene B(4) using ad highly specific and sensitive methodology based on automatic MEPS combined with UHPLC-PDA analysis.

    PubMed

    Perestrelo, Rosa; Silva, Catarina L; Câmara, José S

    2015-11-01

    Leukotriene B4 (LTB4) is a potent mediator of inflammation and plays a key function in the pathophysiology of chronic asthma. Detectable urinary levels of LTB4, arises from the activation of leukotriene pathways. In this study an ultra-fast, selective and sensitive analytical method based on semi-automatic microextraction by packed sorbents (MEPS) technique, using a new digitally controlled syringe (eVol®) combined with ultra-high pressure liquid chromatography (UHPLC), is proposed for the measurement of urinary LTB4 (U-LTB4) levels in a group of asthmatic patients (APs) and healthy controls (CTRL). Important parameters affecting MEPS performance, namely sorbent type, number of extraction cycles (extract-discard) and elution volume, were evaluated. The optimal experimental conditions among those investigated for the quantification of U-LTB4 in urine samples were as follows: porous graphitic carbon sorbent (PGC), 10 extractions cycle (10×250 μL of sample) and LTB4 elution with 100 μL of acetonitrile. The UHPLC optimum conditions resulted in a mobile phase consisting of 95% (v/v) of acid aqueous solution (v/v), and acetonitrile 5% (v/v); flow rate of 500 µL/min, and a column temperature of 37±0.1 °C. Under optimized conditions the proposed method exhibit good selectivity and sensitivity LOD (0.37 ng/mL) and LOQ (1.22 ng/mL). The recovery ranging from 86.4 to 101.1% for LTB4, with relative standard deviations (% RSD) no larger than 5%. In addition, the method also afforded good results in terms of linearity (r(2)>0.995) within the established concentration range, with a residual deviation for each calibration point below 6%, and intra- and inter-day repeatability in urine samples with RSD values lower than 4 and 5%, respectively. The application of the method to urine samples revealed a tendency towards the increased urinary LTB4 levels in APs (5.42±0.17 ng/mL) when compared to those of CTRL group (from ND to 1.9 ng/mL). Urinary measurement of LTB4 may be an

  19. Dose equations for tube current modulation in CT scanning and the interpretation of the associated CTDI{sub vol}

    SciTech Connect

    Dixon, Robert L.; Boone, John M.

    2013-11-15

    Purpose: The scanner-reported CTDI{sub vol} for automatic tube current modulation (TCM) has a different physical meaning from the traditional CTDI{sub vol} at constant mA, resulting in the dichotomy “CTDI{sub vol} of the first and second kinds” for which a physical interpretation is sought in hopes of establishing some commonality between the two.Methods: Rigorous equations are derived to describe the accumulated dose distributions for TCM. A comparison with formulae for scanner-reported CTDI{sub vol} clearly identifies the source of their differences. Graphical dose simulations are also provided for a variety of TCM tube current distributions (including constant mA), all having the same scanner-reported CTDI{sub vol}.Results: These convolution equations and simulations show that the local dose at z depends only weakly on the local tube current i(z) due to the strong influence of scatter from all other locations along z, and that the “local CTDI{sub vol}(z)” does not represent a local dose but rather only a relative i(z) ≡ mA(z). TCM is a shift-variant technique to which the CTDI-paradigm does not apply and its application to TCM leads to a CTDI{sub vol} of the second kind which lacks relevance.Conclusions: While the traditional CTDI{sub vol} at constant mA conveys useful information (the peak dose at the center of the scan length), CTDI{sub vol} of the second kind conveys no useful information about the associated TCM dose distribution it purportedly represents and its physical interpretation remains elusive. On the other hand, the total energy absorbed E (“integral dose”) as well as its surrogate DLP remain robust between variable i(z) TCM and constant current i{sub 0} techniques, both depending only on the total mAs =t{sub 0}=i{sub 0} t{sub 0} during the beam-on time t{sub 0}.

  20. A semi-automatic measurement system based on digital image analysis for the application to the single fiber fragmentation test

    NASA Astrophysics Data System (ADS)

    Blobel, Swen; Thielsch, Karin; Ulbricht, Volker

    2013-04-01

    The computational prediction of the effective macroscopic material behavior of fiber reinforced composites is a goal of research to exploit the potential of these materials. Besides the mechanical characteristics of the material components, an extensive knowledge of the mechanical interaction between these components is necessary in order to set-up suitable models of the local material structure. For example, an experimental investigation of the micromechanical damage behavior of simplified composite specimens can help to understand the mechanisms, which causes matrix and interface damage in the vicinity of a fiber fracture. To realize an appropriate experimental setup, a novel semi-automatic measurement system based on the analysis of digital images using photoelasticity and image correlation was developed. Applied to specimens with a birefringent matrix material, it is able to provide global and local information of the damage evolution and the stress and strain state at the same time. The image acquisition is accomplished using a long distance microscopic optic with an effective resolution of two micrometer per pixel. While the system is moved along the domain of interest of the specimen, the acquired images are assembled online and used to interpret optically extracted information in combination with global force-displacement curves provided by the load frame. The illumination of the specimen with circularly polarized light and the projection of the transmitted light through different configurations of polarizer and quarterwave-plates enables the synchronous capturing of four images at the quadrants of a four megapixel image sensor. The fifth image is decoupled from the same optical path and is projected to a second camera chip, to get a non-polarized image of the same scene at the same time. The benefit of this optical setup is the opportunity to extract a wide range of information locally, without influence on the progress of the experiment. The four images