Sample records for automatic vol analysis

  1. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  2. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  3. Automatic imitation: A meta-analysis.

    PubMed

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Automatic Geo-location Correction of Satellite Imagery

    DTIC Science & Technology

    2014-09-25

    orientation of large stereo satellite image blocks.," Int. Arch. Photogrammetry and Remote Sensing Spatial Inf. Sci, vol. 39, pp. 209-214, 2012. [6...Coefficient (RPC) model to represent both the internal and external orientation of a satellite image in one Automatic Geo-location Correction of Satellite...Applications of Digital Image Processing VI, vol. 432, 1983. [9] Edward M Mikhail, James S Bethel, and J C McGlone, Introduction to Modern Photogrammetry

  5. Service Vessel Analysis. Vol. II : Detailed District Plots.

    DOT National Transportation Integrated Search

    1987-09-01

    This is a supplement to Service Vessel Analysis, Vol. I: Seagoing and Coastal Vessel Requirements for Servicing Aids to Navigation. The material included is not intended to stand alone but is prepared for use in conjunction with the original study. T...

  6. Automatic Fibrosis Quantification By Using a k-NN Classificator

    DTIC Science & Technology

    2001-10-25

    Fluthrope, “Stages in fiber breakdown in duchenne muscular dystrophy ,” J. Neurol. Sci., vol. 24, pp. 179– 186, 1975. [6] F. Cornelio and I. Dones, “ Muscle ...an automatic algorithm to measure fibrosis in muscle sections of mdx mice, a mutant species used as a model of the Duchenne dystrophy . The al- gorithm...fiber degeneration and necro- sis in muscular dystrophy and other muscle diseases: cytochem- ical and immunocytochemical data,” Ann. Neurol., vol. 16

  7. Application of automatic image analysis in wood science

    Treesearch

    Charles W. McMillin

    1982-01-01

    In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...

  8. FAMA: Fast Automatic MOOG Analysis

    NASA Astrophysics Data System (ADS)

    Magrini, Laura; Randich, Sofia; Friel, Eileen; Spina, Lorenzo; Jacobson, Heather; Cantat-Gaudin, Tristan; Donati, Paolo; Baglioni, Roberto; Maiorca, Enrico; Bragaglia, Angela; Sordo, Rosanna; Vallenari, Antonella

    2014-02-01

    FAMA (Fast Automatic MOOG Analysis), written in Perl, computes the atmospheric parameters and abundances of a large number of stars using measurements of equivalent widths (EWs) automatically and independently of any subjective approach. Based on the widely-used MOOG code, it simultaneously searches for three equilibria, excitation equilibrium, ionization balance, and the relationship between logn(FeI) and the reduced EWs. FAMA also evaluates the statistical errors on individual element abundances and errors due to the uncertainties in the stellar parameters. Convergence criteria are not fixed "a priori" but instead are based on the quality of the spectra.

  9. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    PubMed

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  10. Paediatric Automatic Phonological Analysis Tools (APAT).

    PubMed

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  11. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  12. Automatic emotional expression analysis from eye area

    NASA Astrophysics Data System (ADS)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  13. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  14. Thread concept for automatic task parallelization in image analysis

    NASA Astrophysics Data System (ADS)

    Lueckenhaus, Maximilian; Eckstein, Wolfgang

    1998-09-01

    Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.

  15. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    PubMed

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.

  16. Effectiveness of an automatic tracking software in underwater motion analysis.

    PubMed

    Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and

  17. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  18. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    DTIC Science & Technology

    2017-01-01

    files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and

  19. Automatic analysis and classification of surface electromyography.

    PubMed

    Abou-Chadi, F E; Nashar, A; Saad, M

    2001-01-01

    In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device.

  20. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    PubMed

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  1. Review of automatic detection of pig behaviours by using image analysis

    NASA Astrophysics Data System (ADS)

    Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Kong, Fantao

    2017-06-01

    Automatic detection of lying, moving, feeding, drinking, and aggressive behaviours of pigs by means of image analysis can save observation input by staff. It would help staff make early detection of diseases or injuries of pigs during breeding and improve management efficiency of swine industry. This study describes the progress of pig behaviour detection based on image analysis and advancement in image segmentation of pig body, segmentation of pig adhesion and extraction of pig behaviour characteristic parameters. Challenges for achieving automatic detection of pig behaviours were summarized.

  2. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    PubMed

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  3. A hierarchical structure for automatic meshing and adaptive FEM analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Saxena, Mukul; Perucchio, Renato

    1987-01-01

    A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.

  4. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  5. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    ERIC Educational Resources Information Center

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  6. Automatic analysis of stereoscopic satellite image pairs for determination of cloud-top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Woodward, R. H.; Pierce, H.

    1991-01-01

    Results are presented on an automatic stereo analysis of cloud-top heights from nearly simultaneous satellite image pairs from the GOES and NOAA satellites, using a massively parallel processor computer. Comparisons of computer-derived height fields and manually analyzed fields show that the automatic analysis technique shows promise for performing routine stereo analysis in a real-time environment, providing a useful forecasting tool by augmenting observational data sets of severe thunderstorms and hurricanes. Simulations using synthetic stereo data show that it is possible to automatically resolve small-scale features such as 4000-m-diam clouds to about 1500 m in the vertical.

  7. Towards automatic music transcription: note extraction based on independent subspace analysis

    NASA Astrophysics Data System (ADS)

    Wellhausen, Jens; Hoynck, Michael

    2005-01-01

    Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.

  8. Towards automatic music transcription: note extraction based on independent subspace analysis

    NASA Astrophysics Data System (ADS)

    Wellhausen, Jens; Höynck, Michael

    2004-12-01

    Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.

  9. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  10. Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.

    2014-01-01

    Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when symptom severity was high, participants made more facial expressions associated with contempt, smiled less, and those smiles that occurred were more likely to be accompanied by facial actions associated with contempt. These results are consistent with the “social risk hypothesis” of depression. According to this hypothesis, when symptoms are severe, depressed participants withdraw from other people in order to protect themselves from anticipated rejection, scorn, and social exclusion. As their symptoms fade, participants send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and produced the same pattern of depression effects suggests that automatic facial expression analysis may be ready for use in behavioral and clinical science. PMID:24598859

  11. A Theory of Term Importance in Automatic Text Analysis.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    Most existing automatic content analysis and indexing techniques are based on work frequency characteristics applied largely in an ad hoc manner. Contradictory requirements arise in this connection, in that terms exhibiting high occurrence frequencies in individual documents are often useful for high recall performance (to retrieve many relevant…

  12. Comparison of an automatic analysis and a manual analysis of conjunctival microcirculation in a sheep model of haemorrhagic shock.

    PubMed

    Arnemann, Philip-Helge; Hessler, Michael; Kampmeier, Tim; Morelli, Andrea; Van Aken, Hugo Karel; Westphal, Martin; Rehberg, Sebastian; Ertmer, Christian

    2016-12-01

    Life-threatening diseases of critically ill patients are known to derange microcirculation. Automatic analysis of microcirculation would provide a bedside diagnostic tool for microcirculatory disorders and allow immediate therapeutic decisions based upon microcirculation analysis. After induction of general anaesthesia and instrumentation for haemodynamic monitoring, haemorrhagic shock was induced in ten female sheep by stepwise blood withdrawal of 3 × 10 mL per kilogram body weight. Before and after the induction of haemorrhagic shock, haemodynamic variables, samples for blood gas analysis, and videos of conjunctival microcirculation were obtained by incident dark field illumination microscopy. Microcirculatory videos were analysed (1) manually with AVA software version 3.2 by an experienced user and (2) automatically by AVA software version 4.2 for total vessel density (TVD), perfused vessel density (PVD) and proportion of perfused vessels (PPV). Correlation between the two analysis methods was examined by intraclass correlation coefficient and Bland-Altman analysis. The induction of haemorrhagic shock decreased the mean arterial pressure (from 87 ± 11 to 40 ± 7 mmHg; p < 0.001); stroke volume index (from 38 ± 14 to 20 ± 5 ml·m -2 ; p = 0.001) and cardiac index (from 2.9 ± 0.9 to 1.8 ± 0.5 L·min -1 ·m -2 ; p < 0.001) and increased the heart rate (from 72 ± 9 to 87 ± 11 bpm; p < 0.001) and lactate concentration (from 0.9 ± 0.3 to 2.0 ± 0.6 mmol·L -1 ; p = 0.001). Manual analysis showed no change in TVD (17.8 ± 4.2 to 17.8 ± 3.8 mm*mm -2 ; p = 0.993), whereas PVD (from 15.6 ± 4.6 to 11.5 ± 6.5 mm*mm -2 ; p = 0.041) and PPV (from 85.9 ± 11.8 to 62.7 ± 29.6%; p = 0.017) decreased significantly. Automatic analysis was not able to identify these changes. Correlation analysis showed a poor correlation between the analysis methods and a wide

  13. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  14. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  15. Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis

    NASA Astrophysics Data System (ADS)

    Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.

    2013-11-01

    Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.

  16. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  17. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  18. Automatic differential analysis of NMR experiments in complex samples.

    PubMed

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2018-06-01

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Content-based analysis of Ki-67 stained meningioma specimens for automatic hot-spot selection.

    PubMed

    Swiderska-Chadaj, Zaneta; Markiewicz, Tomasz; Grala, Bartlomiej; Lorent, Malgorzata

    2016-10-07

    Hot-spot based examination of immunohistochemically stained histological specimens is one of the most important procedures in pathomorphological practice. The development of image acquisition equipment and computational units allows for the automation of this process. Moreover, a lot of possible technical problems occur in everyday histological material, which increases the complexity of the problem. Thus, a full context-based analysis of histological specimens is also needed in the quantification of immunohistochemically stained specimens. One of the most important reactions is the Ki-67 proliferation marker in meningiomas, the most frequent intracranial tumour. The aim of our study is to propose a context-based analysis of Ki-67 stained specimens of meningiomas for automatic selection of hot-spots. The proposed solution is based on textural analysis, mathematical morphology, feature ranking and classification, as well as on the proposed hot-spot gradual extinction algorithm to allow for the proper detection of a set of hot-spot fields. The designed whole slide image processing scheme eliminates such artifacts as hemorrhages, folds or stained vessels from the region of interest. To validate automatic results, a set of 104 meningioma specimens were selected and twenty hot-spots inside them were identified independently by two experts. The Spearman rho correlation coefficient was used to compare the results which were also analyzed with the help of a Bland-Altman plot. The results show that most of the cases (84) were automatically examined properly with two fields of view with a technical problem at the very most. Next, 13 had three such fields, and only seven specimens did not meet the requirement for the automatic examination. Generally, the Automatic System identifies hot-spot areas, especially their maximum points, better. Analysis of the results confirms the very high concordance between an automatic Ki-67 examination and the expert's results, with a Spearman

  20. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    NASA Astrophysics Data System (ADS)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  1. FURTHER ANALYSIS OF SUBTYPES OF AUTOMATICALLY REINFORCED SIB: A REPLICATION AND QUANTITATIVE ANALYSIS OF PUBLISHED DATASETS

    PubMed Central

    Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.; Bonner, Andrew C.; Arevalo, Alexander R.

    2017-01-01

    Hagopian, Rooker, and Zarcone (2015) evaluated a model for subtyping automatically reinforced self-injurious behavior (SIB) based on its sensitivity to changes in functional analysis conditions and the presence of self-restraint. The current study tested the generality of the model by applying it to all datasets of automatically reinforced SIB published from 1982 to 2015. We identified 49 datasets that included sufficient data to permit subtyping. Similar to the original study, Subtype-1 SIB was generally amenable to treatment using reinforcement alone, whereas Subtype-2 SIB was not. Conclusions could not be drawn about Subtype-3 SIB due to the small number of datasets. Nevertheless, the findings support the generality of the model and suggest that sensitivity of SIB to disruption by alternative reinforcement is an important dimension of automatically reinforced SIB. Findings also suggest that automatically reinforced SIB should no longer be considered a single category and that additional research is needed to better understand and treat Subtype-2 SIB. PMID:28032344

  2. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  3. Algorithm for automatic analysis of electro-oculographic data.

    PubMed

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  4. Dose equations for tube current modulation in CT scanning and the interpretation of the associated CTDI{sub vol}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Robert L.; Boone, John M.

    2013-11-15

    Purpose: The scanner-reported CTDI{sub vol} for automatic tube current modulation (TCM) has a different physical meaning from the traditional CTDI{sub vol} at constant mA, resulting in the dichotomy “CTDI{sub vol} of the first and second kinds” for which a physical interpretation is sought in hopes of establishing some commonality between the two.Methods: Rigorous equations are derived to describe the accumulated dose distributions for TCM. A comparison with formulae for scanner-reported CTDI{sub vol} clearly identifies the source of their differences. Graphical dose simulations are also provided for a variety of TCM tube current distributions (including constant mA), all having the samemore » scanner-reported CTDI{sub vol}.Results: These convolution equations and simulations show that the local dose at z depends only weakly on the local tube current i(z) due to the strong influence of scatter from all other locations along z, and that the “local CTDI{sub vol}(z)” does not represent a local dose but rather only a relative i(z) ≡ mA(z). TCM is a shift-variant technique to which the CTDI-paradigm does not apply and its application to TCM leads to a CTDI{sub vol} of the second kind which lacks relevance.Conclusions: While the traditional CTDI{sub vol} at constant mA conveys useful information (the peak dose at the center of the scan length), CTDI{sub vol} of the second kind conveys no useful information about the associated TCM dose distribution it purportedly represents and its physical interpretation remains elusive. On the other hand, the total energy absorbed E (“integral dose”) as well as its surrogate DLP remain robust between variable i(z) TCM and constant current i{sub 0} techniques, both depending only on the total mAs = t{sub 0}=i{sub 0} t{sub 0} during the beam-on time t{sub 0}.« less

  5. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    NASA Astrophysics Data System (ADS)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  6. Automatic Text Analysis Based on Transition Phenomena of Word Occurrences

    ERIC Educational Resources Information Center

    Pao, Miranda Lee

    1978-01-01

    Describes a method of selecting index terms directly from a word frequency list, an idea originally suggested by Goffman. Results of the analysis of word frequencies of two articles seem to indicate that the automated selection of index terms from a frequency list holds some promise for automatic indexing. (Author/MBR)

  7. Algorithm for automatic analysis of electro-oculographic data

    PubMed Central

    2013-01-01

    Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372

  8. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  9. Automatic analysis of medical dialogue in the home hemodialysis domain: structure induction and summarization.

    PubMed

    Lacson, Ronilda C; Barzilay, Regina; Long, William J

    2006-10-01

    Spoken medical dialogue is a valuable source of information for patients and caregivers. This work presents a first step towards automatic analysis and summarization of spoken medical dialogue. We first abstract a dialogue into a sequence of semantic categories using linguistic and contextual features integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). We then describe and implement a summarizer that utilizes this automatically induced structure. Our evaluation results indicate that automatically generated summaries exhibit high resemblance to summaries written by humans. In addition, task-based evaluation shows that physicians can reasonably answer questions related to patient care by looking at the automatically generated summaries alone, in contrast to the physicians' performance when they were given summaries from a naïve summarizer (p<0.05). This work demonstrates the feasibility of automatically structuring and summarizing spoken medical dialogue.

  10. Facilitator control as automatic behavior: A verbal behavior analysis

    PubMed Central

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is automatic when the speaker or writer is not stimulated by the behavior at the time of emission, the behavior is not edited, the products of behavior differ from what the person would produce normally, and the behavior is attributed to an outside source. All of these characteristics appear to be present in facilitator behavior. Other variables seem to account for the thematic content of the typed messages. These variables also are discussed. PMID:22477083

  11. Automatic dirt trail analysis in dermoscopy images.

    PubMed

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  12. Automatic cortical thickness analysis on rodent brain

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek

    2011-03-01

    Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.

  13. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  14. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  15. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  16. Automatic Real Time Ionogram Scaler with True Height Analysis - Artist

    DTIC Science & Technology

    1983-07-01

    scaled. The corresponding autoscaled values were compared with the manual scaled h’F, h’F2, fminF, foE, foEs, h’E and hlEs. The ARTIST program...I ... , ·~ J .,\\; j~~·n! I:\\’~ .. IC HT:/\\L rritw!E I ONOGI\\AM SCALER ’:!"[’!’if T:\\!_1!: H~:IGHT ANALYSIS - ARTIST P...S. TYPE OF REPORT & PERiCO COVERED Scientific Report No. 7 AUTOMATIC REAL TIME IONOGRAM SCALER WITH TRUE HEIGHT ANALYSIS - ARTIST 6. PERFORMING O𔃾G

  17. Comparison of automatic control systems

    NASA Technical Reports Server (NTRS)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  18. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    PubMed

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  19. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  20. Automatic variance analysis of multistage care pathways.

    PubMed

    Li, Xiang; Liu, Haifeng; Zhang, Shilei; Mei, Jing; Xie, Guotong; Yu, Yiqin; Li, Jing; Lakshmanan, Geetika T

    2014-01-01

    A care pathway (CP) is a standardized process that consists of multiple care stages, clinical activities and their relations, aimed at ensuring and enhancing the quality of care. However, actual care may deviate from the planned CP, and analysis of these deviations can help clinicians refine the CP and reduce medical errors. In this paper, we propose a CP variance analysis method to automatically identify the deviations between actual patient traces in electronic medical records (EMR) and a multistage CP. As the care stage information is usually unavailable in EMR, we first align every trace with the CP using a hidden Markov model. From the aligned traces, we report three types of deviations for every care stage: additional activities, absent activities and violated constraints, which are identified by using the techniques of temporal logic and binomial tests. The method has been applied to a CP for the management of congestive heart failure and real world EMR, providing meaningful evidence for the further improvement of care quality.

  1. Mobile GPU-based implementation of automatic analysis method for long-term ECG.

    PubMed

    Fan, Xiaomao; Yao, Qihang; Li, Ye; Chen, Runge; Cai, Yunpeng

    2018-05-03

    Long-term electrocardiogram (ECG) is one of the important diagnostic assistant approaches in capturing intermittent cardiac arrhythmias. Combination of miniaturized wearable holters and healthcare platforms enable people to have their cardiac condition monitored at home. The high computational burden created by concurrent processing of numerous holter data poses a serious challenge to the healthcare platform. An alternative solution is to shift the analysis tasks from healthcare platforms to the mobile computing devices. However, long-term ECG data processing is quite time consuming due to the limited computation power of the mobile central unit processor (CPU). This paper aimed to propose a novel parallel automatic ECG analysis algorithm which exploited the mobile graphics processing unit (GPU) to reduce the response time for processing long-term ECG data. By studying the architecture of the sequential automatic ECG analysis algorithm, we parallelized the time-consuming parts and reorganized the entire pipeline in the parallel algorithm to fully utilize the heterogeneous computing resources of CPU and GPU. The experimental results showed that the average executing time of the proposed algorithm on a clinical long-term ECG dataset (duration 23.0 ± 1.0 h per signal) is 1.215 ± 0.140 s, which achieved an average speedup of 5.81 ± 0.39× without compromising analysis accuracy, comparing with the sequential algorithm. Meanwhile, the battery energy consumption of the automatic ECG analysis algorithm was reduced by 64.16%. Excluding energy consumption from data loading, 79.44% of the energy consumption could be saved, which alleviated the problem of limited battery working hours for mobile devices. The reduction of response time and battery energy consumption in ECG analysis not only bring better quality of experience to holter users, but also make it possible to use mobile devices as ECG terminals for healthcare professions such as physicians and health

  2. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  3. Automatic Real-Time Estimation of Plume Height and Mass Eruption Rate Using Radar Data During Explosive Volcanism

    NASA Astrophysics Data System (ADS)

    Arason, P.; Barsotti, S.; De'Michieli Vitturi, M.; Jónsson, S.; Arngrímsson, H.; Bergsson, B.; Pfeffer, M. A.; Petersen, G. N.; Bjornsson, H.

    2016-12-01

    Plume height and mass eruption rate are the principal scale parameters of explosive volcanic eruptions. Weather radars are important instruments in estimating plume height, due to their independence of daylight, weather and visibility. The Icelandic Meteorological Office (IMO) operates two fixed position C-band weather radars and two mobile X-band radars. All volcanoes in Iceland can be monitored by IMO's radar network, and during initial phases of an eruption all available radars will be set to a more detailed volcano scan. When the radar volume data is retrived at IMO-headquarters in Reykjavík, an automatic analysis is performed on the radar data above the proximity of the volcano. The plume height is automatically estimated taking into account the radar scanning strategy, beam width, and a likely reflectivity gradient at the plume top. This analysis provides a distribution of the likely plume height. The automatically determined plume height estimates from the radar data are used as input to a numerical suite that calculates the eruptive source parameters through an inversion algorithm. This is done by using the coupled system DAKOTA-PlumeMoM which solves the 1D plume model equations iteratively by varying the input values of vent radius and vertical velocity. The model accounts for the effect of wind on the plume dynamics, using atmospheric vertical profiles extracted from the ECMWF numerical weather prediction model. Finally, the resulting estimates of mass eruption rate are used to initialize the dispersal model VOL-CALPUFF to assess hazard due to tephra fallout, and communicated to London VAAC to support their modelling activity for aviation safety purposes.

  4. Independent component analysis for automatic note extraction from musical trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  5. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  6. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  7. Fully Automatic Speech-Based Analysis of the Semantic Verbal Fluency Task.

    PubMed

    König, Alexandra; Linz, Nicklas; Tröger, Johannes; Wolters, Maria; Alexandersson, Jan; Robert, Phillipe

    2018-06-08

    Semantic verbal fluency (SVF) tests are routinely used in screening for mild cognitive impairment (MCI). In this task, participants name as many items as possible of a semantic category under a time constraint. Clinicians measure task performance manually by summing the number of correct words and errors. More fine-grained variables add valuable information to clinical assessment, but are time-consuming. Therefore, the aim of this study is to investigate whether automatic analysis of the SVF could provide these as accurate as manual and thus, support qualitative screening of neurocognitive impairment. SVF data were collected from 95 older people with MCI (n = 47), Alzheimer's or related dementias (ADRD; n = 24), and healthy controls (HC; n = 24). All data were annotated manually and automatically with clusters and switches. The obtained metrics were validated using a classifier to distinguish HC, MCI, and ADRD. Automatically extracted clusters and switches were highly correlated (r = 0.9) with manually established values, and performed as well on the classification task separating HC from persons with ADRD (area under curve [AUC] = 0.939) and MCI (AUC = 0.758). The results show that it is possible to automate fine-grained analyses of SVF data for the assessment of cognitive decline. © 2018 S. Karger AG, Basel.

  8. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  9. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  10. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  11. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring

  12. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  13. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  14. Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain

    NASA Astrophysics Data System (ADS)

    Krauß, Thomas; Fischer, Peter

    2016-08-01

    In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.

  15. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    ERIC Educational Resources Information Center

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  16. Validation of automatic segmentation of ribs for NTCP modeling.

    PubMed

    Stam, Barbara; Peulen, Heike; Rossi, Maddalena M G; Belderbos, José S A; Sonke, Jan-Jakob

    2016-03-01

    Determination of a dose-effect relation for rib fractures in a large patient group has been limited by the time consuming manual delineation of ribs. Automatic segmentation could facilitate such an analysis. We determine the accuracy of automatic rib segmentation in the context of normal tissue complication probability modeling (NTCP). Forty-one patients with stage I/II non-small cell lung cancer treated with SBRT to 54 Gy in 3 fractions were selected. Using the 4DCT derived mid-ventilation planning CT, all ribs were manually contoured and automatically segmented. Accuracy of segmentation was assessed using volumetric, shape and dosimetric measures. Manual and automatic dosimetric parameters Dx and EUD were tested for equivalence using the Two One-Sided T-test (TOST), and assessed for agreement using Bland-Altman analysis. NTCP models based on manual and automatic segmentation were compared. Automatic segmentation was comparable with the manual delineation in radial direction, but larger near the costal cartilage and vertebrae. Manual and automatic Dx and EUD were significantly equivalent. The Bland-Altman analysis showed good agreement. The two NTCP models were very similar. Automatic rib segmentation was significantly equivalent to manual delineation and can be used for NTCP modeling in a large patient group. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Operational testing of system for automatic sleep analysis

    NASA Technical Reports Server (NTRS)

    Kellaway, P.

    1972-01-01

    Tables on the performance, under operational conditions, of an automatic sleep monitoring system are presented. Data are recorded from patients who were undergoing heart and great vessel surgery. This study resulted in cap, electrode, and preamplifier improvements. Children were used to test the sleep analyzer and medical console write out units. From these data, an automatic voltage control circuit for the analyzer was developed. A special circuitry for obviating the possibility of incorrect sleep staging due to the presence of a movement artifact was also developed as a result of the study.

  18. Rapid automatic keyword extraction for information retrieval and analysis

    DOEpatents

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  19. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  20. Isothermal reduction kinetics of Panzhihua ilmenite concentrate under 30vol% CO-70vol% N2 atmosphere

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-yi; Lü, Wei; Lü, Xue-wei; Li, Sheng-ping; Bai, Chen-guang; Song, Bing; Han, Ke-xi

    2017-03-01

    The reduction of ilmenite concentrate in 30vol% CO-70vol% N2 atmosphere was characterized by thermogravimetric and differential thermogravimetric (TG-DTG) analysis methods at temperatures from 1073 to 1223 K. The isothermal reduction results show that the reduction process comprised two stages; the corresponding apparent activation energy was obtained by the iso-conversional and model-fitting methods. For the first stage, the effect of temperature on the conversion degree was not obvious, the phase boundary chemical reaction was the controlling step, with an apparent activation energy of 15.55-40.71 kJ·mol-1. For the second stage, when the temperatures was greater than 1123 K, the reaction rate and the conversion degree increased sharply with increasing temperature, and random nucleation and subsequent growth were the controlling steps, with an apparent activation energy ranging from 182.33 to 195.95 kJ·mol-1. For the whole reduction process, the average activation energy and pre-exponential factor were 98.94-118.33 kJ·mol-1 and 1.820-1.816 min-1, respectively.

  1. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    NASA Astrophysics Data System (ADS)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  2. Automatic quantitative computed tomography segmentation and analysis of aerated lung volumes in acute respiratory distress syndrome-A comparative diagnostic study.

    PubMed

    Klapsing, Philipp; Herrmann, Peter; Quintel, Michael; Moerer, Onnen

    2017-12-01

    Quantitative lung computed tomographic (CT) analysis yields objective data regarding lung aeration but is currently not used in clinical routine primarily because of the labor-intensive process of manual CT segmentation. Automatic lung segmentation could help to shorten processing times significantly. In this study, we assessed bias and precision of lung CT analysis using automatic segmentation compared with manual segmentation. In this monocentric clinical study, 10 mechanically ventilated patients with mild to moderate acute respiratory distress syndrome were included who had received lung CT scans at 5- and 45-mbar airway pressure during a prior study. Lung segmentations were performed both automatically using a computerized algorithm and manually. Automatic segmentation yielded similar lung volumes compared with manual segmentation with clinically minor differences both at 5 and 45 mbar. At 5 mbar, results were as follows: overdistended lung 49.58mL (manual, SD 77.37mL) and 50.41mL (automatic, SD 77.3mL), P=.028; normally aerated lung 2142.17mL (manual, SD 1131.48mL) and 2156.68mL (automatic, SD 1134.53mL), P = .1038; and poorly aerated lung 631.68mL (manual, SD 196.76mL) and 646.32mL (automatic, SD 169.63mL), P = .3794. At 45 mbar, values were as follows: overdistended lung 612.85mL (manual, SD 449.55mL) and 615.49mL (automatic, SD 451.03mL), P=.078; normally aerated lung 3890.12mL (manual, SD 1134.14mL) and 3907.65mL (automatic, SD 1133.62mL), P = .027; and poorly aerated lung 413.35mL (manual, SD 57.66mL) and 469.58mL (automatic, SD 70.14mL), P=.007. Bland-Altman analyses revealed the following mean biases and limits of agreement at 5 mbar for automatic vs manual segmentation: overdistended lung +0.848mL (±2.062mL), normally aerated +14.51mL (±49.71mL), and poorly aerated +14.64mL (±98.16mL). At 45 mbar, results were as follows: overdistended +2.639mL (±8.231mL), normally aerated 17.53mL (±41.41mL), and poorly aerated 56.23mL (±100.67mL). Automatic

  3. Ganalyzer: A tool for automatic galaxy image analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-05-01

    Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

  4. Automatic RBG-depth-pressure anthropometric analysis and individualised sleep solution prescription.

    PubMed

    Esquirol Caussa, Jordi; Palmero Cantariño, Cristina; Bayo Tallón, Vanessa; Cos Morera, Miquel Àngel; Escalera, Sergio; Sánchez, David; Sánchez Padilla, Maider; Serrano Domínguez, Noelia; Relats Vilageliu, Mireia

    2017-08-01

    Sleep surfaces must adapt to individual somatotypic features to maintain a comfortable, convenient and healthy sleep, preventing diseases and injuries. Individually determining the most adequate rest surface can often be a complex and subjective question. To design and validate an automatic multimodal somatotype determination model to automatically recommend an individually designed mattress-topper-pillow combination. Design and validation of an automated prescription model for an individualised sleep system is performed through a single-image 2 D-3 D analysis and body pressure distribution, to objectively determine optimal individual sleep surfaces combining five different mattress densities, three different toppers and three cervical pillows. A final study (n = 151) and re-analysis (n = 117) defined and validated the model, showing high correlations between calculated and real data (>85% in height and body circumferences, 89.9% in weight, 80.4% in body mass index and more than 70% in morphotype categorisation). Somatotype determination model can accurately prescribe an individualised sleep solution. This can be useful for healthy people and for health centres that need to adapt sleep surfaces to people with special needs. Next steps will increase model's accuracy and analise, if this prescribed individualised sleep solution can improve sleep quantity and quality; additionally, future studies will adapt the model to mattresses with technological improvements, tailor-made production and will define interfaces for people with special needs.

  5. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice.

  6. Automatic digital image analysis for identification of mitotic cells in synchronous mammalian cell cultures.

    PubMed

    Eccles, B A; Klevecz, R R

    1986-06-01

    Mitotic frequency in a synchronous culture of mammalian cells was determined fully automatically and in real time using low-intensity phase-contrast microscopy and a newvicon video camera connected to an EyeCom III image processor. Image samples, at a frequency of one per minute for 50 hours, were analyzed by first extracting the high-frequency picture components, then thresholding and probing for annular objects indicative of putative mitotic cells. Both the extraction of high-frequency components and the recognition of rings of varying radii and discontinuities employed novel algorithms. Spatial and temporal relationships between annuli were examined to discern the occurrences of mitoses, and such events were recorded in a computer data file. At present, the automatic analysis is suited for random cell proliferation rate measurements or cell cycle studies. The automatic identification of mitotic cells as described here provides a measure of the average proliferative activity of the cell population as a whole and eliminates more than eight hours of manual review per time-lapse video recording.

  7. AUTOMATIC MASS SPECTROMETER

    DOEpatents

    Hanson, M.L.; Tabor, C.D. Jr.

    1961-12-01

    A mass spectrometer for analyzing the components of a gas is designed which is capable of continuous automatic operation such as analysis of samples of process gas from a continuous production system where the gas content may be changing. (AEC)

  8. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  9. Automatic Visual Tracking and Social Behaviour Analysis with Multiple Mice

    PubMed Central

    Giancardo, Luca; Sona, Diego; Huang, Huiping; Sannino, Sara; Managò, Francesca; Scheggia, Diego; Papaleo, Francesco; Murino, Vittorio

    2013-01-01

    Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain) and BTBR T+tf/J (a mouse model for autism spectrum disorders). Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2) interacting mice, and its versatility to deal with different experimental settings and

  10. Flight Experience, Risk Taking, and Hazardous Attitudes in Glider Instructors (Experience de vol, Prise de Risque et Attitudes Dangereuses des Instructeurs de vol sur Planeur)

    DTIC Science & Technology

    2010-11-01

    sur planeur . On a également examiné le rôle que jouent l’expérience de vol et la propension à prendre des risques dans le but d’anticiper les...données transversales, obtenues par l’observation de 144 instructeurs de vol sur planeur , en activité ou non, œuvrant dans cinq centres de vol à...137 Sommaire ..... Expérience de vol, prise de risque et attitudes dangereuses des instructeurs de vol sur planeur Ann-Renee Blais

  11. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    NASA Technical Reports Server (NTRS)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  12. Methods for automatically analyzing humpback song units.

    PubMed

    Rickwood, Peter; Taylor, Andrew

    2008-03-01

    This paper presents mathematical techniques for automatically extracting and analyzing bioacoustic signals. Automatic techniques are described for isolation of target signals from background noise, extraction of features from target signals and unsupervised classification (clustering) of the target signals based on these features. The only user-provided inputs, other than raw sound, is an initial set of signal processing and control parameters. Of particular note is that the number of signal categories is determined automatically. The techniques, applied to hydrophone recordings of humpback whales (Megaptera novaeangliae), produce promising initial results, suggesting that they may be of use in automated analysis of not only humpbacks, but possibly also in other bioacoustic settings where automated analysis is desirable.

  13. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  14. Automatic morphological classification of galaxy images

    PubMed Central

    Shamir, Lior

    2009-01-01

    We describe an image analysis supervised learning algorithm that can automatically classify galaxy images. The algorithm is first trained using a manually classified images of elliptical, spiral, and edge-on galaxies. A large set of image features is extracted from each image, and the most informative features are selected using Fisher scores. Test images can then be classified using a simple Weighted Nearest Neighbor rule such that the Fisher scores are used as the feature weights. Experimental results show that galaxy images from Galaxy Zoo can be classified automatically to spiral, elliptical and edge-on galaxies with accuracy of ~90% compared to classifications carried out by the author. Full compilable source code of the algorithm is available for free download, and its general-purpose nature makes it suitable for other uses that involve automatic image analysis of celestial objects. PMID:20161594

  15. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.

  16. Towards the automatic detection and analysis of sunspot rotation

    NASA Astrophysics Data System (ADS)

    Brown, Daniel S.; Walker, Andrew P.

    2016-10-01

    Torsional rotation of sunspots have been noted by many authors over the past century. Sunspots have been observed to rotate up to the order of 200 degrees over 8-10 days, and these have often been linked with eruptive behaviour such as solar flares and coronal mass ejections. However, most studies in the literature are case studies or small-number studies which suffer from selection bias. In order to better understand sunspot rotation and its impact on the corona, unbiased large-sample statistical studies are required (including both rotating and non-rotating sunspots). While this can be done manually, a better approach is to automate the detection and analysis of rotating sunspots using robust methods with well characterised uncertainties. The SDO/HMI instrument provide long-duration, high-resolution and high-cadence continuum observations suitable for extracting a large number of examples of rotating sunspots. This presentation will outline the analysis of SDI/HMI data to determine the rotation (and non-rotation) profiles of sunspots for the complete duration of their transit across the solar disk, along with how this can be extended to automatically identify sunspots and initiate their analysis.

  17. AUTOMATIC DIRT TRAIL ANALYSIS IN DERMOSCOPY IMAGES

    PubMed Central

    Cheng, Beibei; Stanley, R. Joe; Stoecker, William V.; Osterwise, Christopher T.P.; Stricklin, Sherea M.; Hinton, Kristen A.; Moss, Randy H.; Oliviero, Margaret; Rabinovitz, Harold S.

    2011-01-01

    Basal cell carcinoma (BCC) is the most common cancer in the U.S. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 area under a receiver operating characteristic curve using a leave-one-out approach, demonstrating the potential of dirt trails for BCC lesion discrimination. PMID:22233099

  18. DELINEATING SUBTYPES OF SELF-INJURIOUS BEHAVIOR MAINTAINED BY AUTOMATIC REINFORCEMENT

    PubMed Central

    Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.

    2016-01-01

    Self-injurious behavior (SIB) is maintained by automatic reinforcement in roughly 25% of cases. Automatically reinforced SIB typically has been considered a single functional category, and is less understood than socially reinforced SIB. Subtyping automatically reinforced SIB into functional categories has the potential to guide the development of more targeted interventions and increase our understanding of its biological underpinnings. The current study involved an analysis of 39 individuals with automatically reinforced SIB and a comparison group of 13 individuals with socially reinforced SIB. Automatically reinforced SIB was categorized into 3 subtypes based on patterns of responding in the functional analysis and the presence of self-restraint. These response features were selected as the basis for subtyping on the premise that they could reflect functional properties of SIB unique to each subtype. Analysis of treatment data revealed important differences across subtypes and provides preliminary support to warrant additional research on this proposed subtyping model. PMID:26223959

  19. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  2. Automatic adventitious respiratory sound analysis: A systematic review.

    PubMed

    Pramono, Renard Xaviero Adhi; Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11

  3. Automatic adventitious respiratory sound analysis: A systematic review

    PubMed Central

    Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Background Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. Objective To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. Data sources A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Study selection Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Data extraction Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. Data synthesis A total of 77 reports from the literature were included in this review. 55 (71.43%) of the

  4. Automatic Implementation of Prony Analysis for Electromechanical Mode Identification from Phasor Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.

    2010-07-31

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper developed a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detect ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis.« less

  5. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps,more » then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.« less

  6. Automatic quantitative analysis of in-stent restenosis using FD-OCT in vivo intra-arterial imaging.

    PubMed

    Mandelias, Kostas; Tsantis, Stavros; Spiliopoulos, Stavros; Katsakiori, Paraskevi F; Karnabatidis, Dimitris; Nikiforidis, George C; Kagadis, George C

    2013-06-01

    A new segmentation technique is implemented for automatic lumen area extraction and stent strut detection in intravascular optical coherence tomography (OCT) images for the purpose of quantitative analysis of in-stent restenosis (ISR). In addition, a user-friendly graphical user interface (GUI) is developed based on the employed algorithm toward clinical use. Four clinical datasets of frequency-domain OCT scans of the human femoral artery were analyzed. First, a segmentation method based on fuzzy C means (FCM) clustering and wavelet transform (WT) was applied toward inner luminal contour extraction. Subsequently, stent strut positions were detected by utilizing metrics derived from the local maxima of the wavelet transform into the FCM membership function. The inner lumen contour and the position of stent strut were extracted with high precision. Compared to manual segmentation by an expert physician, the automatic lumen contour delineation had an average overlap value of 0.917 ± 0.065 for all OCT images included in the study. The strut detection procedure achieved an overall accuracy of 93.80% and successfully identified 9.57 ± 0.5 struts for every OCT image. Processing time was confined to approximately 2.5 s per OCT frame. A new fast and robust automatic segmentation technique combining FCM and WT for lumen border extraction and strut detection in intravascular OCT images was designed and implemented. The proposed algorithm integrated in a GUI represents a step forward toward the employment of automated quantitative analysis of ISR in clinical practice.

  7. Analysis of facial expressions in parkinson's disease through video-based automatic methods.

    PubMed

    Bandini, Andrea; Orlandi, Silvia; Escalante, Hugo Jair; Giovannelli, Fabio; Cincotta, Massimo; Reyes-Garcia, Carlos A; Vanni, Paola; Zaccara, Gaetano; Manfredi, Claudia

    2017-04-01

    The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. Results show that control subjects reported on average higher distances than PD patients along the tasks. This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Automatic Conflict Detection on Contracts

    NASA Astrophysics Data System (ADS)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  9. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  10. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  11. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    NASA Astrophysics Data System (ADS)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  12. Brain activity across the development of automatic categorization: A comparison of categorization tasks using multi-voxel pattern analysis

    PubMed Central

    Soto, Fabian A.; Waldschmidt, Jennifer G.; Helie, Sebastien; Ashby, F. Gregory

    2013-01-01

    Previous evidence suggests that relatively separate neural networks underlie initial learning of rule-based and information-integration categorization tasks. With the development of automaticity, categorization behavior in both tasks becomes increasingly similar and exclusively related to activity in cortical regions. The present study uses multi-voxel pattern analysis to directly compare the development of automaticity in different categorization tasks. Each of three groups of participants received extensive training in a different categorization task: either an information-integration task, or one of two rule-based tasks. Four training sessions were performed inside an MRI scanner. Three different analyses were performed on the imaging data from a number of regions of interest (ROIs). The common patterns analysis had the goal of revealing ROIs with similar patterns of activation across tasks. The unique patterns analysis had the goal of revealing ROIs with dissimilar patterns of activation across tasks. The representational similarity analysis aimed at exploring (1) the similarity of category representations across ROIs and (2) how those patterns of similarities compared across tasks. The results showed that common patterns of activation were present in motor areas and basal ganglia early in training, but only in the former later on. Unique patterns were found in a variety of cortical and subcortical areas early in training, but they were dramatically reduced with training. Finally, patterns of representational similarity between brain regions became increasingly similar across tasks with the development of automaticity. PMID:23333700

  13. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Y; Huang, H; Su, T

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic

  14. Automatic selection of optimal Savitzky-Golay filter parameters for Coronary Wave Intensity Analysis.

    PubMed

    Rivolo, Simone; Nagel, Eike; Smith, Nicolas P; Lee, Jack

    2014-01-01

    Coronary Wave Intensity Analysis (cWIA) is a technique capable of separating the effects of proximal arterial haemodynamics from cardiac mechanics. The cWIA ability to establish a mechanistic link between coronary haemodynamics measurements and the underlying pathophysiology has been widely demonstrated. Moreover, the prognostic value of a cWIA-derived metric has been recently proved. However, the clinical application of cWIA has been hindered due to the strong dependence on the practitioners, mainly ascribable to the cWIA-derived indices sensitivity to the pre-processing parameters. Specifically, as recently demonstrated, the cWIA-derived metrics are strongly sensitive to the Savitzky-Golay (S-G) filter, typically used to smooth the acquired traces. This is mainly due to the inability of the S-G filter to deal with the different timescale features present in the measured waveforms. Therefore, we propose to apply an adaptive S-G algorithm that automatically selects pointwise the optimal filter parameters. The newly proposed algorithm accuracy is assessed against a cWIA gold standard, provided by a newly developed in-silico cWIA modelling framework, when physiological noise is added to the simulated traces. The adaptive S-G algorithm, when used to automatically select the polynomial degree of the S-G filter, provides satisfactory results with ≤ 10% error for all the metrics through all the levels of noise tested. Therefore, the newly proposed method makes cWIA fully automatic and independent from the practitioners, opening the possibility to multi-centre trials.

  15. Automatic Thesaurus Generation for an Electronic Community System.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; And Others

    1995-01-01

    This research reports an algorithmic approach to the automatic generation of thesauri for electronic community systems. The techniques used include term filtering, automatic indexing, and cluster analysis. The Worm Community System, used by molecular biologists studying the nematode worm C. elegans, was used as the testbed for this research.…

  16. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  17. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  18. Automatic generation of stop word lists for information retrieval and analysis

    DOEpatents

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  19. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  20. Automatic Topography Using High Precision Digital Moire Methods

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Saito, S.

    1983-07-01

    Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.

  1. Automatic analysis of the 2015 Gorkha earthquake aftershock sequence.

    NASA Astrophysics Data System (ADS)

    Baillard, C.; Lyon-Caen, H.; Bollinger, L.; Rietbrock, A.; Letort, J.; Adhikari, L. B.

    2016-12-01

    The Mw 7.8 Gorkha earthquake, that partially ruptured the Main Himalayan Thrust North of Kathmandu on the 25th April 2015, was the largest and most catastrophic earthquake striking Nepal since the great M8.4 1934 earthquake. This mainshock was followed by multiple aftershocks, among them, two notable events that occurred on the 12th May with magnitudes of 7.3 Mw and 6.3 Mw. Due to these recent events it became essential for the authorities and for the scientific community to better evaluate the seismic risk in the region through a detailed analysis of the earthquake catalog, amongst others, the spatio-temporal distribution of the Gorkha aftershock sequence. Here we complement this first study by doing a microseismic study using seismic data coming from the eastern part of the Nepalese Seismological Center network associated to one broadband station in Everest. Our primary goal is to deliver an accurate catalog of the aftershock sequence. Due to the exceptional number of events detected we performed an automatic picking/locating procedure which can be splitted in 4 steps: 1) Coarse picking of the onsets using a classical STA/LTA picker, 2) phase association of picked onsets to detect and declare seismic events, 3) Kurtosis pick refinement around theoretical arrival times to increase picking and location accuracy and, 4) local magnitude calculation based amplitude of waveforms. This procedure is time efficient ( 1 sec/event), reduces considerably the location uncertainties ( 2 to 5 km errors) and increases the number of events detected compared to manual processing. Indeed, the automatic detection rate is 10 times higher than the manual detection rate. By comparing to the USGS catalog we were able to give a new attenuation law to compute local magnitudes in the region. A detailed analysis of the seismicity shows a clear migration toward the east of the region and a sudden decrease of seismicity 100 km east of Kathmandu which may reveal the presence of a tectonic

  2. Automatic network coupling analysis for dynamical systems based on detailed kinetic models.

    PubMed

    Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich

    2005-10-01

    We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.

  3. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis.

    PubMed

    Schäfer, Sebastian; Nylund, Kim; Sævik, Fredrik; Engjom, Trond; Mézl, Martin; Jiřík, Radovan; Dimcevski, Georg; Gilja, Odd Helge; Tönnies, Klaus

    2015-08-01

    This paper presents a system for correcting motion influences in time-dependent 2D contrast-enhanced ultrasound (CEUS) images to assess tissue perfusion characteristics. The system consists of a semi-automatic frame selection method to find images with out-of-plane motion as well as a method for automatic motion compensation. Translational and non-rigid motion compensation is applied by introducing a temporal continuity assumption. A study consisting of 40 clinical datasets was conducted to compare the perfusion with simulated perfusion using pharmacokinetic modeling. Overall, the proposed approach decreased the mean average difference between the measured perfusion and the pharmacokinetic model estimation. It was non-inferior for three out of four patient cohorts to a manual approach and reduced the analysis time by 41% compared to manual processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  5. Automatic Parametrization of Somatosensory Evoked Potentials With Chirp Modeling.

    PubMed

    Vayrynen, Eero; Noponen, Kai; Vipin, Ashwati; Thow, X Y; Al-Nashash, Hasan; Kortelainen, Jukka; All, Angelo

    2016-09-01

    In this paper, an approach using polynomial phase chirp signals to model somatosensory evoked potentials (SEPs) is proposed. SEP waveforms are assumed as impulses undergoing group velocity dispersion while propagating along a multipath neural connection. Mathematical analysis of pulse dispersion resulting in chirp signals is performed. An automatic parameterization of SEPs is proposed using chirp models. A Particle Swarm Optimization algorithm is used to optimize the model parameters. Features describing the latencies and amplitudes of SEPs are automatically derived. A rat model is then used to evaluate the automatic parameterization of SEPs in two experimental cases, i.e., anesthesia level and spinal cord injury (SCI). Experimental results show that chirp-based model parameters and the derived SEP features are significant in describing both anesthesia level and SCI changes. The proposed automatic optimization based approach for extracting chirp parameters offers potential for detailed SEP analysis in future studies. The method implementation in Matlab technical computing language is provided online.

  6. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  7. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    PubMed

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  8. A hybrid 3D region growing and 4D curvature analysis-based automatic abdominal blood vessel segmentation through contrast enhanced CT

    NASA Astrophysics Data System (ADS)

    Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen

    2017-03-01

    In abdominal disease diagnosis and various abdominal surgeries planning, segmentation of abdominal blood vessel (ABVs) is a very imperative task. Automatic segmentation enables fast and accurate processing of ABVs. We proposed a fully automatic approach for segmenting ABVs through contrast enhanced CT images by a hybrid of 3D region growing and 4D curvature analysis. The proposed method comprises three stages. First, candidates of bone, kidneys, ABVs and heart are segmented by an auto-adapted threshold. Second, bone is auto-segmented and classified into spine, ribs and pelvis. Third, ABVs are automatically segmented in two sub-steps: (1) kidneys and abdominal part of the heart are segmented, (2) ABVs are segmented by a hybrid approach that integrates a 3D region growing and 4D curvature analysis. Results are compared with two conventional methods. Results show that the proposed method is very promising in segmenting and classifying bone, segmenting whole ABVs and may have potential utility in clinical use.

  9. Analysis and Comparison of Some Automatic Vehicle Monitoring Systems

    DOT National Transportation Integrated Search

    1973-07-01

    In 1970 UMTA solicited proposals and selected four companies to develop systems to demonstrate the feasibility of different automatic vehicle monitoring techniques. The demonstrations culminated in experiments in Philadelphia to assess the performanc...

  10. Automatic thermographic image defect detection of composites

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP

    2011-05-01

    Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.

  11. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...

  12. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...

  13. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...

  14. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 3 2012-04-01 2012-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...

  15. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel import...

  16. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  17. Urban land use of the Sao Paulo metropolitan area by automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Niero, M.; Foresti, C.

    1983-01-01

    The separability of urban land use classes in the metropolitan area of Sao Paulo was studied by means of automatic analysis of MSS/LANDSAT digital data. The data were analyzed using the media K and MAXVER classification algorithms. The land use classes obtained were: CBD/vertical growth area, residential area, mixed area, industrial area, embankment area type 1, embankment area type 2, dense vegetation area and sparse vegetation area. The spectral analysis of representative samples of urban land use classes was done using the "Single Cell" analysis option. The classes CBD/vertical growth area, residential area and embankment area type 2 showed better spectral separability when compared to the other classes.

  18. Automatic Segmenting Structures in MRI's Based on Texture Analysis and Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Kaur, Mandeep; Rattan, Munish; Singh, Pushpinder

    2017-12-01

    The purpose of this paper is to present the variational method for geometric contours which helps the level set function remain close to the sign distance function, therefor it remove the need of expensive re-initialization procedure and thus, level set method is applied on magnetic resonance images (MRI) to track the irregularities in them as medical imaging plays a substantial part in the treatment, therapy and diagnosis of various organs, tumors and various abnormalities. It favors the patient with more speedy and decisive disease controlling with lesser side effects. The geometrical shape, the tumor's size and tissue's abnormal growth can be calculated by the segmentation of that particular image. It is still a great challenge for the researchers to tackle with an automatic segmentation in the medical imaging. Based on the texture analysis, different images are processed by optimization of level set segmentation. Traditionally, optimization was manual for every image where each parameter is selected one after another. By applying fuzzy logic, the segmentation of image is correlated based on texture features, to make it automatic and more effective. There is no initialization of parameters and it works like an intelligent system. It segments the different MRI images without tuning the level set parameters and give optimized results for all MRI's.

  19. [Automatic Extraction and Analysis of Dosimetry Data in Radiotherapy Plans].

    PubMed

    Song, Wei; Zhao, Di; Lu, Hong; Zhang, Biyun; Ma, Jun; Yu, Dahai

    To improve the efficiency and accuracy of extraction and analysis of dosimetry data in radiotherapy plans for a batch of patients. With the interface function provided in Matlab platform, a program was written to extract the dosimetry data exported from treatment planning system in DICOM RT format and exported the dose-volume data to an Excel file with the SPSS compatible format. This method was compared with manual operation for 14 gastric carcinoma patients to validate the efficiency and accuracy. The output Excel data were compatible with SPSS in format, the dosimetry data error for PTV dose interval of 90%-98%, PTV dose interval of 99%-106% and all OARs were -3.48E-5 ± 3.01E-5, -1.11E-3 ± 7.68E-4, -7.85E-5 ± 9.91E-5 respectively. Compared with manual operation, the time required was reduced from 5.3 h to 0.19 h and input error was reduced from 0.002 to 0. The automatic extraction of dosimetry data in DICOM RT format for batch patients, the SPSS compatible data exportation, quick analysis were achieved in this paper. The efficiency of clinical researches based on dosimetry data analysis of large number of patients will be improved with this methods.

  20. The Science of and Advanced Technology for Cost-Effective Manufacture of High Precision Engineering Products. Volume 5. Automatic Generation of Process Outlines of Forming and Machining Processes.

    DTIC Science & Technology

    1986-08-01

    THE SCIENCE OF AND ADVANCED TECHNOLOGY FOR COST-EFFECTIVE MANUFACTURE Lfl OF HIGH PRECISION ENGINEERING PRODUCTS N iA6/*N ONR Contract No. 83K0385...ADVANCED TECHNOLOGY FOR1 COST-EFFECTIVE MANUFACTURE OF1’ HIGH PRECISION ENGINEERING PRODUCTS ONR Contract No. 83K0385 Final Report Vol. 5 AUTOMATIC...Ck 53N Drawing #: 03116-6233 Raw Material: Iiz’ 500mm diameter and 3000mm length Ma, rial Alloy steel. high carbon content, quenched to Min 45Rc

  1. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    PubMed

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  3. Bifurcation analysis of an automatic dynamic balancing mechanism for eccentric rotors

    NASA Astrophysics Data System (ADS)

    Green, K.; Champneys, A. R.; Lieven, N. J.

    2006-04-01

    We present a nonlinear bifurcation analysis of the dynamics of an automatic dynamic balancing mechanism for rotating machines. The principle of operation is to deploy two or more masses that are free to travel around a race at a fixed distance from the hub and, subsequently, balance any eccentricity in the rotor. Mathematically, we start from a Lagrangian description of the system. It is then shown how under isotropic conditions a change of coordinates into a rotating frame turns the problem into a regular autonomous dynamical system, amenable to a full nonlinear bifurcation analysis. Using numerical continuation techniques, curves are traced of steady states, limit cycles and their bifurcations as parameters are varied. These results are augmented by simulations of the system trajectories in phase space. Taking the case of a balancer with two free masses, broad trends are revealed on the existence of a stable, dynamically balanced steady-state solution for specific rotation speeds and eccentricities. However, the analysis also reveals other potentially attracting states—non-trivial steady states, limit cycles, and chaotic motion—which are not in balance. The transient effects which lead to these competing states, which in some cases coexist, are investigated.

  4. Automatic variable selection method and a comparison for quantitative analysis in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong

    2018-05-01

    In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.

  5. A novel automatic quantification method for high-content screening analysis of DNA double strand-break response.

    PubMed

    Feng, Jingwen; Lin, Jie; Zhang, Pengquan; Yang, Songnan; Sa, Yu; Feng, Yuanming

    2017-08-29

    High-content screening is commonly used in studies of the DNA damage response. The double-strand break (DSB) is one of the most harmful types of DNA damage lesions. The conventional method used to quantify DSBs is γH2AX foci counting, which requires manual adjustment and preset parameters and is usually regarded as imprecise, time-consuming, poorly reproducible, and inaccurate. Therefore, a robust automatic alternative method is highly desired. In this manuscript, we present a new method for quantifying DSBs which involves automatic image cropping, automatic foci-segmentation and fluorescent intensity measurement. Furthermore, an additional function was added for standardizing the measurement of DSB response inhibition based on co-localization analysis. We tested the method with a well-known inhibitor of DSB response. The new method requires only one preset parameter, which effectively minimizes operator-dependent variations. Compared with conventional methods, the new method detected a higher percentage difference of foci formation between different cells, which can improve measurement accuracy. The effects of the inhibitor on DSB response were successfully quantified with the new method (p = 0.000). The advantages of this method in terms of reliability, automation and simplicity show its potential in quantitative fluorescence imaging studies and high-content screening for compounds and factors involved in DSB response.

  6. Automatic Assessment and Reduction of Noise using Edge Pattern Analysis in Non-Linear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  7. Combining Recurrence Analysis and Automatic Movement Extraction from Video Recordings to Study Behavioral Coupling in Face-to-Face Parent-Child Interactions.

    PubMed

    López Pérez, David; Leonardi, Giuseppe; Niedźwiecka, Alicja; Radkowska, Alicja; Rączaszek-Leonardi, Joanna; Tomalski, Przemysław

    2017-01-01

    The analysis of parent-child interactions is crucial for the understanding of early human development. Manual coding of interactions is a time-consuming task, which is a limitation in many projects. This becomes especially demanding if a frame-by-frame categorization of movement needs to be achieved. To overcome this, we present a computational approach for studying movement coupling in natural settings, which is a combination of a state-of-the-art automatic tracker, Tracking-Learning-Detection (TLD), and nonlinear time-series analysis, Cross-Recurrence Quantification Analysis (CRQA). We investigated the use of TLD to extract and automatically classify movement of each partner from 21 video recordings of interactions, where 5.5-month-old infants and mothers engaged in free play in laboratory settings. As a proof of concept, we focused on those face-to-face episodes, where the mother animated an object in front of the infant, in order to measure the coordination between the infants' head movement and the mothers' hand movement. We also tested the feasibility of using such movement data to study behavioral coupling between partners with CRQA. We demonstrate that movement can be extracted automatically from standard definition video recordings and used in subsequent CRQA to quantify the coupling between movement of the parent and the infant. Finally, we assess the quality of this coupling using an extension of CRQA called anisotropic CRQA and show asymmetric dynamics between the movement of the parent and the infant. When combined these methods allow automatic coding and classification of behaviors, which results in a more efficient manner of analyzing movements than manual coding.

  8. Combining Recurrence Analysis and Automatic Movement Extraction from Video Recordings to Study Behavioral Coupling in Face-to-Face Parent-Child Interactions

    PubMed Central

    López Pérez, David; Leonardi, Giuseppe; Niedźwiecka, Alicja; Radkowska, Alicja; Rączaszek-Leonardi, Joanna; Tomalski, Przemysław

    2017-01-01

    The analysis of parent-child interactions is crucial for the understanding of early human development. Manual coding of interactions is a time-consuming task, which is a limitation in many projects. This becomes especially demanding if a frame-by-frame categorization of movement needs to be achieved. To overcome this, we present a computational approach for studying movement coupling in natural settings, which is a combination of a state-of-the-art automatic tracker, Tracking-Learning-Detection (TLD), and nonlinear time-series analysis, Cross-Recurrence Quantification Analysis (CRQA). We investigated the use of TLD to extract and automatically classify movement of each partner from 21 video recordings of interactions, where 5.5-month-old infants and mothers engaged in free play in laboratory settings. As a proof of concept, we focused on those face-to-face episodes, where the mother animated an object in front of the infant, in order to measure the coordination between the infants' head movement and the mothers' hand movement. We also tested the feasibility of using such movement data to study behavioral coupling between partners with CRQA. We demonstrate that movement can be extracted automatically from standard definition video recordings and used in subsequent CRQA to quantify the coupling between movement of the parent and the infant. Finally, we assess the quality of this coupling using an extension of CRQA called anisotropic CRQA and show asymmetric dynamics between the movement of the parent and the infant. When combined these methods allow automatic coding and classification of behaviors, which results in a more efficient manner of analyzing movements than manual coding. PMID:29312075

  9. Automatic fluid dispenser

    NASA Technical Reports Server (NTRS)

    Sakellaris, P. C. (Inventor)

    1977-01-01

    Fluid automatically flows to individual dispensing units at predetermined times from a fluid supply and is available only for a predetermined interval of time after which an automatic control causes the fluid to drain from the individual dispensing units. Fluid deprivation continues until the beginning of a new cycle when the fluid is once again automatically made available at the individual dispensing units.

  10. Finite Element Analysis of New Crankshaft Automatic Adjustment Mechanism of Pumping Unit

    NASA Astrophysics Data System (ADS)

    Wu, Jufei; Wang, Qian

    2017-12-01

    In this paper, the crankshaft automatic adjustment mechanism designed on CYJY10-4.2-53HF pumping unit is used as the research object. The simulation of the friction and bending moment of the crank is carried out by ANSYS Workbench, and the finite element simulation results are compared with the theoretical calculation results to verify the theoretical calculation. The final result is that the finite element analysis of the friction of the crank is basically consistent with the theoretical calculation; The analysis and calculation of the stress and deformation about the two kinds of ultimate conditions of the guide platform are carried out too; The dynamic state analysis of the mechanism is carried out to obtain the vibration modes and natural frequencies of the vibration of the different parts of the counterweight under the condition of no preload force so that the frequency of the array can avoid the natural frequency, and can effectively avoid the resonance phenomenon, and for different modes we can improve the stiffness of the structure.

  11. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  12. Trends of Science Education Research: An Automatic Content Analysis

    NASA Astrophysics Data System (ADS)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  13. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  14. Automatic contact in DYNA3D for vehicle crashworthiness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whirley, R.G.; Engelmann, B.E.

    1993-07-15

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit nonlinear finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. This paper discusses in detail a new four-step automatic contact algorithm. Key aspects of the proposed method include automatic identification of adjacent and opposite surfaces in the global search phase, and the use of a smoothly varying surface normal which allows a consistent treatment of shell intersection and corner contact conditions without ad-hocmore » rules. The paper concludes with three examples which illustrate the performance of the newly proposed algorithm in the public DYNA3D code.« less

  15. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  16. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    PubMed

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-03-01

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. The all local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  17. Automatic weld torch guidance control system

    NASA Technical Reports Server (NTRS)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  18. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  19. Semi-automatic tracking, smoothing and segmentation of hyoid bone motion from videofluoroscopic swallowing study.

    PubMed

    Kim, Won-Seok; Zeng, Pengcheng; Shi, Jian Qing; Lee, Youngjo; Paik, Nam-Jong

    2017-01-01

    Motion analysis of the hyoid bone via videofluoroscopic study has been used in clinical research, but the classical manual tracking method is generally labor intensive and time consuming. Although some automatic tracking methods have been developed, masked points could not be tracked and smoothing and segmentation, which are necessary for functional motion analysis prior to registration, were not provided by the previous software. We developed software to track the hyoid bone motion semi-automatically. It works even in the situation where the hyoid bone is masked by the mandible and has been validated in dysphagia patients with stroke. In addition, we added the function of semi-automatic smoothing and segmentation. A total of 30 patients' data were used to develop the software, and data collected from 17 patients were used for validation, of which the trajectories of 8 patients were partly masked. Pearson correlation coefficients between the manual and automatic tracking are high and statistically significant (0.942 to 0.991, P-value<0.0001). Relative errors between automatic tracking and manual tracking in terms of the x-axis, y-axis and 2D range of hyoid bone excursion range from 3.3% to 9.2%. We also developed an automatic method to segment each hyoid bone trajectory into four phases (elevation phase, anterior movement phase, descending phase and returning phase). The semi-automatic hyoid bone tracking from VFSS data by our software is valid compared to the conventional manual tracking method. In addition, the ability of automatic indication to switch the automatic mode to manual mode in extreme cases and calibration without attaching the radiopaque object is convenient and useful for users. Semi-automatic smoothing and segmentation provide further information for functional motion analysis which is beneficial to further statistical analysis such as functional classification and prognostication for dysphagia. Therefore, this software could provide the

  20. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  1. Analysis of automatic repeat request methods for deep-space downlinks

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Ekroot, L.

    1995-01-01

    Automatic repeat request (ARQ) methods cannot increase the capacity of a memoryless channel. However, they can be used to decrease the complexity of the channel-coding system to achieve essentially error-free transmission and to reduce link margins when the channel characteristics are poorly predictable. This article considers ARQ methods on a power-limited channel (e.g., the deep-space channel), where it is important to minimize the total power needed to transmit the data, as opposed to a bandwidth-limited channel (e.g., terrestrial data links), where the spectral efficiency or the total required transmission time is the most relevant performance measure. In the analysis, we compare the performance of three reference concatenated coded systems used in actual deep-space missions to that obtainable by ARQ methods using the same codes, in terms of required power, time to transmit with a given number of retransmissions, and achievable probability of word error. The ultimate limits of ARQ with an arbitrary number of retransmissions are also derived.

  2. Automatic segmentation in three-dimensional analysis of fibrovascular pigmentepithelial detachment using high-definition optical coherence tomography.

    PubMed

    Ahlers, C; Simader, C; Geitzenauer, W; Stock, G; Stetson, P; Dastmalchi, S; Schmidt-Erfurth, U

    2008-02-01

    A limited number of scans compromise conventional optical coherence tomography (OCT) to track chorioretinal disease in its full extension. Failures in edge-detection algorithms falsify the results of retinal mapping even further. High-definition-OCT (HD-OCT) is based on raster scanning and was used to visualise the localisation and volume of intra- and sub-pigment-epithelial (RPE) changes in fibrovascular pigment epithelial detachments (fPED). Two different scanning patterns were evaluated. 22 eyes with fPED were imaged using a frequency-domain, high-speed prototype of the Cirrus HD-OCT. The axial resolution was 6 mum, and the scanning speed was 25 kA scans/s. Two different scanning patterns covering an area of 6 x 6 mm in the macular retina were compared. Three-dimensional topographic reconstructions and volume calculations were performed using MATLAB-based automatic segmentation software. Detailed information about layer-specific distribution of fluid accumulation and volumetric measurements can be obtained for retinal- and sub-RPE volumes. Both raster scans show a high correlation (p<0.01; R2>0.89) of measured values, that is PED volume/area, retinal volume and mean retinal thickness. Quality control of the automatic segmentation revealed reasonable results in over 90% of the examinations. Automatic segmentation allows for detailed quantitative and topographic analysis of the RPE and the overlying retina. In fPED, the 128 x 512 scanning-pattern shows mild advantages when compared with the 256 x 256 scan. Together with the ability for automatic segmentation, HD-OCT clearly improves the clinical monitoring of chorioretinal disease by adding relevant new parameters. HD-OCT is likely capable of enhancing the understanding of pathophysiology and benefits of treatment for current anti-CNV strategies in future.

  3. Attention to Automatic Movements in Parkinson's Disease: Modified Automatic Mode in the Striatum

    PubMed Central

    Wu, Tao; Liu, Jun; Zhang, Hejia; Hallett, Mark; Zheng, Zheng; Chan, Piu

    2015-01-01

    We investigated neural correlates when attending to a movement that could be made automatically in healthy subjects and Parkinson's disease (PD) patients. Subjects practiced a visuomotor association task until they could perform it automatically, and then directed their attention back to the automated task. Functional MRI was obtained during the early-learning, automatic stage, and when re-attending. In controls, attention to automatic movement induced more activation in the dorsolateral prefrontal cortex (DLPFC), anterior cingulate cortex, and rostral supplementary motor area. The motor cortex received more influence from the cortical motor association regions. In contrast, the pattern of the activity and connectivity of the striatum remained at the level of the automatic stage. In PD patients, attention enhanced activity in the DLPFC, premotor cortex, and cerebellum, but the connectivity from the putamen to the motor cortex decreased. Our findings demonstrate that, in controls, when a movement achieves the automatic stage, attention can influence the attentional networks and cortical motor association areas, but has no apparent effect on the striatum. In PD patients, attention induces a shift from the automatic mode back to the controlled pattern within the striatum. The shifting between controlled and automatic behaviors relies in part on striatal function. PMID:24925772

  4. [Wearable Automatic External Defibrillators].

    PubMed

    Luo, Huajie; Luo, Zhangyuan; Jin, Xun; Zhang, Leilei; Wang, Changjin; Zhang, Wenzan; Tu, Quan

    2015-11-01

    Defibrillation is the most effective method of treating ventricular fibrillation(VF), this paper introduces wearable automatic external defibrillators based on embedded system which includes EGG measurements, bioelectrical impedance measurement, discharge defibrillation module, which can automatic identify VF signal, biphasic exponential waveform defibrillation discharge. After verified by animal tests, the device can realize EGG acquisition and automatic identification. After identifying the ventricular fibrillation signal, it can automatic defibrillate to abort ventricular fibrillation and to realize the cardiac electrical cardioversion.

  5. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    NASA Astrophysics Data System (ADS)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  6. Approaches to the automatic generation and control of finite element meshes

    NASA Technical Reports Server (NTRS)

    Shephard, Mark S.

    1987-01-01

    The algorithmic approaches being taken to the development of finite element mesh generators capable of automatically discretizing general domains without the need for user intervention are discussed. It is demonstrated that because of the modeling demands placed on a automatic mesh generator, all the approaches taken to date produce unstructured meshes. Consideration is also given to both a priori and a posteriori mesh control devices for automatic mesh generators as well as their integration with geometric modeling and adaptive analysis procedures.

  7. Accuracy of Automatic Cephalometric Software on Landmark Identification

    NASA Astrophysics Data System (ADS)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (p<0.05) were found in 5 landmarks (Or, A-point, Me, L1T, and L1A) in horizontal direction and 7 landmarks (Or, A-point, U1T, U1A, B-point, Me, and L1A) in vertical direction. Four landmarks (Or, A-point, Me, and L1A) showed significant (p<0.05) mean differences in both horizontal and vertical directions. Small mean differences (<0.5mm) were found for S, N, B-point, Gn, and Pog in horizontal direction and N, Gn, Me, and L1T in vertical direction. Large mean differences were found for A-point (3.0 < 3.5mm) in horizontal direction and L1A (>4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  8. Theoretical Analysis of the Longitudinal Behavior of an Automatically Controlled Supersonic Interceptor During the Attack Phase

    NASA Technical Reports Server (NTRS)

    Gates, Ordway B., Jr.; Woodling, C. H.

    1959-01-01

    Theoretical analysis of the longitudinal behavior of an automatically controlled supersonic interceptor during the attack phase against a nonmaneuvering target is presented. Control of the interceptor's flight path is obtained by use of a pitch rate command system. Topics lift, and pitching moment, effects of initial tracking errors, discussion of normal acceleration limited, limitations of control surface rate and deflection, and effects of neglecting forward velocity changes of interceptor during attack phase.

  9. Automatic Processing of Current Affairs Queries

    ERIC Educational Resources Information Center

    Salton, G.

    1973-01-01

    The SMART system is used for the analysis, search and retrieval of news stories appearing in Time'' magazine. A comparison is made between the automatic text processing methods incorporated into the SMART system and a manual search using the classified index to Time.'' (14 references) (Author)

  10. Automatic exposure control for space sequential camera

    NASA Technical Reports Server (NTRS)

    Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.

    1975-01-01

    The final report for the automatic exposure control study for space sequential cameras, for the NASA Johnson Space Center is presented. The material is shown in the same sequence that the work was performed. The purpose of the automatic exposure control is to automatically control the lens iris as well as the camera shutter so that the subject is properly exposed on the film. A study of design approaches is presented. Analysis of the light range of the spectrum covered indicates that the practical range would be from approximately 20 to 6,000 foot-lamberts, or about nine f-stops. Observation of film available from space flights shows that optimum scene illumination is apparently not present in vehicle interior photography as well as in vehicle-to-vehicle situations. The evaluation test procedure for a breadboard, and the results, which provided information for the design of a brassboard are given.

  11. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  12. Automatic registration of ICG images using mutual information and perfusion analysis

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum

    2005-04-01

    Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.

  13. Systematic Design of High-performance Hybrid Feedback Algorithms

    DTIC Science & Technology

    2015-06-24

    Automatic Control, vol. 59, no. 9, pp. 2426- 2441 , 2014. J6. Liberzon, D.; Nešić, D.; Teel, A.R., “Lyapunov-based small-gain theorems for hybrid...on Automatic Control, vol. 59, no. 9, pp. 2426- 2441 , 2014. J6. Liberzon, D.; Nešić, D.; Teel, A.R., “Lyapunov-based small-gain theorems for hybrid

  14. Automatic identification of artifacts in electrodermal activity data.

    PubMed

    Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind

    2015-01-01

    Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.

  15. The feasibility of a regional CTDI{sub vol} to estimate organ dose from tube current modulated CT exams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khatonabadi, Maryam; Kim, Hyun J.; Lu, Peiyun

    normalized dose to correlate with patient size was investigated. Results: For all five organs, the correlations with patient size increased when organ doses were normalized by regional and organ-specific CTDI{sub vol} values. For example, when estimating dose to the liver, CTDI{sub vol,global} yielded a R{sup 2} value of 0.26, which improved to 0.77 and 0.86, when using the regional and organ-specific CTDI{sub vol} for abdomen and liver, respectively. For breast dose, the global CTDI{sub vol} yielded a R{sup 2} value of 0.08, which improved to 0.58 and 0.83, when using the regional and organ-specific CTDI{sub vol} for chest and breasts, respectively. The R{sup 2} values also increased once the thoracic models were separated for the analysis into females and males, indicating differences between genders in this region not explained by a simple measure of effective diameter. Conclusions: This work demonstrated the utility of regional and organ-specific CTDI{sub vol} as normalization factors when using TCM. It was demonstrated that CTDI{sub vol,global} is not an effective normalization factor in TCM exams where attenuation (and therefore tube current) varies considerably throughout the scan, such as abdomen/pelvis and even thorax. These exams can be more accurately assessed for dose using regional CTDI{sub vol} descriptors that account for local variations in scanner output present when TCM is employed.« less

  16. Automatic tracking of labeled red blood cells in microchannels.

    PubMed

    Pinho, Diana; Lima, Rui; Pereira, Ana I; Gayubo, Fernando

    2013-09-01

    The current study proposes an automatic method for the segmentation and tracking of red blood cells flowing through a 100- μm glass capillary. The original images were obtained by means of a confocal system and then processed in MATLAB using the Image Processing Toolbox. The measurements obtained with the proposed automatic method were compared with the results determined by a manual tracking method. The comparison was performed by using both linear regressions and Bland-Altman analysis. The results have shown a good agreement between the two methods. Therefore, the proposed automatic method is a powerful way to provide rapid and accurate measurements for in vitro blood experiments in microchannels. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Automatic visibility retrieval from thermal camera images

    NASA Astrophysics Data System (ADS)

    Dizerens, Céline; Ott, Beat; Wellig, Peter; Wunderle, Stefan

    2017-10-01

    This study presents an automatic visibility retrieval of a FLIR A320 Stationary Thermal Imager installed on a measurement tower on the mountain Lagern located in the Swiss Jura Mountains. Our visibility retrieval makes use of edges that are automatically detected from thermal camera images. Predefined target regions, such as mountain silhouettes or buildings with high thermal differences to the surroundings, are used to derive the maximum visibility distance that is detectable in the image. To allow a stable, automatic processing, our procedure additionally removes noise in the image and includes automatic image alignment to correct small shifts of the camera. We present a detailed analysis of visibility derived from more than 24000 thermal images of the years 2015 and 2016 by comparing them to (1) visibility derived from a panoramic camera image (VISrange), (2) measurements of a forward-scatter visibility meter (Vaisala FD12 working in the NIR spectra), and (3) modeled visibility values using the Thermal Range Model TRM4. Atmospheric conditions, mainly water vapor from European Center for Medium Weather Forecast (ECMWF), were considered to calculate the extinction coefficients using MODTRAN. The automatic visibility retrieval based on FLIR A320 images is often in good agreement with the retrieval from the systems working in different spectral ranges. However, some significant differences were detected as well, depending on weather conditions, thermal differences of the monitored landscape, and defined target size.

  18. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    PubMed

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  19. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens

    PubMed Central

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-01-01

    Abstract. This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear–cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  20. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  1. Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.

    PubMed

    Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F

    2010-11-01

    Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.

  2. Bringing automatic stereotyping under control: implementation intentions as efficient means of thought control.

    PubMed

    Stewart, Brandon D; Payne, B Keith

    2008-10-01

    The evidence for whether intentional control strategies can reduce automatic stereotyping is mixed. Therefore, the authors tested the utility of implementation intentions--specific plans linking a behavioral opportunity to a specific response--in reducing automatic bias. In three experiments, automatic stereotyping was reduced when participants made an intention to think specific counterstereotypical thoughts whenever they encountered a Black individual. The authors used two implicit tasks and process dissociation analysis, which allowed them to separate contributions of automatic and controlled thinking to task performance. Of importance, the reduction in stereotyping was driven by a change in automatic stereotyping and not controlled thinking. This benefit was acquired with little practice and generalized to novel faces. Thus, implementation intentions may be an effective and efficient means for controlling automatic aspects of thought.

  3. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol. 11, No. 1--Vol. 17, No. 6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cottrell, W.B.; Klein, A.

    1977-02-23

    This index to Nuclear Safety covers articles in Nuclear Safety Vol. 11, No. 1 (Jan.-Feb. 1970), through Vol. 17, No. 6 (Nov.-Dec. 1976). The index includes a chronological list of articles (including abstract) followed by KWIC and Author Indexes. Nuclear Safety, a bimonthly technical progress review prepared by the Nuclear Safety Information Center, covers all safety aspects of nuclear power reactors and associated facilities. The index lists over 350 technical articles in the last six years of publication.

  4. [The mediating role of anger in the relationship between automatic thoughts and physical aggression in adolescents].

    PubMed

    Yavuzer, Yasemin; Karataş, Zeynep

    2013-01-01

    This study aimed to examine the mediating role of anger in the relationship between automatic thoughts and physical aggression in adolescents. The study included 224 adolescents in the 9th grade of 3 different high schools in central Burdur during the 2011-2012 academic year. Participants completed the Aggression Questionnaire and Automatic Thoughts Scale in their classrooms during counseling sessions. Data were analyzed using simple and multiple linear regression analysis. There were positive correlations between the adolescents' automatic thoughts, and physical aggression, and anger. According to regression analysis, automatic thoughts effectively predicted the level of physical aggression (b= 0.233, P < 0.001)) and anger (b= 0.325, P < 0.001). Analysis of the mediating role of anger showed that anger fully mediated the relationship between automatic thoughts and physical aggression (Sobel z = 5.646, P < 0.001). Anger fully mediated the relationship between automatic thoughts and physical aggression. Providing adolescents with anger management skills training is very important for the prevention of physical aggression. Such training programs should include components related to the development of an awareness of dysfunctional and anger-triggering automatic thoughts, and how to change them. As the study group included adolescents from Burdur, the findings can only be generalized to groups with similar characteristics.

  5. Terminal Sliding Mode Tracking Controller Design for Automatic Guided Vehicle

    NASA Astrophysics Data System (ADS)

    Chen, Hongbin

    2018-03-01

    Based on sliding mode variable structure control theory, the path tracking problem of automatic guided vehicle is studied, proposed a controller design method based on the terminal sliding mode. First of all, through analyzing the characteristics of the automatic guided vehicle movement, the kinematics model is presented. Then to improve the traditional expression of terminal sliding mode, design a nonlinear sliding mode which the convergence speed is faster than the former, verified by theoretical analysis, the design of sliding mode is steady and fast convergence in the limited time. Finally combining Lyapunov method to design the tracking control law of automatic guided vehicle, the controller can make the automatic guided vehicle track the desired trajectory in the global sense as well as in finite time. The simulation results verify the correctness and effectiveness of the control law.

  6. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  7. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    PubMed Central

    Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272

  8. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  9. Comparison of automatic procedures in the selection of peaks over threshold in flood frequency analysis: A Canadian case study in the context of climate change

    NASA Astrophysics Data System (ADS)

    Durocher, M.; Mostofi Zadeh, S.; Burn, D. H.; Ashkar, F.

    2017-12-01

    Floods are one of the most costly hazards and frequency analysis of river discharges is an important part of the tools at our disposal to evaluate their inherent risks and to provide an adequate response. In comparison to the common examination of annual streamflow maximums, peaks over threshold (POT) is an interesting alternative that makes better use of the available information by including more than one flood event per year (on average). However, a major challenge is the selection of a satisfactory threshold above which peaks are assumed to respect certain conditions necessary for an adequate estimation of the risk. Additionally, studies have shown that POT is also a valuable approach to investigate the evolution of flood regimes in the context of climate change. Recently, automatic procedures for the selection of the threshold were suggested to guide that important choice, which otherwise rely on graphical tools and expert judgment. Furthermore, having an automatic procedure that is objective allows for quickly repeating the analysis on a large number of samples, which is useful in the context of large databases or for uncertainty analysis based on a resampling approach. This study investigates the impact of considering such procedures in a case study including many sites across Canada. A simulation study is conducted to evaluate the bias and predictive power of the automatic procedures in similar conditions as well as investigating the power of derived nonstationarity tests. The results obtained are also evaluated in the light of expert judgments established in a previous study. Ultimately, this study provides a thorough examination of the considerations that need to be addressed when conducting POT analysis using automatic threshold selection.

  10. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  11. An integrated exhaust gas analysis system with self-contained data processing and automatic calibration

    NASA Technical Reports Server (NTRS)

    Anderson, R. C.; Summers, R. L.

    1981-01-01

    An integrated gas analysis system designed to operate in automatic, semiautomatic, and manual modes from a remote control panel is described. The system measures the carbon monoxide, oxygen, water vapor, total hydrocarbons, carbon dioxide, and oxides of nitrogen. A pull through design provides increased reliability and eliminates the need for manual flow rate adjustment and pressure correction. The system contains two microprocessors to range the analyzers, calibrate the system, process the raw data to units of concentration, and provides information to the facility research computer and to the operator through terminal and the control panels. After initial setup, the system operates for several hours without significant operator attention.

  12. Motor automaticity in Parkinson’s disease

    PubMed Central

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  13. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  14. Automatic microscopy for mitotic cell location.

    NASA Technical Reports Server (NTRS)

    Herron, J.; Ranshaw, R.; Castle, J.; Wald, N.

    1972-01-01

    Advances are reported in the development of an automatic microscope with which to locate hematologic or other cells in mitosis for subsequent chromosome analysis. The system under development is designed to perform the functions of: slide scanning to locate metaphase cells; conversion of images of selected cells into binary form; and on-line computer analysis of the digitized image for significant cytogenetic data. Cell detection criteria are evaluated using a test sample of 100 mitotic cells and 100 artifacts.

  15. On a program manifold's stability of one contour automatic control systems

    NASA Astrophysics Data System (ADS)

    Zumatov, S. S.

    2017-12-01

    Methodology of analysis of stability is expounded to the one contour systems automatic control feedback in the presence of non-linearities. The methodology is based on the use of the simplest mathematical models of the nonlinear controllable systems. Stability of program manifolds of one contour automatic control systems is investigated. The sufficient conditions of program manifold's absolute stability of one contour automatic control systems are obtained. The Hurwitz's angle of absolute stability was determined. The sufficient conditions of program manifold's absolute stability of control systems by the course of plane in the mode of autopilot are obtained by means Lyapunov's second method.

  16. Automatic analysis of ciliary beat frequency using optical flow

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  17. Automatic classification of seismic events within a regional seismograph network

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kortström, Jari; Uski, Marja

    2015-04-01

    A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.

  18. Automatic classification of retinal three-dimensional optical coherence tomography images using principal component analysis network with composite kernels

    NASA Astrophysics Data System (ADS)

    Fang, Leyuan; Wang, Chong; Li, Shutao; Yan, Jun; Chen, Xiangdong; Rabbani, Hossein

    2017-11-01

    We present an automatic method, termed as the principal component analysis network with composite kernel (PCANet-CK), for the classification of three-dimensional (3-D) retinal optical coherence tomography (OCT) images. Specifically, the proposed PCANet-CK method first utilizes the PCANet to automatically learn features from each B-scan of the 3-D retinal OCT images. Then, multiple kernels are separately applied to a set of very important features of the B-scans and these kernels are fused together, which can jointly exploit the correlations among features of the 3-D OCT images. Finally, the fused (composite) kernel is incorporated into an extreme learning machine for the OCT image classification. We tested our proposed algorithm on two real 3-D spectral domain OCT (SD-OCT) datasets (of normal subjects and subjects with the macular edema and age-related macular degeneration), which demonstrated its effectiveness.

  19. Automatic generation of the non-holonomic equations of motion for vehicle stability analysis

    NASA Astrophysics Data System (ADS)

    Minaker, B. P.; Rieveley, R. J.

    2010-09-01

    The mathematical analysis of vehicle stability has been utilised as an important tool in the design, development, and evaluation of vehicle architectures and stability controls. This paper presents a novel method for automatic generation of the linearised equations of motion for mechanical systems that is well suited to vehicle stability analysis. Unlike conventional methods for generating linearised equations of motion in standard linear second order form, the proposed method allows for the analysis of systems with non-holonomic constraints. In the proposed method, the algebraic constraint equations are eliminated after linearisation and reduction to first order. The described method has been successfully applied to an assortment of classic dynamic problems of varying complexity including the classic rolling coin, the planar truck-trailer, and the bicycle, as well as in more recent problems such as a rotor-stator and a benchmark road vehicle with suspension. This method has also been applied in the design and analysis of a novel three-wheeled narrow tilting vehicle with zero roll-stiffness. An application for determining passively stable configurations using the proposed method together with a genetic search algorithm is detailed. The proposed method and software implementation has been shown to be robust and provides invaluable conceptual insight into the stability of vehicles and mechanical systems.

  20. Automatic Text Structuring and Summarization.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1997-01-01

    Discussion of the use of information retrieval techniques for automatic generation of semantic hypertext links focuses on automatic text summarization. Topics include World Wide Web links, text segmentation, and evaluation of text summarization by comparing automatically generated abstracts with manually prepared abstracts. (Author/LRW)

  1. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer.

    PubMed

    Kessel, Kerstin A; Habermehl, Daniel; Jäger, Andreas; Floca, Ralf O; Zhang, Lanlan; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E

    2013-06-07

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition.

  2. An analysis of automatic human detection and tracking

    NASA Astrophysics Data System (ADS)

    Demuth, Philipe R.; Cosmo, Daniel L.; Ciarelli, Patrick M.

    2015-12-01

    This paper presents an automatic method to detect and follow people on video streams. This method uses two techniques to determine the initial position of the person at the beginning of the video file: one based on optical flow and the other one based on Histogram of Oriented Gradients (HOG). After defining the initial bounding box, tracking is done using four different trackers: Median Flow tracker, TLD tracker, Mean Shift tracker and a modified version of the Mean Shift tracker using HSV color space. The results of the methods presented in this paper are then compared at the end of the paper.

  3. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  4. Finite Element Analysis of Osteosynthesis Screw Fixation in the Bone Stock: An Appropriate Method for Automatic Screw Modelling

    PubMed Central

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  5. Application of automatic image analysis for morphometric studies of peroxisomes stained cytochemically for catalase. II. Light-microscopic application.

    PubMed

    Beier, K; Fahimi, H D

    1987-01-01

    The feasibility of the application of a television-based image analyzer, the Texture Analysis System (TAS, Leitz Wetzlar, FRG) in conjunction with a light microscope for morphometric studies of hepatic peroxisomes has been investigated. Rat liver peroxisomes were stained with the alkaline-DAB method for localization of catalase and semithin (0.25 and 1 micron) sections of plastic-embedded material were examined under an oil immersion objective. The TAS detected the peroxisomal profiles selectively and determined their morphometric parameters automatically. The same parameters were obtained also by morphometric analysis of electron micrographs from the same material. The volume density of peroxisomes determined by TAS in semithin sections of normal liver, after correction for section thickness, is quite close to the corresponding value obtained by morphometry of electron micrographs. The difference is approximately 20%. In animals treated with the hypolipidemic drug bezafibrate, which causes proliferation of peroxisomes, TAS detected readily the increase in volume density of peroxisomes in semithin sections. In comparison with electron microscopy, however, the light-microscopic approach seems to underestimate the proliferation. The lower resolution of the light microscope and overlapping of neighbouring particles in relatively thick sections used for light-microscopic analysis may account for the differences. The present study has demonstrated the usefulness of automatic image analysis in conjunction with selective cytochemical staining of peroxisomes for morphometry of this organelle in rat liver. The light-microscopic approach is not only faster but is also extremely economical by obviating the use of an electron microscope.

  6. Automatic SAR/optical cross-matching for GCP monograph generation

    NASA Astrophysics Data System (ADS)

    Nutricato, Raffaele; Morea, Alberto; Nitti, Davide Oscar; La Mantia, Claudio; Agrimano, Luigi; Samarelli, Sergio; Chiaradia, Maria Teresa

    2016-10-01

    Ground Control Points (GCP), automatically extracted from Synthetic Aperture Radar (SAR) images through 3D stereo analysis, can be effectively exploited for an automatic orthorectification of optical imagery if they can be robustly located in the basic optical images. The present study outlines a SAR/Optical cross-matching procedure that allows a robust alignment of radar and optical images, and consequently to derive automatically the corresponding sub-pixel position of the GCPs in the optical image in input, expressed as fractional pixel/line image coordinates. The cross-matching in performed in two subsequent steps, in order to gradually gather a better precision. The first step is based on the Mutual Information (MI) maximization between optical and SAR chips while the last one uses the Normalized Cross-Correlation as similarity metric. This work outlines the designed algorithmic solution and discusses the results derived over the urban area of Pisa (Italy), where more than ten COSMO-SkyMed Enhanced Spotlight stereo images with different beams and passes are available. The experimental analysis involves different satellite images, in order to evaluate the performances of the algorithm w.r.t. the optical spatial resolution. An assessment of the performances of the algorithm has been carried out, and errors are computed by measuring the distance between the GCP pixel/line position in the optical image, automatically estimated by the tool, and the "true" position of the GCP, visually identified by an expert user in the optical images.

  7. Using normalization 3D model for automatic clinical brain quantative analysis and evaluation

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping

    2003-05-01

    Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.

  8. Apparatus enables automatic microanalysis of body fluids

    NASA Technical Reports Server (NTRS)

    Soffen, G. A.; Stuart, J. L.

    1966-01-01

    Apparatus will automatically and quantitatively determine body fluid constituents which are amenable to analysis by fluorometry or colorimetry. The results of the tests are displayed as percentages of full scale deflection on a strip-chart recorder. The apparatus can also be adapted for microanalysis of various other fluids.

  9. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    PubMed

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  10. Automatic Segmentation of High-Throughput RNAi Fluorescent Cellular Images

    PubMed Central

    Yan, Pingkum; Zhou, Xiaobo; Shah, Mubarak; Wong, Stephen T. C.

    2010-01-01

    High-throughput genome-wide RNA interference (RNAi) screening is emerging as an essential tool to assist biologists in understanding complex cellular processes. The large number of images produced in each study make manual analysis intractable; hence, automatic cellular image analysis becomes an urgent need, where segmentation is the first and one of the most important steps. In this paper, a fully automatic method for segmentation of cells from genome-wide RNAi screening images is proposed. Nuclei are first extracted from the DNA channel by using a modified watershed algorithm. Cells are then extracted by modeling the interaction between them as well as combining both gradient and region information in the Actin and Rac channels. A new energy functional is formulated based on a novel interaction model for segmenting tightly clustered cells with significant intensity variance and specific phenotypes. The energy functional is minimized by using a multiphase level set method, which leads to a highly effective cell segmentation method. Promising experimental results demonstrate that automatic segmentation of high-throughput genome-wide multichannel screening can be achieved by using the proposed method, which may also be extended to other multichannel image segmentation problems. PMID:18270043

  11. A cloud-based system for automatic glaucoma screening.

    PubMed

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  12. Automatic classification of retinal three-dimensional optical coherence tomography images using principal component analysis network with composite kernels.

    PubMed

    Fang, Leyuan; Wang, Chong; Li, Shutao; Yan, Jun; Chen, Xiangdong; Rabbani, Hossein

    2017-11-01

    We present an automatic method, termed as the principal component analysis network with composite kernel (PCANet-CK), for the classification of three-dimensional (3-D) retinal optical coherence tomography (OCT) images. Specifically, the proposed PCANet-CK method first utilizes the PCANet to automatically learn features from each B-scan of the 3-D retinal OCT images. Then, multiple kernels are separately applied to a set of very important features of the B-scans and these kernels are fused together, which can jointly exploit the correlations among features of the 3-D OCT images. Finally, the fused (composite) kernel is incorporated into an extreme learning machine for the OCT image classification. We tested our proposed algorithm on two real 3-D spectral domain OCT (SD-OCT) datasets (of normal subjects and subjects with the macular edema and age-related macular degeneration), which demonstrated its effectiveness. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  13. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less

  14. Automatic Collision Avoidance Technology (ACAT)

    NASA Technical Reports Server (NTRS)

    Swihart, Donald E.; Skoog, Mark A.

    2007-01-01

    This document represents two views of the Automatic Collision Avoidance Technology (ACAT). One viewgraph presentation reviews the development and system design of Automatic Collision Avoidance Technology (ACAT). Two types of ACAT exist: Automatic Ground Collision Avoidance (AGCAS) and Automatic Air Collision Avoidance (AACAS). The AGCAS Uses Digital Terrain Elevation Data (DTED) for mapping functions, and uses Navigation data to place aircraft on map. It then scans DTED in front of and around aircraft and uses future aircraft trajectory (5g) to provide automatic flyup maneuver when required. The AACAS uses data link to determine position and closing rate. It contains several canned maneuvers to avoid collision. Automatic maneuvers can occur at last instant and both aircraft maneuver when using data link. The system can use sensor in place of data link. The second viewgraph presentation reviews the development of a flight test and an evaluation of the test. A review of the operation and comparison of the AGCAS and a pilot's performance are given. The same review is given for the AACAS is given.

  15. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk

  16. Automatic high throughput empty ISO container verification

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2007-04-01

    Encouraging results are presented for the automatic analysis of radiographic images of a continuous stream of ISO containers to confirm they are truly empty. A series of image processing algorithms are described that process real-time data acquired during the actual inspection of each container and assigns each to one of the classes "empty", "not empty" or "suspect threat". This research is one step towards achieving fully automated analysis of cargo containers.

  17. Automatic segmentation of time-lapse microscopy images depicting a live Dharma embryo.

    PubMed

    Zacharia, Eleni; Bondesson, Maria; Riu, Anne; Ducharme, Nicole A; Gustafsson, Jan-Åke; Kakadiaris, Ioannis A

    2011-01-01

    Biological inferences about the toxicity of chemicals reached during experiments on the zebrafish Dharma embryo can be greatly affected by the analysis of the time-lapse microscopy images depicting the embryo. Among the stages of image analysis, automatic and accurate segmentation of the Dharma embryo is the most crucial and challenging. In this paper, an accurate and automatic segmentation approach for the segmentation of the Dharma embryo data obtained by fluorescent time-lapse microscopy is proposed. Experiments performed in four stacks of 3D images over time have shown promising results.

  18. Machine for Automatic Bacteriological Pour Plate Preparation

    PubMed Central

    Sharpe, A. N.; Biggs, D. R.; Oliver, R. J.

    1972-01-01

    A fully automatic system for preparing poured plates for bacteriological analyses has been constructed and tested. The machine can make decimal dilutions of bacterial suspensions, dispense measured amounts into petri dishes, add molten agar, mix the dish contents, and label the dishes with sample and dilution numbers at the rate of 2,000 dishes per 8-hr day. In addition, the machine can be programmed to select different media so that plates for different types of bacteriological analysis may be made automatically from the same sample. The machine uses only the components of the media and sterile polystyrene petri dishes; requirements for all other materials, such as sterile pipettes and capped bottles of diluents and agar, are eliminated. Images PMID:4560475

  19. Fluid status monitoring with a wireless network to reduce cardiovascular-related hospitalizations and mortality in heart failure: rationale and design of the OptiLink HF Study (Optimization of Heart Failure Management using OptiVol Fluid Status Monitoring and CareLink).

    PubMed

    Brachmann, Johannes; Böhm, Michael; Rybak, Karin; Klein, Gunnar; Butter, Christian; Klemm, Hanno; Schomburg, Rolf; Siebermair, Johannes; Israel, Carsten; Sinha, Anil-Martin; Drexler, Helmut

    2011-07-01

    The Optimization of Heart Failure Management using OptiVol Fluid Status Monitoring and CareLink (OptiLink HF) study is designed to investigate whether OptiVol fluid status monitoring with an automatically generated wireless CareAlert notification via the CareLink Network can reduce all-cause death and cardiovascular hospitalizations in an HF population, compared with standard clinical assessment. Methods Patients with newly implanted or replacement cardioverter-defibrillator devices with or without cardiac resynchronization therapy, who have chronic HF in New York Heart Association class II or III and a left ventricular ejection fraction ≤35% will be eligible to participate. Following device implantation, patients are randomized to either OptiVol fluid status monitoring through CareAlert notification or regular care (OptiLink 'on' vs. 'off'). The primary endpoint is a composite of all-cause death or cardiovascular hospitalization. It is estimated that 1000 patients will be required to demonstrate superiority of the intervention group to reduce the primary outcome by 30% with 80% power. The OptiLink HF study is designed to investigate whether early detection of congestion reduces mortality and cardiovascular hospitalization in patients with chronic HF. The study is expected to close recruitment in September 2012 and to report first results in May 2014.

  20. Automatic Denoising of Functional MRI Data: Combining Independent Component Analysis and Hierarchical Fusion of Classifiers

    PubMed Central

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject “at rest”). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing “signal” (brain activity) can be distinguished form the “noise” components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX (“FMRIB’s ICA-based X-noiseifier”), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different Classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of

  1. Automatic Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1985-01-01

    Coal cutting and removal done with minimal hazard to people. Automatic coal mine cutting, transport and roof-support movement all done by automatic machinery. Exposure of people to hazardous conditions reduced to inspection tours, maintenance, repair, and possibly entry mining.

  2. Validation of Computerized Automatic Calculation of the Sequential Organ Failure Assessment Score

    PubMed Central

    Harrison, Andrew M.; Pickering, Brian W.; Herasevich, Vitaly

    2013-01-01

    Purpose. To validate the use of a computer program for the automatic calculation of the sequential organ failure assessment (SOFA) score, as compared to the gold standard of manual chart review. Materials and Methods. Adult admissions (age > 18 years) to the medical ICU with a length of stay greater than 24 hours were studied in the setting of an academic tertiary referral center. A retrospective cross-sectional analysis was performed using a derivation cohort to compare automatic calculation of the SOFA score to the gold standard of manual chart review. After critical appraisal of sources of disagreement, another analysis was performed using an independent validation cohort. Then, a prospective observational analysis was performed using an implementation of this computer program in AWARE Dashboard, which is an existing real-time patient EMR system for use in the ICU. Results. Good agreement between the manual and automatic SOFA calculations was observed for both the derivation (N=94) and validation (N=268) cohorts: 0.02 ± 2.33 and 0.29 ± 1.75 points, respectively. These results were validated in AWARE (N=60). Conclusion. This EMR-based automatic tool accurately calculates SOFA scores and can facilitate ICU decisions without the need for manual data collection. This tool can also be employed in a real-time electronic environment. PMID:23936639

  3. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.

    PubMed

    Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin

    2015-01-01

    Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.

  4. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information

    PubMed Central

    Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Li, Jianxin

    2015-01-01

    Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition. PMID:26380294

  5. Automatically inserted technical details improve radiology report accuracy.

    PubMed

    Abujudeh, Hani H; Govindan, Siddharth; Narin, Ozden; Johnson, Jamlik Omari; Thrall, James H; Rosenthal, Daniel I

    2011-09-01

    To assess the effect of automatically inserted technical details on the concordance of a radiology report header with the actual procedure performed. The study was IRB approved and informed consent was waived. We obtained radiology report audit data from the hospital's compliance office from the period of January 2005 through December 2009 spanning a total of 20 financial quarters. A "discordance percentage" was defined as the percentage of total studies in which a procedure code change was made during auditing. Using Chi-square analysis we compared discordance percentages between reports with manually inserted technical details (MITD) and automatically inserted technical details (AITD). The second quarter data of 2007 was not included in the analysis as the switch from MITD to AITD occurred during this quarter. The hospital's compliance office audited 9,110 studies from 2005-2009. Excluding the 564 studies in the second quarter of 2007, we analyzed a total of 8,546 studies, 3,948 with MITD and 4,598 with AITD. The discordance percentage in the MITD group was 3.95% (156/3,948, range per quarter, 1.5- 6.1%). The AITD discordance percentage was 1.37% (63/4,598, range per quarter, 0.0-2.6%). A Chi-square analysis determined a statistically significant difference between the 2 groups (P < 0.001). There was a statistically significant improvement in the concordance of a radiology report header with the performed procedure using automatically inserted technical details compared to manually inserted details. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  6. Stationary table CT dosimetry and anomalous scanner-reported values of CTDI{sub vol}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Robert L., E-mail: rdixon@wfubmc.edu; Boone, John M.

    2014-01-15

    Purpose: Anomalous, scanner-reported values of CTDI{sub vol} for stationary phantom/table protocols (having elevated values of CTDI{sub vol} over 300% higher than the actual dose to the phantom) have been observed; which are well-beyond the typical accuracy expected of CTDI{sub vol} as a phantom dose. Recognition of these outliers as “bad data” is important to users of CT dose index tracking systems (e.g., ACR DIR), and a method for recognition and correction is provided. Methods: Rigorous methods and equations are presented which describe the dose distributions for stationary-table CT. A comparison with formulae for scanner-reported values of CTDI{sub vol} clearly identifiesmore » the source of these anomalies. Results: For the stationary table, use of the CTDI{sub 100} formula (applicable to a moving phantom only) overestimates the dose due to extra scatter and also includes an overbeaming correction, both of which are nonexistent when the phantom (or patient) is held stationary. The reported DLP remains robust for the stationary phantom. Conclusions: The CTDI-paradigm does not apply in the case of a stationary phantom and simpler nonintegral equations suffice. A method of correction of the currently reported CTDI{sub vol} using the approach-to-equilibrium formula H(a) and an overbeaming correction factor serves to scale the reported CTDI{sub vol} values to more accurate levels for stationary-table CT, as well as serving as an indicator in the detection of “bad data.”.« less

  7. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  8. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  9. [Automatic adjustment control system for DC glow discharge plasma source].

    PubMed

    Wan, Zhen-zhen; Wang, Yong-qing; Li, Xiao-jia; Wang, Hai-zhou; Shi, Ning

    2011-03-01

    There are three important parameters in the DC glow discharge process, the discharge current, discharge voltage and argon pressure in discharge source. These parameters influence each other during glow discharge process. This paper presents an automatic control system for DC glow discharge plasma source. This system collects and controls discharge voltage automatically by adjusting discharge source pressure while the discharge current is constant in the glow discharge process. The design concept, circuit principle and control program of this automatic control system are described. The accuracy is improved by this automatic control system with the method of reducing the complex operations and manual control errors. This system enhances the control accuracy of glow discharge voltage, and reduces the time to reach discharge voltage stability. The glow discharge voltage stability test results with automatic control system are provided as well, the accuracy with automatic control system is better than 1% FS which is improved from 4% FS by manual control. Time to reach discharge voltage stability has been shortened to within 30 s by automatic control from more than 90 s by manual control. Standard samples like middle-low alloy steel and tin bronze have been tested by this automatic control system. The concentration analysis precision has been significantly improved. The RSDs of all the test result are better than 3.5%. In middle-low alloy steel standard sample, the RSD range of concentration test result of Ti, Co and Mn elements is reduced from 3.0%-4.3% by manual control to 1.7%-2.4% by automatic control, and that for S and Mo is also reduced from 5.2%-5.9% to 3.3%-3.5%. In tin bronze standard sample, the RSD range of Sn, Zn and Al elements is reduced from 2.6%-4.4% to 1.0%-2.4%, and that for Si, Ni and Fe is reduced from 6.6%-13.9% to 2.6%-3.5%. The test data is also shown in this paper.

  10. Automatic switching matrix

    DOEpatents

    Schlecht, Martin F.; Kassakian, John G.; Caloggero, Anthony J.; Rhodes, Bruce; Otten, David; Rasmussen, Neil

    1982-01-01

    An automatic switching matrix that includes an apertured matrix board containing a matrix of wires that can be interconnected at each aperture. Each aperture has associated therewith a conductive pin which, when fully inserted into the associated aperture, effects electrical connection between the wires within that particular aperture. Means is provided for automatically inserting the pins in a determined pattern and for removing all the pins to permit other interconnecting patterns.

  11. Analysis of steranes and triterpanes in geolipid extracts by automatic classification of mass spectra

    NASA Technical Reports Server (NTRS)

    Wardroper, A. M. K.; Brooks, P. W.; Humberston, M. J.; Maxwell, J. R.

    1977-01-01

    A computer method is described for the automatic classification of triterpanes and steranes into gross structural type from their mass spectral characteristics. The method has been applied to the spectra obtained by gas-chromatographic/mass-spectroscopic analysis of two mixtures of standards and of hydrocarbon fractions isolated from Green River and Messel oil shales. Almost all of the steranes and triterpanes identified previously in both shales were classified, in addition to a number of new components. The results indicate that classification of such alkanes is possible with a laboratory computer system. The method has application to diagenesis and maturation studies as well as to oil/oil and oil/source rock correlations in which rapid screening of large numbers of samples is required.

  12. Comparison Of Semi-Automatic And Automatic Slick Detection Algorithms For Jiyeh Power Station Oil Spill, Lebanon

    NASA Astrophysics Data System (ADS)

    Osmanoglu, B.; Ozkan, C.; Sunar, F.

    2013-10-01

    After air strikes on July 14 and 15, 2006 the Jiyeh Power Station started leaking oil into the eastern Mediterranean Sea. The power station is located about 30 km south of Beirut and the slick covered about 170 km of coastline threatening the neighboring countries Turkey and Cyprus. Due to the ongoing conflict between Israel and Lebanon, cleaning efforts could not start immediately resulting in 12 000 to 15 000 tons of fuel oil leaking into the sea. In this paper we compare results from automatic and semi-automatic slick detection algorithms. The automatic detection method combines the probabilities calculated for each pixel from each image to obtain a joint probability, minimizing the adverse effects of atmosphere on oil spill detection. The method can readily utilize X-, C- and L-band data where available. Furthermore wind and wave speed observations can be used for a more accurate analysis. For this study, we utilize Envisat ASAR ScanSAR data. A probability map is generated based on the radar backscatter, effect of wind and dampening value. The semi-automatic algorithm is based on supervised classification. As a classifier, Artificial Neural Network Multilayer Perceptron (ANN MLP) classifier is used since it is more flexible and efficient than conventional maximum likelihood classifier for multisource and multi-temporal data. The learning algorithm for ANN MLP is chosen as the Levenberg-Marquardt (LM). Training and test data for supervised classification are composed from the textural information created from SAR images. This approach is semiautomatic because tuning the parameters of classifier and composing training data need a human interaction. We point out the similarities and differences between the two methods and their results as well as underlining their advantages and disadvantages. Due to the lack of ground truth data, we compare obtained results to each other, as well as other published oil slick area assessments.

  13. Methods of automatic nucleotide-sequence analysis. Multicomponent spectrophotometric analysis of mixtures of nucleic acid components by a least-squares procedure

    PubMed Central

    Lee, Sheila; McMullen, D.; Brown, G. L.; Stokes, A. R.

    1965-01-01

    1. A theoretical analysis of the errors in multicomponent spectrophotometric analysis of nucleoside mixtures, by a least-squares procedure, has been made to obtain an expression for the error coefficient, relating the error in calculated concentration to the error in extinction measurements. 2. The error coefficients, which depend only on the `library' of spectra used to fit the experimental curves, have been computed for a number of `libraries' containing the following nucleosides found in s-RNA: adenosine, guanosine, cytidine, uridine, 5-ribosyluracil, 7-methylguanosine, 6-dimethylaminopurine riboside, 6-methylaminopurine riboside and thymine riboside. 3. The error coefficients have been used to determine the best conditions for maximum accuracy in the determination of the compositions of nucleoside mixtures. 4. Experimental determinations of the compositions of nucleoside mixtures have been made and the errors found to be consistent with those predicted by the theoretical analysis. 5. It has been demonstrated that, with certain precautions, the multicomponent spectrophotometric method described is suitable as a basis for automatic nucleotide-composition analysis of oligonucleotides containing nine nucleotides. Used in conjunction with continuous chromatography and flow chemical techniques, this method can be applied to the study of the sequence of s-RNA. PMID:14346087

  14. CIRF Publications, Vol. 12, No. 5.

    ERIC Educational Resources Information Center

    International Labour Office, Geneva (Switzerland).

    CIRF Publications, Vol. 12, No. 5 is a collection of 80 abstracts giving particular attention to education, training, and economic growth in developing countries, Iran, Japan, Kenya, the Solomon Islands, and Sri Lanka; vocational rehabilitation in Italy, Spain, the United Kingdom, and the U. S. A.; agriculture in Chad, developing countries, and…

  15. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    PubMed

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  16. [The effects of interpretation bias for social events and automatic thoughts on social anxiety].

    PubMed

    Aizawa, Naoki

    2015-08-01

    Many studies have demonstrated that individuals with social anxiety interpret ambiguous social situations negatively. It is, however, not clear whether the interpretation bias discriminatively contributes to social anxiety in comparison with depressive automatic thoughts. The present study investigated the effects of negative interpretation bias and automatic thoughts on social anxiety. The Social Intent Interpretation-Questionnaire, which measures the tendency to interpret ambiguous social events as implying other's rejective intents, the short Japanese version of the Automatic Thoughts Questionnaire-Revised, and the Anthropophobic Tendency Scale were administered to 317 university students. Covariance structure analysis indicated that both rejective intent interpretation bias and negative automatic thoughts contributed to mental distress in social situations mediated by a sense of powerlessness and excessive concern about self and others in social situations. Positive automatic thoughts reduced mental distress. These results indicate the importance of interpretation bias and negative automatic thoughts in the development and maintenance of social anxiety. Implications for understanding of the cognitive features of social anxiety were discussed.

  17. Automatic T1 bladder tumor detection by using wavelet analysis in cystoscopy images

    NASA Astrophysics Data System (ADS)

    Freitas, Nuno R.; Vieira, Pedro M.; Lima, Estevão; Lima, Carlos S.

    2018-02-01

    Correct classification of cystoscopy images depends on the interpreter’s experience. Bladder cancer is a common lesion that can only be confirmed by biopsying the tissue, therefore, the automatic identification of tumors plays a significant role in early stage diagnosis and its accuracy. To our best knowledge, the use of white light cystoscopy images for bladder tumor diagnosis has not been reported so far. In this paper, a texture analysis based approach is proposed for bladder tumor diagnosis presuming that tumors change in tissue texture. As is well accepted by the scientific community, texture information is more present in the medium to high frequency range which can be selected by using a discrete wavelet transform (DWT). Tumor enhancement can be improved by using automatic segmentation, since a mixing with normal tissue is avoided under ideal conditions. The segmentation module proposed in this paper takes advantage of the wavelet decomposition tree to discard poor texture information in such a way that both steps of the proposed algorithm segmentation and classification share the same focus on texture. Multilayer perceptron and a support vector machine with a stratified ten-fold cross-validation procedure were used for classification purposes by using the hue-saturation-value (HSV), red-green-blue, and CIELab color spaces. Performances of 91% in sensitivity and 92.9% in specificity were obtained regarding HSV color by using both preprocessing and classification steps based on the DWT. The proposed method can achieve good performance on identifying bladder tumor frames. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis.

  18. Automatic telangiectasia analysis in dermoscopy images using adaptive critic design.

    PubMed

    Cheng, B; Stanley, R J; Stoecker, W V; Hinton, K

    2012-11-01

    Telangiectasia, tiny skin vessels, are important dermoscopy structures used to discriminate basal cell carcinoma (BCC) from benign skin lesions. This research builds off of previously developed image analysis techniques to identify vessels automatically to discriminate benign lesions from BCCs. A biologically inspired reinforcement learning approach is investigated in an adaptive critic design framework to apply action-dependent heuristic dynamic programming (ADHDP) for discrimination based on computed features using different skin lesion contrast variations to promote the discrimination process. Lesion discrimination results for ADHDP are compared with multilayer perception backpropagation artificial neural networks. This study uses a data set of 498 dermoscopy skin lesion images of 263 BCCs and 226 competitive benign images as the input sets. This data set is extended from previous research [Cheng et al., Skin Research and Technology, 2011, 17: 278]. Experimental results yielded a diagnostic accuracy as high as 84.6% using the ADHDP approach, providing an 8.03% improvement over a standard multilayer perception method. We have chosen BCC detection rather than vessel detection as the endpoint. Although vessel detection is inherently easier, BCC detection has potential direct clinical applications. Small BCCs are detectable early by dermoscopy and potentially detectable by the automated methods described in this research. © 2011 John Wiley & Sons A/S.

  19. Industrial application of low voltage bidirectional automatic release of reserve

    NASA Astrophysics Data System (ADS)

    Popa, G. N.; Diniş, C. M.; Iagăr, A.; Deaconu, S. I.; Popa, I.

    2018-01-01

    The paper presents an analysis on low voltage industrial electrical installation controlled by bidirectional automatic release of reserve. Industrial electrical installation is for removing smoke in case of fire from a textile company. The main parts of the installation of removing smoke in case of fire are: general electrical panel; reserve electrical panel; three-phase induction motors for driven fans; electrical actuators for inlet and outlet valves; clean air inlet pipe, respectively, the outlet pipe for smoke. The operation and checking of bidirectional automatic release of reserve are present in the paper.

  20. Automatic assembly of micro-optical components

    NASA Astrophysics Data System (ADS)

    Gengenbach, Ulrich K.

    1996-12-01

    Automatic assembly becomes an important issue as hybrid micro systems enter industrial fabrication. Moving from a laboratory scale production with manual assembly and bonding processes to automatic assembly requires a thorough re- evaluation of the design, the characteristics of the individual components and of the processes involved. Parts supply for automatic operation, sensitive and intelligent grippers adapted to size, surface and material properties of the microcomponents gain importance when the superior sensory and handling skills of a human are to be replaced by a machine. This holds in particular for the automatic assembly of micro-optical components. The paper outlines these issues exemplified at the automatic assembly of a micro-optical duplexer consisting of a micro-optical bench fabricated by the LIGA technique, two spherical lenses, a wavelength filter and an optical fiber. Spherical lenses, wavelength filter and optical fiber are supplied by third party vendors, which raises the question of parts supply for automatic assembly. The bonding processes for these components include press fit and adhesive bonding. The prototype assembly system with all relevant components e.g. handling system, parts supply, grippers and control is described. Results of first automatic assembly tests are presented.

  1. 30 CFR 27.23 - Automatic warning device.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Automatic warning device. 27.23 Section 27.23... Automatic warning device. (a) An automatic warning device shall be suitably constructed for incorporation in... automatic warning device shall include an alarm signal (audible or colored light), which shall be made to...

  2. 30 CFR 27.23 - Automatic warning device.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Automatic warning device. 27.23 Section 27.23... Automatic warning device. (a) An automatic warning device shall be suitably constructed for incorporation in... automatic warning device shall include an alarm signal (audible or colored light), which shall be made to...

  3. 30 CFR 27.23 - Automatic warning device.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Automatic warning device. 27.23 Section 27.23... Automatic warning device. (a) An automatic warning device shall be suitably constructed for incorporation in... automatic warning device shall include an alarm signal (audible or colored light), which shall be made to...

  4. 30 CFR 27.23 - Automatic warning device.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Automatic warning device. 27.23 Section 27.23... Automatic warning device. (a) An automatic warning device shall be suitably constructed for incorporation in... automatic warning device shall include an alarm signal (audible or colored light), which shall be made to...

  5. NASA automatic system for computer program documentation, volume 2

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1972-01-01

    The DYNASOR 2 program is used for the dynamic nonlinear analysis of shells of revolution. The equations of motion of the shell are solved using Houbolt's numerical procedure. The displacements and stress resultants are determined for both symmetrical and asymmetrical loading conditions. Asymmetrical dynamic buckling can be investigated. Solutions can be obtained for highly nonlinear problems utilizing as many as five of the harmonics generated by SAMMSOR program. A restart capability allows the user to restart the program at a specified time. For Vol. 1, see N73-22129.

  6. A Network of Automatic Control Web-Based Laboratories

    ERIC Educational Resources Information Center

    Vargas, Hector; Sanchez Moreno, J.; Jara, Carlos A.; Candelas, F. A.; Torres, Fernando; Dormido, Sebastian

    2011-01-01

    This article presents an innovative project in the context of remote experimentation applied to control engineering education. Specifically, the authors describe their experience regarding the analysis, design, development, and exploitation of web-based technologies within the scope of automatic control. This work is part of an inter-university…

  7. Neural Bases of Automaticity

    ERIC Educational Resources Information Center

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  8. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    PubMed

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  9. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  10. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells.

  11. 46 CFR 52.01-10 - Automatic controls.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  12. 46 CFR 52.01-10 - Automatic controls.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  13. 46 CFR 52.01-10 - Automatic controls.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  14. 46 CFR 52.01-10 - Automatic controls.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  15. 46 CFR 52.01-10 - Automatic controls.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  16. Automatic Feature Extraction from Planetary Images

    NASA Technical Reports Server (NTRS)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  17. Evaluation of needle trap micro-extraction and automatic alveolar sampling for point-of-care breath analysis.

    PubMed

    Trefz, Phillip; Rösner, Lisa; Hein, Dietmar; Schubert, Jochen K; Miekisch, Wolfram

    2013-04-01

    Needle trap devices (NTDs) have shown many advantages such as improved detection limits, reduced sampling time and volume, improved stability, and reproducibility if compared with other techniques used in breath analysis such as solid-phase extraction and solid-phase micro-extraction. Effects of sampling flow (2-30 ml/min) and volume (10-100 ml) were investigated in dry gas standards containing hydrocarbons, aldehydes, and aromatic compounds and in humid breath samples. NTDs contained (single-bed) polymer packing and (triple-bed) combinations of divinylbenzene/Carbopack X/Carboxen 1000. Substances were desorbed from the NTDs by means of thermal expansion and analyzed by gas chromatography-mass spectrometry. An automated CO2-controlled sampling device for direct alveolar sampling at the point-of-care was developed and tested in pilot experiments. Adsorption efficiency for small volatile organic compounds decreased and breakthrough increased when sampling was done with polymer needles from a water-saturated matrix (breath) instead from dry gas. Humidity did not affect analysis with triple-bed NTDs. These NTDs showed only small dependencies on sampling flow and low breakthrough from 1-5 %. The new sampling device was able to control crucial parameters such as sampling flow and volume. With triple-bed NTDs, substance amounts increased linearly with increasing sample volume when alveolar breath was pre-concentrated automatically. When compared with manual sampling, automatic sampling showed comparable or better results. Thorough control of sampling and adequate choice of adsorption material is mandatory for application of needle trap micro-extraction in vivo. The new CO2-controlled sampling device allows direct alveolar sampling at the point-of-care without the need of any additional sampling, storage, or pre-concentration steps.

  18. Research into automatic recognition of joints in human symmetrical movements

    NASA Astrophysics Data System (ADS)

    Fan, Yifang; Li, Zhiyu

    2008-03-01

    High speed photography is a major means of collecting data from human body movement. It enables the automatic identification of joints, which brings great significance to the research, treatment and recovery of injuries, the analysis to the diagnosis of sport techniques and the ergonomics. According to the features that when the adjacent joints of human body are in planetary motion, their distance remains the same, and according to the human body joint movement laws (such as the territory of the articular anatomy and the kinematic features), a new approach is introduced to process the image thresholding of joints filmed by the high speed camera, to automatically identify the joints and to automatically trace the joint points (by labeling markers at the joints). Based upon the closure of marking points, automatic identification can be achieved through thresholding treatment. Due to the screening frequency and the laws of human segment movement, when the marking points have been initialized, their automatic tracking can be achieved with the progressive sequential images.Then the testing results, the data from three-dimensional force platform and the characteristics that human body segment will only rotate around the closer ending segment when the segment has no boding force and only valid to the conservative force all tell that after being analyzed kinematically, the approach is approved to be valid.

  19. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    PubMed

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  20. Automatic Between-Pulse Analysis of DIII-D Experimental Data Performed Remotely on a Supercomputer at Argonne Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostuk, M.; Uram, T. D.; Evans, T.

    For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less

  1. Automatic Between-Pulse Analysis of DIII-D Experimental Data Performed Remotely on a Supercomputer at Argonne Leadership Computing Facility

    DOE PAGES

    Kostuk, M.; Uram, T. D.; Evans, T.; ...

    2018-02-01

    For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less

  2. Vibration analysis on automatic take-up device of belt conveyor

    NASA Astrophysics Data System (ADS)

    Qin, Tailong; Wei, Jin

    2008-10-01

    Through introducing application condition of belt conveyor in the modern mining industry, the paper proposed, in the dynamic course of its starting, braking or loading, it would produce moving tension and elastic wave. And analyzed the factors cause the automatic take-up device of belt conveyor vibrating: the take-up device's structure and the elastic wave. Finally the paper proposed the measure to reduce vibration and carried on the modeling and simulation on the tension buffer device.

  3. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    DOT National Transportation Integrated Search

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  4. High-Speed Automatic Microscopy for Real Time Tracks Reconstruction in Nuclear Emulsion

    NASA Astrophysics Data System (ADS)

    D'Ambrosio, N.

    2006-06-01

    The Oscillation Project with Emulsion-tRacking Apparatus (OPERA) experiment will use a massive nuclear emulsion detector to search for /spl nu//sub /spl mu///spl rarr//spl nu//sub /spl tau// oscillation by identifying /spl tau/ leptons through the direct detection of their decay topology. The feasibility of experiments using a large mass emulsion detector is linked to the impressive progress under way in the development of automatic emulsion analysis. A new generation of scanning systems requires the development of fast automatic microscopes for emulsion scanning and image analysis to reconstruct tracks of elementary particles. The paper presents the European Scanning System (ESS) developed in the framework of OPERA collaboration.

  5. ADMAP (automatic data manipulation program)

    NASA Technical Reports Server (NTRS)

    Mann, F. I.

    1971-01-01

    Instructions are presented on the use of ADMAP, (automatic data manipulation program) an aerospace data manipulation computer program. The program was developed to aid in processing, reducing, plotting, and publishing electric propulsion trajectory data generated by the low thrust optimization program, HILTOP. The program has the option of generating SC4020 electric plots, and therefore requires the SC4020 routines to be available at excution time (even if not used). Several general routines are present, including a cubic spline interpolation routine, electric plotter dash line drawing routine, and single parameter and double parameter sorting routines. Many routines are tailored for the manipulation and plotting of electric propulsion data, including an automatic scale selection routine, an automatic curve labelling routine, and an automatic graph titling routine. Data are accepted from either punched cards or magnetic tape.

  6. Automatic lumbar spine measurement in CT images

    NASA Astrophysics Data System (ADS)

    Mao, Yunxiang; Zheng, Dong; Liao, Shu; Peng, Zhigang; Yan, Ruyi; Liu, Junhua; Dong, Zhongxing; Gong, Liyan; Zhou, Xiang Sean; Zhan, Yiqiang; Fei, Jun

    2017-03-01

    Accurate lumbar spine measurement in CT images provides an essential way for quantitative spinal diseases analysis such as spondylolisthesis and scoliosis. In today's clinical workflow, the measurements are manually performed by radiologists and surgeons, which is time consuming and irreproducible. Therefore, automatic and accurate lumbar spine measurement algorithm becomes highly desirable. In this study, we propose a method to automatically calculate five different lumbar spine measurements in CT images. There are three main stages of the proposed method: First, a learning based spine labeling method, which integrates both the image appearance and spine geometry information, is used to detect lumbar and sacrum vertebrae in CT images. Then, a multiatlases based image segmentation method is used to segment each lumbar vertebra and the sacrum based on the detection result. Finally, measurements are derived from the segmentation result of each vertebra. Our method has been evaluated on 138 spinal CT scans to automatically calculate five widely used clinical spine measurements. Experimental results show that our method can achieve more than 90% success rates across all the measurements. Our method also significantly improves the measurement efficiency compared to manual measurements. Besides benefiting the routine clinical diagnosis of spinal diseases, our method also enables the large scale data analytics for scientific and clinical researches.

  7. Fluid status monitoring with a wireless network to reduce cardiovascular-related hospitalizations and mortality in heart failure: rationale and design of the OptiLink HF Study (Optimization of Heart Failure Management using OptiVol Fluid Status Monitoring and CareLink)

    PubMed Central

    Brachmann, Johannes; Böhm, Michael; Rybak, Karin; Klein, Gunnar; Butter, Christian; Klemm, Hanno; Schomburg, Rolf; Siebermair, Johannes; Israel, Carsten; Sinha, Anil-Martin; Drexler, Helmut

    2011-01-01

    Aims The Optimization of Heart Failure Management using OptiVol Fluid Status Monitoring and CareLink (OptiLink HF) study is designed to investigate whether OptiVol fluid status monitoring with an automatically generated wireless CareAlert notification via the CareLink Network can reduce all-cause death and cardiovascular hospitalizations in an HF population, compared with standard clinical assessment. Methods Patients with newly implanted or replacement cardioverter-defibrillator devices with or without cardiac resynchronization therapy, who have chronic HF in New York Heart Association class II or III and a left ventricular ejection fraction ≤35% will be eligible to participate. Following device implantation, patients are randomized to either OptiVol fluid status monitoring through CareAlert notification or regular care (OptiLink ‘on' vs. ‘off'). The primary endpoint is a composite of all-cause death or cardiovascular hospitalization. It is estimated that 1000 patients will be required to demonstrate superiority of the intervention group to reduce the primary outcome by 30% with 80% power. Conclusion The OptiLink HF study is designed to investigate whether early detection of congestion reduces mortality and cardiovascular hospitalization in patients with chronic HF. The study is expected to close recruitment in September 2012 and to report first results in May 2014. ClinicalTrials.gov Identifier: NCT00769457 PMID:21555324

  8. Automatic processing of spoken dialogue in the home hemodialysis domain.

    PubMed

    Lacson, Ronilda; Barzilay, Regina

    2005-01-01

    Spoken medical dialogue is a valuable source of information, and it forms a foundation for diagnosis, prevention and therapeutic management. However, understanding even a perfect transcript of spoken dialogue is challenging for humans because of the lack of structure and the verbosity of dialogues. This work presents a first step towards automatic analysis of spoken medical dialogue. The backbone of our approach is an abstraction of a dialogue into a sequence of semantic categories. This abstraction uncovers structure in informal, verbose conversation between a caregiver and a patient, thereby facilitating automatic processing of dialogue content. Our method induces this structure based on a range of linguistic and contextual features that are integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). This work demonstrates the feasibility of automatically processing spoken medical dialogue.

  9. Automatic Classification of Medical Text: The Influence of Publication Form1

    PubMed Central

    Cole, William G.; Michael, Patricia A.; Stewart, James G.; Blois, Marsden S.

    1988-01-01

    Previous research has shown that within the domain of medical journal abstracts the statistical distribution of words is neither random nor uniform, but is highly characteristic. Many words are used mainly or solely by one medical specialty or when writing about one particular level of description. Due to this regularity of usage, automatic classification within journal abstracts has proved quite successful. The present research asks two further questions. It investigates whether this statistical regularity and automatic classification success can also be achieved in medical textbook chapters. It then goes on to see whether the statistical distribution found in textbooks is sufficiently similar to that found in abstracts to permit accurate classification of abstracts based solely on previous knowledge of textbooks. 14 textbook chapters and 45 MEDLINE abstracts were submitted to an automatic classification program that had been trained only on chapters drawn from a standard textbook series. Statistical analysis of the properties of abstracts vs. chapters revealed important differences in word use. Automatic classification performance was good for chapters, but poor for abstracts.

  10. Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription

    NASA Astrophysics Data System (ADS)

    Kabir, A.; Barker, J.; Giurgiu, M.

    2010-09-01

    An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.

  11. Automatic segmentation of the puborectalis muscle in 3D transperineal ultrasound.

    PubMed

    van den Noort, Frieda; Grob, Anique T M; Slump, Cornelis H; van der Vaart, Carl H; van Stralen, Marijn

    2017-10-11

    The introduction of 3D analysis of the puborectalis muscle, for diagnostic purposes, into daily practice is hindered by the need for appropriate training of the observers. Automatic 3D segmentation of the puborectalis muscle in 3D transperineal ultrasound may aid to its adaption in clinical practice. A manual 3D segmentation protocol was developed to segment the puborectalis muscle. The data of 20 women, in their first trimester of pregnancy, was used to validate the reproducibility of this protocol. For automatic segmentation, active appearance models of the puborectalis muscle were developed. Those models were trained using manual segmentation data of 50 women. The performance of both manual and automatic segmentation was analyzed by measuring the overlap and distance between the segmentations. Also, the interclass correlation coefficients and their 95% confidence intervals were determined for mean echogenicity and volume of the puborectalis muscle. The ICC values of mean echogenicity (0.968-0.991) and volume (0.626-0.910) are good to very good for both automatic and manual segmentation. The results of overlap and distance for manual segmentation are as expected, showing only few pixels (2-3) mismatch on average and a reasonable overlap. Based on overlap and distance 5 mismatches in automatic segmentation were detected, resulting in an automatic segmentation a success rate of 90%. In conclusion, this study presents a reliable manual and automatic 3D segmentation of the puborectalis muscle. This will facilitate future investigation of the puborectalis muscle. It also allows for reliable measurements of clinically potentially valuable parameters like mean echogenicity. This article is protected by copyright. All rights reserved.

  12. Research on Application of Automatic Weather Station Based on Internet of Things

    NASA Astrophysics Data System (ADS)

    Jianyun, Chen; Yunfan, Sun; Chunyan, Lin

    2017-12-01

    In this paper, the Internet of Things is briefly introduced, and then its application in the weather station is studied. A method of data acquisition and transmission based on NB-iot communication mode is proposed, Introduction of Internet of things technology, Sensor digital and independent power supply as the technical basis, In the construction of Automatic To realize the intelligent interconnection of the automatic weather station, and then to form an automatic weather station based on the Internet of things. A network structure of automatic weather station based on Internet of things technology is constructed to realize the independent operation of intelligent sensors and wireless data transmission. Research on networking data collection and dissemination of meteorological data, through the data platform for data analysis, the preliminary work of meteorological information publishing standards, networking of meteorological information receiving terminal provides the data interface, to the wisdom of the city, the wisdom of the purpose of the meteorological service.

  13. Automatic imitation of pro- and antisocial gestures: Is implicit social behavior censored?

    PubMed

    Cracco, Emiel; Genschow, Oliver; Radkova, Ina; Brass, Marcel

    2018-01-01

    According to social reward theories, automatic imitation can be understood as a means to obtain positive social consequences. In line with this view, it has been shown that automatic imitation is modulated by contextual variables that constrain the positive outcomes of imitation. However, this work has largely neglected that many gestures have an inherent pro- or antisocial meaning. As a result of their meaning, antisocial gestures are considered taboo and should not be used in public. In three experiments, we show that automatic imitation of symbolic gestures is modulated by the social intent of these gestures. Experiment 1 (N=37) revealed reduced automatic imitation of antisocial compared with prosocial gestures. Experiment 2 (N=118) and Experiment 3 (N=118) used a social priming procedure to show that this effect was stronger in a prosocial context than in an antisocial context. These findings were supported in a within-study meta-analysis using both frequentist and Bayesian statistics. Together, our results indicate that automatic imitation is regulated by internalized social norms that act as a stop signal when inappropriate actions are triggered. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Development of automatic visceral fat volume calculation software for CT volume data.

    PubMed

    Nemoto, Mitsutaka; Yeernuer, Tusufuhan; Masutani, Yoshitaka; Nomura, Yukihiro; Hanaoka, Shouhei; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Ohtomo, Kuni

    2014-01-01

    To develop automatic visceral fat volume calculation software for computed tomography (CT) volume data and to evaluate its feasibility. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30) in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  15. Automatic enforcement and highway safety.

    DOT National Transportation Integrated Search

    2011-05-01

    The objectives of this research are to: 1. Identify aspects of the automatic detection of red light running that the public finds offensive or problematical, and quantify the level of opposition on each aspect. 2. Identify aspects of the automatic de...

  16. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    NASA Astrophysics Data System (ADS)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  17. Utilization of Automatic Tagging Using Web Information to Datamining

    NASA Astrophysics Data System (ADS)

    Sugimura, Hiroshi; Matsumoto, Kazunori

    This paper proposes a data annotation system using the automatic tagging approach. Although annotations of data are useful for deep analysis and mining of it, the cost of providing them becomes huge in most of the cases. In order to solve this problem, we develop a semi-automatic method that consists of two stages. In the first stage, it searches the Web space for relating information, and discovers candidates of effective annotations. The second stage uses knowledge of a human user. The candidates are investigated and refined by the user, and then they become annotations. We in this paper focus on time-series data, and show effectiveness of a GUI tool that supports the above process.

  18. 30 CFR 77.314 - Automatic temperature control instruments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Automatic temperature control instruments. 77...

  19. 30 CFR 77.314 - Automatic temperature control instruments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77...

  20. 30 CFR 77.314 - Automatic temperature control instruments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  1. 30 CFR 77.314 - Automatic temperature control instruments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  2. 30 CFR 77.314 - Automatic temperature control instruments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  3. Thai Automatic Speech Recognition

    DTIC Science & Technology

    2005-01-01

    used in an external DARPA evaluation involving medical scenarios between an American Doctor and a naïve monolingual Thai patient. 2. Thai Language... dictionary generation more challenging, and (3) the lack of word segmentation, which calls for automatic segmentation approaches to make n-gram language...requires a dictionary and provides various segmentation algorithms to automatically select suitable segmentations. Here we used a maximal matching

  4. Automatic Whistler Detector and Analyzer system: Implementation of the analyzer algorithm

    NASA Astrophysics Data System (ADS)

    Lichtenberger, JáNos; Ferencz, Csaba; Hamar, Daniel; Steinbach, Peter; Rodger, Craig J.; Clilverd, Mark A.; Collier, Andrew B.

    2010-12-01

    The full potential of whistlers for monitoring plasmaspheric electron density variations has not yet been realized. The primary reason is the vast human effort required for the analysis of whistler traces. Recently, the first part of a complete whistler analysis procedure was successfully automated, i.e., the automatic detection of whistler traces from the raw broadband VLF signal was achieved. This study describes a new algorithm developed to determine plasmaspheric electron density measurements from whistler traces, based on a Virtual (Whistler) Trace Transformation, using a 2-D fast Fourier transform transformation. This algorithm can be automated and can thus form the final step to complete an Automatic Whistler Detector and Analyzer (AWDA) system. In this second AWDA paper, the practical implementation of the Automatic Whistler Analyzer (AWA) algorithm is discussed and a feasible solution is presented. The practical implementation of the algorithm is able to track the variations of plasmasphere in quasi real time on a PC cluster with 100 CPU cores. The electron densities obtained by the AWA method can be used in investigations such as plasmasphere dynamics, ionosphere-plasmasphere coupling, or in space weather models.

  5. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  6. Clothes Dryer Automatic Termination Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TeGrotenhuis, Ward E.

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this reportmore » shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.« less

  7. Model-Based Reasoning in Humans Becomes Automatic with Training.

    PubMed

    Economides, Marcos; Kurth-Nelson, Zeb; Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J

    2015-09-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  8. Analysis of motor fan radiated sound and vibration waveform by automatic pattern recognition technique using "Mahalanobis distance"

    NASA Astrophysics Data System (ADS)

    Toma, Eiji

    2018-06-01

    In recent years, as the weight of IT equipment has been reduced, the demand for motor fans for cooling the interior of electronic equipment is on the rise. Sensory test technique by inspectors is the mainstream for quality inspection of motor fans in the field. This sensory test requires a lot of experience to accurately diagnose differences in subtle sounds (sound pressures) of the fans, and the judgment varies depending on the condition of the inspector and the environment. In order to solve these quality problems, development of an analysis method capable of quantitatively and automatically diagnosing the sound/vibration level of a fan is required. In this study, it was clarified that the analysis method applying the MT system based on the waveform information of noise and vibration is more effective than the conventional frequency analysis method for the discrimination diagnosis technology of normal and abnormal items. Furthermore, it was found that due to the automation of the vibration waveform analysis system, there was a factor influencing the discrimination accuracy in relation between the fan installation posture and the vibration waveform.

  9. Automatic short axis orientation of the left ventricle in 3D ultrasound recordings

    NASA Astrophysics Data System (ADS)

    Pedrosa, João.; Heyde, Brecht; Heeren, Laurens; Engvall, Jan; Zamorano, Jose; Papachristidis, Alexandros; Edvardsen, Thor; Claus, Piet; D'hooge, Jan

    2016-04-01

    The recent advent of three-dimensional echocardiography has led to an increased interest from the scientific community in left ventricle segmentation frameworks for cardiac volume and function assessment. An automatic orientation of the segmented left ventricular mesh is an important step to obtain a point-to-point correspondence between the mesh and the cardiac anatomy. Furthermore, this would allow for an automatic division of the left ventricle into the standard 17 segments and, thus, fully automatic per-segment analysis, e.g. regional strain assessment. In this work, a method for fully automatic short axis orientation of the segmented left ventricle is presented. The proposed framework aims at detecting the inferior right ventricular insertion point. 211 three-dimensional echocardiographic images were used to validate this framework by comparison to manual annotation of the inferior right ventricular insertion point. A mean unsigned error of 8, 05° +/- 18, 50° was found, whereas the mean signed error was 1, 09°. Large deviations between the manual and automatic annotations (> 30°) only occurred in 3, 79% of cases. The average computation time was 666ms in a non-optimized MATLAB environment, which potentiates real-time application. In conclusion, a successful automatic real-time method for orientation of the segmented left ventricle is proposed.

  10. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    NASA Astrophysics Data System (ADS)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  11. IADE: a system for intelligent automatic design of bioisosteric analogs

    NASA Astrophysics Data System (ADS)

    Ertl, Peter; Lewis, Richard

    2012-11-01

    IADE, a software system supporting molecular modellers through the automatic design of non-classical bioisosteric analogs, scaffold hopping and fragment growing, is presented. The program combines sophisticated cheminformatics functionalities for constructing novel analogs and filtering them based on their drug-likeness and synthetic accessibility using automatic structure-based design capabilities: the best candidates are selected according to their similarity to the template ligand and to their interactions with the protein binding site. IADE works in an iterative manner, improving the fitness of designed molecules in every generation until structures with optimal properties are identified. The program frees molecular modellers from routine, repetitive tasks, allowing them to focus on analysis and evaluation of the automatically designed analogs, considerably enhancing their work efficiency as well as the area of chemical space that can be covered. The performance of IADE is illustrated through a case study of the design of a nonclassical bioisosteric analog of a farnesyltransferase inhibitor—an analog that has won a recent "Design a Molecule" competition.

  12. IADE: a system for intelligent automatic design of bioisosteric analogs.

    PubMed

    Ertl, Peter; Lewis, Richard

    2012-11-01

    IADE, a software system supporting molecular modellers through the automatic design of non-classical bioisosteric analogs, scaffold hopping and fragment growing, is presented. The program combines sophisticated cheminformatics functionalities for constructing novel analogs and filtering them based on their drug-likeness and synthetic accessibility using automatic structure-based design capabilities: the best candidates are selected according to their similarity to the template ligand and to their interactions with the protein binding site. IADE works in an iterative manner, improving the fitness of designed molecules in every generation until structures with optimal properties are identified. The program frees molecular modellers from routine, repetitive tasks, allowing them to focus on analysis and evaluation of the automatically designed analogs, considerably enhancing their work efficiency as well as the area of chemical space that can be covered. The performance of IADE is illustrated through a case study of the design of a nonclassical bioisosteric analog of a farnesyltransferase inhibitor--an analog that has won a recent "Design a Molecule" competition.

  13. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    PubMed

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  14. 10 CFR 429.45 - Automatic commercial ice makers.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...

  15. 10 CFR 429.45 - Automatic commercial ice makers.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...

  16. 10 CFR 429.45 - Automatic commercial ice makers.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Automatic commercial ice makers. 429.45 Section 429.45... PRODUCTS AND COMMERCIAL AND INDUSTRIAL EQUIPMENT Certification § 429.45 Automatic commercial ice makers. (a... automatic commercial ice makers; and (2) For each basic model of automatic commercial ice maker selected for...

  17. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  18. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  19. AUTOMATIC COUNTING APPARATUS

    DOEpatents

    Howell, W.D.

    1957-08-20

    An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.

  20. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  1. A method for measuring enthalpy of volatilization of a compound, Delta(vol)H, from dilute aqueous solution.

    PubMed

    Wang, Tianshu

    2006-01-01

    This study has developed a method for measuring the enthalpy of volatilization (Delta(vol)H) of a compound in a dilute solution via ion-molecule reactions and gas-phase analysis using selected ion flow tube mass spectrometry (SIFT-MS). The Delta(vol)H/R value was obtained using an equation with three variant forms either from the headspace concentration of the solution or from individual product ion(s). Under certain experimental conditions, the equation has the simplest form [formula: see text], where R is the gas constant (8.314 J . mol(-1) . K(-1)), i(n) and I are the respective product and precursor ion count rates, and T is the temperature of the solution. As an example, a series of 27.0 micromol/L aqueous solutions of acetone was analyzed over a temperature range of 25-50 degrees C at 5 degrees C intervals using H3O+, NO+ and O2+* precursor ions, producing a mean Delta(vol)H/R value of 4700 +/- 200 K. This corresponds with current literature values and supports the consistency of the new method. Notably, using this method, as long as the concentration of the solution falls into the range of Henry's law, the exact concentration does not have to be known and it can require only one sample at each temperature. Compared with previous methods which involve the measurement of Henry's law constant at each temperature, this method significantly reduces the number of samples required and avoids the labour and difficulties in preparing standard solutions at very low concentrations. Further to this, if the contents of a solution were unknown the measured Delta(vol)H/R from individual product ion(s) can help to identify the origin of the ion(s). Copyright 2006 John Wiley & Sons, Ltd.

  2. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    ERIC Educational Resources Information Center

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  3. Semi-automatic knee cartilage segmentation

    NASA Astrophysics Data System (ADS)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  4. Automatic registration of multi-modal microscopy images for integrative analysis of prostate tissue sections.

    PubMed

    Lippolis, Giuseppe; Edsjö, Anders; Helczynski, Leszek; Bjartell, Anders; Overgaard, Niels Chr

    2013-09-05

    Prostate cancer is one of the leading causes of cancer related deaths. For diagnosis, predicting the outcome of the disease, and for assessing potential new biomarkers, pathologists and researchers routinely analyze histological samples. Morphological and molecular information may be integrated by aligning microscopic histological images in a multiplex fashion. This process is usually time-consuming and results in intra- and inter-user variability. The aim of this study is to investigate the feasibility of using modern image analysis methods for automated alignment of microscopic images from differently stained adjacent paraffin sections from prostatic tissue specimens. Tissue samples, obtained from biopsy or radical prostatectomy, were sectioned and stained with either hematoxylin & eosin (H&E), immunohistochemistry for p63 and AMACR or Time Resolved Fluorescence (TRF) for androgen receptor (AR). Image pairs were aligned allowing for translation, rotation and scaling. The registration was performed automatically by first detecting landmarks in both images, using the scale invariant image transform (SIFT), followed by the well-known RANSAC protocol for finding point correspondences and finally aligned by Procrustes fit. The Registration results were evaluated using both visual and quantitative criteria as defined in the text. Three experiments were carried out. First, images of consecutive tissue sections stained with H&E and p63/AMACR were successfully aligned in 85 of 88 cases (96.6%). The failures occurred in 3 out of 13 cores with highly aggressive cancer (Gleason score ≥ 8). Second, TRF and H&E image pairs were aligned correctly in 103 out of 106 cases (97%).The third experiment considered the alignment of image pairs with the same staining (H&E) coming from a stack of 4 sections. The success rate for alignment dropped from 93.8% in adjacent sections to 22% for sections furthest away. The proposed method is both reliable and fast and therefore well suited

  5. Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.

    PubMed

    Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L

    2010-07-01

    The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty liver disease, to test this hypothesis. Five transcutaneous and five intraoperative US liver images were acquired in each animal and a liverbiopsy was taken. In liver tissue samples, triacylglycerol (TAG) was measured by biochemical analysis and hepatic diseases other than hepatic lipidosis were excluded by histopathologic examination. Ultrasonic tissue characterization (UTC) parameters--Mean echo level, standard deviation (SD) of echo level, signal-to-noise ratio (SNR), residual attenuation coefficient (ResAtt) and axial and lateral speckle size--were derived using a computer-aided US (CAUS) protocol and software package. First, the liver tissue was interactively segmented by two observers. With increasing fat content, fewer hepatic vessels were visible in the ultrasound images and, therefore, a smaller proportion of the liver needed to be excluded from these images. Automatic-segmentation algorithms were implemented and it was investigated whether better results could be achieved than with the subjective and time-consuming interactive-segmentation procedure. The automatic-segmentation algorithms were based on both fixed and adaptive thresholding techniques in combination with a 'speckle'-shaped moving-window exclusion technique. All data were analyzed with and without postprocessing as contained in CAUS and with different automated-segmentation techniques. This enabled us to study the effect of the applied postprocessing steps on single and multiple linear regressions ofthe various UTC parameters with TAG. Improved correlations for all US parameters were found by using automatic-segmentation techniques. Stepwise multiple linear-regression formulas where derived and used

  6. A system for automatic analysis of blood pressure data for digital computer entry

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1972-01-01

    Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.

  7. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  8. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    PubMed

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  9. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  10. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  11. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  12. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  13. Automatic query formulations in information retrieval.

    PubMed

    Salton, G; Buckley, C; Fox, E A

    1983-07-01

    Modern information retrieval systems are designed to supply relevant information in response to requests received from the user population. In most retrieval environments the search requests consist of keywords, or index terms, interrelated by appropriate Boolean operators. Since it is difficult for untrained users to generate effective Boolean search requests, trained search intermediaries are normally used to translate original statements of user need into useful Boolean search formulations. Methods are introduced in this study which reduce the role of the search intermediaries by making it possible to generate Boolean search formulations completely automatically from natural language statements provided by the system patrons. Frequency considerations are used automatically to generate appropriate term combinations as well as Boolean connectives relating the terms. Methods are covered to produce automatic query formulations both in a standard Boolean logic system, as well as in an extended Boolean system in which the strict interpretation of the connectives is relaxed. Experimental results are supplied to evaluate the effectiveness of the automatic query formulation process, and methods are described for applying the automatic query formulation process in practice.

  14. Automatic intraaortic balloon pump timing using an intrabeat dicrotic notch prediction algorithm.

    PubMed

    Schreuder, Jan J; Castiglioni, Alessandro; Donelli, Andrea; Maisano, Francesco; Jansen, Jos R C; Hanania, Ramzi; Hanlon, Pat; Bovelander, Jan; Alfieri, Ottavio

    2005-03-01

    The efficacy of intraaortic balloon counterpulsation (IABP) during arrhythmic episodes is questionable. A novel algorithm for intrabeat prediction of the dicrotic notch was used for real time IABP inflation timing control. A windkessel model algorithm was used to calculate real-time aortic flow from aortic pressure. The dicrotic notch was predicted using a percentage of calculated peak flow. Automatic inflation timing was set at intrabeat predicted dicrotic notch and was combined with automatic IAB deflation. Prophylactic IABP was applied in 27 patients with low ejection fraction (< 35%) undergoing cardiac surgery. Analysis of IABP at a 1:4 ratio revealed that IAB inflation occurred at a mean of 0.6 +/- 5 ms from the dicrotic notch. In all patients accurate automatic timing at a 1:1 assist ratio was performed. Seventeen patients had episodes of severe arrhythmia, the novel IABP inflation algorithm accurately assisted 318 of 320 arrhythmic beats at a 1:1 ratio. The novel real-time intrabeat IABP inflation timing algorithm performed accurately in all patients during both regular rhythms and severe arrhythmia, allowing fully automatic intrabeat IABP timing.

  15. Adding Automatic Evaluation to Interactive Virtual Labs

    ERIC Educational Resources Information Center

    Farias, Gonzalo; Muñoz de la Peña, David; Gómez-Estern, Fabio; De la Torre, Luis; Sánchez, Carlos; Dormido, Sebastián

    2016-01-01

    Automatic evaluation is a challenging field that has been addressed by the academic community in order to reduce the assessment workload. In this work we present a new element for the authoring tool Easy Java Simulations (EJS). This element, which is named automatic evaluation element (AEE), provides automatic evaluation to virtual and remote…

  16. Automatic segmentation of the left ventricle in a cardiac MR short axis image using blind morphological operation

    NASA Astrophysics Data System (ADS)

    Irshad, Mehreen; Muhammad, Nazeer; Sharif, Muhammad; Yasmeen, Mussarat

    2018-04-01

    Conventionally, cardiac MR image analysis is done manually. Automatic examination for analyzing images can replace the monotonous tasks of massive amounts of data to analyze the global and regional functions of the cardiac left ventricle (LV). This task is performed using MR images to calculate the analytic cardiac parameter like end-systolic volume, end-diastolic volume, ejection fraction, and myocardial mass, respectively. These analytic parameters depend upon genuine delineation of epicardial, endocardial, papillary muscle, and trabeculations contours. In this paper, we propose an automatic segmentation method using the sum of absolute differences technique to localize the left ventricle. Blind morphological operations are proposed to segment and detect the LV contours of the epicardium and endocardium, automatically. We test the benchmark Sunny Brook dataset for evaluation of the proposed work. Contours of epicardium and endocardium are compared quantitatively to determine contour's accuracy and observe high matching values. Similarity or overlapping of an automatic examination to the given ground truth analysis by an expert are observed with high accuracy as with an index value of 91.30% . The proposed method for automatic segmentation gives better performance relative to existing techniques in terms of accuracy.

  17. Case-based synthesis in automatic advertising creation system

    NASA Astrophysics Data System (ADS)

    Zhuang, Yueting; Pan, Yunhe

    1995-08-01

    Advertising (ads) is an important design area. Though many interactive ad-design softwares have come into commercial use, none of them ever support the intelligent work -- automatic ad creation. The potential for this is enormous. This paper gives a description of our current work in research of an automatic advertising creation system (AACS). After careful analysis of the mental behavior of a human ad designer, we conclude that case-based approach is appropriate to its intelligent modeling. A model for AACS is given in the paper. A case in ads is described as two parts: the creation process and the configuration of the ads picture, with detailed data structures given in the paper. Along with the case representation, we put forward an algorithm. Some issues such as similarity measure computing, and case adaptation have also been discussed.

  18. Text Structuration Leading to an Automatic Summary System: RAFI.

    ERIC Educational Resources Information Center

    Lehman, Abderrafih

    1999-01-01

    Describes the design and construction of Resume Automatique a Fragments Indicateurs (RAFI), a system of automatic text summary which sums up scientific and technical texts. The RAFI system transforms a long source text into several versions of more condensed texts, using discourse analysis, to make searching easier; it could be adapted to the…

  19. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    ERIC Educational Resources Information Center

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  20. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  1. Automatic detection of confusion in elderly users of a web-based health instruction video.

    PubMed

    Postma-Nilsenová, Marie; Postma, Eric; Tates, Kiek

    2015-06-01

    Because of cognitive limitations and lower health literacy, many elderly patients have difficulty understanding verbal medical instructions. Automatic detection of facial movements provides a nonintrusive basis for building technological tools supporting confusion detection in healthcare delivery applications on the Internet. Twenty-four elderly participants (70-90 years old) were recorded while watching Web-based health instruction videos involving easy and complex medical terminology. Relevant fragments of the participants' facial expressions were rated by 40 medical students for perceived level of confusion and analyzed with automatic software for facial movement recognition. A computer classification of the automatically detected facial features performed more accurately and with a higher sensitivity than the human observers (automatic detection and classification, 64% accuracy, 0.64 sensitivity; human observers, 41% accuracy, 0.43 sensitivity). A drill-down analysis of cues to confusion indicated the importance of the eye and eyebrow region. Confusion caused by misunderstanding of medical terminology is signaled by facial cues that can be automatically detected with currently available facial expression detection technology. The findings are relevant for the development of Web-based services for healthcare consumers.

  2. A Low-Cost Device for Automatic Photometric Titrations

    NASA Astrophysics Data System (ADS)

    Rocha, Fábio R. P.; Reis, Boaventura F.

    2000-02-01

    Electronics is an important topic in chemistry courses. However, the introduction of basic concepts is often difficult and the lab instruments are frequently seen as "black boxes". To address this problem, we propose the construction of a simple, low-cost (about $150 U.S.) automatic photometric titrator employing a light-emitting diode (LED) and a phototransistor. The electronic circuit can be assembled by the students themselves. The device was employed to implement a common procedure in chemical labs, making feasible the introduction of concepts related to electronics in undergraduate chemistry courses. The titrator is able to work automatically, since a feedback system permits stopping the addition of titrant solution when the end-point is achieved. With this demonstration, it can be stressed that automatic procedures can be implemented without expensive instruments. Additionally, a classical procedure becomes more attractive to the students and its importance to chemical analysis can be emphasized. The feasibility of the titrator was demonstrated by acid-base titrations of HCl solutions with NaOH in the presence of phenolphthalein and by iodimetric determination of ascorbic acid in vitamin C tablets and lemon juice. Precise results (0.7% relative standard deviation, n = 10) in agreement at the 95% confidence level with those attained by a conventional procedure were obtained.

  3. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    NASA Astrophysics Data System (ADS)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  4. Automatic segmentation and supervised learning-based selection of nuclei in cancer tissue images.

    PubMed

    Nandy, Kaustav; Gudla, Prabhakar R; Amundsen, Ryan; Meaburn, Karen J; Misteli, Tom; Lockett, Stephen J

    2012-09-01

    Analysis of preferential localization of certain genes within the cell nuclei is emerging as a new technique for the diagnosis of breast cancer. Quantitation requires accurate segmentation of 100-200 cell nuclei in each tissue section to draw a statistically significant result. Thus, for large-scale analysis, manual processing is too time consuming and subjective. Fortuitously, acquired images generally contain many more nuclei than are needed for analysis. Therefore, we developed an integrated workflow that selects, following automatic segmentation, a subpopulation of accurately delineated nuclei for positioning of fluorescence in situ hybridization-labeled genes of interest. Segmentation was performed by a multistage watershed-based algorithm and screening by an artificial neural network-based pattern recognition engine. The performance of the workflow was quantified in terms of the fraction of automatically selected nuclei that were visually confirmed as well segmented and by the boundary accuracy of the well-segmented nuclei relative to a 2D dynamic programming-based reference segmentation method. Application of the method was demonstrated for discriminating normal and cancerous breast tissue sections based on the differential positioning of the HES5 gene. Automatic results agreed with manual analysis in 11 out of 14 cancers, all four normal cases, and all five noncancerous breast disease cases, thus showing the accuracy and robustness of the proposed approach. Published 2012 Wiley Periodicals, Inc.

  5. An automatic method to detect and track the glottal gap from high speed videoendoscopic images.

    PubMed

    Andrade-Miranda, Gustavo; Godino-Llorente, Juan I; Moro-Velázquez, Laureano; Gómez-García, Jorge Andrés

    2015-10-29

    The image-based analysis of the vocal folds vibration plays an important role in the diagnosis of voice disorders. The analysis is based not only on the direct observation of the video sequences, but also in an objective characterization of the phonation process by means of features extracted from the recorded images. However, such analysis is based on a previous accurate identification of the glottal gap, which is the most challenging step for a further automatic assessment of the vocal folds vibration. In this work, a complete framework to automatically segment and track the glottal area (or glottal gap) is proposed. The algorithm identifies a region of interest that is adapted along time, and combine active contours and watershed transform for the final delineation of the glottis and also an automatic procedure for synthesize different videokymograms is proposed. Thanks to the ROI implementation, our technique is robust to the camera shifting and also the objective test proved the effectiveness and performance of the approach in the most challenging scenarios that it is when exist an inappropriate closure of the vocal folds. The novelties of the proposed algorithm relies on the used of temporal information for identify an adaptive ROI and the use of watershed merging combined with active contours for the glottis delimitation. Additionally, an automatic procedure for synthesize multiline VKG by the identification of the glottal main axis is developed.

  6. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  7. Automatic Cell Segmentation in Fluorescence Images of Confluent Cell Monolayers Using Multi-object Geometric Deformable Model.

    PubMed

    Yang, Zhen; Bogovic, John A; Carass, Aaron; Ye, Mao; Searson, Peter C; Prince, Jerry L

    2013-03-13

    With the rapid development of microscopy for cell imaging, there is a strong and growing demand for image analysis software to quantitatively study cell morphology. Automatic cell segmentation is an important step in image analysis. Despite substantial progress, there is still a need to improve the accuracy, efficiency, and adaptability to different cell morphologies. In this paper, we propose a fully automatic method for segmenting cells in fluorescence images of confluent cell monolayers. This method addresses several challenges through a combination of ideas. 1) It realizes a fully automatic segmentation process by first detecting the cell nuclei as initial seeds and then using a multi-object geometric deformable model (MGDM) for final segmentation. 2) To deal with different defects in the fluorescence images, the cell junctions are enhanced by applying an order-statistic filter and principal curvature based image operator. 3) The final segmentation using MGDM promotes robust and accurate segmentation results, and guarantees no overlaps and gaps between neighboring cells. The automatic segmentation results are compared with manually delineated cells, and the average Dice coefficient over all distinguishable cells is 0.88.

  8. Automatic Layout Design for Power Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, Puqi; Wang, Fei; Ngo, Khai

    The layout of power modules is one of the most important elements in power module design, especially for high power densities, where couplings are increased. In this paper, an automatic design process using a genetic algorithm is presented. Some practical considerations are introduced in the optimization of the layout design of the module. This paper presents a process for automatic layout design for high power density modules. Detailed GA implementations are introduced both for outer loop and inner loop. As verified by a design example, the results of the automatic design process presented here are better than those from manualmore » design and also better than the results from a popular design software. This automatic design procedure could be a major step toward improving the overall performance of future layout design.« less

  9. Word-Level and Sentence-Level Automaticity in English as a Foreign Language (EFL) Learners: a Comparative Study

    ERIC Educational Resources Information Center

    Ma, Dongmei; Yu, Xiaoru; Zhang, Haomin

    2017-01-01

    The present study aimed to investigate second language (L2) word-level and sentence-level automatic processing among English as a foreign language students through a comparative analysis of students with different proficiency levels. As a multidimensional and dynamic construct, automaticity is conceptualized as processing speed, stability, and…

  10. Automatic Scaling of Digisonde Ionograms Computer Program and Numerical Analysis Documentation,

    DTIC Science & Technology

    1983-02-01

    and Huang, 1983). This method is ideally suited for autoscaled results as discussed in Reference 1. The results of ARTIST are outputted to a standard... ARTIST , the autoscaling routine has been tested on some 8000 ionograms from Goose Bay, Labrador, for the months January, April, July, and September of 1980...Automatic scaling of Digisonde ionograms by-~a ’R-4 ARTIST system is discussed and ref erence is made to the ARTIST’s success in scaling over 8000

  11. [An automatic system for anatomophysiological correlation in three planes simultaneously during functional neurosurgery].

    PubMed

    Teijeiro, E J; Macías, R J; Morales, J M; Guerra, E; López, G; Alvarez, L M; Fernández, F; Maragoto, C; Seijo, F; Alvarez, E

    The Neurosurgical Deep Recording System (NDRS) using a personal computer takes the place of complex electronic equipment for recording and processing deep cerebral electrical activity, as a guide in stereotaxic functional neurosurgery. It also permits increased possibilities of presenting information in direct graphic form with automatic management and sufficient flexibility to implement different analyses. This paper describes the possibilities of automatic simultaneous graphic representation in three almost orthogonal planes, available with the new 5.1 version of NDRS so as to facilitate the analysis of anatomophysiological correlation in the localization of deep structures of the brain during minimal access surgery. This new version can automatically show the spatial behaviour of signals registered throughout the path of the electrode inside the brain, superimposed simultaneously on sagittal, coronal and axial sections of an anatomical atlas of the brain, after adjusting the scale automatically according to the dimensions of the brain of each individual patient. This may also be shown in a tridimensional representation of the different planes themselves intercepting. The NDRS system has been successfully used in Spain and Cuba in over 300 functional neurosurgery operations. The new version further facilitates analysis of spatial anatomophysiological correlation for the localization of brain structures. This system has contributed to increase the precision and safety in selecting surgical targets in the control of Parkinson s disease and other disorders of movement.

  12. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    DTIC Science & Technology

    2016-06-01

    within an attack tree structure, then expand attack tree methodology to include cryptographic reductions. We then provide the algorithms for...maintaining and automatically reasoning about these expanded attack trees . We provide a software tool that utilizes machine-readable proof and attack metadata...and the attack tree methodology to provide rapid and precise answers regarding security parameters and effective security. This eliminates the need

  13. An automatic tsunami warning system: TREMORS application in Europe

    NASA Astrophysics Data System (ADS)

    Reymond, D.; Robert, S.; Thomas, Y.; Schindelé, F.

    1996-03-01

    An integrated system named TREMORS (Tsunami Risk Evaluation through seismic Moment of a Real-time System) has been installed in EVORA station, in Portugal which has been affected by historical tsunamis. The system is based on a three component long period seismic station linked to a compatible IBM_PC with a specific software. The goals of this system are the followings: detect earthquake, locate them, compute their seismic moment, give a seismic warning. The warnings are based on the seismic moment estimation and all the processing are made automatically. The finality of this study is to check the quality of estimation of the main parameters of interest in a goal of tsunami warning: the location which depends of azimuth and distance, and at last the seismic moment, M 0, which controls the earthquake size. The sine qua non condition for obtaining an automatic location is that the 3 main seismic phases P, S, R must be visible. This study gives satisfying results (automatic analysis): ± 5° errors in azimuth and epicentral distance, and a standard deviation of less than a factor 2 for the seismic moment M 0.

  14. Automatic Cataract Hardness Classification Ex Vivo by Ultrasound Techniques.

    PubMed

    Caixinha, Miguel; Santos, Mário; Santos, Jaime

    2016-04-01

    To demonstrate the feasibility of a new methodology for cataract hardness characterization and automatic classification using ultrasound techniques, different cataract degrees were induced in 210 porcine lenses. A 25-MHz ultrasound transducer was used to obtain acoustical parameters (velocity and attenuation) and backscattering signals. B-Scan and parametric Nakagami images were constructed. Ninety-seven parameters were extracted and subjected to a Principal Component Analysis. Bayes, K-Nearest-Neighbours, Fisher Linear Discriminant and Support Vector Machine (SVM) classifiers were used to automatically classify the different cataract severities. Statistically significant increases with cataract formation were found for velocity, attenuation, mean brightness intensity of the B-Scan images and mean Nakagami m parameter (p < 0.01). The four classifiers showed a good performance for healthy versus cataractous lenses (F-measure ≥ 92.68%), while for initial versus severe cataracts the SVM classifier showed the higher performance (90.62%). The results showed that ultrasound techniques can be used for non-invasive cataract hardness characterization and automatic classification. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  15. Automatic detection and recognition of signs from natural scenes.

    PubMed

    Chen, Xilin; Yang, Jie; Zhang, Jing; Waibel, Alex

    2004-01-01

    In this paper, we present an approach to automatic detection and recognition of signs from natural scenes, and its application to a sign translation task. The proposed approach embeds multiresolution and multiscale edge detection, adaptive searching, color analysis, and affine rectification in a hierarchical framework for sign detection, with different emphases at each phase to handle the text in different sizes, orientations, color distributions and backgrounds. We use affine rectification to recover deformation of the text regions caused by an inappropriate camera view angle. The procedure can significantly improve text detection rate and optical character recognition (OCR) accuracy. Instead of using binary information for OCR, we extract features from an intensity image directly. We propose a local intensity normalization method to effectively handle lighting variations, followed by a Gabor transform to obtain local features, and finally a linear discriminant analysis (LDA) method for feature selection. We have applied the approach in developing a Chinese sign translation system, which can automatically detect and recognize Chinese signs as input from a camera, and translate the recognized text into English.

  16. Automatic cloud coverage assessment of Formosat-2 image

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  17. A Noise-Assisted Data Analysis Method for Automatic EOG-Based Sleep Stage Classification Using Ensemble Learning.

    PubMed

    Olesen, Alexander Neergaard; Christensen, Julie A E; Sorensen, Helge B D; Jennum, Poul J

    2016-08-01

    Reducing the number of recording modalities for sleep staging research can benefit both researchers and patients, under the condition that they provide as accurate results as conventional systems. This paper investigates the possibility of exploiting the multisource nature of the electrooculography (EOG) signals by presenting a method for automatic sleep staging using the complete ensemble empirical mode decomposition with adaptive noise algorithm, and a random forest classifier. It achieves a high overall accuracy of 82% and a Cohen's kappa of 0.74 indicating substantial agreement between automatic and manual scoring.

  18. Automatic Texture Mapping of Architectural and Archaeological 3d Models

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Stallmann, D.

    2012-07-01

    Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  19. Automatic and Direct Identification of Blink Components from Scalp EEG

    PubMed Central

    Kong, Wanzeng; Zhou, Zhanpeng; Hu, Sanqing; Zhang, Jianhai; Babiloni, Fabio; Dai, Guojun

    2013-01-01

    Eye blink is an important and inevitable artifact during scalp electroencephalogram (EEG) recording. The main problem in EEG signal processing is how to identify eye blink components automatically with independent component analysis (ICA). Taking into account the fact that the eye blink as an external source has a higher sum of correlation with frontal EEG channels than all other sources due to both its location and significant amplitude, in this paper, we proposed a method based on correlation index and the feature of power distribution to automatically detect eye blink components. Furthermore, we prove mathematically that the correlation between independent components and scalp EEG channels can be translating directly from the mixing matrix of ICA. This helps to simplify calculations and understand the implications of the correlation. The proposed method doesn't need to select a template or thresholds in advance, and it works without simultaneously recording an electrooculography (EOG) reference. The experimental results demonstrate that the proposed method can automatically recognize eye blink components with a high accuracy on entire datasets from 15 subjects. PMID:23959240

  20. Comparative analysis of image classification methods for automatic diagnosis of ophthalmic images

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Zhang, Kai; Liu, Xiyang; Long, Erping; Jiang, Jiewei; An, Yingying; Zhang, Jia; Liu, Zhenzhen; Lin, Zhuoling; Li, Xiaoyan; Chen, Jingjing; Cao, Qianzhong; Li, Jing; Wu, Xiaohang; Wang, Dongni; Li, Wangting; Lin, Haotian

    2017-01-01

    There are many image classification methods, but it remains unclear which methods are most helpful for analyzing and intelligently identifying ophthalmic images. We select representative slit-lamp images which show the complexity of ocular images as research material to compare image classification algorithms for diagnosing ophthalmic diseases. To facilitate this study, some feature extraction algorithms and classifiers are combined to automatic diagnose pediatric cataract with same dataset and then their performance are compared using multiple criteria. This comparative study reveals the general characteristics of the existing methods for automatic identification of ophthalmic images and provides new insights into the strengths and shortcomings of these methods. The relevant methods (local binary pattern +SVMs, wavelet transformation +SVMs) which achieve an average accuracy of 87% and can be adopted in specific situations to aid doctors in preliminarily disease screening. Furthermore, some methods requiring fewer computational resources and less time could be applied in remote places or mobile devices to assist individuals in understanding the condition of their body. In addition, it would be helpful to accelerate the development of innovative approaches and to apply these methods to assist doctors in diagnosing ophthalmic disease.

  1. Quality Control of True Height Profiles Obtained Automatically from Digital Ionograms.

    DTIC Science & Technology

    1982-05-01

    nece.,ssary and Identify by block number) Ionosphere Digisonde Electron Density Profile Ionogram Autoscaling ARTIST 2 , ABSTRACT (Continue on reverae...analysis technique currently used with the ionogram traces scaled automatically by the ARTIST software [Reinisch and Huang, 1983; Reinisch et al...19841, and the generalized polynomial analysis technique POLAN [Titheridge, 1985], using the same ARTIST -identified ionogram traces. 2. To determine how

  2. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations.

    PubMed

    Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S

    2014-12-09

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.

  3. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  4. Robust automatic P-phase picking: an on-line implementation in the analysis of broadband seismogram recordings

    NASA Astrophysics Data System (ADS)

    Sleeman, Reinoud; van Eck, Torild

    1999-06-01

    The onset of a seismic signal is determined through joint AR modeling of the noise and the seismic signal, and the application of the Akaike Information Criterion (AIC) using the onset time as parameter. This so-called AR-AIC phase picker has been tested successfully and implemented on the Z-component of the broadband station HGN to provide automatic P-phase picks for a rapid warning system. The AR-AIC picker is shown to provide accurate and robust automatic picks on a large experimental database. Out of 1109 P-phase onsets with signal-to-noise ratio (SNR) above 1 from local, regional and teleseismic earthquakes, our implementation detects 71% and gives a mean difference with manual picks of 0.1 s. An optimal version of the well-established picker of Baer and Kradolfer [Baer, M., Kradolfer, U., An automatic phase picker for local and teleseismic events, Bull. Seism. Soc. Am. 77 (1987) 1437-1445] detects less than 41% and gives a mean difference with manual picks of 0.3 s using the same dataset.

  5. Automatic Correction Algorithm of Hyfrology Feature Attribute in National Geographic Census

    NASA Astrophysics Data System (ADS)

    Li, C.; Guo, P.; Liu, X.

    2017-09-01

    A subset of the attributes of hydrologic features data in national geographic census are not clear, the current solution to this problem was through manual filling which is inefficient and liable to mistakes. So this paper proposes an automatic correction algorithm of hydrologic features attribute. Based on the analysis of the structure characteristics and topological relation, we put forward three basic principles of correction which include network proximity, structure robustness and topology ductility. Based on the WJ-III map workstation, we realize the automatic correction of hydrologic features. Finally, practical data is used to validate the method. The results show that our method is highly reasonable and efficient.

  6. Automatic patient dose registry and clinical audit on line for mammography.

    PubMed

    Ten, J I; Vano, E; Sánchez, R; Fernandez-Soto, J M

    2015-07-01

    The use of automatic registry systems for patient dose in digital mammography allows clinical audit and patient dose analysis of the whole sample of individual mammography exposures while fulfilling the requirements of the European Directives and other international recommendations. Further parameters associated with radiation exposure (tube voltage, X-ray tube output and HVL values for different kVp and target/filter combinations, breast compression, etc.) should be periodically verified and used to evaluate patient doses. This study presents an experience in routine clinical practice for mammography using automatic systems. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Development of Standard Methods of Testing and Analyzing Fatigue Crack Growth Rate Data

    DTIC Science & Technology

    1978-05-01

    nitrogen cooled cryostat; high temperature tests were conducted using resistance heating tapes . An automatic controller maintained test temperatures...Cracking," Int. J. Fracture, Vol. 9, 1973, pp. 63-74. 87. P. Paris and F. Erdogan , "A Critical Analysis of Crack Propagation Laws," Trans. ASME, Ser. D: J...requirements of Sec. 7.2 and Appendix B. 200 REFERENCES 1. P. C. Paris and F. Erdogan , "A Critical Analysis of Crack Propagation Laws", Trans. ASME, Ser. D: 3

  8. Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics

    NASA Astrophysics Data System (ADS)

    Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu

    2007-11-01

    In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.

  9. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    PubMed

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  10. A method for the automatic reconstruction of fetal cardiac signals from magnetocardiographic recordings

    NASA Astrophysics Data System (ADS)

    Mantini, D.; Alleva, G.; Comani, S.

    2005-10-01

    Fetal magnetocardiography (fMCG) allows monitoring the fetal heart function through algorithms able to retrieve the fetal cardiac signal, but no standardized automatic model has become available so far. In this paper, we describe an automatic method that restores the fetal cardiac trace from fMCG recordings by means of a weighted summation of fetal components separated with independent component analysis (ICA) and identified through dedicated algorithms that analyse the frequency content and temporal structure of each source signal. Multichannel fMCG datasets of 66 healthy and 4 arrhythmic fetuses were used to validate the automatic method with respect to a classical procedure requiring the manual classification of fetal components by an expert investigator. ICA was run with input clusters of different dimensions to simulate various MCG systems. Detection rates, true negative and false positive component categorization, QRS amplitude, standard deviation and signal-to-noise ratio of reconstructed fetal signals, and real and per cent QRS differences between paired fetal traces retrieved automatically and manually were calculated to quantify the performances of the automatic method. Its robustness and reliability, particularly evident with the use of large input clusters, might increase the diagnostic role of fMCG during the prenatal period.

  11. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  12. 49 CFR 236.824 - System, automatic block signal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic block signal. 236.824 Section... § 236.824 System, automatic block signal. A block signal system wherein the use of each block is governed by an automatic block signal, cab signal, or both. ...

  13. Real-time automatic registration in optical surgical navigation

    NASA Astrophysics Data System (ADS)

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming

    2016-05-01

    An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.

  14. Negative Life Events and Antenatal Depression among Pregnant Women in Rural China: The Role of Negative Automatic Thoughts.

    PubMed

    Wang, Yang; Wang, Xiaohua; Liu, Fangnan; Jiang, Xiaoning; Xiao, Yun; Dong, Xuehan; Kong, Xianglei; Yang, Xuemei; Tian, Donghua; Qu, Zhiyong

    2016-01-01

    Few studies have looked at the relationship between psychological and the mental health status of pregnant women in rural China. The current study aims to explore the potential mediating effect of negative automatic thoughts between negative life events and antenatal depression. Data were collected in June 2012 and October 2012. 495 rural pregnant women were interviewed. Depressive symptoms were measured by the Edinburgh postnatal depression scale, stresses of pregnancy were measured by the pregnancy pressure scale, negative automatic thoughts were measured by the automatic thoughts questionnaire, and negative life events were measured by the life events scale for pregnant women. We used logistic regression and path analysis to test the mediating effect. The prevalence of antenatal depression was 13.7%. In the logistic regression, the only socio-demographic and health behavior factor significantly related to antenatal depression was sleep quality. Negative life events were not associated with depression in the fully adjusted model. Path analysis showed that the eventual direct and general effects of negative automatic thoughts were 0.39 and 0.51, which were larger than the effects of negative life events. This study suggested that there was a potentially significant mediating effect of negative automatic thoughts. Pregnant women who had lower scores of negative automatic thoughts were more likely to suffer less from negative life events which might lead to antenatal depression.

  15. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  16. Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.

    PubMed

    Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana

    2017-07-01

    Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.

  17. Automatic Fastening Large Structures: a New Approach

    NASA Technical Reports Server (NTRS)

    Lumley, D. F.

    1985-01-01

    The external tank (ET) intertank structure for the space shuttle, a 27.5 ft diameter 22.5 ft long externally stiffened mechanically fastened skin-stringer-frame structure, was a labor intensitive manual structure built on a modified Saturn tooling position. A new approach was developed based on half-section subassemblies. The heart of this manufacturing approach will be 33 ft high vertical automatic riveting system with a 28 ft rotary positioner coming on-line in mid 1985. The Automatic Riveting System incorporates many of the latest automatic riveting technologies. Key features include: vertical columns with two sets of independently operating CNC drill-riveting heads; capability of drill, insert and upset any one piece fastener up to 3/8 inch diameter including slugs without displacing the workpiece offset bucking ram with programmable rotation and deep retraction; vision system for automatic parts program re-synchronization and part edge margin control; and an automatic rivet selection/handling system.

  18. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  19. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  20. Use of seatbelts in cars with automatic belts.

    PubMed Central

    Williams, A F; Wells, J K; Lund, A K; Teed, N J

    1992-01-01

    Use of seatbelts in late model cars with automatic or manual belt systems was observed in suburban Washington, DC, Chicago, Los Angeles, and Philadelphia. In cars with automatic two-point belt systems, the use of shoulder belts by drivers was substantially higher than in the same model cars with manual three-point belts. This finding was true in varying degrees whatever the type of automatic belt, including cars with detachable nonmotorized belts, cars with detachable motorized belts, and especially cars with nondetachable motorized belts. Most of these automatic shoulder belts systems include manual lap belts. Use of lap belts was lower in cars with automatic two-point belt systems than in the same model cars with manual three-point belts; precisely how much lower could not be reliably estimated in this survey. Use of shoulder and lap belts was slightly higher in General Motors cars with detachable automatic three-point belts compared with the same model cars with manual three-point belts; in Hondas there was no difference in the rates of use of manual three-point belts and the rates of use of automatic three-point belts. PMID:1561301

  1. Automatic phase aberration compensation for digital holographic microscopy based on deep learning background detection.

    PubMed

    Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George

    2017-06-26

    We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.

  2. WOLF; automatic typing program

    USGS Publications Warehouse

    Evenden, G.I.

    1982-01-01

    A FORTRAN IV program for the Hewlett-Packard 1000 series computer provides for automatic typing operations and can, when employed with manufacturer's text editor, provide a system to greatly facilitate preparation of reports, letters and other text. The input text and imbedded control data can perform nearly all of the functions of a typist. A few of the features available are centering, titles, footnotes, indentation, page numbering (including Roman numerals), automatic paragraphing, and two forms of tab operations. This documentation contains both user and technical description of the program.

  3. 49 CFR 236.825 - System, automatic train control.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  4. 49 CFR 236.825 - System, automatic train control.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  5. 7 CFR 58.418 - Automatic cheese making equipment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Automatic cheese making equipment. 58.418 Section 58.418 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING... Service 1 Equipment and Utensils § 58.418 Automatic cheese making equipment. (a) Automatic Curd Maker. The...

  6. 7 CFR 58.418 - Automatic cheese making equipment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Automatic cheese making equipment. 58.418 Section 58.418 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING... Service 1 Equipment and Utensils § 58.418 Automatic cheese making equipment. (a) Automatic Curd Maker. The...

  7. 21 CFR 892.1900 - Automatic radiographic film processor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automatic radiographic film processor. 892.1900... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1900 Automatic radiographic film processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  8. 21 CFR 892.1900 - Automatic radiographic film processor.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automatic radiographic film processor. 892.1900... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1900 Automatic radiographic film processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  9. 21 CFR 892.1900 - Automatic radiographic film processor.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automatic radiographic film processor. 892.1900... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1900 Automatic radiographic film processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  10. 21 CFR 892.1900 - Automatic radiographic film processor.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automatic radiographic film processor. 892.1900... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1900 Automatic radiographic film processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  11. 21 CFR 892.1900 - Automatic radiographic film processor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automatic radiographic film processor. 892.1900... (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1900 Automatic radiographic film processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  12. Automatic Query Formulations in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1983-01-01

    Introduces methods designed to reduce role of search intermediaries by generating Boolean search formulations automatically using term frequency considerations from natural language statements provided by system patrons. Experimental results are supplied and methods are described for applying automatic query formulation process in practice.…

  13. Automatic textual annotation of video news based on semantic visual object extraction

    NASA Astrophysics Data System (ADS)

    Boujemaa, Nozha; Fleuret, Francois; Gouet, Valerie; Sahbi, Hichem

    2003-12-01

    In this paper, we present our work for automatic generation of textual metadata based on visual content analysis of video news. We present two methods for semantic object detection and recognition from a cross modal image-text thesaurus. These thesaurus represent a supervised association between models and semantic labels. This paper is concerned with two semantic objects: faces and Tv logos. In the first part, we present our work for efficient face detection and recogniton with automatic name generation. This method allows us also to suggest the textual annotation of shots close-up estimation. On the other hand, we were interested to automatically detect and recognize different Tv logos present on incoming different news from different Tv Channels. This work was done jointly with the French Tv Channel TF1 within the "MediaWorks" project that consists on an hybrid text-image indexing and retrieval plateform for video news.

  14. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  15. 49 CFR 236.826 - System, automatic train stop.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false System, automatic train stop. 236.826 Section 236..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.826 System, automatic train stop. A system so arranged that its operation will automatically...

  16. The Use of Automatic Indexing for Authority Control.

    ERIC Educational Resources Information Center

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  17. 49 CFR 236.826 - System, automatic train stop.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train stop. 236.826 Section 236..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.826 System, automatic train stop. A system so arranged that its operation will automatically...

  18. 30 CFR 77.1401 - Automatic controls and brakes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic controls and brakes. 77.1401 Section... MINES Personnel Hoisting § 77.1401 Automatic controls and brakes. Hoists and elevators shall be equipped with overspeed, overwind, and automatic stop controls and with brakes capable of stopping the elevator...

  19. 30 CFR 56.19006 - Automatic hoist braking devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 56.19006 Section 56.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 56.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  20. 30 CFR 57.19006 - Automatic hoist braking devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 57.19006 Section 57.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 57.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  1. Automatic Network Fingerprinting through Single-Node Motifs

    PubMed Central

    Echtermeyer, Christoph; da Fontoura Costa, Luciano; Rodrigues, Francisco A.; Kaiser, Marcus

    2011-01-01

    Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs—a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes, as hubs before, might be found to play critical roles in real-world networks. PMID:21297963

  2. Automatic liver volume segmentation and fibrosis classification

    NASA Astrophysics Data System (ADS)

    Bal, Evgeny; Klang, Eyal; Amitai, Michal; Greenspan, Hayit

    2018-02-01

    In this work, we present an automatic method for liver segmentation and fibrosis classification in liver computed-tomography (CT) portal phase scans. The input is a full abdomen CT scan with an unknown number of slices, and the output is a liver volume segmentation mask and a fibrosis grade. A multi-stage analysis scheme is applied to each scan, including: volume segmentation, texture features extraction and SVM based classification. Data contains portal phase CT examinations from 80 patients, taken with different scanners. Each examination has a matching Fibroscan grade. The dataset was subdivided into two groups: first group contains healthy cases and mild fibrosis, second group contains moderate fibrosis, severe fibrosis and cirrhosis. Using our automated algorithm, we achieved an average dice index of 0.93 ± 0.05 for segmentation and a sensitivity of 0.92 and specificity of 0.81for classification. To the best of our knowledge, this is a first end to end automatic framework for liver fibrosis classification; an approach that, once validated, can have a great potential value in the clinic.

  3. Improved Automatically Locking/Unlocking Orthotic Knee Joint

    NASA Technical Reports Server (NTRS)

    Weddendorf, Bruce

    1995-01-01

    Proposed orthotic knee joint improved version of one described in "Automatically Locking/Unlocking Orthotic Knee Joint" (MFS-28633). Locks automatically upon initial application of radial force (wearer's weight) and unlocks automatically, but only when all loads (radial force and bending) relieved. Joints lock whenever wearer applies weight to knee at any joint angle between full extension and 45 degree bend. Both devices offer increased safety and convenience relative to conventional orthotic knee joints.

  4. A Quantitative and Qualitative Evaluation of an Automatic Occlusion Device for Tracheoesophageal Speech: The Provox FreeHands HME

    ERIC Educational Resources Information Center

    Hamade, Rachel; Hewlett, Nigel; Scanlon, Emer

    2006-01-01

    This study aimed to evaluate a new automatic tracheostoma valve: the Provox FreeHands HME (manufactured by Atos Medical AB, Sweden). Data from four laryngectomee participants using automatic and also manual occlusion were subjected to acoustic and perceptual analysis. The main results were a significant decrease, from the manual to automatic…

  5. [Advances in automatic detection technology for images of thin blood film of malaria parasite].

    PubMed

    Juan-Sheng, Zhang; Di-Qiang, Zhang; Wei, Wang; Xiao-Guang, Wei; Zeng-Guo, Wang

    2017-05-05

    This paper reviews the computer vision and image analysis studies aiming at automated diagnosis or screening of malaria in microscope images of thin blood film smears. On the basis of introducing the background and significance of automatic detection technology, the existing detection technologies are summarized and divided into several steps, including image acquisition, pre-processing, morphological analysis, segmentation, count, and pattern classification components. Then, the principles and implementation methods of each step are given in detail. In addition, the promotion and application in automatic detection technology of thick blood film smears are put forwarded as questions worthy of study, and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.

  6. How automatic is the hand's automatic pilot? Evidence from dual-task studies.

    PubMed

    McIntosh, Robert D; Mulroue, Amy; Brockmole, James R

    2010-10-01

    The ability to correct reaching movements for changes in target position has been described as the hand's 'automatic pilot'. These corrections are preconscious and occur by default in double-step reaching tasks, even if the goal is to react to the target jump in some other way, for instance by stopping the movement (STOP instruction). Nonetheless, corrections are strongly modulated by conscious intention: participants make more corrections when asked to follow the target (GO instruction) and can suppress them when explicitly asked not to follow the target (NOGO instruction). We studied the influence of a cognitively demanding (auditory 1-back) task upon correction behaviour under GO, STOP and NOGO instructions. Correction rates under the STOP instruction were unaffected by cognitive load, consistent with the assumption that they reflect the default behaviour of the automatic pilot. Correction rates under the GO instruction were also unaffected, suggesting that minimal cognitive resources are required to enhance online correction. By contrast, cognitive load impeded the ability to suppress online corrections under the NOGO instruction. These data reveal a constitutional bias in the automatic pilot system: intentional suppression of the default correction behaviour is cognitively demanding, but enhancement towards greater responsiveness is seemingly effortless.

  7. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.

    2016-01-01

    Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased

  8. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method.

    PubMed

    Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W

    2016-01-01

    Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial

  9. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis.

    PubMed

    Liu, Chanjuan; van Netten, Jaap J; van Baal, Jeff G; Bus, Sicco A; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8% ± 1.1% sensitivity and 98.4% ± 0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained. © 2015 Society of Photo-Optical Instrumentation Engineers (SPIE)

  10. [Research on automatic external defibrillator based on DSP].

    PubMed

    Jing, Jun; Ding, Jingyan; Zhang, Wei; Hong, Wenxue

    2012-10-01

    Electrical defibrillation is the most effective way to treat the ventricular tachycardia (VT) and ventricular fibrillation (VF). An automatic external defibrillator based on DSP is introduced in this paper. The whole design consists of the signal collection module, the microprocessor controlingl module, the display module, the defibrillation module and the automatic recognition algorithm for VF and non VF, etc. This automatic external defibrillator has achieved goals such as ECG signal real-time acquisition, ECG wave synchronous display, data delivering to U disk and automatic defibrillate when shockable rhythm appears, etc.

  11. Precision about the automatic emotional brain.

    PubMed

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  12. Algorithm for automatic forced spirometry quality assessment: technological developments.

    PubMed

    Melia, Umberto; Burgos, Felip; Vallverdú, Montserrat; Velickovski, Filip; Lluch-Ariet, Magí; Roca, Josep; Caminal, Pere

    2014-01-01

    We hypothesized that the implementation of automatic real-time assessment of quality of forced spirometry (FS) may significantly enhance the potential for extensive deployment of a FS program in the community. Recent studies have demonstrated that the application of quality criteria defined by the ATS/ERS (American Thoracic Society/European Respiratory Society) in commercially available equipment with automatic quality assessment can be markedly improved. To this end, an algorithm for assessing quality of FS automatically was reported. The current research describes the mathematical developments of the algorithm. An innovative analysis of the shape of the spirometric curve, adding 23 new metrics to the traditional 4 recommended by ATS/ERS, was done. The algorithm was created through a two-step iterative process including: (1) an initial version using the standard FS curves recommended by the ATS; and, (2) a refined version using curves from patients. In each of these steps the results were assessed against one expert's opinion. Finally, an independent set of FS curves from 291 patients was used for validation purposes. The novel mathematical approach to characterize the FS curves led to appropriate FS classification with high specificity (95%) and sensitivity (96%). The results constitute the basis for a successful transfer of FS testing to non-specialized professionals in the community.

  13. Mobile Genome Express (MGE): A comprehensive automatic genetic analyses pipeline with a mobile device.

    PubMed

    Yoon, Jun-Hee; Kim, Thomas W; Mendez, Pedro; Jablons, David M; Kim, Il-Jin

    2017-01-01

    The development of next-generation sequencing (NGS) technology allows to sequence whole exomes or genome. However, data analysis is still the biggest bottleneck for its wide implementation. Most laboratories still depend on manual procedures for data handling and analyses, which translates into a delay and decreased efficiency in the delivery of NGS results to doctors and patients. Thus, there is high demand for developing an automatic and an easy-to-use NGS data analyses system. We developed comprehensive, automatic genetic analyses controller named Mobile Genome Express (MGE) that works in smartphones or other mobile devices. MGE can handle all the steps for genetic analyses, such as: sample information submission, sequencing run quality check from the sequencer, secured data transfer and results review. We sequenced an Actrometrix control DNA containing multiple proven human mutations using a targeted sequencing panel, and the whole analysis was managed by MGE, and its data reviewing program called ELECTRO. All steps were processed automatically except for the final sequencing review procedure with ELECTRO to confirm mutations. The data analysis process was completed within several hours. We confirmed the mutations that we have identified were consistent with our previous results obtained by using multi-step, manual pipelines.

  14. Automatic comparison of striation marks and automatic classification of shoe prints

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Keijzer, Jan; Keereweer, Isaac

    1995-09-01

    A database for toolmarks (named TRAX) and a database for footwear outsole designs (named REBEZO) have been developed on a PC. The databases are filled with video-images and administrative data about the toolmarks and the footwear designs. An algorithm for the automatic comparison of the digitized striation patterns has been developed for TRAX. The algorithm appears to work well for deep and complete striation marks and will be implemented in TRAX. For REBEZO some efforts have been made to the automatic classification of outsole patterns. The algorithm first segments the shoeprofile. Fourier-features are selected for the separate elements and are classified with a neural network. In future developments information on invariant moments of the shape and rotation angle will be included in the neural network.

  15. Automatic alignment of double optical paths in excimer laser amplifier

    NASA Astrophysics Data System (ADS)

    Wang, Dahui; Zhao, Xueqing; Hua, Hengqi; Zhang, Yongsheng; Hu, Yun; Yi, Aiping; Zhao, Jun

    2013-05-01

    A kind of beam automatic alignment method used for double paths amplification in the electron pumped excimer laser system is demonstrated. In this way, the beams from the amplifiers can be transferred along the designated direction and accordingly irradiate on the target with high stabilization and accuracy. However, owing to nonexistence of natural alignment references in excimer laser amplifiers, two cross-hairs structure is used to align the beams. Here, one crosshair put into the input beam is regarded as the near-field reference while the other put into output beam is regarded as the far-field reference. The two cross-hairs are transmitted onto Charge Coupled Devices (CCD) by image-relaying structures separately. The errors between intersection points of two cross-talk images and centroid coordinates of actual beam are recorded automatically and sent to closed loop feedback control mechanism. Negative feedback keeps running until preset accuracy is reached. On the basis of above-mentioned design, the alignment optical path is built and the software is compiled, whereafter the experiment of double paths automatic alignment in electron pumped excimer laser amplifier is carried through. Meanwhile, the related influencing factors and the alignment precision are analyzed. Experimental results indicate that the alignment system can achieve the aiming direction of automatic aligning beams in short time. The analysis shows that the accuracy of alignment system is 0.63μrad and the beam maximum restoration error is 13.75μm. Furthermore, the bigger distance between the two cross-hairs, the higher precision of the system is. Therefore, the automatic alignment system has been used in angular multiplexing excimer Main Oscillation Power Amplification (MOPA) system and can satisfy the requirement of beam alignment precision on the whole.

  16. Climate change effects on Glacier recession in Himalayas using Multitemporal SAR data and Automatic Weather Station observations

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Singh, S. K.; Venkataraman, G.

    2009-04-01

    The Himalaya is the highest but the youngest mountain belt (20 to 60 million years B.P.) of the earth running in arc shape for about 2500 km. It has more than 90 peaks above 6000 m and contains about 50% of all glaciers outside of the polar environments (Bahadur, 1993). All glaciers in this region are in general recession since last 150 years (Paul et al.,1979). Gangotri, Siachen, Bara Shigri and Patsio are major glaciers in this region which are showing retreat with different rates and their respective tributary glaciers are completely disconnected from main body of glaciers. Spaceborne synthetic aperture radar data provide an important tool for monitoring the fluctuation of the glaciers. In this paper attempt has been made for quantifying the glacier retreat using multitemporal synthetic aperture radar (SAR) data. SAR intensity and phase information will be exploited separately under SAR intensity tracking and interferometric SAR (InSAR) coherence tracking (Strozzi et al., 2002) respectively. Glacier retreat study have been done using time series coregistered multi temporal SAR images. Simultaneously InSAR coherence thresholding is applied for tracking the snout of Gangotri glacier. It is observed that glacier is retreating at the rate of 21 m/a. Availability of high resolution spotlight mode TerraSAR-X SAR data will supplement the ENVISAT ASAR and ERS-1/2 based observations. The observatory in the proximity of Gangotri glacier has been made functional at Bhojbasa and all weather parameters viz. Snow fall, temperature, pressure, air vector, column water vapor and humidity are recorded twice a day as per WMO standards manually and automatically. Three Automatic Weather Stations (AWS) have been established in the glacier area at Bhojbasa , Kalindipass and Nandaban. Since Himalayan environment is presently under great stress of decay and degeneration, AWS data will be analyzed in the context of climate change effects on fluctuation of glaciers. References 1.Jagdish

  17. Automatic estimation of extent of resection and residual tumor volume of patients with glioblastoma.

    PubMed

    Meier, Raphael; Porz, Nicole; Knecht, Urspeter; Loosli, Tina; Schucht, Philippe; Beck, Jürgen; Slotboom, Johannes; Wiest, Roland; Reyes, Mauricio

    2017-10-01

    OBJECTIVE In the treatment of glioblastoma, residual tumor burden is the only prognostic factor that can be actively influenced by therapy. Therefore, an accurate, reproducible, and objective measurement of residual tumor burden is necessary. This study aimed to evaluate the use of a fully automatic segmentation method-brain tumor image analysis (BraTumIA)-for estimating the extent of resection (EOR) and residual tumor volume (RTV) of contrast-enhancing tumor after surgery. METHODS The imaging data of 19 patients who underwent primary resection of histologically confirmed supratentorial glioblastoma were retrospectively reviewed. Contrast-enhancing tumors apparent on structural preoperative and immediate postoperative MR imaging in this patient cohort were segmented by 4 different raters and the automatic segmentation BraTumIA software. The manual and automatic results were quantitatively compared. RESULTS First, the interrater variabilities in the estimates of EOR and RTV were assessed for all human raters. Interrater agreement in terms of the coefficient of concordance (W) was higher for RTV (W = 0.812; p < 0.001) than for EOR (W = 0.775; p < 0.001). Second, the volumetric estimates of BraTumIA for all 19 patients were compared with the estimates of the human raters, which showed that for both EOR (W = 0.713; p < 0.001) and RTV (W = 0.693; p < 0.001) the estimates of BraTumIA were generally located close to or between the estimates of the human raters. No statistically significant differences were detected between the manual and automatic estimates. BraTumIA showed a tendency to overestimate contrast-enhancing tumors, leading to moderate agreement with expert raters with respect to the literature-based, survival-relevant threshold values for EOR. CONCLUSIONS BraTumIA can generate volumetric estimates of EOR and RTV, in a fully automatic fashion, which are comparable to the estimates of human experts. However, automated analysis showed a tendency to overestimate

  18. Fully automatic left ventricular myocardial strain estimation in 2D short-axis tagged magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Morais, Pedro; Queirós, Sandro; Heyde, Brecht; Engvall, Jan; 'hooge, Jan D.; Vilaça, João L.

    2017-09-01

    Cardiovascular diseases are among the leading causes of death and frequently result in local myocardial dysfunction. Among the numerous imaging modalities available to detect these dysfunctional regions, cardiac deformation imaging through tagged magnetic resonance imaging (t-MRI) has been an attractive approach. Nevertheless, fully automatic analysis of these data sets is still challenging. In this work, we present a fully automatic framework to estimate left ventricular myocardial deformation from t-MRI. This strategy performs automatic myocardial segmentation based on B-spline explicit active surfaces, which are initialized using an annular model. A non-rigid image-registration technique is then used to assess myocardial deformation. Three experiments were set up to validate the proposed framework using a clinical database of 75 patients. First, automatic segmentation accuracy was evaluated by comparing against manual delineations at one specific cardiac phase. The proposed solution showed an average perpendicular distance error of 2.35  ±  1.21 mm and 2.27  ±  1.02 mm for the endo- and epicardium, respectively. Second, starting from either manual or automatic segmentation, myocardial tracking was performed and the resulting strain curves were compared. It is shown that the automatic segmentation adds negligible differences during the strain-estimation stage, corroborating its accuracy. Finally, segmental strain was compared with scar tissue extent determined by delay-enhanced MRI. The results proved that both strain components were able to distinguish between normal and infarct regions. Overall, the proposed framework was shown to be accurate, robust, and attractive for clinical practice, as it overcomes several limitations of a manual analysis.

  19. Automatic sample Dewar for MX beam-line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charignon, T.; Tanchon, J.; Trollier, T.

    2014-01-29

    It is very common for crystals of large biological macromolecules to show considerable variation in quality of their diffraction. In order to increase the number of samples that are tested for diffraction quality before any full data collections at the ESRF*, an automatic sample Dewar has been implemented. Conception and performances of the Dewar are reported in this paper. The automatic sample Dewar has 240 samples capability with automatic loading/unloading ports. The storing Dewar is capable to work with robots and it can be integrated in a full automatic MX** beam-line. The samples are positioned in the front of themore » loading/unloading ports with and automatic rotating plate. A view port has been implemented for data matrix camera reading on each sample loaded in the Dewar. At last, the Dewar is insulated with polyurethane foam that keeps the liquid nitrogen consumption below 1.6 L/h. At last, the static insulation also makes vacuum equipment and maintenance unnecessary. This Dewar will be useful for increasing the number of samples tested in synchrotrons.« less

  20. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  1. ANNUAL REPORT-AUTOMATIC INDEXING AND ABSTRACTING.

    ERIC Educational Resources Information Center

    Lockheed Missiles and Space Co., Palo Alto, CA. Electronic Sciences Lab.

    THE INVESTIGATION IS CONCERNED WITH THE DEVELOPMENT OF AUTOMATIC INDEXING, ABSTRACTING, AND EXTRACTING SYSTEMS. BASIC INVESTIGATIONS IN ENGLISH MORPHOLOGY, PHONETICS, AND SYNTAX ARE PURSUED AS NECESSARY MEANS TO THIS END. IN THE FIRST SECTION THE THEORY AND DESIGN OF THE "SENTENCE DICTIONARY" EXPERIMENT IN AUTOMATIC EXTRACTION IS OUTLINED. SOME OF…

  2. 14 CFR 29.1329 - Automatic pilot system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... is automatic synchronization, each system must have a means to readily indicate to the pilot the... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  3. 14 CFR 29.1329 - Automatic pilot system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... is automatic synchronization, each system must have a means to readily indicate to the pilot the... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  4. 14 CFR 29.1329 - Automatic pilot system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... is automatic synchronization, each system must have a means to readily indicate to the pilot the... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  5. 14 CFR 29.1329 - Automatic pilot system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... is automatic synchronization, each system must have a means to readily indicate to the pilot the... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  6. 14 CFR 29.1329 - Automatic pilot system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... is automatic synchronization, each system must have a means to readily indicate to the pilot the... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  7. Automatic detection of sleep macrostructure based on a sensorized T-shirt.

    PubMed

    Bianchi, Anna M; Mendez, Martin O

    2010-01-01

    In the present work we apply a fully automatic procedure to the analysis of signal coming from a sensorized T-shit, worn during the night, for sleep evaluation. The goodness and reliability of the signals recorded trough the T-shirt was previously tested, while the employed algorithms for feature extraction and sleep classification were previously developed on standard ECG recordings and the obtained classification was compared to the standard clinical practice based on polysomnography (PSG). In the present work we combined T-shirt recordings and automatic classification and could obtain reliable sleep profiles, i.e. the sleep classification in WAKE, REM (rapid eye movement) and NREM stages, based on heart rate variability (HRV), respiration and movement signals.

  8. 14 CFR 27.1329 - Automatic pilot system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... automatic synchronization, each system must have a means to readily indicate to the pilot the alignment of... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  9. 14 CFR 27.1329 - Automatic pilot system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... automatic synchronization, each system must have a means to readily indicate to the pilot the alignment of... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  10. 14 CFR 27.1329 - Automatic pilot system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... automatic synchronization, each system must have a means to readily indicate to the pilot the alignment of... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  11. 14 CFR 27.1329 - Automatic pilot system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... automatic synchronization, each system must have a means to readily indicate to the pilot the alignment of... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  12. 14 CFR 27.1329 - Automatic pilot system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... automatic synchronization, each system must have a means to readily indicate to the pilot the alignment of... that corrective action begins within a reasonable period of time. (e) If the automatic pilot integrates...

  13. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  14. Generalized Self-Organizing Maps for Automatic Determination of the Number of Clusters and Their Multiprototypes in Cluster Analysis.

    PubMed

    Gorzalczany, Marian B; Rudzinski, Filip

    2017-06-07

    This paper presents a generalization of self-organizing maps with 1-D neighborhoods (neuron chains) that can be effectively applied to complex cluster analysis problems. The essence of the generalization consists in introducing mechanisms that allow the neuron chain--during learning--to disconnect into subchains, to reconnect some of the subchains again, and to dynamically regulate the overall number of neurons in the system. These features enable the network--working in a fully unsupervised way (i.e., using unlabeled data without a predefined number of clusters)--to automatically generate collections of multiprototypes that are able to represent a broad range of clusters in data sets. First, the operation of the proposed approach is illustrated on some synthetic data sets. Then, this technique is tested using several real-life, complex, and multidimensional benchmark data sets available from the University of California at Irvine (UCI) Machine Learning repository and the Knowledge Extraction based on Evolutionary Learning data set repository. A sensitivity analysis of our approach to changes in control parameters and a comparative analysis with an alternative approach are also performed.

  15. Automatic Video Analysis for Obstructive Sleep Apnea Diagnosis.

    PubMed

    Abad, Jorge; Muñoz-Ferrer, Aida; Cervantes, Miguel Ángel; Esquinas, Cristina; Marin, Alicia; Martínez, Carlos; Morera, Josep; Ruiz, Juan

    2016-08-01

    We investigated the diagnostic accuracy for the identification of obstructive sleep apnea (OSA) and its severity of a noninvasive technology based on image processing (SleepWise). This is an observational, prospective study to evaluate the degree of agreement between polysomnography (PSG) and SleepWise. We recruited 56 consecutive subjects with suspected OSA who were referred as outpatients to the Sleep Unit of the Hospital Universitari Germans Trias i Pujol (HUGTiP) from January 2013 to January 2014. All patients underwent laboratory PSG and image processing with SleepWise simultaneously the same night. Both PSG and SleepWise analyses were carried independently and blindly. We analyzed 50 of the 56 patients recruited. OSA was diagnosed through PSG in a total of 44 patients (88%) with a median apnea-hypopnea index (AHI) of 25.35 (24.9). According to SleepWise, 45 patients (90%) met the criteria for a diagnosis of OSA, with a median AHI of 22.8 (22.03). An analysis of the ability of PSG and SleepWise to classify patients by severity on the basis of their AHI shows that the two diagnostic systems distribute the different groups similarly. According to PSG, 23 patients (46%) had a diagnosis of severe OSA, 11 patients (22%) moderate OSA, and 10 patients (20%) mild OSA. According to SleepWise, 20, 13, and 12 patients (40%, 26%, and 24%, respectively) had a diagnosis of severe, moderate, and mild OSA respectively. For OSA diagnosis, SleepWise was found to have sensitivity of 100% and specificity of 83% in relation to PSG. The positive predictive value was 97% and the negative predictive value was 100%. The Bland-Altman plot comparing the mean AHI values obtained through PSG and SleepWise shows very good agreement between the two diagnostic techniques, with a bias of -3.85, a standard error of 12.18, and a confidence interval of -0.39 to -7.31. SleepWise was reasonably accurate for noninvasive and automatic diagnosis of OSA in outpatients. SleepWise determined the

  16. Channel Measurements for Automatic Vehicle Monitoring Systems

    DOT National Transportation Integrated Search

    1974-03-01

    Co-channel and adjacent channel electromagnetic interference measurements were conducted on the Sierra Research Corp. and the Chicago Transit Authority automatic vehicle monitoring systems. These measurements were made to determine if the automatic v...

  17. 10 CFR 431.132 - Definitions concerning automatic commercial ice makers.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Definitions concerning automatic commercial ice makers... CERTAIN COMMERCIAL AND INDUSTRIAL EQUIPMENT Automatic Commercial Ice Makers § 431.132 Definitions concerning automatic commercial ice makers. Automatic commercial ice maker means a factory-made assembly (not...

  18. 10 CFR 431.132 - Definitions concerning automatic commercial ice makers.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Definitions concerning automatic commercial ice makers... CERTAIN COMMERCIAL AND INDUSTRIAL EQUIPMENT Automatic Commercial Ice Makers § 431.132 Definitions concerning automatic commercial ice makers. Automatic commercial ice maker means a factory-made assembly (not...

  19. 10 CFR 431.132 - Definitions concerning automatic commercial ice makers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Definitions concerning automatic commercial ice makers... CERTAIN COMMERCIAL AND INDUSTRIAL EQUIPMENT Automatic Commercial Ice Makers § 431.132 Definitions concerning automatic commercial ice makers. Automatic commercial ice maker means a factory-made assembly (not...

  20. 10 CFR 431.132 - Definitions concerning automatic commercial ice makers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Definitions concerning automatic commercial ice makers... CERTAIN COMMERCIAL AND INDUSTRIAL EQUIPMENT Automatic Commercial Ice Makers § 431.132 Definitions concerning automatic commercial ice makers. Automatic commercial ice maker means a factory-made assembly (not...

  1. 10 CFR 431.132 - Definitions concerning automatic commercial ice makers.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Definitions concerning automatic commercial ice makers... CERTAIN COMMERCIAL AND INDUSTRIAL EQUIPMENT Automatic Commercial Ice Makers § 431.132 Definitions concerning automatic commercial ice makers. Automatic commercial ice maker means a factory-made assembly (not...

  2. Automatic stereotyping against people with schizophrenia, schizoaffective and affective disorders

    PubMed Central

    Rüsch, Nicolas; Corrigan, Patrick W.; Todd, Andrew R.; Bodenhausen, Galen V.

    2010-01-01

    Similar to members of the public, people with mental illness may exhibit general negative automatic prejudice against their own group. However, it is unclear whether more specific negative stereotypes are automatically activated among diagnosed individuals and how such automatic stereotyping may be related to self-reported attitudes and emotional reactions. We therefore studied automatically activated reactions toward mental illness among 85 people with schizophrenia, schizoaffective or affective disorders as well as among 50 members of the general public, using a Lexical Decision Task to measure automatic stereotyping. Deliberately endorsed attitudes and emotional reactions were assessed by self-report. Independent of diagnosis, people with mental illness showed less negative automatic stereotyping than did members of the public. Among members of the public, stronger automatic stereotyping was associated with more self-reported shame about a potential mental illness and more anger toward stigmatized individuals. Reduced automatic stereotyping in the diagnosed group suggests that people with mental illness might not entirely internalize societal stigma. Among members of the public, automatic stereotyping predicted negative emotional reactions to people with mental illness. Initiatives to reduce the impact of public stigma and internalized stigma should take automatic stereotyping and related emotional aspects of stigma into account. PMID:20843560

  3. A semi-automatic microextraction in packed sorbent, using a digitally controlled syringe, combined with ultra-high pressure liquid chromatography as a new and ultra-fast approach for the determination of prenylflavonoids in beers.

    PubMed

    Gonçalves, João L; Alves, Vera L; Rodrigues, Fátima P; Figueira, José A; Câmara, José S

    2013-08-23

    In this work a highly selective and sensitive analytical procedure based on semi-automatic microextraction by packed sorbents (MEPS) technique, using a new digitally controlled syringe (eVol(®)) combined with ultra-high pressure liquid chromatography (UHPLC), is proposed to determine the prenylated chalcone derived from the hop (Humulus lupulus L.), xanthohumol (XN), and its isomeric flavonone isoxanthohumol (IXN) in beers. Extraction and UHPLC parameters were accurately optimized to achieve the highest recoveries and to enhance the analytical characteristics of the method. Important parameters affecting MEPS performance, namely the type of sorbent material (C2, C8, C18, SIL, and M1), elution solvent system, number of extraction cycles (extract-discard), sample volume, elution volume, and sample pH, were evaluated. The optimal experimental conditions involves the loading of 500μL of sample through a C18 sorbent in a MEPS syringe placed in the semi-automatic eVol(®) syringe followed by elution using 250μL of acetonitrile (ACN) in a 10 extractions cycle (about 5min for the entire sample preparation step). The obtained extract is directly analyzed in the UHPLC system using a binary mobile phase composed of aqueous 0.1% formic acid (eluent A) and ACN (eluent B) in the gradient elution mode (10min total analysis). Under optimized conditions good results were obtained in terms of linearity within the established concentration range with correlation coefficients (R) values higher than 0.986, with a residual deviation for each calibration point below 12%. The limit of detection (LOD) and limit of quantification (LOQ) obtained were 0.4ngmL(-1) and 1.0ngmL(-1) for IXN, and 0.9ngmL(-1) and 3.0ngmL(-1) for XN, respectively. Precision was lower than 4.6% for IXN and 8.4% for XN. Typical recoveries ranged between 67.1% and 99.3% for IXN and between 74.2% and 99.9% for XN, with relative standard deviations %RSD no larger than 8%. The applicability of the proposed analytical

  4. Retrieval, automaticity, vocabulary elaboration, orthography (RAVE-O): a comprehensive, fluency-based reading intervention program.

    PubMed

    Wolf, M; Miller, L; Donnelly, K

    2000-01-01

    The most important implication of the double-deficit hypothesis (Wolf & Bowers, in this issue) concerns a new emphasis on fluency and automaticity in intervention for children with developmental reading disabilities. The RAVE-O (Retrieval, Automaticity, Vocabulary Elaboration, Orthography) program is an experimental, fluency-based approach to reading intervention that is designed to accompany a phonological analysis program. In an effort to address multiple possible sources of dysfluency in readers with disabilities, the program involves comprehensive emphases both on fluency in word attack, word identification, and comprehension and on automaticity in underlying componential processes (e.g., phonological, orthographic, semantic, and lexical retrieval skills). The goals, theoretical principles, and applied activities of the RAVE-O curriculum are described with particular stress on facilitating the development of rapid orthographic pattern recognition and on changing children's attitudes toward language.

  5. Automatic Contour Tracking in Ultrasound Images

    ERIC Educational Resources Information Center

    Li, Min; Kambhamettu, Chandra; Stone, Maureen

    2005-01-01

    In this paper, a new automatic contour tracking system, EdgeTrak, for the ultrasound image sequences of human tongue is presented. The images are produced by a head and transducer support system (HATS). The noise and unrelated high-contrast edges in ultrasound images make it very difficult to automatically detect the correct tongue surfaces. In…

  6. Comparison of liver volumetry on contrast-enhanced CT images: one semiautomatic and two automatic approaches.

    PubMed

    Cai, Wei; He, Baochun; Fan, Yingfang; Fang, Chihua; Jia, Fucang

    2016-11-08

    This study was to evaluate the accuracy, consistency, and efficiency of three liver volumetry methods- one interactive method, an in-house-developed 3D medical Image Analysis (3DMIA) system, one automatic active shape model (ASM)-based segmentation, and one automatic probabilistic atlas (PA)-guided segmentation method on clinical contrast-enhanced CT images. Forty-two datasets, including 27 normal liver and 15 space-occupying liver lesion patients, were retrospectively included in this study. The three methods - one semiautomatic 3DMIA, one automatic ASM-based, and one automatic PA-based liver volumetry - achieved an accuracy with VD (volume difference) of -1.69%, -2.75%, and 3.06% in the normal group, respectively, and with VD of -3.20%, -3.35%, and 4.14% in the space-occupying lesion group, respectively. However, the three methods achieved an efficiency of 27.63 mins, 1.26 mins, 1.18 mins on average, respectively, compared with the manual volumetry, which took 43.98 mins. The high intraclass correlation coefficient between the three methods and the manual method indicated an excel-lent agreement on liver volumetry. Significant differences in segmentation time were observed between the three methods (3DMIA, ASM, and PA) and the manual volumetry (p < 0.001), as well as between the automatic volumetries (ASM and PA) and the semiautomatic volumetry (3DMIA) (p < 0.001). The semiautomatic interactive 3DMIA, automatic ASM-based, and automatic PA-based liver volum-etry agreed well with manual gold standard in both the normal liver group and the space-occupying lesion group. The ASM- and PA-based automatic segmentation have better efficiency in clinical use. © 2016 The Authors.

  7. Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei

    There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.

  8. Automatically processed alpha-track radon monitor

    DOEpatents

    Langner, Jr., G. Harold

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.

  9. Automatically processed alpha-track radon monitor

    DOEpatents

    Langner, G.H. Jr.

    1993-01-12

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.

  10. 2D Automatic body-fitted structured mesh generation using advancing extraction method

    USDA-ARS?s Scientific Manuscript database

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...

  11. 2D automatic body-fitted structured mesh generation using advancing extraction method

    USDA-ARS?s Scientific Manuscript database

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...

  12. Automatic Lamp and Fan Control Based on Microcontroller

    NASA Astrophysics Data System (ADS)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  13. Practical automatic Arabic license plate recognition system

    NASA Astrophysics Data System (ADS)

    Mohammad, Khader; Agaian, Sos; Saleh, Hani

    2011-02-01

    Since 1970's, the need of an automatic license plate recognition system, sometimes referred as Automatic License Plate Recognition system, has been increasing. A license plate recognition system is an automatic system that is able to recognize a license plate number, extracted from image sensors. In specific, Automatic License Plate Recognition systems are being used in conjunction with various transportation systems in application areas such as law enforcement (e.g. speed limit enforcement) and commercial usages such as parking enforcement and automatic toll payment private and public entrances, border control, theft and vandalism control. Vehicle license plate recognition has been intensively studied in many countries. Due to the different types of license plates being used, the requirement of an automatic license plate recognition system is different for each country. [License plate detection using cluster run length smoothing algorithm ].Generally, an automatic license plate localization and recognition system is made up of three modules; license plate localization, character segmentation and optical character recognition modules. This paper presents an Arabic license plate recognition system that is insensitive to character size, font, shape and orientation with extremely high accuracy rate. The proposed system is based on a combination of enhancement, license plate localization, morphological processing, and feature vector extraction using the Haar transform. The performance of the system is fast due to classification of alphabet and numerals based on the license plate organization. Experimental results for license plates of two different Arab countries show an average of 99 % successful license plate localization and recognition in a total of more than 20 different images captured from a complex outdoor environment. The results run times takes less time compared to conventional and many states of art methods.

  14. Reconstruction of Mammary Gland Structure Using Three-Dimensional Computer-Based Microscopy

    DTIC Science & Technology

    2004-08-01

    for image analysis in cytology" Ortiz de Solorzano C., R . Malladi , Lockett S. In: Geometric methods in bio-medical image processing. Ravikanth Malladi ...Deschamps T., Idica A.K., Malladi R ., Ortiz de Solorzano C. Journal of Biomedical Optics 9(3):445-453, 2004.. Manuscripts (in preparation): "* "Three...Deschamps T., Idica A.K., 16 Malladi R ., Ortiz de Solorzano C., Proceedings of Photonics West 2003, Vol. 4964, 2003 "* "Automatic and segmentation

  15. [Study on Intelligent Automatic Tracking Radiation Protection Curtain].

    PubMed

    Zhao, Longyang; Han, Jindong; Ou, Minjian; Chen, Jinlong

    2015-09-01

    In order to overcome the shortcomings of traditional X-ray inspection taking passive protection mode, this paper combines the automatic control technology, puts forward a kind of active protection X-ray equipment. The device of automatic detection of patients receiving X-ray irradiation part, intelligent adjustment in patients and shooting device between automatic tracking radiation protection device height. The device has the advantages of automatic adjustment, anti-radiation device, reduce the height of non-irradiated area X-ray radiation and improve the work efficiency. Testing by the professional organization, the device can decrease more than 90% of X-ray dose for patients with non-irradiated area.

  16. Development of a cerebral circulation model for the automatic control of brain physiology.

    PubMed

    Utsuki, T

    2015-01-01

    In various clinical guidelines of brain injury, intracranial pressure (ICP), cerebral blood flow (CBF) and brain temperature (BT) are essential targets for precise management for brain resuscitation. In addition, the integrated automatic control of BT, ICP, and CBF is required for improving therapeutic effects and reducing medical costs and staff burden. Thus, a new model of cerebral circulation was developed in this study for integrative automatic control. With this model, the CBF and cerebral perfusion pressure of a normal adult male were regionally calculated according to cerebrovascular structure, blood viscosity, blood distribution, CBF autoregulation, and ICP. The analysis results were consistent with physiological knowledge already obtained with conventional studies. Therefore, the developed model is potentially available for the integrative control of the physiological state of the brain as a reference model of an automatic control system, or as a controlled object in various control simulations.

  17. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    NASA Astrophysics Data System (ADS)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  18. Automatic Brake Slack Adjusters on Transit Buses

    DOT National Transportation Integrated Search

    1985-04-01

    The purpose of this report is to provide transit bus agencies with current information on automatic brake slack adjusters. Rather than a comprehensive statistical survey, this report is intended to provide an overview of the states of automatic slack...

  19. Automatic p wave analysis over 24 hours in patients with paroxysmal or persistent atrial fibrillation.

    PubMed

    Bitzen, Alexander; Sternickel, Karsten; Lewalter, Thorsten; Schwab, Jörg Otto; Yang, Alexander; Schrickel, Jan Wilko; Linhart, Markus; Wolpert, Christian; Jung, Werner; David, Peter; Lüderitz, Berndt; Nickenig, Georg; Lickfett, Lars

    2007-10-01

    Patients with atrial fibrillation (AF) often exhibit abnormalities of P wave morphology during sinus rhythm. We examined a novel method for automatic P wave analysis in the 24-hour-Holter-ECG of 60 patients with paroxysmal or persistent AF and 12 healthy subjects. Recorded ECG signals were transferred to the analysis program where 5-10 P and R waves were manually marked. A wavelet transform performed a time-frequency decomposition to train neural networks. Afterwards, the detected P waves were described using a Gauss function optimized to fit the individual morphology and providing amplitude and duration at half P wave height. >96% of P waves were detected, 47.4 +/- 20.7% successfully analyzed afterwards. In the patient population, the mean amplitude was 0.073 +/- 0.028 mV (mean variance 0.020 +/- 0.008 mV(2)), the mean duration at half height 23.5 +/- 2.7 ms (mean variance 4.2 +/- 1.6 ms(2)). In the control group, the mean amplitude (0.105 +/- 0.020 ms) was significantly higher (P < 0.0005), the mean variance of duration at half height (2.9 +/- 0.6 ms(2)) significantly lower (P < 0.0085). This method shows promise for identification of triggering factors of AF.

  20. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  1. Automatic safety rod for reactors

    DOEpatents

    Germer, John H.

    1988-01-01

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-core flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  2. Automatic image enhancement based on multi-scale image decomposition

    NASA Astrophysics Data System (ADS)

    Feng, Lu; Wu, Zhuangzhi; Pei, Luo; Long, Xiong

    2014-01-01

    In image processing and computational photography, automatic image enhancement is one of the long-range objectives. Recently the automatic image enhancement methods not only take account of the globe semantics, like correct color hue and brightness imbalances, but also the local content of the image, such as human face and sky of landscape. In this paper we describe a new scheme for automatic image enhancement that considers both global semantics and local content of image. Our automatic image enhancement method employs the multi-scale edge-aware image decomposition approach to detect the underexposure regions and enhance the detail of the salient content. The experiment results demonstrate the effectiveness of our approach compared to existing automatic enhancement methods.

  3. Ability and efficiency of an automatic analysis software to measure microvascular parameters.

    PubMed

    Carsetti, Andrea; Aya, Hollmann D; Pierantozzi, Silvia; Bazurro, Simone; Donati, Abele; Rhodes, Andrew; Cecconi, Maurizio

    2017-08-01

    Analysis of the microcirculation is currently performed offline, is time consuming and operator dependent. The aim of this study was to assess the ability and efficiency of the automatic analysis software CytoCamTools 1.7.12 (CC) to measure microvascular parameters in comparison with Automated Vascular Analysis (AVA) software 3.2. 22 patients admitted to the cardiothoracic intensive care unit following cardiac surgery were prospectively enrolled. Sublingual microcirculatory videos were analysed using AVA and CC software. The total vessel density (TVD) for small vessels, perfused vessel density (PVD) and proportion of perfused vessels (PPV) were calculated. Blood flow was assessed using the microvascular flow index (MFI) for AVA software and the averaged perfused speed indicator (APSI) for the CC software. The duration of the analysis was also recorded. Eighty-four videos from 22 patients were analysed. The bias between TVD-CC and TVD-AVA was 2.20 mm/mm 2 (95 % CI 1.37-3.03) with limits of agreement (LOA) of -4.39 (95 % CI -5.66 to -3.16) and 8.79 (95 % CI 7.50-10.01) mm/mm 2 . The percentage error (PE) for TVD was ±32.2 %. TVD was positively correlated between CC and AVA (r = 0.74, p < 0.001). The bias between PVD-CC and PVD-AVA was 6.54 mm/mm 2 (95 % CI 5.60-7.48) with LOA of -4.25 (95 % CI -8.48 to -0.02) and 17.34 (95 % CI 13.11-21.57) mm/mm 2 . The PE for PVD was ±61.2 %. PVD was positively correlated between CC and AVA (r = 0.66, p < 0.001). The median PPV-AVA was significantly higher than the median PPV-CC [97.39 % (95.25, 100 %) vs. 81.65 % (61.97, 88.99), p < 0.0001]. MFI categories cannot estimate or predict APSI values (p = 0.45). The time required for the analysis was shorter with CC than with AVA system [2'42″ (2'12″, 3'31″) vs. 16'12″ (13'38″, 17'57″), p < 0.001]. TVD is comparable between the two softwares, although faster with CC software. The values for PVD and PPV are not interchangeable given the

  4. Observed use of automatic seat belts in 1987 cars.

    PubMed

    Williams, A F; Wells, J K; Lund, A K; Teed, N

    1989-10-01

    Usage of the automatic belt systems supplied by six large-volume automobile manufacturers to meet the federal requirements for automatic restraints were observed in suburban Washington, D.C., Chicago, Los Angeles, and Philadelphia. The different belt systems studied were: Ford and Toyota (motorized, nondetachable automatic shoulder belt), Nissan (motorized, detachable shoulder belt), VW and Chrysler (nonmotorized, detachable shoulder belt), and GM (nonmotorized detachable lap and shoulder belt). Use of automatic belts was significantly greater than manual belt use in otherwise comparable late-model cars for all manufacturers except Chrysler; in Chrysler cars, automatic belt use was significantly lower than manual belt use. The automatic shoulder belts provided by Ford, Nissan, Toyota, and VW increased use rates to about 90%. Because use rates were lower in Ford cars with manual belts, their increase was greater. GM cars had the smallest increase in use rates; however, lap belt use was highest in GM cars. The other manufacturers supply knee bolsters to supplement shoulder belt protection; all--except VW--also provide manual lap belts, which were used by about half of those who used the automatic shoulder belt. The results indicate that some manufacturers have been more successful than others in providing automatic belt systems that result in high use that, in turn, will mean fewer deaths and injuries in those cars.

  5. Density estimation in aerial images of large crowds for automatic people counting

    NASA Astrophysics Data System (ADS)

    Herrmann, Christian; Metzler, Juergen

    2013-05-01

    Counting people is a common topic in the area of visual surveillance and crowd analysis. While many image-based solutions are designed to count only a few persons at the same time, like pedestrians entering a shop or watching an advertisement, there is hardly any solution for counting large crowds of several hundred persons or more. We addressed this problem previously by designing a semi-automatic system being able to count crowds consisting of hundreds or thousands of people based on aerial images of demonstrations or similar events. This system requires major user interaction to segment the image. Our principle aim is to reduce this manual interaction. To achieve this, we propose a new and automatic system. Besides counting the people in large crowds, the system yields the positions of people allowing a plausibility check by a human operator. In order to automatize the people counting system, we use crowd density estimation. The determination of crowd density is based on several features like edge intensity or spatial frequency. They indicate the density and discriminate between a crowd and other image regions like buildings, bushes or trees. We compare the performance of our automatic system to the previous semi-automatic system and to manual counting in images. By counting a test set of aerial images showing large crowds containing up to 12,000 people, the performance gain of our new system will be measured. By improving our previous system, we will increase the benefit of an image-based solution for counting people in large crowds.

  6. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    PubMed

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  7. TU-H-CAMPUS-JeP2-05: Can Automatic Delineation of Cardiac Substructures On Noncontrast CT Be Used for Cardiac Toxicity Analysis?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y; Liao, Z; Jiang, W

    Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD)more » to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the

  8. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, A.J.

    1994-05-10

    Disclosed are a method and apparatus for automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly. 10 figures.

  9. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, Anthony J.

    1994-05-10

    Disclosed are a method and apparatus for (1) automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, (2) automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, (3) manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and (4) automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly.

  10. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  11. Automatic, semi-automatic and manual validation of urban drainage data.

    PubMed

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  12. Automatic Gain Control in Compact Spectrometers.

    PubMed

    Protopopov, Vladimir

    2016-03-01

    An image intensifier installed in the optical path of a compact spectrometer may act not only as a fast gating unit, which is widely used for time-resolved measurements, but also as a variable attenuator-amplifier in a continuous wave mode. This opens the possibility of an automatic gain control, a new feature in spectroscopy. With it, the user is relieved from the necessity to manually adjust signal level at a certain value that it is done automatically by means of an electronic feedback loop. It is even more important that automatic gain control is done without changing exposure time, which is an additional benefit in time-resolved experiments. The concept, algorithm, design considerations, and experimental results are presented. © The Author(s) 2016.

  13. 30 CFR 75.523-3 - Automatic emergency-parking brakes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic emergency-parking brakes. 75.523-3...-3 Automatic emergency-parking brakes. (a) Except for personnel carriers, rubber-tired, self... with automatic emergency-parking brakes in accordance with the following schedule. (1) On and after May...

  14. 30 CFR 75.523-3 - Automatic emergency-parking brakes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Automatic emergency-parking brakes. 75.523-3...-3 Automatic emergency-parking brakes. (a) Except for personnel carriers, rubber-tired, self... with automatic emergency-parking brakes in accordance with the following schedule. (1) On and after May...

  15. A Survey of Automatic Code Generating Software

    DTIC Science & Technology

    1988-09-01

    11189 Cincinnati, OH 45211 513-662-2300 69 LIST OF REFERENCES 1. Boehm, Barry W., "Software and Its Impact: A Quantita- tive Assessment," Daamtin, Vol...Decision SuDvort Systems: An Organizational Perspective, pp. 11- 12, Addison-Wesley Publishing Company, Inc., 1978. 6. Pressman , Roger S., Software

  16. Automatic target recognition apparatus and method

    DOEpatents

    Baumgart, Chris W.; Ciarcia, Christopher A.

    2000-01-01

    An automatic target recognition apparatus (10) is provided, having a video camera/digitizer (12) for producing a digitized image signal (20) representing an image containing therein objects which objects are to be recognized if they meet predefined criteria. The digitized image signal (20) is processed within a video analysis subroutine (22) residing in a computer (14) in a plurality of parallel analysis chains such that the objects are presumed to be lighter in shading than the background in the image in three of the chains and further such that the objects are presumed to be darker than the background in the other three chains. In two of the chains the objects are defined by surface texture analysis using texture filter operations. In another two of the chains the objects are defined by background subtraction operations. In yet another two of the chains the objects are defined by edge enhancement processes. In each of the analysis chains a calculation operation independently determines an error factor relating to the probability that the objects are of the type which should be recognized, and a probability calculation operation combines the results of the analysis chains.

  17. Comparison between manual and semi-automatic segmentation of nasal cavity and paranasal sinuses from CT images.

    PubMed

    Tingelhoff, K; Moral, A I; Kunkel, M E; Rilk, M; Wagner, I; Eichhorn, K G; Wahl, F M; Bootz, F

    2007-01-01

    Segmentation of medical image data is getting more and more important over the last years. The results are used for diagnosis, surgical planning or workspace definition of robot-assisted systems. The purpose of this paper is to find out whether manual or semi-automatic segmentation is adequate for ENT surgical workflow or whether fully automatic segmentation of paranasal sinuses and nasal cavity is needed. We present a comparison of manual and semi-automatic segmentation of paranasal sinuses and the nasal cavity. Manual segmentation is performed by custom software whereas semi-automatic segmentation is realized by a commercial product (Amira). For this study we used a CT dataset of the paranasal sinuses which consists of 98 transversal slices, each 1.0 mm thick, with a resolution of 512 x 512 pixels. For the analysis of both segmentation procedures we used volume, extension (width, length and height), segmentation time and 3D-reconstruction. The segmentation time was reduced from 960 minutes with manual to 215 minutes with semi-automatic segmentation. We found highest variances segmenting nasal cavity. For the paranasal sinuses manual and semi-automatic volume differences are not significant. Dependent on the segmentation accuracy both approaches deliver useful results and could be used for e.g. robot-assisted systems. Nevertheless both procedures are not useful for everyday surgical workflow, because they take too much time. Fully automatic and reproducible segmentation algorithms are needed for segmentation of paranasal sinuses and nasal cavity.

  18. Measuring attachment to life in old age: the Portuguese version of the Positive Valuation of Life Scale (Positive VOL).

    PubMed

    Araújo, Lia; Ribeiro, Oscar; Teixeira, Laetitia; Azevedo, Maria João; Jopp, Daniela S; Rott, Christoph; Paúl, Constança

    2015-10-01

    This study aims to present the psychometric properties of the Portuguese version of the Positive Valuation of Life Scale (Lawton et al. in J Aging Ment Healt 13:3-31, 2001). Sample included 207 community-dwelling elders (129 women; M Age = 77.2 years, SD = 7.5). The data collection included the translated and adapted Portuguese version of Positive Valuation of Life Scale, Life Satisfaction Index Z, Meaning in Life Questionnaire and Geriatric Depression Scale. From exploratory factor analysis, two factors emerged, existential beliefs and perceived control, explaining 49 % of the total variance. Both factors were positively related with meaning in life and life satisfaction and negatively related with depression (p < 0.05). The values obtained for internal consistency for the total scale and for each subscale were good (α > 0.75). The Portuguese version of Positive VOL Scale represents a reliable and valid measure to capture the subjective experience of attachment to one's life. The two-factor structure is an update to Lawton's previous work and in line with findings obtained in the USA (Dennis et al. in What is valuation of life for frail community-dwelling older adults: factor structure and criterion validity of the VOL, Thomas Jefferson University, Center for Applied Research on Aging and Health Research, 2005) and Japan (Nakagawa et al. in Shinrigaku Kenkyu 84:37-46, 2013). Future research is required to investigate VOL predictors and the potential changes toward the end of the life span.

  19. Comparison of liver volumetry on contrast‐enhanced CT images: one semiautomatic and two automatic approaches

    PubMed Central

    Cai, Wei; He, Baochun; Fang, Chihua

    2016-01-01

    This study was to evaluate the accuracy, consistency, and efficiency of three liver volumetry methods— one interactive method, an in‐house‐developed 3D medical Image Analysis (3DMIA) system, one automatic active shape model (ASM)‐based segmentation, and one automatic probabilistic atlas (PA)‐guided segmentation method on clinical contrast‐enhanced CT images. Forty‐two datasets, including 27 normal liver and 15 space‐occupying liver lesion patients, were retrospectively included in this study. The three methods — one semiautomatic 3DMIA, one automatic ASM‐based, and one automatic PA‐based liver volumetry — achieved an accuracy with VD (volume difference) of −1.69%,−2.75%, and 3.06% in the normal group, respectively, and with VD of −3.20%,−3.35%, and 4.14% in the space‐occupying lesion group, respectively. However, the three methods achieved an efficiency of 27.63 mins, 1.26 mins, 1.18 mins on average, respectively, compared with the manual volumetry, which took 43.98 mins. The high intraclass correlation coefficient between the three methods and the manual method indicated an excellent agreement on liver volumetry. Significant differences in segmentation time were observed between the three methods (3DMIA, ASM, and PA) and the manual volumetry (p<0.001), as well as between the automatic volumetries (ASM and PA) and the semiautomatic volumetry (3DMIA) (p<0.001). The semiautomatic interactive 3DMIA, automatic ASM‐based, and automatic PA‐based liver volumetry agreed well with manual gold standard in both the normal liver group and the space‐occupying lesion group. The ASM‐ and PA‐based automatic segmentation have better efficiency in clinical use. PACS number(s): 87.55.‐x PMID:27929487

  20. Back-and-Forth Methodology for Objective Voice Quality Assessment: From/to Expert Knowledge to/from Automatic Classification of Dysphonia

    NASA Astrophysics Data System (ADS)

    Fredouille, Corinne; Pouchoulin, Gilles; Ghio, Alain; Revis, Joana; Bonastre, Jean-François; Giovanni, Antoine

    2009-12-01

    This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists). The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices), rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0-3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.

  1. Grinding Parts For Automatic Welding

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  2. Automatic classification of blank substrate defects

    NASA Astrophysics Data System (ADS)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  3. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Automatic, mechanical, and electronic equipment. 211.68 Section 211.68 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic...

  4. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Automatic, mechanical, and electronic equipment. 211.68 Section 211.68 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic...

  5. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  6. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  7. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  8. Automatically-computed prehospital severity scores are equivalent to scores based on medic documentation.

    PubMed

    Reisner, Andrew T; Chen, Liangyou; McKenna, Thomas M; Reifman, Jaques

    2008-10-01

    Prehospital severity scores can be used in routine prehospital care, mass casualty care, and military triage. If computers could reliably calculate clinical scores, new clinical and research methodologies would be possible. One obstacle is that vital signs measured automatically can be unreliable. We hypothesized that Signal Quality Indices (SQI's), computer algorithms that differentiate between reliable and unreliable monitored physiologic data, could improve the predictive power of computer-calculated scores. In a retrospective analysis of trauma casualties transported by air ambulance, we computed the Triage Revised Trauma Score (RTS) from archived travel monitor data. We compared the areas-under-the-curve (AUC's) of receiver operating characteristic curves for prediction of mortality and red blood cell transfusion for 187 subjects with comparable quantities of good-quality and poor-quality data. Vital signs deemed reliable by SQI's led to significantly more discriminatory severity scores than vital signs deemed unreliable. We also compared automatically-computed RTS (using the SQI's) versus RTS computed from vital signs documented by medics. For the subjects in whom the SQI algorithms identified 15 consecutive seconds of reliable vital signs data (n = 350), the automatically-computed scores' AUC's were the same as the medic-based scores' AUC's. Using the Prehospital Index in place of RTS led to very similar results, corroborating our findings. SQI algorithms improve automatically-computed severity scores, and automatically-computed scores using SQI's are equivalent to medic-based scores.

  9. The Influence of Facial Signals on the Automatic Imitation of Hand Actions

    PubMed Central

    Butler, Emily E.; Ward, Robert; Ramsey, Richard

    2016-01-01

    Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate “in the moment” states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation. PMID:27833573

  10. The Influence of Facial Signals on the Automatic Imitation of Hand Actions.

    PubMed

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2016-01-01

    Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate "in the moment" states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation.

  11. MARZ: Manual and automatic redshifting software

    NASA Astrophysics Data System (ADS)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  12. Automatic retinal interest evaluation system (ARIES).

    PubMed

    Yin, Fengshou; Wong, Damon Wing Kee; Yow, Ai Ping; Lee, Beng Hai; Quan, Ying; Zhang, Zhuo; Gopalakrishnan, Kavitha; Li, Ruoying; Liu, Jiang

    2014-01-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases such as glaucoma, age-related macular degeneration and diabetic retinopathy. However, in practice, retinal image quality is a big concern as automatic systems without consideration of degraded image quality will likely generate unreliable results. In this paper, an automatic retinal image quality assessment system (ARIES) is introduced to assess both image quality of the whole image and focal regions of interest. ARIES achieves 99.54% accuracy in distinguishing fundus images from other types of images through a retinal image identification step in a dataset of 35342 images. The system employs high level image quality measures (HIQM) to perform image quality assessment, and achieves areas under curve (AUCs) of 0.958 and 0.987 for whole image and optic disk region respectively in a testing dataset of 370 images. ARIES acts as a form of automatic quality control which ensures good quality images are used for processing, and can also be used to alert operators of poor quality images at the time of acquisition.

  13. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    PubMed

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  14. Automatic soldering machine

    NASA Technical Reports Server (NTRS)

    Stein, J. A.

    1974-01-01

    Fully-automatic tube-joint soldering machine can be used to make leakproof joints in aluminum tubes of 3/16 to 2 in. in diameter. Machine consists of temperature-control unit, heater transformer and heater head, vibrator, and associated circuitry controls, and indicators.

  15. Automatic computation for optimum height planning of apartment buildings to improve solar access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong, Yoon-Bok; Kim, Yong-Yee; Seok, Ho-Tae

    2011-01-15

    The objective of this study is to suggest a mathematical model and an optimal algorithm for determining the height of apartment buildings to satisfy the solar rights of survey buildings or survey housing units. The objective is also to develop an automatic computation model for the optimum height of apartment buildings and then to clarify the performance and expected effects. To accomplish the objective of this study, the following procedures were followed: (1) The necessity of the height planning of obstruction buildings to satisfy the solar rights of survey buildings or survey housing units is demonstrated by analyzing through amore » literature review the recent trend of disputes related to solar rights and to examining the social requirements in terms of solar rights. In addition, the necessity of the automatic computation system for height planning of apartment buildings is demonstrated and a suitable analysis method for this system is chosen by investigating the characteristics of analysis methods for solar rights assessment. (2) A case study on the process of height planning of apartment buildings will be briefly described and the problems occurring in this process will then be examined carefully. (3) To develop an automatic computation model for height planning of apartment buildings, geometrical elements forming apartment buildings are defined by analyzing the geometrical characteristics of apartment buildings. In addition, design factors and regulations required in height planning of apartment buildings are investigated. Based on this knowledge, the methodology and mathematical algorithm to adjust the height of apartment buildings by automatic computation are suggested and probable problems and the ways to resolve these problems are discussed. Finally, the methodology and algorithm for the optimization are suggested. (4) Based on the suggested methodology and mathematical algorithm, the automatic computation model for optimum height of apartment

  16. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  17. Automatic rectum limit detection by anatomical markers correlation.

    PubMed

    Namías, R; D'Amato, J P; del Fresno, M; Vénere, M

    2014-06-01

    Several diseases take place at the end of the digestive system. Many of them can be diagnosed by means of different medical imaging modalities together with computer aided detection (CAD) systems. These CAD systems mainly focus on the complete segmentation of the digestive tube. However, the detection of limits between different sections could provide important information to these systems. In this paper we present an automatic method for detecting the rectum and sigmoid colon limit using a novel global curvature analysis over the centerline of the segmented digestive tube in different imaging modalities. The results are compared with the gold standard rectum upper limit through a validation scheme comprising two different anatomical markers: the third sacral vertebra and the average rectum length. Experimental results in both magnetic resonance imaging (MRI) and computed tomography colonography (CTC) acquisitions show the efficacy of the proposed strategy in automatic detection of rectum limits. The method is intended for application to the rectum segmentation in MRI for geometrical modeling and as contextual information source in virtual colonoscopies and CAD systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. 46 CFR 171.118 - Automatic ventilators and side ports.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Automatic ventilators and side ports. 171.118 Section... SPECIAL RULES PERTAINING TO VESSELS CARRYING PASSENGERS Openings in the Side of a Vessel Below the Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  19. 46 CFR 171.118 - Automatic ventilators and side ports.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Automatic ventilators and side ports. 171.118 Section... SPECIAL RULES PERTAINING TO VESSELS CARRYING PASSENGERS Openings in the Side of a Vessel Below the Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  20. 46 CFR 171.118 - Automatic ventilators and side ports.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Automatic ventilators and side ports. 171.118 Section... SPECIAL RULES PERTAINING TO VESSELS CARRYING PASSENGERS Openings in the Side of a Vessel Below the Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  1. 46 CFR 171.118 - Automatic ventilators and side ports.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Automatic ventilators and side ports. 171.118 Section... SPECIAL RULES PERTAINING TO VESSELS CARRYING PASSENGERS Openings in the Side of a Vessel Below the Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  2. 46 CFR 171.118 - Automatic ventilators and side ports.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Automatic ventilators and side ports. 171.118 Section... SPECIAL RULES PERTAINING TO VESSELS CARRYING PASSENGERS Openings in the Side of a Vessel Below the Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  3. Presentation video retrieval using automatically recovered slide and spoken text

    NASA Astrophysics Data System (ADS)

    Cooper, Matthew

    2013-03-01

    Video is becoming a prevalent medium for e-learning. Lecture videos contain text information in both the presentation slides and lecturer's speech. This paper examines the relative utility of automatically recovered text from these sources for lecture video retrieval. To extract the visual information, we automatically detect slides within the videos and apply optical character recognition to obtain their text. Automatic speech recognition is used similarly to extract spoken text from the recorded audio. We perform controlled experiments with manually created ground truth for both the slide and spoken text from more than 60 hours of lecture video. We compare the automatically extracted slide and spoken text in terms of accuracy relative to ground truth, overlap with one another, and utility for video retrieval. Results reveal that automatically recovered slide text and spoken text contain different content with varying error profiles. Experiments demonstrate that automatically extracted slide text enables higher precision video retrieval than automatically recovered spoken text.

  4. A device for automatic photoelectric control of the analytical gap for emission spectrographs

    USGS Publications Warehouse

    Dietrich, John A.; Cooley, Elmo F.; Curry, Kenneth J.

    1977-01-01

    A photoelectric device has been built that automatically controls the analytical gap between electrodes during excitation period. The control device allows for precise control of the analytical gap during the arcing process of samples, resulting in better precision of analysis.

  5. A novel automatic segmentation workflow of axial breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Besbes, Feten; Gargouri, Norhene; Damak, Alima; Sellami, Dorra

    2018-04-01

    In this paper we propose a novel process of a fully automatic breast tissue segmentation which is independent from expert calibration and contrast. The proposed algorithm is composed by two major steps. The first step consists in the detection of breast boundaries. It is based on image content analysis and Moore-Neighbour tracing algorithm. As a processing step, Otsu thresholding and neighbors algorithm are applied. Then, the external area of breast is removed to get an approximated breast region. The second preprocessing step is the delineation of the chest wall which is considered as the lowest cost path linking three key points; These points are located automatically at the breast. They are respectively, the left and right boundary points and the middle upper point placed at the sternum region using statistical method. For the minimum cost path search problem, we resolve it through Dijkstra algorithm. Evaluation results reveal the robustness of our process face to different breast densities, complex forms and challenging cases. In fact, the mean overlap between manual segmentation and automatic segmentation through our method is 96.5%. A comparative study shows that our proposed process is competitive and faster than existing methods. The segmentation of 120 slices with our method is achieved at least in 20.57+/-5.2s.

  6. Automatic Determination of the Conic Coronal Mass Ejection Model Parameters

    NASA Technical Reports Server (NTRS)

    Pulkkinen, A.; Oates, T.; Taktakishvili, A.

    2009-01-01

    Characterization of the three-dimensional structure of solar transients using incomplete plane of sky data is a difficult problem whose solutions have potential for societal benefit in terms of space weather applications. In this paper transients are characterized in three dimensions by means of conic coronal mass ejection (CME) approximation. A novel method for the automatic determination of cone model parameters from observed halo CMEs is introduced. The method uses both standard image processing techniques to extract the CME mass from white-light coronagraph images and a novel inversion routine providing the final cone parameters. A bootstrap technique is used to provide model parameter distributions. When combined with heliospheric modeling, the cone model parameter distributions will provide direct means for ensemble predictions of transient propagation in the heliosphere. An initial validation of the automatic method is carried by comparison to manually determined cone model parameters. It is shown using 14 halo CME events that there is reasonable agreement, especially between the heliocentric locations of the cones derived with the two methods. It is argued that both the heliocentric locations and the opening half-angles of the automatically determined cones may be more realistic than those obtained from the manual analysis

  7. Oscillatory brain dynamics associated with the automatic processing of emotion in words.

    PubMed

    Wang, Lin; Bastiaansen, Marcel

    2014-10-01

    This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Automatically monitoring driftwood in large rivers: preliminary results

    NASA Astrophysics Data System (ADS)

    Piegay, H.; Lemaire, P.; MacVicar, B.; Mouquet-Noppe, C.; Tougne, L.

    2014-12-01

    Driftwood in rivers impact sediment transport, riverine habitat and human infrastructures. Quantifying it, in particular large woods on fairly large rivers where it can move easily, would allow us to improve our knowledge on fluvial transport processes. There are several means of studying this phenomenon, amongst which RFID sensors tracking, photo and video monitoring. In this abstract, we are interested in the latter, being easier and cheaper to deploy. However, video monitoring of driftwood generates a huge amount of images and manually labeling it is tedious. It is essential to automate such a monitoring process, which is a difficult task in the field of computer vision, and more specifically automatic video analysis. Detecting foreground into dynamic background remains an open problem to date. We installed a video camera at the riverside of a gauging station on the Ain River, a 3500 km² Piedmont River in France. Several floods were manually annotated by a human operator. We developed software that automatically extracts and characterizes wood blocks within a video stream. This algorithm is based upon a statistical model and combines static, dynamic and spatial data. Segmented wood objects are further described with the help of a skeleton-based approach that helps us to automatically determine its shape, diameter and length. The first detailed comparisons between manual annotations and automatically extracted data show that we can fairly well detect large wood until a given size (approximately 120 cm in length or 15 cm in diameter) whereas smaller ones are difficult to detect and tend to be missed by either the human operator, either the algorithm. Detection is fairly accurate in high flow conditions where the water channel is usually brown because of suspended sediment transport. In low flow context, our algorithm still needs improvement to reduce the number of false positive so as to better distinguish shadow or turbulence structures from wood pieces.

  9. Automatic script identification from images using cluster-based templates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, J.; Kerns, L.; Kelly, P.

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a newmore » document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.« less

  10. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XX); illustration used by eminent British textile engineer to exemplify the ultimate development in American cotton mill technology. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  11. The effects of presession manipulations on automatically maintained challenging behavior and task responding.

    PubMed

    Chung, Yi-Chieh; Cannella-Malone, Helen I

    2010-11-01

    This study examined the effects of presession exposure to attention, response blocking, attention with response blocking, and noninteraction conditions on subsequent engagement in automatically maintained challenging behavior and correct responding in four individuals with significant intellectual disabilities. Following a functional analysis, the effects of the four presession conditions were examined using multielement designs. Results varied across the 4 participants (e.g., presession noninteraction acted as an abolishing operation for 2 participants, but as an establishing operation for the other 2 participants). As such, both the results replicated and contradicted previous research examining the effects of motivating operations on automatically maintained challenging behavior. Although the results varied across participants, at least one condition resulting in a decrease in challenging behavior and an increase in correct responding were identified for each participant. These findings suggested that presession manipulations resulted in decreases in subsequent automatically maintained challenging behavior and simultaneous increases in correct responding might need to be individually identified when the maintaining contingencies cannot be identified.

  12. Improved automatic adjustment of density and contrast in FCR system using neural network

    NASA Astrophysics Data System (ADS)

    Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo

    1994-05-01

    FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.

  13. 33 CFR 401.20 - Automatic Identification System.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...' maritime Differential Global Positioning System radiobeacon services; or (7) The use of a temporary unit... Identification System. (a) Each of the following vessels must use an Automatic Identification System (AIS... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Automatic Identification System...

  14. Automatic characterization of neointimal tissue by intravascular optical coherence tomography.

    PubMed

    Ughi, Giovanni J; Steigerwald, Kristin; Adriaenssens, Tom; Desmet, Walter; Guagliumi, Giulio; Joner, Michael; D'hooge, Jan

    2014-02-01

    Intravascular optical coherence tomography (IVOCT) is rapidly becoming the method of choice for assessing vessel healing after stent implantation due to its unique axial resolution <20  μm. The amount of neointimal coverage is an important parameter. In addition, the characterization of neointimal tissue maturity is also of importance for an accurate analysis, especially in the case of drug-eluting and bioresorbable stent devices. Previous studies indicated that well-organized mature neointimal tissue appears as a high-intensity, smooth, and homogeneous region in IVOCT images, while lower-intensity signal areas might correspond to immature tissue mainly composed of acellular material. A new method for automatic neointimal tissue characterization, based on statistical texture analysis and a supervised classification technique, is presented. Algorithm training and validation were obtained through the use of 53 IVOCT images supported by histology data from atherosclerotic New Zealand White rabbits. A pixel-wise classification accuracy of 87% and a two-dimensional region-based analysis accuracy of 92% (with sensitivity and specificity of 91% and 93%, respectively) were found, suggesting that a reliable automatic characterization of neointimal tissue was achieved. This may potentially expand the clinical value of IVOCT in assessing the completeness of stent healing and speed up the current analysis methodologies (which are, due to their time- and energy-consuming character, not suitable for application in large clinical trials and clinical practice), potentially allowing for a wider use of IVOCT technology.

  15. Impact of basal inferolateral scar burden determined by automatic analysis of 99mTc-MIBI myocardial perfusion SPECT on the long-term prognosis of cardiac resynchronization therapy.

    PubMed

    Morishima, Itsuro; Okumura, Kenji; Tsuboi, Hideyuki; Morita, Yasuhiro; Takagi, Kensuke; Yoshida, Ruka; Nagai, Hiroaki; Tomomatsu, Toshiro; Ikai, Yoshihiro; Terada, Kazushi; Sone, Takahito; Murohara, Toyoaki

    2017-04-01

    Left-ventricular (LV) scarring may be associated with a poor response to cardiac resynchronization therapy (CRT). The automatic analysis of myocardial perfusion single-photon emission computed tomography (MP-SPECT) may provide objective quantification of LV scarring. We investigated the impact of LV scarring determined by an automatic analysis of MP-SPECT on short-term LV volume response as well as long-term outcome. We studied consecutive 51 patients who were eligible to undergo 99mTc-MIBI MP-SPECT both at baseline and 6 months after CRT (ischaemic cardiomyopathies 31%). Quantitative perfusion SPECT was used to evaluate the defect extent (an index of global scarring) and the LV 17-segment regional uptake ratio (an inverse index of regional scar burden). The primary outcome was the composite of overall mortality or first hospitalization for worsening heart failure. A high global scar burden and a low mid/basal inferolateral regional uptake ratio were associated with volume non-responders to CRT at 6 months. The basal inferolateral regional uptake ratio remained as a predictor of volume non-response after adjusting for the type of cardiomyopathy. During a median follow-up of 36.1 months, the outcome occurred in 28 patients. The patients with a low basal inferolateral regional uptake ratio with a cutoff value of 57% showed poor prognosis (log-rank P= 0.006). The scarring determined by automatic analysis of MP-SPECT images may predict a poor response to CRT regardless of the pathogenesis of cardiomyopathy. The basal inferolateral scar burden in particular may have an adverse impact on long-term prognosis. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  16. Automatic blood vessel based-liver segmentation using the portal phase abdominal CT

    NASA Astrophysics Data System (ADS)

    Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen

    2018-02-01

    Liver segmentation is the basis for computer-based planning of hepatic surgical interventions. In diagnosis and analysis of hepatic diseases and surgery planning, automatic segmentation of liver has high importance. Blood vessel (BV) has showed high performance at liver segmentation. In our previous work, we developed a semi-automatic method that segments the liver through the portal phase abdominal CT images in two stages. First stage was interactive segmentation of abdominal blood vessels (ABVs) and subsequent classification into hepatic (HBVs) and non-hepatic (non-HBVs). This stage had 5 interactions that include selective threshold for bone segmentation, selecting two seed points for kidneys segmentation, selection of inferior vena cava (IVC) entrance for starting ABVs segmentation, identification of the portal vein (PV) entrance to the liver and the IVC-exit for classifying HBVs from other ABVs (non-HBVs). Second stage is automatic segmentation of the liver based on segmented ABVs as described in [4]. For full automation of our method we developed a method [5] that segments ABVs automatically tackling the first three interactions. In this paper, we propose full automation of classifying ABVs into HBVs and non- HBVs and consequently full automation of liver segmentation that we proposed in [4]. Results illustrate that the method is effective at segmentation of the liver through the portal abdominal CT images.

  17. Vehicle-to-Grid Automatic Load Sharing with Driver Preference in Micro-Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yubo; Nazaripouya, Hamidreza; Chu, Chi-Cheng

    Integration of Electrical Vehicles (EVs) with power grid not only brings new challenges for load management, but also opportunities for distributed storage and generation. This paper comprehensively models and analyzes distributed Vehicle-to-Grid (V2G) for automatic load sharing with driver preference. In a micro-grid with limited communications, V2G EVs need to decide load sharing based on their own power and voltage profile. A droop based controller taking into account driver preference is proposed in this paper to address the distributed control of EVs. Simulations are designed for three fundamental V2G automatic load sharing scenarios that include all system dynamics of suchmore » applications. Simulation results demonstrate that active power sharing is achieved proportionally among V2G EVs with consideration of driver preference. In additional, the results also verify the system stability and reactive power sharing analysis in system modelling, which sheds light on large scale V2G automatic load sharing in more complicated cases.« less

  18. A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood

    2013-02-01

    In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.

  19. Analysis of biases from parallel observations of co-located manual and automatic weather stations in Indonesia

    NASA Astrophysics Data System (ADS)

    Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip

    2017-04-01

    Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.

  20. A review of automatic patient identification options for public health care centers with restricted budgets.

    PubMed

    García-Betances, Rebeca I; Huerta, Mónica K

    2012-01-01

    A comparative review is presented of available technologies suitable for automatic reading of patient identification bracelet tags. Existing technologies' backgrounds, characteristics, advantages and disadvantages, are described in relation to their possible use by public health care centers with budgetary limitations. A comparative assessment is presented of suitable automatic identification systems based on graphic codes, both one- (1D) and two-dimensional (2D), printed on labels, as well as those based on radio frequency identification (RFID) tags. The analysis looks at the tradeoffs of these technologies to provide guidance to hospital administrator looking to deploy patient identification technology. The results suggest that affordable automatic patient identification systems can be easily and inexpensively implemented using 2D code printed on low cost bracelet labels, which can then be read and automatically decoded by ordinary mobile smart phones. Because of mobile smart phones' present versatility and ubiquity, the implantation and operation of 2D code, and especially Quick Response® (QR) Code, technology emerges as a very attractive alternative to automate the patients' identification processes in low-budget situations.