Science.gov

Sample records for automatic vol analysis

  1. Automatic Microwave Network Analysis.

    DTIC Science & Technology

    A program and procedure are developed for the automatic measurement of microwave networks using a Hewlett-Packard network analyzer and programmable calculator . The program and procedure are used in the measurement of a simple microwave two port network. These measurements are evaluated by comparing with measurements on the same network using other techniques. The programs...in the programmable calculator are listed in Appendix 1. The step by step procedure used is listed in Appendix 2. (Author)

  2. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  3. Integrating Automatic Genre Analysis into Digital Libraries.

    ERIC Educational Resources Information Center

    Rauber, Andreas; Muller-Kogler, Alexander

    With the number and types of documents in digital library systems increasing, tools for automatically organizing and presenting the content have to be found. While many approaches focus on topic-based organization and structuring, hardly any system incorporates automatic structural analysis and representation. Yet, genre information…

  4. Automatic tools for microprocessor failure analysis

    NASA Astrophysics Data System (ADS)

    Conard, Didier; Laurent, J.; Velazco, Raoul; Ziade, Haissam; Cabestany, J.; Sala, F.

    A new approach for fault location when testing microprocessors is presented. The startpoint for the backtracing analysis converging to the failure is constituted by the automatic localization of a reduced area. Automatic image comparison based on pattern recognition is performed by means of an electron beam tester. The developed hardware and software tools allow large circuit areas to be covered offering powerful diagnosis capabilities to the user. The validation of this technique was performed on faulty 68000 microprocessors. It shows the feasibility of the automation of the first and most important step of failure analysis: fault location at the chip surface.

  5. Automatic emotional expression analysis from eye area

    NASA Astrophysics Data System (ADS)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  6. Automatic Prosodic Analysis to Identify Mild Dementia

    PubMed Central

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  7. Automatic analysis of D-partition

    NASA Astrophysics Data System (ADS)

    Bogaevskaya, V. G.

    2017-01-01

    The paper is dedicated to automatization of D-partition analysis. D-partition is one of the most common methods for determination of solution stability in systems with time-delayed feedback control and its dependency on values of control parameters. A transition from analytical form of D-partition to plain graph has been investigated. An algorithm of graph faces determination and calculation of count of characteristic equation roots with positive real part for appropriate area of D-partition has been developed. The algorithm keeps an information about analytical formulas for edges of faces. It allows to make further analytical research based on the results of computer analysis.

  8. Research on automatic human chromosome image analysis

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to achieve the best results. Medial axis is extracted by the middle point algorithm. Chromosome band is enhanced by the algorithm based on multiscale B-spline wavelets, extracted by average gray profile, gradient profile and shape profile, and calculated by the WDD (Weighted Density Distribution) descriptors. The multilayer classifier is used in classification. Experiment results demonstrate that the algorithms perform well.

  9. Semi-automatic analysis of fire debris

    PubMed

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-08

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians.

  10. Automatic analysis of speckle photography fringes.

    PubMed

    Buendía, M; Cibrián, R; Salvador, R; Roldán, C; Iñesta, J M

    1997-04-10

    Speckle interferometry is a technique adequate to metrological problems such as the measurement of object deformation. An automatic system of analysis of such measurements is given; it consists of a motorized x-y plate positioner controlled by computer, a CCD video camera, and software for image analysis. A fringe-recognition algorithm determines the spacing and orientation of the fringes and permits the calculation of the magnitude and direction of the displacement of the analyzed object point in images with variable degrees of illumination. For a 256 x 256 pixel image resolution, the procedure allows one to analyze from three fringes to a number of fringes that corresponds to 3 pixels/fringe.

  11. Automatic cortical thickness analysis on rodent brain

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek

    2011-03-01

    Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.

  12. Automatic analysis of the corneal ulcer

    NASA Astrophysics Data System (ADS)

    Ventura, Liliane; Chiaradia, Caio; Faria de Sousa, Sidney J.

    1999-06-01

    A very common disease in agricultural countries is the corneal ulcer. Particularly in the public hospitals, several patients come every week presenting this kind of pathology. One of the most important features to diagnose the regression of the disease is the determination of the vanishing of the affected area. An automatic system (optical system and software), attached to a Slit Lamp, has been developed to determine automatically the area of the ulcer and to follow up its regression. The clinical procedure to isolate the ulcer is still done, but the measuring time is fast enough to not cause discomfort to the patient as the traditional evaluation does. The system has been used in the last 6 months in a hospital that has about 80 patients per week presenting corneal ulcer. The patients follow up (which is an indispensable criteria for the cure of the disease) has been improved by the system and has guaranteed the treatment success.

  13. Automatism

    PubMed Central

    McCaldon, R. J.

    1964-01-01

    Individuals can carry out complex activity while in a state of impaired consciousness, a condition termed “automatism”. Consciousness must be considered from both an organic and a psychological aspect, because impairment of consciousness may occur in both ways. Automatism may be classified as normal (hypnosis), organic (temporal lobe epilepsy), psychogenic (dissociative fugue) or feigned. Often painstaking clinical investigation is necessary to clarify the diagnosis. There is legal precedent for assuming that all crimes must embody both consciousness and will. Jurists are loath to apply this principle without reservation, as this would necessitate acquittal and release of potentially dangerous individuals. However, with the sole exception of the defence of insanity, there is at present no legislation to prohibit release without further investigation of anyone acquitted of a crime on the grounds of “automatism”. PMID:14199824

  14. Functional analysis screening for problem behavior maintained by automatic reinforcement.

    PubMed

    Querim, Angie C; Iwata, Brian A; Roscoe, Eileen M; Schlichenmeyer, Kevin J; Ortega, Javier Virués; Hurl, Kylee E

    2013-01-01

    A common finding in previous research is that problem behavior maintained by automatic reinforcement continues to occur in the alone condition of a functional analysis (FA), whereas behavior maintained by social reinforcement typically is extinguished. Thus, the alone condition may represent an efficient screening procedure when maintenance by automatic reinforcement is suspected. We conducted a series of 5-min alone (or no-interaction) probes for 30 cases of problem behavior and compared initial predictions of maintenance or extinction to outcomes obtained in subsequent FAs. Results indicated that data from the screening procedure accurately predicted that problem behavior was maintained by automatic reinforcement in 21 of 22 cases and by social reinforcement in 7 of 8 cases. Thus, results of the screening accurately predicted the function of problem behavior (social vs. automatic reinforcement) in 28 of 30 cases.

  15. Automatic ionospheric layers detection: Algorithms analysis

    NASA Astrophysics Data System (ADS)

    Molina, María G.; Zuccheretti, Enrico; Cabrera, Miguel A.; Bianchi, Cesidio; Sciacca, Umberto; Baskaradas, James

    2016-03-01

    Vertical sounding is a widely used technique to obtain ionosphere measurements, such as an estimation of virtual height versus frequency scanning. It is performed by high frequency radar for geophysical applications called ;ionospheric sounder; (or ;ionosonde;). Radar detection depends mainly on targets characteristics. While several targets behavior and correspondent echo detection algorithms have been studied, a survey to address a suitable algorithm for ionospheric sounder has to be carried out. This paper is focused on automatic echo detection algorithms implemented in particular for an ionospheric sounder, target specific characteristics were studied as well. Adaptive threshold detection algorithms are proposed, compared to the current implemented algorithm, and tested using actual data obtained from the Advanced Ionospheric Sounder (AIS-INGV) at Rome Ionospheric Observatory. Different cases of study have been selected according typical ionospheric and detection conditions.

  16. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  17. A hierarchical structure for automatic meshing and adaptive FEM analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Saxena, Mukul; Perucchio, Renato

    1987-01-01

    A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.

  18. Automatic functional analysis of left ventricle in cardiac cine MRI

    PubMed Central

    Lu, Ying-Li; Connelly, Kim A.; Dick, Alexander J.; Wright, Graham A.

    2013-01-01

    Rationale and objectives A fully automated left ventricle segmentation method for the functional analysis of cine short axis (SAX) magnetic resonance (MR) images was developed, and its performance evaluated with 133 studies of subjects with diverse pathology: ischemic heart failure (n=34), non-ischemic heart failure (n=30), hypertrophy (n=32), and healthy (n=37). Materials and methods The proposed automatic method locates the left ventricle (LV), then for each image detects the contours of the endocardium, epicardium, papillary muscles and trabeculations. Manually and automatically determined contours and functional parameters were compared quantitatively. Results There was no significant difference between automatically and manually determined end systolic volume (ESV), end diastolic volume (EDV), ejection fraction (EF) and left ventricular mass (LVM) for each of the four groups (paired sample t-test, α=0.05). The automatically determined functional parameters showed high correlations with those derived from manual contours, and the Bland-Altman analysis biases were small (1.51 mL, 1.69 mL, –0.02%, –0.66 g for ESV, EDV, EF and LVM, respectively). Conclusions The proposed technique automatically and rapidly detects endocardial, epicardial, papillary muscles’ and trabeculations’ contours providing accurate and reproducible quantitative MRI parameters, including LV mass and EF. PMID:24040616

  19. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  20. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    PubMed

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  1. Development of an automatic identification algorithm for antibiogram analysis.

    PubMed

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a <4mm difference between AIA and human analysis, exhibiting a correlation index of 0.85 for all images, 0.90 for standards and 0.80 for oddities with no significant difference between automatic and manual method. AIA resolved some reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories.

  2. Automatic movie skimming with general tempo analysis

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  3. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  4. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  5. Facilitator control as automatic behavior: A verbal behavior analysis

    PubMed Central

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is automatic when the speaker or writer is not stimulated by the behavior at the time of emission, the behavior is not edited, the products of behavior differ from what the person would produce normally, and the behavior is attributed to an outside source. All of these characteristics appear to be present in facilitator behavior. Other variables seem to account for the thematic content of the typed messages. These variables also are discussed. PMID:22477083

  6. Spectral analysis methods for automatic speech recognition applications

    NASA Astrophysics Data System (ADS)

    Parinam, Venkata Neelima Devi

    In this thesis, we evaluate the front-end of Automatic Speech Recognition (ASR) systems, with respect to different types of spectral processing methods that are extensively used. A filter bank approach for front end spectral analysis is one of the common methods used for spectral analysis. In this work we describe and evaluate spectral analysis based on Mel and Gammatone filter banks. These filtering methods are derived from auditory models and are thought to have some advantages for automatic speech recognition work. Experimentally, however, we show that direct use of FFT spectral values is just as effective as using either Mel or Gammatone filter banks, provided that the features extracted from the FFT spectral values take into account a Mel or Mel-like frequency scale. It is also shown that trajectory features based on sliding block of spectral features, computed using either FFT or filter bank spectral analysis are considerably more effective, in terms of ASR accuracy, than are delta and delta-delta terms often used for ASR. Although there is no major performance disadvantage to using a filter bank, simplicity of analysis is a reason to eliminate this step in speech processing. These assertions hold for both clean and noisy speech.

  7. Rapid automatic keyword extraction for information retrieval and analysis

    DOEpatents

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  8. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  9. Entropy analysis of OCT signal for automatic tissue characterization

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Qiu, Yi; Zaki, Farzana; Xu, Yiqing; Hubbi, Basil; Belfield, Kevin D.; Liu, Xuan

    2016-03-01

    Optical coherence tomography (OCT) signal can provide microscopic characterization of biological tissue and assist clinical decision making in real-time. However, raw OCT data is noisy and complicated. It is challenging to extract information that is directly related to the pathological status of tissue through visual inspection on huge volume of OCT signal streaming from the high speed OCT engine. Therefore, it is critical to discover concise, comprehensible information from massive OCT data through novel strategies for signal analysis. In this study, we perform Shannon entropy analysis on OCT signal for automatic tissue characterization, which can be applied in intraoperative tumor margin delineation for surgical excision of cancer. The principle of this technique is based on the fact that normal tissue is usually more structured with higher entropy value, compared to pathological tissue such as cancer tissue. In this study, we develop high-speed software based on graphic processing units (GPU) for real-time entropy analysis of OCT signal.

  10. The Romanian-English Contrastive Analysis Project; Further Developments in Contrastive Studies, Vol. 5.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The fifth volume in this series contains ten articles dealing with various aspects of Romanian-English contrastive analysis. They are: "Theoretical Interpretation and Methodological Consequences of 'REGULARIZATION'," by Tatiana Slama-Cazacu; "On Error Analysis," by Charles M. Carlton; "The Contrastive Hypothesis in Second Language Acquisition," by…

  11. Entropy analysis of automatic sequences revisited: An entropy diagnostic for automaticity

    NASA Astrophysics Data System (ADS)

    Karamanos, Kostas

    2001-06-01

    We give a necessary entropy condition, valid for all automatic sequences read by lumping. We next establish new entropic decimation schemes for the Thue-Morse, the Rudin-Shapiro and the paperfolding sequences read by lumping.

  12. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  13. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  14. DWT features performance analysis for automatic speech recognition of Urdu.

    PubMed

    Ali, Hazrat; Ahmad, Nasir; Zhou, Xianwei; Iqbal, Khalid; Ali, Sahibzada Muhammad

    2014-01-01

    This paper presents the work on Automatic Speech Recognition of Urdu language, using a comparative analysis for Discrete Wavelets Transform (DWT) based features and Mel Frequency Cepstral Coefficients (MFCC). These features have been extracted for one hundred isolated words of Urdu, each word uttered by ten different speakers. The words have been selected from the most frequently used words of Urdu. A variety of age and dialect has been covered by using a balanced corpus approach. After extraction of features, the classification has been achieved by using Linear Discriminant Analysis. After the classification task, the confusion matrix obtained for the DWT features has been compared with the one obtained for Mel-Frequency Cepstral Coefficients based speech recognition. The framework has been trained and tested for speech data recorded under controlled environments. The experimental results are useful in determination of the optimum features for speech recognition task.

  15. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  16. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  17. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  18. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    SciTech Connect

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  19. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    PubMed

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  20. Differentiation of normal and disturbed sleep by automatic analysis.

    PubMed

    Hasan, J

    1983-01-01

    stage classification could be used for the differentiation between normal and disturbed sleep. In the present work only EEG waveform parameters and body movement activity were studied with this in mind. It was found that sleep can satisfactorily be classified in stages by automatic analysis if it is not markedly disturbed. The percentage agreement obtained for the three groups having practically normal sleep (young normals appr. 80%, older normals 77% and anonymous alcoholics 75%) was satisfactory and sufficient for clinical and experimental work.(ABSTRACT TRUNCATED AT 400 WORDS)

  1. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  2. Automatic measuring device for octave analysis of noise

    NASA Technical Reports Server (NTRS)

    Memnonov, D. L.; Nikitin, A. M.

    1973-01-01

    An automatic decoder is described that counts noise levels by pulse counters and forms audio signals proportional in duration to the total or to one of the octave noise levels. Automatic ten fold repetition of the measurement cycle is provided at each measurement point before the transition to a new point is made.

  3. Shape analysis for an automatic oyster grading system

    NASA Astrophysics Data System (ADS)

    Lee, Dah-Jye; Xu, Xiaoqian; Lane, Robert M.; Zhan, Pengcheng

    2004-12-01

    An overview of the oyster industry in the U. S. with emphasis in Virginia shows oyster grading occurs at harvest, wholesale and processing markets. Currently whole oysters, also called shellstock, are graded manually by screening and sorting based on diameter or weight. The majority of oysters harvested for the processing industry are divided into three to four main grades: small, medium, large, and selects. We have developed a shape analysis method for an automatic oyster grading system. The system first detects and removes poor quality oysters such as banana shape, broken shell, and irregular shapes. Good quality oysters move further into grades of small, medium and large. The contours of the oysters are extracted for shape analysis. Banana shape and broken shell have a specific shape flaw (or difference) compared to the ones with good quality. Global shape properties such as compactness, roughness, and elongation are suitable and useful to measure the shape flaw. Image projection area or length of the major axis measured as global properties for sizing. Incorporating a machine vision system for grading, sorting and counting oysters supports reduced operating costs. The savings produced from reducing labor, increasing accuracy in size, grade and count and providing real time accurate data for accounting and billing would contribute to the profit of the oysters industry.

  4. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  5. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  6. Automatic Video Analysis for Obstructive Sleep Apnea Diagnosis

    PubMed Central

    Abad, Jorge; Muñoz-Ferrer, Aida; Cervantes, Miguel Ángel; Esquinas, Cristina; Marin, Alicia; Martínez, Carlos; Morera, Josep; Ruiz, Juan

    2016-01-01

    Study Objectives: We investigated the diagnostic accuracy for the identification of obstructive sleep apnea (OSA) and its severity of a noninvasive technology based on image processing (SleepWise). Methods: This is an observational, prospective study to evaluate the degree of agreement between polysomnography (PSG) and SleepWise. We recruited 56 consecutive subjects with suspected OSA who were referred as outpatients to the Sleep Unit of the Hospital Universitari Germans Trias i Pujol (HUGTiP) from January 2013 to January 2014. All patients underwent laboratory PSG and image processing with SleepWise simultaneously the same night. Both PSG and SleepWise analyses were carried independently and blindly. Results: We analyzed 50 of the 56 patients recruited. OSA was diagnosed through PSG in a total of 44 patients (88%) with a median apnea-hypopnea index (AHI) of 25.35 (24.9). According to SleepWise, 45 patients (90%) met the criteria for a diagnosis of OSA, with a median AHI of 22.8 (22.03). An analysis of the ability of PSG and SleepWise to classify patients by severity on the basis of their AHI shows that the two diagnostic systems distribute the different groups similarly. According to PSG, 23 patients (46%) had a diagnosis of severe OSA, 11 patients (22%) moderate OSA, and 10 patients (20%) mild OSA. According to SleepWise, 20, 13, and 12 patients (40%, 26%, and 24%, respectively) had a diagnosis of severe, moderate, and mild OSA respectively. For OSA diagnosis, SleepWise was found to have sensitivity of 100% and specificity of 83% in relation to PSG. The positive predictive value was 97% and the negative predictive value was 100%. The Bland-Altman plot comparing the mean AHI values obtained through PSG and SleepWise shows very good agreement between the two diagnostic techniques, with a bias of −3.85, a standard error of 12.18, and a confidence interval of −0.39 to −7.31. Conclusions: SleepWise was reasonably accurate for noninvasive and automatic diagnosis

  7. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  8. Automatic analysis of ciliary beat frequency using optical flow

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  9. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  10. Trends of Science Education Research: An Automatic Content Analysis

    NASA Astrophysics Data System (ADS)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  11. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  12. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  13. A Distance Measure for Automatic Document Classification by Sequential Analysis.

    ERIC Educational Resources Information Center

    Kar, Gautam; White, Lee J.

    1978-01-01

    Investigates the feasibility of using a distance measure for automatic sequential document classification. This property of the distance measure is used to design a sequential classification algorithm which classifies key words and analyzes them separately in order to assign primary and secondary classes to a document. (VT)

  14. Automatic audio signing. Volume 2: Review, analysis and design

    NASA Astrophysics Data System (ADS)

    1981-11-01

    Automati Audio Signing, also referred to as an 'automatic highway advisory radio system' (AHAR) provides appropriately equipped motor vehicles one way non commerical communications pertaining to traffic, road and weather conditions, travel advisories, directions, tourist information and other matters of interest to the traveling public. The automatic audio signing project reduces accidents by providing advance warning of hazardous traffic, weather and road conditions; saves the motorists' time and fuel, and reduces motorist irritation by improving traffic control and provides route diversion information when justified by traffic congestion or road blockage; and provides directions, locations of tourist facilities, descriptions of points of interest, and other messages intended to enhance the convenience and enjoyment of the traveling public.

  15. Automatic Metadata Generation Through Analysis of Narration Within Instructional Videos.

    PubMed

    Rafferty, Joseph; Nugent, Chris; Liu, Jun; Chen, Liming

    2015-09-01

    Current activity recognition based assistive living solutions have adopted relatively rigid models of inhabitant activities. These solutions have some deficiencies associated with the use of these models. To address this, a goal-oriented solution has been proposed. In a goal-oriented solution, goal models offer a method of flexibly modelling inhabitant activity. The flexibility of these goal models can dynamically produce a large number of varying action plans that may be used to guide inhabitants. In order to provide illustrative, video-based, instruction for these numerous actions plans, a number of video clips would need to be associated with each variation. To address this, rich metadata may be used to automatically match appropriate video clips from a video repository to each specific, dynamically generated, activity plan. This study introduces a mechanism of automatically generating suitable rich metadata representing actions depicted within video clips to facilitate such video matching. This performance of this mechanism was evaluated using eighteen video files; during this evaluation metadata was automatically generated with a high level of accuracy.

  16. Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis

    NASA Astrophysics Data System (ADS)

    Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.

    2013-11-01

    Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.

  17. Automatic Crowd Analysis from Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Reinartz, P.

    2011-04-01

    Recently automatic detection of people crowds from images became a very important research field, since it can provide crucial information especially for police departments and crisis management teams. Due to the importance of the topic, many researchers tried to solve this problem using street cameras. However, these cameras cannot be used to monitor very large outdoor public events. In order to bring a solution to the problem, herein we propose a novel approach to detect crowds automatically from remotely sensed images, and especially from very high resolution satellite images. To do so, we use a local feature based probabilistic framework. We extract local features from color components of the input image. In order to eliminate redundant local features coming from other objects in given scene, we apply a feature selection method. For feature selection purposes, we benefit from three different type of information; digital elevation model (DEM) of the region which is automatically generated using stereo satellite images, possible street segment which is obtained by segmentation, and shadow information. After eliminating redundant local features, remaining features are used to detect individual persons. Those local feature coordinates are also assumed as observations of the probability density function (pdf) of the crowds to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf which gives us information about dense crowd and people locations. We test our algorithm usingWorldview-2 satellite images over Cairo and Munich cities. Besides, we also provide test results on airborne images for comparison of the detection accuracy. Our experimental results indicate the possible usage of the proposed approach in real-life mass events.

  18. An analysis of automatic human detection and tracking

    NASA Astrophysics Data System (ADS)

    Demuth, Philipe R.; Cosmo, Daniel L.; Ciarelli, Patrick M.

    2015-12-01

    This paper presents an automatic method to detect and follow people on video streams. This method uses two techniques to determine the initial position of the person at the beginning of the video file: one based on optical flow and the other one based on Histogram of Oriented Gradients (HOG). After defining the initial bounding box, tracking is done using four different trackers: Median Flow tracker, TLD tracker, Mean Shift tracker and a modified version of the Mean Shift tracker using HSV color space. The results of the methods presented in this paper are then compared at the end of the paper.

  19. Automatic interferometer with digital readout for refractometric analysis.

    PubMed

    Kinder, W; Neumann, J; Plesse, H; Torge, R

    1968-02-01

    The paper describes an interference refractometer for liquids and gases which operates automatically and reads out in digital or analog form. A compensating technique using white light is used for measurement. Zero adjustment is achieved by rotating the compensator and capturing the zero-order white light fringe by photoelectric means. Measurement of the path difference compensated by the compensator is based on electronic interpolation and counting of interference fringes by optointerferometric means, a time division multiplex technique with pulse amplitude modulation being used to obtain the electrical fringe signals.

  20. Structuring Lecture Videos by Automatic Projection Screen Localization and Analysis.

    PubMed

    Li, Kai; Wang, Jue; Wang, Haoqian; Dai, Qionghai

    2015-06-01

    We present a fully automatic system for extracting the semantic structure of a typical academic presentation video, which captures the whole presentation stage with abundant camera motions such as panning, tilting, and zooming. Our system automatically detects and tracks both the projection screen and the presenter whenever they are visible in the video. By analyzing the image content of the tracked screen region, our system is able to detect slide progressions and extract a high-quality, non-occluded, geometrically-compensated image for each slide, resulting in a list of representative images that reconstruct the main presentation structure. Afterwards, our system recognizes text content and extracts keywords from the slides, which can be used for keyword-based video retrieval and browsing. Experimental results show that our system is able to generate more stable and accurate screen localization results than commonly-used object tracking methods. Our system also extracts more accurate presentation structures than general video summarization methods, for this specific type of video.

  1. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  2. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    ERIC Educational Resources Information Center

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  3. The tool for the automatic analysis of text cohesion (TAACO): Automatic assessment of local, global, and text cohesion.

    PubMed

    Crossley, Scott A; Kyle, Kristopher; McNamara, Danielle S

    2016-12-01

    This study introduces the Tool for the Automatic Analysis of Cohesion (TAACO), a freely available text analysis tool that is easy to use, works on most operating systems (Windows, Mac, and Linux), is housed on a user's hard drive (rather than having an Internet interface), allows for the batch processing of text files, and incorporates over 150 classic and recently developed indices related to text cohesion. The study validates TAACO by investigating how its indices related to local, global, and overall text cohesion can predict expert judgments of text coherence and essay quality. The findings of this study provide predictive validation of TAACO and support the notion that expert judgments of text coherence and quality are either negatively correlated or not predicted by local and overall text cohesion indices, but are positively predicted by global indices of cohesion. Combined, these findings provide supporting evidence that coherence for expert raters is a property of global cohesion and not of local cohesion, and that expert ratings of text quality are positively related to global cohesion.

  4. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  5. Phase Segmentation Methods for an Automatic Surgical Workflow Analysis

    PubMed Central

    Sakurai, Ryuhei; Yamazoe, Hirotake

    2017-01-01

    In this paper, we present robust methods for automatically segmenting phases in a specified surgical workflow by using latent Dirichlet allocation (LDA) and hidden Markov model (HMM) approaches. More specifically, our goal is to output an appropriate phase label for each given time point of a surgical workflow in an operating room. The fundamental idea behind our work lies in constructing an HMM based on observed values obtained via an LDA topic model covering optical flow motion features of general working contexts, including medical staff, equipment, and materials. We have an awareness of such working contexts by using multiple synchronized cameras to capture the surgical workflow. Further, we validate the robustness of our methods by conducting experiments involving up to 12 phases of surgical workflows with the average length of each surgical workflow being 12.8 minutes. The maximum average accuracy achieved after applying leave-one-out cross-validation was 84.4%, which we found to be a very promising result.

  6. Automatic shape model building based on principal geodesic analysis bootstrapping.

    PubMed

    Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M

    2008-04-01

    We present a novel method for automatic shape model building from a collection of training shapes. The result is a shape model consisting of the mean model and the major modes of variation with a dense correspondence map between individual shapes. The framework consists of iterations where a medial shape representation is deformed into the training shapes followed by computation of the shape mean and modes of shape variation. In the first iteration, a generic shape model is used as starting point - in the following iterations in the bootstrap method, the resulting mean and modes from the previous iteration are used. Thereby, we gradually capture the shape variation in the training collection better and better. Convergence of the method is explicitly enforced. The method is evaluated on collections of artificial training shapes where the expected shape mean and modes of variation are known by design. Furthermore, collections of real prostates and cartilage sheets are used in the evaluation. The evaluation shows that the method is able to capture the training shapes close to the attainable accuracy already in the first iteration. Furthermore, the correspondence properties measured by generality, specificity, and compactness are improved during the shape model building iterations.

  7. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  8. Automatic A-set selection for dynamics analysis

    NASA Technical Reports Server (NTRS)

    Allen, Tom

    1993-01-01

    A method for selecting optimum NASTRAN analysis set degrees of freedom for the dynamic eigenvalue problem is described. Theoretical development of the Guyan reduction procedure on which the method is based is first summarized. The algorithm used to select the analysis set degrees of freedom is then developed. Two example problems are provided to demonstrate the accuracy of the algorithm.

  9. Automatic Method of Supernovae Classification by Modeling Human Procedure of Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Módolo, Marcelo; Rosa, Reinaldo; Guimaraes, Lamartine N. F.

    2016-07-01

    The classification of a recently discovered supernova must be done as quickly as possible in order to define what information will be captured and analyzed in the following days. This classification is not trivial and only a few experts astronomers are able to perform it. This paper proposes an automatic method that models the human procedure of classification. It uses Multilayer Perceptron Neural Networks to analyze the supernovae spectra. Experiments were performed using different pre-processing and multiple neural network configurations to identify the classic types of supernovae. Significant results were obtained indicating the viability of using this method in places that have no specialist or that require an automatic analysis.

  10. System for Automatic Detection and Analysis of Targets in FMICW Radar Signal

    NASA Astrophysics Data System (ADS)

    Rejfek, Luboš; Mošna, Zbyšek; Urbář, Jaroslav; Koucká Knížová, Petra

    2016-01-01

    This paper presents the automatic system for the processing of the signals from the frequency modulated interrupted continuous wave (FMICW) radar and describes methods for the primary signal processing. Further, we present methods for the detection of the targets in strong noise. These methods are tested both on the real and simulated signals. The real signals were measured using the developed at the IAP CAS experimental prototype of FMICW radar with operational frequency 35.4 GHz. The measurement campaign took place at the TU Delft, the Netherlands. The obtained results were used for development of the system for the automatic detection and analysis of the targets measured by the FMICW radar.

  11. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  12. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  13. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  14. Automatic Detection of Military Targets Utilising Neural Networks and Scale Space Analysis

    DTIC Science & Technology

    2001-04-01

    Detection of Military Targets utilising Neural Networks and Scale Space Analysis A. Khashman Chairman, Department of Computer Engineering Near East...and 3) high computational cost. This required for the employment of neural networks and new approach to edge detection is formalized in the providing...analysis, edge detection and of the Laplacian of the Gaussian (FLoG) edge neural networks . The result is an automatic edge detection operator [1][2], as

  15. Development of automatic movement analysis system for a small laboratory animal using image processing

    NASA Astrophysics Data System (ADS)

    Nagatomo, Satoshi; Kawasue, Kikuhito; Koshimoto, Chihiro

    2013-03-01

    Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

  16. Automatic forensic analysis of automotive paints using optical microscopy.

    PubMed

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time.

  17. Learning Enterprise Malware Triage from Automatic Dynamic Analysis

    DTIC Science & Technology

    2013-03-01

    mother, for always encouraging me academically. Your sacrifices have made this journey and these opportunities possible. Thank you, Major Thomas Dube...Automated Malware Analysis - Cuckoo Sandbox”, Nov 15 2012. URL http://www.cuckoosandbox.org/. [13] Hall, Mark, Eibe Frank, Geoffrey Holmes, Bernhard ...awareness’ ‘feature generation’ ‘feature selection’ ‘malware instruction set’ ‘n-gram’ ‘q-gram’ U U U UU 75 Thomas E. Dube, Maj, USAF (ENG) (937) 255-3636 ext. 4613

  18. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  19. Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.

    2014-01-01

    Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when symptom severity was high, participants made more facial expressions associated with contempt, smiled less, and those smiles that occurred were more likely to be accompanied by facial actions associated with contempt. These results are consistent with the “social risk hypothesis” of depression. According to this hypothesis, when symptoms are severe, depressed participants withdraw from other people in order to protect themselves from anticipated rejection, scorn, and social exclusion. As their symptoms fade, participants send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and produced the same pattern of depression effects suggests that automatic facial expression analysis may be ready for use in behavioral and clinical science. PMID:24598859

  20. Biosignal analysis to assess mental stress in automatic driving of trucks: palmar perspiration and masseter electromyography.

    PubMed

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-03-02

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports.

  1. Automatic Determination of Bacterioplankton Biomass by Image Analysis

    PubMed Central

    Bjørnsen, Peter Koefoed

    1986-01-01

    Image analysis was applied to epifluorescense microscopy of acridine orange-stained plankton samples. A program was developed for discrimination and binary segmentation of digitized video images, taken by an ultrasensitive video camera mounted on the microscope. Cell volumes were estimated from area and perimeter of the objects in the binary image. The program was tested on fluorescent latex beads of known diameters. Biovolumes measured by image analysis were compared with directly determined carbon biomasses in batch cultures of estuarine and freshwater bacterioplankton. This calibration revealed an empirical conversion factor from biovolume to biomass of 0.35 pg of C μm−3 (± 0.03 95% confidence limit). The deviation of this value from the normally used conversion factors of 0.086 to 0.121 pg of C μm−3 is discussed. The described system was capable of measuring 250 cells within 10 min, providing estimates of cell number, mean cell volume, and biovolume with a precision of 5%. Images PMID:16347077

  2. The Romanian-English Contrastive Analysis Project; Contrastive Studies in the Syntax and Semantics of English and Romanian, Vol. 6.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The sixth volume of this series contains eight contrastive studies in the syntax and semantics of English and Romanian. They are: "Criteria for the Contrastive Analysis of English Nouns," by Andrei Bantas; "Adjectives as Noun Modifiers in Post-Verbal Position," by Ioana Poenaru; "Towards a Semantic Description of 'Tense' and 'Aspect' in English…

  3. IMPROMPTU: a system for automatic 3D medical image-analysis.

    PubMed

    Sundaramoorthy, G; Hoford, J D; Hoffman, E A; Higgins, W E

    1995-01-01

    The utility of three-dimensional (3D) medical imaging is hampered by difficulties in extracting anatomical regions and making measurements in 3D images. Presently, a user is generally forced to use time-consuming, subjective, manual methods, such as slice tracing and region painting, to define regions of interest. Automatic image-analysis methods can ameliorate the difficulties of manual methods. This paper describes a graphical user interface (GUI) system for constructing automatic image-analysis processes for 3D medical-imaging applications. The system, referred to as IMPROMPTU, provides a user-friendly environment for prototyping, testing and executing complex image-analysis processes. IMPROMPTU can stand alone or it can interact with an existing graphics-based 3D medical image-analysis package (VIDA), giving a strong environment for 3D image-analysis, consisting of tools for visualization, manual interaction, and automatic processing. IMPROMPTU links to a large library of 1D, 2D, and 3D image-processing functions, referred to as VIPLIB, but a user can easily link in custom-made functions. 3D applications of the system are given for left-ventricular chamber, myocardial, and upper-airway extractions.

  4. Automatic Fatigue Detection of Drivers through Yawning Analysis

    NASA Astrophysics Data System (ADS)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  5. Automatic analysis of quantitative NMR data of pharmaceutical compound libraries.

    PubMed

    Liu, Xuejun; Kolpak, Michael X; Wu, Jiejun; Leo, Gregory C

    2012-08-07

    In drug discovery, chemical library compounds are usually dissolved in DMSO at a certain concentration and then distributed to biologists for target screening. Quantitative (1)H NMR (qNMR) is the preferred method for the determination of the actual concentrations of compounds because the relative single proton peak areas of two chemical species represent the relative molar concentrations of the two compounds, that is, the compound of interest and a calibrant. Thus, an analyte concentration can be determined using a calibration compound at a known concentration. One particularly time-consuming step in the qNMR analysis of compound libraries is the manual integration of peaks. In this report is presented an automated method for performing this task without prior knowledge of compound structures and by using an external calibration spectrum. The script for automated integration is fast and adaptable to large-scale data sets, eliminating the need for manual integration in ~80% of the cases.

  6. Automatic localization of cerebral cortical malformations using fractal analysis

    NASA Astrophysics Data System (ADS)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  7. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms

    PubMed Central

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F.

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7–76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  8. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  9. Investigation of Ballistic Evidence through an Automatic Image Analysis and Identification System.

    PubMed

    Kara, Ilker

    2016-05-01

    Automated firearms identification (AFI) systems contribute to shedding light on criminal events by comparison between different pieces of evidence on cartridge cases and bullets and by matching similar ones that were fired from the same firearm. Ballistic evidence can be rapidly analyzed and classified by means of an automatic image analysis and identification system. In addition, it can be used to narrow the range of possible matching evidence. In this study conducted on the cartridges ejected from the examined pistol, three imaging areas, namely the firing pin impression, capsule traces, and the intersection of these traces, were compared automatically using the image analysis and identification system through the correlation ranking method to determine the numeric values that indicate the significance of the similarities. These numerical features that signify the similarities and differences between pistol makes and models can be used in groupings to make a distinction between makes and models of pistols.

  10. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    NASA Astrophysics Data System (ADS)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  11. Theoretical Analysis of the Longitudinal Behavior of an Automatically Controlled Supersonic Interceptor During the Attack Phase

    NASA Technical Reports Server (NTRS)

    Gates, Ordway B., Jr.; Woodling, C. H.

    1959-01-01

    Theoretical analysis of the longitudinal behavior of an automatically controlled supersonic interceptor during the attack phase against a nonmaneuvering target is presented. Control of the interceptor's flight path is obtained by use of a pitch rate command system. Topics lift, and pitching moment, effects of initial tracking errors, discussion of normal acceleration limited, limitations of control surface rate and deflection, and effects of neglecting forward velocity changes of interceptor during attack phase.

  12. Automatic Assessment and Reduction of Noise using Edge Pattern Analysis in Non-Linear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  13. Automatic analysis of auditory nerve electrically evoked compound action potential with an artificial neural network.

    PubMed

    Charasse, Basile; Thai-Van, Hung; Chanal, Jean Marc; Berger-Vachon, Christian; Collet, Lionel

    2004-07-01

    The auditory nerve's electrically evoked compound action potential is recorded in deaf patients equipped with the Nucleus 24 cochlear implant using a reverse telemetry system (NRT). Since the threshold of the NRT response (NRT-T) is thought to reflect the psychophysics needed for programming cochlear implants, efforts have been made by specialized management teams to develop its use. This study aimed at developing a valid tool, based on artificial neural networks (ANN) technology, for automatic estimation of NRT-T. The ANN used was a single layer perceptron, trained with 120 NRT traces. Learning traces differed from data used for the validation. A total of 550 NRT traces from 11 cochlear implant subjects were analyzed separately by the system and by a group of physicians with expertise in NRT analysis. Both worked to determine 37 NRT-T values, using the response amplitude growth function (AGF) (linear regression of response amplitudes obtained at decreasing stimulus intensity levels). The validity of the system was assessed by comparing the NRT-T values automatically determined by the system with those determined by the physicians. A strong correlation was found between automatic and physician-obtained NRT-T values (Pearson r correlation coefficient >0.9). ANOVA statistics confirmed that automatic NRT-Ts did not differ from physician-obtained values (F = 0.08999, P = 0.03). Moreover, the average error between NRT-Ts predicted by the system and NRT-Ts measured by the physicians (3.6 stimulation units) did not differ significantly from the average error between NRT-Ts measured by each of the three physicians (4.2 stimulation units). In conclusion, the automatic system developed in this study was found to be as efficient as human experts for fitting the amplitude growth function and estimating NRT-T, with the advantage of considerable time-saving.

  14. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  15. Nonverbal Social Withdrawal in Depression: Evidence from manual and automatic analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, S. Mohammad; Hammal, Zakia; Rosenwald, Dean P.

    2014-01-01

    The relationship between nonverbal behavior and severity of depression was investigated by following depressed participants over the course of treatment and video recording a series of clinical interviews. Facial expressions and head pose were analyzed from video using manual and automatic systems. Both systems were highly consistent for FACS action units (AUs) and showed similar effects for change over time in depression severity. When symptom severity was high, participants made fewer affiliative facial expressions (AUs 12 and 15) and more non-affiliative facial expressions (AU 14). Participants also exhibited diminished head motion (i.e., amplitude and velocity) when symptom severity was high. These results are consistent with the Social Withdrawal hypothesis: that depressed individuals use nonverbal behavior to maintain or increase interpersonal distance. As individuals recover, they send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and revealed the same pattern of findings suggests that automatic facial expression analysis may be ready to relieve the burden of manual coding in behavioral and clinical science. PMID:25378765

  16. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  17. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  18. A new and fast methodology to assess oxidative damage in cardiovascular diseases risk development through eVol-MEPS-UHPLC analysis of four urinary biomarkers.

    PubMed

    Mendes, Berta; Silva, Pedro; Mendonça, Isabel; Pereira, Jorge; Câmara, José S

    2013-11-15

    In this work, a new, fast and reliable methodology using a digitally controlled microextraction by packed sorbent (eVol(®)-MEPS) followed by ultra-high pressure liquid chromatography (UHPLC) analysis with photodiodes (PDA) detection, was developed to establish the urinary profile levels of four putative oxidative stress biomarkers (OSBs) in healthy subjects and patients evidencing cardiovascular diseases (CVDs). This data was used to verify the suitability of the selected OSBs (uric acid-UAc, malondialdehyde-MDA, 5-(hydroxymethyl)uracil-5-HMUra and 8-hydroxy-2'-deoxyguanosine-8-oxodG) as potential biomarkers of CVDs progression. Important parameters affecting the efficiency of the extraction process were optimized, particularly stationary phase selection, pH influence, sample volume, number of extraction cycles and washing and elution volumes. The experimental conditions that allowed the best extraction efficiency, expressed in terms of total area of the target analytes and data reproducibility, includes a 10 times dilution and pH adjustment of the urine samples to 6.0, followed by a gradient elution through the C8 adsorbent with 5 times 50 µL of 0.01% formic acid and 3×50 µL of 20% methanol in 0.01% formic acid. The chromatographic separation of the target analytes was performed with a HSS T3 column (100 mm × 2.1 mm, 1.7 µm in particle size) using 0.01% formic acid 20% methanol at 250 µL min(-1). The methodology was validated in terms of selectivity, linearity, instrumental limit of detection (LOD), method limit of quantification (LOQ), matrix effect, accuracy and precision (intra-and inter-day). Good results were obtained in terms of selectivity and linearity (r(2)>0.9906), as well as the LOD and LOQ, whose values were low, ranging from 0.00005 to 0.72 µg mL(-1) and 0.00023 to 2.31 µg mL(-1), respectively. The recovery results (91.1-123.0%), intra-day (1.0-8.3%), inter-day precision (4.6-6.3%) and the matrix effect (60.1-110.3%) of eVol

  19. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    NASA Astrophysics Data System (ADS)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  20. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    NASA Astrophysics Data System (ADS)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  1. Automatic Segmentation and Quantitative Analysis of the Articular Cartilages From Magnetic Resonance Images of the Knee

    PubMed Central

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K.; Ourselin, Sébastien

    2010-01-01

    In this paper, we present a segmentation scheme that automatically and accurately segments all the cartilages from magnetic resonance (MR) images of nonpathological knees. Our scheme involves the automatic segmentation of the bones using a three-dimensional active shape model, the extraction of the expected bone-cartilage interface (BCI), and cartilage segmentation from the BCI using a deformable model that utilizes localization, patient specific tissue estimation and a model of the thickness variation. The accuracy of this scheme was experimentally validated using leave one out experiments on a database of fat suppressed spoiled gradient recall MR images. The scheme was compared to three state of the art approaches, tissue classification, a modified semi-automatic watershed algorithm and nonrigid registration (B-spline based free form deformation). Our scheme obtained an average Dice similarity coefficient (DSC) of (0.83, 0.83, 0.85) for the (patellar, tibial, femoral) cartilages, while (0.82, 0.81, 0.86) was obtained with a tissue classifier and (0.73, 0.79, 0.76) was obtained with nonrigid registration. The average DSC obtained for all the cartilages using a semi-automatic watershed algorithm (0.90) was slightly higher than our approach (0.89), however unlike this approach we segment each cartilage as a separate object. The effectiveness of our approach for quantitative analysis was evaluated using volume and thickness measures with a median volume difference error of (5.92, 4.65, 5.69) and absolute Laplacian thickness difference of (0.13, 0.24, 0.12) mm. PMID:19520633

  2. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    PubMed

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments.

  3. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis

    PubMed Central

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text “The North Wind and the Sun” were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. PMID:26136813

  4. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  5. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    SciTech Connect

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.; Jin, Shuangshuang; Lin, Jenglung; Hauer, Matthew L.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliably and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.

  6. Automatic generation of stop word lists for information retrieval and analysis

    DOEpatents

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  7. Semi-automatic detection of skin malformations by analysis of spectral images

    NASA Astrophysics Data System (ADS)

    Rubins, U.; Spigulis, J.; Valeine, L.; Berzina, A.

    2013-06-01

    The multi-spectral imaging technique to reveal skin malformations has been described in this work. Four spectral images taken at polarized monochromatic LED illumination (450nm, 545nm, 660nm and 940 nm) and polarized white LED light imaged by CMOS sensor via cross-oriented polarizing filter were analyzed to calculate chromophore maps. The algorithm based on skin color analysis and user-defined threshold selection allows highlighting of skin areas with predefined chromophore concentration semi-automatically. Preliminary results of clinical tests are presented.

  8. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  9. Urban land use of the Sao Paulo metropolitan area by automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Niero, M.; Foresti, C.

    1983-01-01

    The separability of urban land use classes in the metropolitan area of Sao Paulo was studied by means of automatic analysis of MSS/LANDSAT digital data. The data were analyzed using the media K and MAXVER classification algorithms. The land use classes obtained were: CBD/vertical growth area, residential area, mixed area, industrial area, embankment area type 1, embankment area type 2, dense vegetation area and sparse vegetation area. The spectral analysis of representative samples of urban land use classes was done using the "Single Cell" analysis option. The classes CBD/vertical growth area, residential area and embankment area type 2 showed better spectral separability when compared to the other classes.

  10. Performance Analysis of Distributed Applications using Automatic Classification of Communication Inefficiencies

    SciTech Connect

    Vetter, J.

    1999-11-01

    We present a technique for performance analysis that helps users understand the communication behavior of their message passing applications. Our method automatically classifies individual communication operations and it reveals the cause of communication inefficiencies in the application. This classification allows the developer to focus quickly on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, we trace the message operations of MPI applications and then classify each individual communication event using decision tree classification, a supervised learning technique. We train our decision tree using microbenchmarks that demonstrate both efficient and inefficient communication. Since our technique adapts to the target system's configuration through these microbenchmarks, we can simultaneously automate the performance analysis process and improve classification accuracy. Our experiments on four applications demonstrate that our technique can improve the accuracy of performance analysis, and dramatically reduce the amount of data that users must encounter.

  11. Automatic analysis of image of surface structure of cell wall-deficient EVC.

    PubMed

    Li, S; Hu, K; Cai, N; Su, W; Xiong, H; Lou, Z; Lin, T; Hu, Y

    2001-01-01

    Some computer applications for cell characterization in medicine and biology, such as analysis of surface structure of cell wall-deficient EVC (El Tor Vibrio of Cholera), operate with cell samples taken from very small areas of interest. In order to perform texture characterization in such an application, only a few texture operators can be employed: the operators should be insensitive to noise and image distortion and be reliable in order to estimate texture quality from images. Therefore, we introduce wavelet theory and mathematical morphology to analyse the cellular surface micro-area image obtained by SEM (Scanning Electron Microscope). In order to describe the quality of surface structure of cell wall-deficient EVC, we propose a fully automatic computerized method. The image analysis process is carried out in two steps. In the first, we decompose the given image by dyadic wavelet transform and form an image approximation with higher resolution, by doing so, we perform edge detection of given images efficiently. In the second, we introduce many operations of mathematical morphology to obtain morphological quantitative parameters of surface structure of cell wall-deficient EVC. The obtained results prove that the method can eliminate noise, detect the edge and extract the feature parameters validly. In this work, we have built automatic analytic software named "EVC.CELL".

  12. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    NASA Astrophysics Data System (ADS)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  13. In-air PIXE set-up for automatic analysis of historical document inks

    NASA Astrophysics Data System (ADS)

    Budnar, Miloš; Simčič, Jure; Rupnik, Zdravko; Uršič, Mitja; Pelicon, Primož; Kolar, Jana; Strlič, Matija

    2004-06-01

    The iron gall inks were one of the writing materials mostly applied in historical documents of the western civilization. Due to the ink corrosive character, the documents are faced with a danger of being seriously, and in some cases also irreversibly changed. The elemental composition of the inks is an important information for taking the adequate conservation action [Project InkCor, http://www.infosrvr.nuk.uni-lj.si/jana/Inkcor/index.htm, and references within]. Here, the in-air PIXE analysis offers an indispensable tool due to its sensitivity and almost non-destructive character. An experimental approach developed for precise and automatic analysis of documents at Jožef Stefan Institute Tandetron accelerator is presented. The selected documents were mounted, one at the time, on the positioning board and the chosen ink spots on the sample were irradiated by 1.7 MeV protons. The data acquisition on the selected ink spots is done automatically throughout the measuring pattern determined prior to the measurement. The chemical elements identified in the documents ranged from Si to Pb, and between them the significant iron gall ink components like Fe, S, K, Cu, Zn, Co, Mn, Ni were deduced with precision of ±10%. The measurements were done non-destructively and no visible damage was observed on the irradiated documents.

  14. Automatic determination of chlorine without standard solutions using a biamperometric flow-batch analysis system.

    PubMed

    Nascimento, Valberes B; Selva, Thiago M G; Coelho, Elaine C S; Santos, Francyana P; Antônio, Jadielson L S; Silva, José R; Gaião, Edvaldo N; Araújo, Mário C U

    2010-04-15

    This study presents an automatic analysis system that does not require the use of standard solutions. The system uses an electrochemical flow cell for in line generation of the standards, and operates under the standard addition technique. The versatility of this system was demonstrated by the development of a one key touch fully automatic method for the determination of total available chlorine in real samples. The extremely simple, accurate and inexpensive method was based simply on the biamperometric monitoring of the well known redox reaction of chlorine with iodide ions in a flow-batch system, where the produced iodine (triiodide ions) generates an electrical current proportional to the chlorine concentration in the sample. The flow-batch parameters were optimized to maximize the sensitivity without losses on the precision of the analysis. An excellent linear dependence between the biamperometric signal and the chlorine concentration for the standard additions and a good agreement between the proposed approach and a reference method were obtained. The method was successfully applied to determine chlorine in several different bleach and chlorinated water samples (r=0.9995, LOD=8.261 x 10(-7) mol L(-1)) and could be easily extended to other oxidants and samples. Comparison to a reference method and recoveries close to 100% demonstrated the reliability of the proposed method. In addition, low residue disposal and reagent consumption, allied with high accuracy and precision, make it very promising for routine applications.

  15. Intercellular fluorescence background on microscope slides: some problems and solutions for automatic analysis

    NASA Astrophysics Data System (ADS)

    Piper, Jim; Sudar, Damir; Peters, Don; Pinkel, Daniel

    1994-05-01

    Although high contrast between signal and the dark background is often claimed as a major advantage of fluorescence staining in cytology and cytogenetics, in practice this is not always the case and in some circumstances the inter-cellular or, in the case of metaphase preparations, the inter-chromosome background can be both brightly fluorescent and vary substantially across the slide or even across a single metaphase. Bright background results in low image contrast, making automatic detection of metaphase cells more difficult. The background correction strategy employed in automatic search must both cope with variable background and be computationally efficient. The method employed in a fluorescence metaphase finder is presented, and the compromises involved are discussed. A different set of problems arise when the analysis is aimed at accurate quantification of the fluorescence signal. Some insight into the nature of the background in the case of comparative genomic hybridization is obtained by image analysis of data obtained from experiments using cell lines with known abnormal copy numbers of particular chromosome types.

  16. Analysis and Exploitation of Automatically Generated Scene Structure from Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Nilosek, David R.

    The recent advancements made in the field of computer vision, along with the ever increasing rate of computational power has opened up opportunities in the field of automated photogrammetry. Many researchers have focused on using these powerful computer vision algorithms to extract three-dimensional point clouds of scenes from multi-view imagery, with the ultimate goal of creating a photo-realistic scene model. However, geographically accurate three-dimensional scene models have the potential to be exploited for much more than just visualization. This work looks at utilizing automatically generated scene structure from near-nadir aerial imagery to identify and classify objects within the structure, through the analysis of spatial-spectral information. The limitation to this type of imagery is imposed due to the common availability of this type of aerial imagery. Popular third-party computer-vision algorithms are used to generate the scene structure. A voxel-based approach for surface estimation is developed using Manhattan-world assumptions. A surface estimation confidence metric is also presented. This approach provides the basis for further analysis of surface materials, incorporating spectral information. Two cases of spectral analysis are examined: when additional hyperspectral imagery of the reconstructed scene is available, and when only R,G,B spectral information can be obtained. A method for registering the surface estimation to hyperspectral imagery, through orthorectification, is developed. Atmospherically corrected hyperspectral imagery is used to assign reflectance values to estimated surface facets for physical simulation with DIRSIG. A spatial-spectral region growing-based segmentation algorithm is developed for the R,G,B limited case, in order to identify possible materials for user attribution. Finally, an analysis of the geographic accuracy of automatically generated three-dimensional structure is performed. An end-to-end, semi-automated, workflow

  17. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring

  18. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    PubMed

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-08

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  19. Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain

    NASA Astrophysics Data System (ADS)

    Krauß, Thomas; Fischer, Peter

    2016-08-01

    In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.

  20. Automatic fuzzy object-based analysis of VHSR images for urban objects extraction

    NASA Astrophysics Data System (ADS)

    Sebari, Imane; He, Dong-Chen

    2013-05-01

    We present an automatic approach for object extraction from very high spatial resolution (VHSR) satellite images based on Object-Based Image Analysis (OBIA). The proposed solution requires no input data other than the studied image. Not input parameters are required. First, an automatic non-parametric cooperative segmentation technique is applied to create object primitives. A fuzzy rule base is developed based on the human knowledge used for image interpretation. The rules integrate spectral, textural, geometric and contextual object proprieties. The classes of interest are: tree, lawn, bare soil and water for natural classes; building, road, parking lot for man made classes. The fuzzy logic is integrated in our approach in order to manage the complexity of the studied subject, to reason with imprecise knowledge and to give information on the precision and certainty of the extracted objects. The proposed approach was applied to extracts of Ikonos images of Sherbrooke city (Canada). An overall total extraction accuracy of 80% was observed. The correctness rates obtained for building, road and parking lot classes are of 81%, 75% and 60%, respectively.

  1. Semi-automatic system for UV images analysis of historical musical instruments

    NASA Astrophysics Data System (ADS)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  2. Automatic analysis of left ventricular ejection fraction using stroke volume images.

    PubMed

    Nelson, T R; Verba, J W; Bhargava, V; Shabetai, R; Slutsky, R

    1983-01-01

    The purpose of this study was to analyze, validate, and report on an automatic computer algorithm for analyzing left ventricular ejection fraction and to indicate future applications of the technique to other chambers and more advanced measurements. Thirty-eight patients were studied in the cardiac catheterization laboratory by equilibrium radionuclide ventriculography and concurrent contrast ventriculography. The temporal and spatial behavior of each picture element in a filtered stroke volume image series was monitored throughout the cardiac cycle. Pixels that met specific phase, amplitude, and derivative criteria were assigned to the appropriate chamber. Volume curves were generated from regions of interest for each chamber to enable calculation of the left ventricular ejection fraction. Left ventricular ejection fractions showed a good correlation (r = 0.89) between the two techniques. Ejection fractions ranged between 0.12 and 0.88, showing a wide range of application. It is concluded that automatic analysis of left ventricular ejection fraction is possible using the present algorithm and will be useful in improving the reproducibility and providing more accurate information during exercise protocols, pharmaceutical interventions, and routine clinical studies.

  3. Automatic geocoding of high-value targets using structural image analysis and GIS data

    NASA Astrophysics Data System (ADS)

    Soergel, Uwe; Thoennessen, Ulrich

    1999-12-01

    Geocoding based merely on navigation data and sensor model is often not possible or precise enough. In these cases an improvement of the preregistration through image-based approaches is a solution. Due to the large amount of data in remote sensing automatic geocoding methods are necessary. For geocoding purposes appropriate tie points, which are present in image and map, have to be detected and matched. The tie points are base of the transformation function. Assigning the tie points is combinatorial problem depending on the number of tie points. This number can be reduced using structural tie points like corners or crossings of prominent extended targets (e.g. harbors, airfields). Additionally the reliability of the tie points is improved. Our approach extracts structural tie points independently in the image and in the vector map by a model-based image analysis. The vector map is provided by a GIS using ATKIS data base. The model parameters are extracted from maps or collateral information of the scenario. The two sets of tie points are automatically matched with a Geometric Hashing algorithm. The algorithm was successfully applied to VIS, IR and SAR data.

  4. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  5. Group-wise automatic mesh-based analysis of cortical thickness

    NASA Astrophysics Data System (ADS)

    Vachet, Clement; Cody Hazlett, Heather; Niethammer, Marc; Oguz, Ipek; Cates, Joshua; Whitaker, Ross; Piven, Joseph; Styner, Martin

    2011-03-01

    The analysis of neuroimaging data from pediatric populations presents several challenges. There are normal variations in brain shape from infancy to adulthood and normal developmental changes related to tissue maturation. Measurement of cortical thickness is one important way to analyze such developmental tissue changes. We developed a novel framework that allows group-wise automatic mesh-based analysis of cortical thickness. Our approach is divided into four main parts. First an individual pre-processing pipeline is applied on each subject to create genus-zero inflated white matter cortical surfaces with cortical thickness measurements. The second part performs an entropy-based group-wise shape correspondence on these meshes using a particle system, which establishes a trade-off between an even sampling of the cortical surfaces and the similarity of corresponding points across the population using sulcal depth information and spatial proximity. A novel automatic initial particle sampling is performed using a matched 98-lobe parcellation map prior to a particle-splitting phase. Third, corresponding re-sampled surfaces are computed with interpolated cortical thickness measurements, which are finally analyzed via a statistical vertex-wise analysis module. This framework consists of a pipeline of automated 3D Slicer compatible modules. It has been tested on a small pediatric dataset and incorporated in an open-source C++ based high-level module called GAMBIT. GAMBIT's setup allows efficient batch processing, grid computing and quality control. The current research focuses on the use of an average template for correspondence and surface re-sampling, as well as thorough validation of the framework and its application to clinical pediatric studies.

  6. AGARD Flight Test Series. Volume 10. Weapon Delivery Analysis and Ballistic Flight Testing (L’Analyse du Largage d’Armes et les en Vol Balistique).

    DTIC Science & Technology

    1992-07-01

    iniDin-^cMCMCMCM CM cMOninmcQCMCMi^’-cooO’-Oino)’tO)icicMcooJcocMCM-^coincMcMco-^cocMOioooinr~con-^cM in OO’J-OJOOCOCMI^’—^( DCM ...II I I I I I I I ^inroincMCMCMC0i-i>cflCMOw^o< DCM <»coreh-roteiOi-cMCMCMi^cor~wro win(flWNm^t0i-OTWCMCMCowin^cii-wi-(OMroOffli0CMOi...Federal Air Force currently flies three weapon systems: F-4F Phantom - automatic release possible Alpha Jet - CCIP mode PA-200 Tornado - automatic

  7. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  8. An image analysis approach for automatically re-orienteering CT images for dental implants.

    PubMed

    Cucchiara, Rita; Lamma, Evelina; Sansoni, Tommaso

    2004-06-01

    In the last decade, computerized tomography (CT) has become the most frequently used imaging modality to obtain a correct pre-operative implant planning. In this work, we present an image analysis and computer vision approach able to identify, from the reconstructed 3D data set, the optimal cutting plane specific to each implant to be planned, in order to obtain the best view of the implant site and to have correct measures. If the patient requires more implants, different cutting planes are automatically identified, and the axial and cross-sectional images can be re-oriented accordingly to each of them. In the paper, we describe the defined algorithms in order to recognize 3D markers (each one aligned with a missed tooth for which an implant has to be planned) in the 3D reconstructed space, and the results in processing real exams, in terms of effectiveness and precision and reproducibility of the measure.

  9. Application of automatic image analysis for the investigation of autoclaved aerated concrete structure

    SciTech Connect

    Petrov, I.; Schlegel, E. . Inst. fuer Silikattechnik)

    1994-01-01

    Autoclaved aerated concrete (AAC) is formed from small-grained mixtures of raw materials and Al-powder as an air entraining agent. Owing to its high porosity AAC has a low bulk density which leads to very good heat insulating qualities. Automatic image analysis in connection with stereology and stochastic geometry was used to describe the size distribution of air pores in autoclaved concrete. The experiments were carried out an AAC samples with extremely different bulk densities and compressive strengths. The assumption of an elliptic shape of pores leads to an unambiguous characterization of structure by bi-histograms. It will be possible to calculate the spatial pore size distribution by these histograms, if the pores are assumed as being spheroids. A marked point field model and the pair correlation function g[sub a](r) were used to describe the pore structure.

  10. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  11. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    PubMed

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  12. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  13. Development of a machine learning technique for automatic analysis of seafloor image data: Case example, Pogonophora coverage at mud volcanoes

    NASA Astrophysics Data System (ADS)

    Lüdtke, A.; Jerosch, K.; Herzog, O.; Schlüter, M.

    2012-02-01

    Digital image processing provides powerful tools for fast and precise analysis of large image data sets in marine and geoscientific applications. Because of the increasing volume of georeferenced image and video data acquired by underwater platforms such as remotely operated vehicles, means of automatic analysis of the acquired image data are required. A new and fast-developing application is the combination of video imagery and mosaicking techniques for seafloor habitat mapping. In this article we introduce an approach to fully automatic detection and quantification of Pogonophora coverage in seafloor video mosaics from mud volcanoes. The automatic recognition is based on textural image features extracted from the raw image data and classification using machine learning techniques. Classification rates of up to 98.86% were achieved on the training data. The approach was extensively validated on a data set of more than 4000 seafloor video mosaics from the Håkon Mosby Mud Volcano.

  14. Automatic Differentiation Package

    SciTech Connect

    Gay, David M.; Phipps, Eric; Bratlett, Roscoe

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  15. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    PubMed

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p < 0.01; RV, r = 0.79, p < 0.01). Our semi-automatic longitudinal strain analysis in 4CH cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  16. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool.

  17. A system for automatic recording and analysis of motor activity in rats.

    PubMed

    Heredia-López, Francisco J; May-Tuyub, Rossana M; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J

    2013-03-01

    We describe the design and evaluation of an electronic system for the automatic recording of motor activity in rats. The device continually locates the position of a rat inside a transparent acrylic cube (50 cm/side) with infrared sensors arranged on its walls so as to correspond to the x-, y-, and z-axes. The system is governed by two microcontrollers. The raw data are saved in a text file within a secure digital memory card, and offline analyses are performed with a library of programs that automatically compute several parameters based on the sequence of coordinates and the time of occurrence of each movement. Four analyses can be made at specified time intervals: traveled distance (cm), movement speed (cm/s), time spent in vertical exploration (s), and thigmotaxis (%). In addition, three analyses are made for the total duration of the experiment: time spent at each x-y coordinate pair (min), time spent on vertical exploration at each x-y coordinate pair (s), and frequency distribution of vertical exploration episodes of distinct durations. User profiles of frequently analyzed parameters may be created and saved for future experimental analyses, thus obtaining a full set of analyses for a group of rats in a short time. The performance of the developed system was assessed by recording the spontaneous motor activity of six rats, while their behaviors were simultaneously videotaped for manual analysis by two trained observers. A high and significant correlation was found between the values measured by the electronic system and by the observers.

  18. Fractal Analysis of Elastographic Images for Automatic Detection of Diffuse Diseases of Salivary Glands: Preliminary Results

    PubMed Central

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of “real-time” elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology. PMID:23762183

  19. Fractal analysis of elastographic images for automatic detection of diffuse diseases of salivary glands: preliminary results.

    PubMed

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of "real-time" elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology.

  20. Application of an automatic adaptive filter for Heart Rate Variability analysis.

    PubMed

    Dos Santos, Laurita; Barroso, Joaquim J; Macau, Elbert E N; de Godoy, Moacir F

    2013-12-01

    The presence of artifacts and noise effects in temporal series can seriously hinder the analysis of Heart Rate Variability (HRV). The tachograms should be carefully edited to avoid erroneous interpretations. The physician should carefully analyze the tachogram in order to detect points that might be associated with unlikely biophysical behavior and manually eliminate them from the data series. However, this is a time-consuming procedure. To facilitate the pre-analysis of the tachogram, this study uses a method of data filtering based on an adaptive filter which is quickly able to analyze a large amount of data. The method was applied to 229 time series from a database of patients with different clinical conditions: premature newborns, full-term newborns, healthy young adults, adults submitted to a very-low-calorie diet, and adults under preoperative evaluation for coronary artery bypass grafting. This proposed method is compared to the demanding conventional method, wherein the corrections of occasional ectopic beats and artifacts are usually manually executed by a specialist. To confirm the reliability of the results obtained, correlation coefficients were calculated, using both automatic and manual methods of ltering for each HRV index selected. A high correlation between the results was found, with highly significant p values, for all cases, except for some parameters analyzed in the premature newborns group, an issue that is thoroughly discussed. The authors concluded that the proposed adaptive filtering method helps to efficiently handle the task of editing temporal series for HRV analysis.

  1. Adaptive automatic data analysis in full-field fringe-pattern-based optical metrology

    NASA Astrophysics Data System (ADS)

    Trusiak, Maciej; Patorski, Krzysztof; Sluzewski, Lukasz; Pokorski, Krzysztof; Sunderland, Zofia

    2016-12-01

    Fringe pattern processing and analysis is an important task of full-field optical measurement techniques like interferometry, digital holography, structural illumination and moiré. In this contribution we present several adaptive automatic data analysis solutions based on the notion of Hilbert-Huang transform for measurand retrieval via fringe pattern phase and amplitude demodulation. The Hilbert-Huang transform consists of 2D empirical mode decomposition algorithm and Hilbert spiral transform analysis. Empirical mode decomposition adaptively dissects a meaningful number of same-scale subimages from the analyzed pattern - it is a data-driven method. Appropriately managing this set of unique subimages results in a very powerful fringe pre-filtering tool. Phase/amplitude demodulation is performed using Hilbert spiral transform aided by the local fringe orientation estimator. We describe several optical measurement techniques for technical and biological objects characterization basing on the especially tailored Hilbert-Huang algorithm modifications for fringe pattern denoising, detrending and amplitude/phase demodulation.

  2. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    SciTech Connect

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  3. Automatic Classification of Staphylococci by Principal-Component Analysis and a Gradient Method1

    PubMed Central

    Hill, L. R.; Silvestri, L. G.; Ihm, P.; Farchi, G.; Lanciani, P.

    1965-01-01

    Hill, L. R. (Università Statale, Milano, Italy), L. G. Silvestri, P. Ihm, G. Farchi, and P. Lanciani. Automatic classification of staphylococci by principal-component analysis and a gradient method. J. Bacteriol. 89:1393–1401. 1965.—Forty-nine strains from the species Staphylococcus aureus, S. saprophyticus, S. lactis, S. afermentans, and S. roseus were submitted to different taxometric analyses; clustering was performed by single linkage, by the unweighted pair group method, and by principal-component analysis followed by a gradient method. Results were substantially the same with all methods. All S. aureus clustered together, sharply separated from S. roseus and S. afermentans; S. lactis and S. saprophyticus fell between, with the latter nearer to S. aureus. The main purpose of this study was to introduce a new taxometric technique, based on principal-component analysis followed by a gradient method, and to compare it with some other methods in current use. Advantages of the new method are complete automation and therefore greater objectivity, execution of the clustering in a space of reduced dimensions in which different characters have different weights, easy recognition of taxonomically important characters, and opportunity for representing clusters in three-dimensional models; the principal disadvantage is the need for large computer facilities. Images PMID:14293013

  4. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    SciTech Connect

    Gainey, M; Rothe, T

    2015-06-15

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to ease navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.

  5. Automatic roof plane detection and analysis in airborne lidar point clouds for solar potential assessment.

    PubMed

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m(2).

  6. Automatic Vehicle Trajectory Extraction for Traffic Analysis from Aerial Video Data

    NASA Astrophysics Data System (ADS)

    Apeltauer, J.; Babinec, A.; Herman, D.; Apeltauer, T.

    2015-03-01

    This paper presents a new approach to simultaneous detection and tracking of vehicles moving through an intersection in aerial images acquired by an unmanned aerial vehicle (UAV). Detailed analysis of spatial and temporal utilization of an intersection is an important step for its design evaluation and further traffic inspection. Traffic flow at intersections is typically very dynamic and requires continuous and accurate monitoring systems. Conventional traffic surveillance relies on a set of fixed cameras or other detectors, requiring a high density of the said devices in order to monitor the intersection in its entirety and to provide data in sufficient quality. Alternatively, a UAV can be converted to a very agile and responsive mobile sensing platform for data collection from such large scenes. However, manual vehicle annotation in aerial images would involve tremendous effort. In this paper, the proposed combination of vehicle detection and tracking aims to tackle the problem of automatic traffic analysis at an intersection from visual data. The presented method has been evaluated in several real-life scenarios.

  7. Automatic Roof Plane Detection and Analysis in Airborne Lidar Point Clouds for Solar Potential Assessment

    PubMed Central

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m2. PMID:22346695

  8. Automatic differentiation bibliography

    SciTech Connect

    Corliss, G.F.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  9. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  10. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  11. Digital automatic gain control

    NASA Technical Reports Server (NTRS)

    Uzdy, Z.

    1980-01-01

    Performance analysis, used to evaluated fitness of several circuits to digital automatic gain control (AGC), indicates that digital integrator employing coherent amplitude detector (CAD) is best device suited for application. Circuit reduces gain error to half that of conventional analog AGC while making it possible to automatically modify response of receiver to match incoming signal conditions.

  12. Automatic aerial image shadow detection through the hybrid analysis of RGB and HIS color space

    NASA Astrophysics Data System (ADS)

    Wu, Jun; Li, Huilin; Peng, Zhiyong

    2015-12-01

    This paper presents our research on automatic shadow detection from high-resolution aerial image through the hybrid analysis of RGB and HIS color space. To this end, the spectral characteristics of shadow are firstly discussed and three kinds of spectral components including the difference between normalized blue and normalized red component - BR, intensity and saturation components are selected as criterions to obtain initial segmentation of shadow region (called primary segmentation). After that, within the normalized RGB color space and HIS color space, the shadow region is extracted again (called auxiliary segmentation) using the OTSU operation, respectively. Finally, the primary segmentation and auxiliary segmentation are combined through a logical AND-connection operation to obtain reliable shadow region. In this step, small shadow areas are removed from combined shadow region and morphological algorithms are apply to fill small holes as well. The experimental results show that the proposed approach can effectively detect the shadow region from high-resolution aerial image and in high degree of automaton.

  13. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis.

    PubMed

    Liu, Chanjuan; van Netten, Jaap J; van Baal, Jeff G; Bus, Sicco A; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8% ± 1.1% sensitivity and 98.4% ± 0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  14. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  15. Automatic Robust Neurite Detection and Morphological Analysis of Neuronal Cell Cultures in High-content Screening

    PubMed Central

    Wu, Chaohong; Schulte, Joost; Sepp, Katharine J.; Littleton, J. Troy

    2011-01-01

    Cell-based high content screening (HCS) is becoming an important and increasingly favored approach in therapeutic drug discovery and functional genomics. In HCS, changes in cellular morphology and biomarker distributions provide an information-rich profile of cellular responses to experimental treatments such as small molecules or gene knockdown probes. One obstacle that currently exists with such cell-based assays is the availability of image processing algorithms that are capable of reliably and automatically analyzing large HCS image sets. HCS images of primary neuronal cell cultures are particularly challenging to analyze due to complex cellular morphology. Here we present a robust method for quantifying and statistically analyzing the morphology of neuronal cells in HCS images. The major advantages of our method over existing software lie in its capability to correct non-uniform illumination using the contrast-limited adaptive histogram equalization method; segment neuromeres using Gabor-wavelet texture analysis; and detect faint neurites by a novel phase-based neurite extraction algorithm that is invariant to changes in illumination and contrast and can accurately localize neurites. Our method was successfully applied to analyze a large HCS image set generated in a morphology screen for polyglutamine-mediated neuronal toxicity using primary neuronal cell cultures derived from embryos of a Drosophila Huntington’s Disease (HD) model. PMID:20405243

  16. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  17. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks.

    PubMed

    Wang, Changhan; Yan, Xinchen; Smith, Max; Kochhar, Kanika; Rubin, Marcie; Warren, Stephen M; Wrobel, James; Lee, Honglak

    2015-01-01

    Wound surface area changes over multiple weeks are highly predictive of the wound healing process. Furthermore, the quality and quantity of the tissue in the wound bed also offer important prognostic information. Unfortunately, accurate measurements of wound surface area changes are out of reach in the busy wound practice setting. Currently, clinicians estimate wound size by estimating wound width and length using a scalpel after wound treatment, which is highly inaccurate. To address this problem, we propose an integrated system to automatically segment wound regions and analyze wound conditions in wound images. Different from previous segmentation techniques which rely on handcrafted features or unsupervised approaches, our proposed deep learning method jointly learns task-relevant visual features and performs wound segmentation. Moreover, learned features are applied to further analysis of wounds in two ways: infection detection and healing progress prediction. To the best of our knowledge, this is the first attempt to automate long-term predictions of general wound healing progress. Our method is computationally efficient and takes less than 5 seconds per wound image (480 by 640 pixels) on a typical laptop computer. Our evaluations on a large-scale wound database demonstrate the effectiveness and reliability of the proposed system.

  18. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    PubMed

    Martínez, Leandro

    2015-01-01

    The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD) and Root Mean Square Fluctuations (RMSF) of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO) strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  19. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features.

    PubMed

    Martínez, Fabio; Manzanera, Antoine; Romero, Eduardo

    2015-01-19

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of [Formula: see text] and [Formula: see text] for the global angular acceleration and the global wing deformation respectively.

  20. Automatic fault diagnosis of rotating machines by time-scale manifold ridge analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; He, Qingbo; Kong, Fanrang

    2013-10-01

    This paper explores the improved time-scale representation by considering the non-linear property for effectively identifying rotating machine faults in the time-scale domain. A new time-scale signature, called time-scale manifold (TSM), is proposed in this study through combining phase space reconstruction (PSR), continuous wavelet transform (CWT), and manifold learning. For the TSM generation, an optimal scale band is selected to eliminate the influence of unconcerned scale components, and the noise in the selected band is suppressed by manifold learning to highlight the inherent non-linear structure of faulty impacts. The TSM reserves the non-stationary information and reveals the non-linear structure of the fault pattern, with the merits of noise suppression and resolution improvement. The TSM ridge is further extracted by seeking the ridge with energy concentration lying on the TSM signature. It inherits the advantages of both the TSM and ridge analysis, and hence is beneficial to demodulation of the fault information. Through analyzing the instantaneous amplitude (IA) of the TSM ridge, in which the noise is nearly not contained, the fault characteristic frequency can be exactly identified. The whole process of the proposed fault diagnosis scheme is automatic, and its effectiveness has been verified by means of typical faulty vibration/acoustic signals from a gearbox and bearings. A reliable performance of the new method is validated in comparison with traditional enveloping methods for rotating machine fault diagnosis.

  1. Analysis of automatic repeat request methods for deep-space downlinks

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Ekroot, L.

    1995-01-01

    Automatic repeat request (ARQ) methods cannot increase the capacity of a memoryless channel. However, they can be used to decrease the complexity of the channel-coding system to achieve essentially error-free transmission and to reduce link margins when the channel characteristics are poorly predictable. This article considers ARQ methods on a power-limited channel (e.g., the deep-space channel), where it is important to minimize the total power needed to transmit the data, as opposed to a bandwidth-limited channel (e.g., terrestrial data links), where the spectral efficiency or the total required transmission time is the most relevant performance measure. In the analysis, we compare the performance of three reference concatenated coded systems used in actual deep-space missions to that obtainable by ARQ methods using the same codes, in terms of required power, time to transmit with a given number of retransmissions, and achievable probability of word error. The ultimate limits of ARQ with an arbitrary number of retransmissions are also derived.

  2. A marked point process of rectangles and segments for automatic analysis of digital elevation models.

    PubMed

    Ortner, Mathias; Descombe, Xavier; Zerubia, Josiane

    2008-01-01

    This work presents a framework for automatic feature extraction from images using stochastic geometry. Features in images are modeled as realizations of a spatial point process of geometrical shapes. This framework allows the incorporation of a priori knowledge on the spatial repartition of features. More specifically, we present a model based on the superposition of a process of segments and a process of rectangles. The former is dedicated to the detection of linear networks of discontinuities, while the latter aims at segmenting homogeneous areas. An energy is defined, favoring connections of segments, alignments of rectangles, as well as a relevant interaction between both types of objects. The estimation is performed by minimizing the energy using a simulated annealing algorithm. The proposed model is applied to the analysis of Digital Elevation Models (DEMs). These images are raster data representing the altimetry of a dense urban area. We present results on real data provided by the IGN (French National Geographic Institute) consisting in low quality DEMs of various types.

  3. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    PubMed Central

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  4. Automatic detection of cortical and PSC cataracts using texture and intensity analysis on retro-illumination lens images.

    PubMed

    Chow, Yew Chung; Gao, Xinting; Li, Huiqi; Lim, Joo Hwee; Sun, Ying; Wong, Tien Yin

    2011-01-01

    Cataract remains a leading cause for blindness worldwide. Cataract diagnosis via human grading is subjective and time-consuming. Several methods of automatic grading are currently available, but each of them suffers from some drawbacks. In this paper, a new approach for automatic detection based on texture and intensity analysis is proposed to address the problems of existing methods and improve the performance from three aspects, namely ROI detection, lens mask generation and opacity detection. In the detection method, image clipping and texture analysis are applied to overcome the over-detection problem for clear lens images and global thresholding is exploited to solve the under-detection problem for severe cataract images. The proposed method is tested on 725 retro-illumination lens images randomly selected from a database of a community study. Experiments show improved performance compared with the state-of-the-art method.

  5. Implementation of terbium-sensitized luminescence in sequential-injection analysis for automatic analysis of orbifloxacin.

    PubMed

    Llorent-Martínez, E J; Ortega-Barrales, P; Molina-Díaz, A; Ruiz-Medina, A

    2008-12-01

    Orbifloxacin (ORBI) is a third-generation fluoroquinolone developed exclusively for use in veterinary medicine, mainly in companion animals. This antimicrobial agent has bactericidal activity against numerous gram-negative and gram-positive bacteria. A few chromatographic methods for its analysis have been described in the scientific literature. Here, coupling of sequential-injection analysis and solid-phase spectroscopy is described in order to develop, for the first time, a terbium-sensitized luminescent optosensor for analysis of ORBI. The cationic resin Sephadex-CM C-25 was used as solid support and measurements were made at 275/545 nm. The system had a linear dynamic range of 10-150 ng mL(-1), with a detection limit of 3.3 ng mL(-1) and an R.S.D. below 3% (n = 10). The analyte was satisfactorily determined in veterinary drugs and dog and horse urine.

  6. Ambulatory 24-h oesophageal impedance-pH recordings: reliability of automatic analysis for gastro-oesophageal reflux assessment.

    PubMed

    Roman, S; Bruley des Varannes, S; Pouderoux, P; Chaput, U; Mion, F; Galmiche, J-P; Zerbib, F

    2006-11-01

    Oesophageal pH-impedance monitoring allows detection of acid and non-acid gastro-oesophageal reflux (GOR) events. Visual analysis of impedance recording requires expertise. Our aim was to evaluate the efficacy of an automated analysis for GOR assessment. Seventy-three patients with suspected GORD underwent 24-h oesophageal pH-impedance monitoring. Recordings analysis was performed visually (V) and automatically using Autoscan function (AS) of Bioview software. A symptom index (SI) > or =50% was considered for a significant association between symptoms and reflux events. AS analysis detected more reflux events, especially non-acid, liquid, pure gas and proximal events. Detection of oesophageal acid exposure and acid reflux events was similar with both analyses. Agreement between V and AS analysis was good (Kendall's coefficient W > 0.750, P < 0.01) for all parameters. During pH-impedance studies, 65 patients reported symptoms. As compared to visual analysis, the sensitivity and specificity of a positive SI determined by AS were respectively 85.7% and 80% for all reflux events, 100% and 98% for acid reflux and 33% and 87.5% for non-acid reflux. Despite good agreement with visual analysis, automatic analysis overestimates the number of non-acid reflux events. Visual analysis remains the gold standard to detect an association between symptoms and non-acid reflux events.

  7. Image structural analysis in the tasks of automatic navigation of unmanned vehicles and inspection of Earth surface

    NASA Astrophysics Data System (ADS)

    Lutsiv, Vadim; Malyshev, Igor

    2013-10-01

    The automatic analysis of images of terrain is urgent for several decades. On the one hand, such analysis is a base of automatic navigation of unmanned vehicles. On the other hand, the amount of information transferred to the Earth by modern video-sensors increases, thus a preliminary classification of such data by onboard computer becomes urgent. We developed an object-independent approach to structural analysis of images. While creating the methods of image structural description, we did our best to abstract away from the partial peculiarities of scenes. Only the most general limitations were taken into account, that were derived from the laws of organization of observable environment and from the properties of image formation systems. The practical application of this theoretic approach enables reliable matching the aerospace photographs acquired from differing aspect angles, in different day-time and seasons by sensors of differing types. The aerospace photographs can be matched even with the geographic maps. The developed approach enabled solving the tasks of automatic navigation of unmanned vehicles. The signs of changes and catastrophes can be detected by means of matching and comparison of aerospace photographs acquired at different time. We present the theoretical proofs of chosen strategy of structural description and matching of images. Several examples of matching of acquired images with template pictures and maps of terrain are shown within the frameworks of navigation of unmanned vehicles or detection of signs of disasters.

  8. Stable hydrogen isotopic analysis of nanomolar molecular hydrogen by automatic multi-step gas chromatographic separation.

    PubMed

    Komatsu, Daisuke D; Tsunogai, Urumu; Kamimura, Kanae; Konno, Uta; Ishimura, Toyoho; Nakagawa, Fumiko

    2011-11-15

    We have developed a new automated analytical system that employs a continuous flow isotope ratio mass spectrometer to determine the stable hydrogen isotopic composition (δD) of nanomolar quantities of molecular hydrogen (H(2)) in an air sample. This method improves previous methods to attain simpler and lower-cost analyses, especially by avoiding the use of expensive or special devices, such as a Toepler pump, a cryogenic refrigerator, and a special evacuation system to keep the temperature of a coolant under reduced pressure. Instead, the system allows H(2) purification from the air matrix via automatic multi-step gas chromatographic separation using the coolants of both liquid nitrogen (77 K) and liquid nitrogen + ethanol (158 K) under 1 atm pressure. The analytical precision of the δD determination using the developed method was better than 4‰ for >5 nmol injections (250 mL STP for 500 ppbv air sample) and better than 15‰ for 1 nmol injections, regardless of the δD value, within 1 h for one sample analysis. Using the developed system, the δD values of H(2) can be quantified for atmospheric samples as well as samples of representative sources and sinks including those containing small quantities of H(2) , such as H(2) in soil pores or aqueous environments, for which there is currently little δD data available. As an example of such trace H(2) analyses, we report here the isotope fractionations during H(2) uptake by soils in a static chamber. The δD values of H(2) in these H(2)-depleted environments can be useful in constraining the budgets of atmospheric H(2) by applying an isotope mass balance model.

  9. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  10. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  11. Ability and efficiency of an automatic analysis software to measure microvascular parameters.

    PubMed

    Carsetti, Andrea; Aya, Hollmann D; Pierantozzi, Silvia; Bazurro, Simone; Donati, Abele; Rhodes, Andrew; Cecconi, Maurizio

    2016-09-01

    Analysis of the microcirculation is currently performed offline, is time consuming and operator dependent. The aim of this study was to assess the ability and efficiency of the automatic analysis software CytoCamTools 1.7.12 (CC) to measure microvascular parameters in comparison with Automated Vascular Analysis (AVA) software 3.2. 22 patients admitted to the cardiothoracic intensive care unit following cardiac surgery were prospectively enrolled. Sublingual microcirculatory videos were analysed using AVA and CC software. The total vessel density (TVD) for small vessels, perfused vessel density (PVD) and proportion of perfused vessels (PPV) were calculated. Blood flow was assessed using the microvascular flow index (MFI) for AVA software and the averaged perfused speed indicator (APSI) for the CC software. The duration of the analysis was also recorded. Eighty-four videos from 22 patients were analysed. The bias between TVD-CC and TVD-AVA was 2.20 mm/mm(2) (95 % CI 1.37-3.03) with limits of agreement (LOA) of -4.39 (95 % CI -5.66 to -3.16) and 8.79 (95 % CI 7.50-10.01) mm/mm(2). The percentage error (PE) for TVD was ±32.2 %. TVD was positively correlated between CC and AVA (r = 0.74, p < 0.001). The bias between PVD-CC and PVD-AVA was 6.54 mm/mm(2) (95 % CI 5.60-7.48) with LOA of -4.25 (95 % CI -8.48 to -0.02) and 17.34 (95 % CI 13.11-21.57) mm/mm(2). The PE for PVD was ±61.2 %. PVD was positively correlated between CC and AVA (r = 0.66, p < 0.001). The median PPV-AVA was significantly higher than the median PPV-CC [97.39 % (95.25, 100 %) vs. 81.65 % (61.97, 88.99), p < 0.0001]. MFI categories cannot estimate or predict APSI values (p = 0.45). The time required for the analysis was shorter with CC than with AVA system [2'42″ (2'12″, 3'31″) vs. 16'12″ (13'38″, 17'57″), p < 0.001]. TVD is comparable between the two softwares, although faster with CC software. The values for PVD and PPV are not interchangeable given the

  12. Isothermal reduction kinetics of Panzhihua ilmenite concentrate under 30vol% CO-70vol% N2 atmosphere

    NASA Astrophysics Data System (ADS)

    Zhang, Ying-yi; Lü, Wei; Lü, Xue-wei; Li, Sheng-ping; Bai, Chen-guang; Song, Bing; Han, Ke-xi

    2017-03-01

    The reduction of ilmenite concentrate in 30vol% CO-70vol% N2 atmosphere was characterized by thermogravimetric and differential thermogravimetric (TG-DTG) analysis methods at temperatures from 1073 to 1223 K. The isothermal reduction results show that the reduction process comprised two stages; the corresponding apparent activation energy was obtained by the iso-conversional and model-fitting methods. For the first stage, the effect of temperature on the conversion degree was not obvious, the phase boundary chemical reaction was the controlling step, with an apparent activation energy of 15.55-40.71 kJ·mol-1. For the second stage, when the temperatures was greater than 1123 K, the reaction rate and the conversion degree increased sharply with increasing temperature, and random nucleation and subsequent growth were the controlling steps, with an apparent activation energy ranging from 182.33 to 195.95 kJ·mol-1. For the whole reduction process, the average activation energy and pre-exponential factor were 98.94-118.33 kJ·mol-1 and 1.820-1.816 min-1, respectively.

  13. The use of normalized cross-correlation analysis for automatic tendon excursion measurement in dynamic ultrasound imaging.

    PubMed

    Pearson, Stephen J; Ritchings, Tim; Mohamed, Ahmad S A

    2013-04-01

    The work describes an automated method of tracking dynamic ultrasound images using a normalized cross-correlation algorithm, applied to the patellar and gastrocnemius tendon. Displacement was examined during active and passive tendon excursions using B-mode ultrasonography. In the passive test where two regions of interest (2-ROI) were tracked, the automated tracking algorithm showed insignificant deviations from relative zero displacement for the knee (0.01 ± 0.04 mm) and ankle (-0.02 ± 0.04 mm) (P > .05). Similarly, when tracking 1-ROI the passive tests showed no significant differences (P > .05) between automatic and manual methods, 7.50 ± 0.60 vs 7.66 ± 0.63 mm for the patellar and 11.28 ± 1.36 vs 11.17 ± 1.35 mm for the gastrocnemius tests. The active tests gave no significant differences (P > .05) between automatic and manual methods with differences of 0.29 ± 0.04 mm for the patellar and 0.26 ± 0.01 mm for the gastrocnemius. This study showed that automatic tracking of in vivo displacement of tendon during dynamic excursion under load is possible and valid when compared with the standardized method. This approach will save time during analysis and enable discrete areas of the tendon to be examined.

  14. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  15. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  16. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  17. Studies on quantitative analysis and automatic recognition of cell types of lung cancer.

    PubMed

    Chen, Yi-Chen; Hu, Kuang-Hu; Li, Fang-Zhen; Li, Shu-Yu; Su, Wan-Fang; Huang, Zhi-Ying; Hu, Ying-Xiong

    2006-01-01

    Recognition of lung cancer cells is very important to the clinical diagnosis of lung cancer. In this paper we present a novel method to extract the structure characteristics of lung cancer cells and automatically recognize their types. Firstly soft mathematical morphology methods are used to enhance the grayscale image, to improve the definition of images, and to eliminate most of disturbance, noise and information of subordinate images, so the contour of target lung cancer cell and biological shape characteristic parameters can be extracted accurately. Then the minimum distance classifier is introduced to realize the automatic recognition of different types of lung cancer cells. A software system named "CANCER.LUNG" is established to demonstrate the efficiency of this method. The clinical experiments show that this method can accurately and objectively recognize the type of lung cancer cells, which can significantly improve the pathology research on the pathological changes of lung cancer and clinical assistant diagnoses.

  18. Analysis of Automatic Automotive Gear Boxes by Means of Versatile Graph-Based Methods

    NASA Astrophysics Data System (ADS)

    Drewniak, J.; Kopeć, J.; Zawiślak, S.

    Automotive gear boxes are special mechanisms which are created based upon some planetary gears and additionally equipped in control systems. The control system allows for an activation of particular drives. In the present paper, some graph based models of these boxes are considered i.e. contour, bond and mixed graphs. An exemplary automatic gear box is considered. Based upon the introduced models, ratios for some drives have been calculated. Advantages of the proposed method of modeling are: algorithmic approach and simplicity.

  19. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    PubMed

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  20. An automatic variational level set segmentation framework for computer aided dental X-rays analysis in clinical environments.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2006-03-01

    An automatic variational level set segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) in clinical environments is proposed. Designed for clinical environments, the segmentation contains two stages: a training stage and a segmentation stage. During the training stage, first, manually chosen representative images are segmented using hierarchical level set region detection. Then the window based feature extraction followed by principal component analysis (PCA) is applied and results are used to train a support vector machine (SVM) classifier. During the segmentation stage, dental X-rays are classified first by the trained SVM. The classifier provides initial contours which are close to correct boundaries for three coupled level sets driven by a proposed pathologically variational modeling which greatly accelerates the level set segmentation. Based on the segmentation results and uncertainty maps that are built based on a proposed uncertainty measurement, a computer aided analysis scheme is applied. The experimental results show that the proposed method is able to provide an automatic pathological segmentation which naturally segments those problem areas. Based on the segmentation results, the analysis scheme is able to provide indications of possible problem areas of bone loss and decay to the dentists. As well, the experimental results show that the proposed segmentation framework is able to speed up the level set segmentation in clinical environments.

  1. An approach to automatic blood vessel image registration of microcirculation for blood flow analysis on nude mice.

    PubMed

    Lin, Wen-Chen; Wu, Chih-Chieh; Zhang, Geoffrey; Wu, Tung-Hsin; Lin, Yang-Hsien; Huang, Tzung-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2011-04-01

    Image registration is often a required and a time-consuming step in blood flow analysis of large microscopic video sequences in vivo. In order to obtain stable images for blood flow analysis, frame-to-frame image matching as a preprocessing step is a solution to the problem of movement during image acquisition. In this paper, microscopic system analysis without fluorescent labelling is performed to provide precise and continuous quantitative data of blood flow rate in individual microvessels of nude mice. The performance properties of several matching metrics are evaluated through simulated image registrations. An automatic image registration programme based on Powell's optimisation search method with low calculation redundancy was implemented. The matching method by variance of ratio is computationally efficient and improves the registration robustness and accuracy in practical application of microcirculation registration. The presented registration method shows acceptable results in close requisition to analyse red blood cell velocities, confirming the scientific potential of the system in blood flow analysis.

  2. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    PubMed

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design.

  3. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  4. Automatic transmission

    SciTech Connect

    Miura, M.; Aoki, H.

    1988-02-02

    An automatic transmission is described comprising: an automatic transmission mechanism portion comprising a single planetary gear unit and a dual planetary gear unit; carriers of both of the planetary gear units that are integral with one another; an input means for inputting torque to the automatic transmission mechanism, clutches for operatively connecting predetermined ones of planetary gear elements of both of the planetary gear units to the input means and braking means for restricting the rotation of predetermined ones of planetary gear elements of both of the planetary gear units. The clutches are disposed adjacent one another at an end portion of the transmission for defining a clutch portion of the transmission; a first clutch portion which is attachable to the automatic transmission mechanism portion for comprising the clutch portion when attached thereto; a second clutch portion that is attachable to the automatic transmission mechanism portion in place of the first clutch portion for comprising the clutch portion when so attached. The first clutch portion comprising first clutch for operatively connecting the input means to a ring gear of the single planetary gear unit and a second clutch for operatively connecting the input means to a single gear of the automatic transmission mechanism portion. The second clutch portion comprising a the first clutch, the second clutch, and a third clutch for operatively connecting the input member to a ring gear of the dual planetary gear unit.

  5. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    PubMed

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  6. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    NASA Astrophysics Data System (ADS)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  7. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice.

  8. Fully Automatic Cross-Associations

    DTIC Science & Technology

    2004-08-01

    in Proc. 8th KDD, 2002. [16] S. Deerwester, S. T. Dumais, G. W. Furnas, T. K. Landauer, and R. Harshman, “Indexing by latent semantic analysis,” JASI...vol. 41, pp. 391–407, 1990. [17] T. G. Kolda and D. P. O’Leary, “A semidiscrete matrix decomposition for latent semantic indexing informa- tion...retrieval,” ACM Transactions on Information Systems, vol. 16, no. 4, pp. 322–346, 1998. [18] T. Hofmann, “Probabilistic latent semantic indexing,” in Proc

  9. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    PubMed

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  10. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    PubMed Central

    Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272

  11. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques.

    PubMed

    Akyol, Kemal; Şen, Baha; Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  12. A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood

    2013-02-01

    In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.

  13. Comparative analysis of image classification methods for automatic diagnosis of ophthalmic images

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Zhang, Kai; Liu, Xiyang; Long, Erping; Jiang, Jiewei; An, Yingying; Zhang, Jia; Liu, Zhenzhen; Lin, Zhuoling; Li, Xiaoyan; Chen, Jingjing; Cao, Qianzhong; Li, Jing; Wu, Xiaohang; Wang, Dongni; Li, Wangting; Lin, Haotian

    2017-01-01

    There are many image classification methods, but it remains unclear which methods are most helpful for analyzing and intelligently identifying ophthalmic images. We select representative slit-lamp images which show the complexity of ocular images as research material to compare image classification algorithms for diagnosing ophthalmic diseases. To facilitate this study, some feature extraction algorithms and classifiers are combined to automatic diagnose pediatric cataract with same dataset and then their performance are compared using multiple criteria. This comparative study reveals the general characteristics of the existing methods for automatic identification of ophthalmic images and provides new insights into the strengths and shortcomings of these methods. The relevant methods (local binary pattern +SVMs, wavelet transformation +SVMs) which achieve an average accuracy of 87% and can be adopted in specific situations to aid doctors in preliminarily disease screening. Furthermore, some methods requiring fewer computational resources and less time could be applied in remote places or mobile devices to assist individuals in understanding the condition of their body. In addition, it would be helpful to accelerate the development of innovative approaches and to apply these methods to assist doctors in diagnosing ophthalmic disease.

  14. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images

    PubMed Central

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  15. Comparative analysis of image classification methods for automatic diagnosis of ophthalmic images

    PubMed Central

    Wang, Liming; Zhang, Kai; Liu, Xiyang; Long, Erping; Jiang, Jiewei; An, Yingying; Zhang, Jia; Liu, Zhenzhen; Lin, Zhuoling; Li, Xiaoyan; Chen, Jingjing; Cao, Qianzhong; Li, Jing; Wu, Xiaohang; Wang, Dongni; Li, Wangting; Lin, Haotian

    2017-01-01

    There are many image classification methods, but it remains unclear which methods are most helpful for analyzing and intelligently identifying ophthalmic images. We select representative slit-lamp images which show the complexity of ocular images as research material to compare image classification algorithms for diagnosing ophthalmic diseases. To facilitate this study, some feature extraction algorithms and classifiers are combined to automatic diagnose pediatric cataract with same dataset and then their performance are compared using multiple criteria. This comparative study reveals the general characteristics of the existing methods for automatic identification of ophthalmic images and provides new insights into the strengths and shortcomings of these methods. The relevant methods (local binary pattern +SVMs, wavelet transformation +SVMs) which achieve an average accuracy of 87% and can be adopted in specific situations to aid doctors in preliminarily disease screening. Furthermore, some methods requiring fewer computational resources and less time could be applied in remote places or mobile devices to assist individuals in understanding the condition of their body. In addition, it would be helpful to accelerate the development of innovative approaches and to apply these methods to assist doctors in diagnosing ophthalmic disease. PMID:28139688

  16. Automatic Tracking and Motility Analysis of Human Sperm in Time-Lapse Images.

    PubMed

    Urbano, Leonardo F; Masson, Puneet; VerMilyea, Matthew; Kam, Moshe

    2017-03-01

    We present a fully automated multi-sperm tracking algorithm. It has the demonstrated capability to detect and track simultaneously hundreds of sperm cells in recorded videos while accurately measuring motility parameters over time and with minimal operator intervention. Algorithms of this kind may help in associating dynamic swimming parameters of human sperm cells with fertility and fertilization rates. Specifically, we offer an image processing method, based on radar tracking algorithms, that detects and tracks automatically the swimming paths of human sperm cells in timelapse microscopy image sequences of the kind that is analyzed by fertility clinics. Adapting the well-known joint probabilistic data association filter (JPDAF), we automatically tracked hundreds of human sperm simultaneously and measured their dynamic swimming parameters over time. Unlike existing CASA instruments, our algorithm has the capability to track sperm swimming in close proximity to each other and during apparent cell-to-cell collisions. Collecting continuously parameters for each sperm tracked without sample dilution (currently impossible using standard CASA systems) provides an opportunity to compare such data with standard fertility rates. The use of our algorithm thus has the potential to free the clinician from having to rely on elaborate motility measurements obtained manually by technicians, speed up semen processing, and provide medical practitioners and researchers with more useful data than are currently available.

  17. Automatic Identification of Motion Artifacts in EHG Recording for Robust Analysis of Uterine Contractions

    PubMed Central

    Ye-Lin, Yiyao; Alberola-Rubio, José; Perales, Alfredo

    2014-01-01

    Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique. PMID:24523828

  18. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    NASA Astrophysics Data System (ADS)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  19. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations.

  20. On the automaticity and flexibility of covert attention: a speed-accuracy trade-off analysis.

    PubMed

    Giordano, Anna Marie; McElree, Brian; Carrasco, Marisa

    2009-03-31

    Exogenous covert attention improves discriminability and accelerates the rate of visual information processing (M. Carrasco & B. McElree, 2001). Here we investigated and compared the effects of both endogenous (sustained) and exogenous (transient) covert attention. Specifically, we directed attention via spatial cues and evaluated the automaticity and flexibility of exogenous and endogenous attention by manipulating cue validity in conjunction with a response-signal speed-accuracy trade-off (SAT) procedure, which provides conjoint measures of discriminability and information accrual. To investigate whether discriminability and rate of information processing differ as a function of cue validity (chance to 100%), we compared how both types of attention affect performance while keeping experimental conditions constant. With endogenous attention, both the observed benefits (valid-cue) and the costs (invalid-cue) increased with cue validity. However, with exogenous attention, the benefits and costs in both discriminability and processing speed were similar across cue validity conditions. These results provide compelling time-course evidence that whereas endogenous attention can be flexibly allocated according to cue validity, exogenous attention is automatic and unaffected by cue validity.

  1. Analysis of the Distances Covered by First Division Brazilian Soccer Players Obtained with an Automatic Tracking Method

    PubMed Central

    Barros, Ricardo M.L.; Misuta, Milton S.; Menezes, Rafael P.; Figueroa, Pascual J.; Moura, Felipe A.; Cunha, Sergio A.; Anido, Ricardo; Leite, Neucimar J.

    2007-01-01

    Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players’, both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. Key pointsA novel automatic tracking method was presented. No previous

  2. Analysis of the distances covered by first division brazilian soccer players obtained with an automatic tracking method.

    PubMed

    Barros, Ricardo M L; Misuta, Milton S; Menezes, Rafael P; Figueroa, Pascual J; Moura, Felipe A; Cunha, Sergio A; Anido, Ricardo; Leite, Neucimar J

    2007-01-01

    Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. Key pointsA novel automatic tracking method was presented. No previous

  3. γH2AX foci as a measure of DNA damage: a computational approach to automatic analysis

    PubMed Central

    Ivashkevich, Alesia N.; Martin, Olga A.; Smith, Andrea J.; Redon, Christophe E.; Bonner, William M.; Martin, Roger F.; Lobachevsky, Pavel N.

    2011-01-01

    The γH2AX focus assay represents a fast and sensitive approach for detection of one of the critical types of DNA damage – double-strand breaks (DSB) induced by various cytotoxic agents including ionising radiation. Apart from research applications, the assay has a potential in clinical medicine/pathology, such as assessment of individual radiosensitivity, response to cancer therapies, as well as in biodosimetry. Given that generally there is a direct relationship between numbers of microscopically visualised γH2AX foci and DNA DSB in a cell, the number of foci per nucleus represents the most efficient and informative parameter of the assay. Although computational approaches have been developed for automatic focus counting, the tedious and time consuming manual focus counting still remains the most reliable approach due to limitations of computational approaches. We suggest a computational approach and associated software for automatic focus counting that minimises these limitations. Our approach, while using standard image processing algorithms, maximises the automation of identification of nuclei/cells in complex images, offers an efficient way to optimise parameters used in the image analysis and counting procedures, optionally invokes additional procedures to deal with variations in intensity of the signal and background in individual images, and provides automatic batch processing of a series of images. We report results of validation studies that demonstrated correlation of manual focus counting with results obtained using our computational algorithm for mouse jejunum touch prints, mouse tongue sections and human blood lymphocytes as well as radiation dose response of γH2AX focus induction for these biological specimens. PMID:21216255

  4. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  5. Automatic analysis of cerebral asymmetry: an exploratory study of the relationship between brain torque and planum temporale asymmetry.

    PubMed

    Barrick, Thomas R; Mackay, Clare E; Prima, Sylvain; Maes, Frederik; Vandermeulen, Dirk; Crow, Timothy J; Roberts, Neil

    2005-02-01

    Leftward occipital and rightward frontal lobe asymmetry (brain torque) and leftward planum temporale asymmetry have been consistently reported in postmortem and in vivo neuroimaging studies of the human brain. Here automatic image analysis techniques are applied to quantify global and local asymmetries, and investigate the relationship between brain torque and planum temporale asymmetries on T1-weighted magnetic resonance (MR) images of 30 right-handed young healthy subjects (15 male, 15 female). Previously described automatic cerebral hemisphere extraction and 3D interhemispheric reflection-based methods for studying brain asymmetry are applied with a new technique, LowD (Low Dimension), which enables automatic quantification of brain torque. LowD integrates extracted left and right cerebral hemispheres in columns orthogonal to the midsagittal plane (2D column maps), and subsequently integrates slices along the brain's anterior-posterior axis (1D slice profiles). A torque index defined as the magnitude of occipital and frontal lobe asymmetry is computed allowing exploratory investigation of relationships between this global asymmetry and local asymmetries found in the planum temporale. LowD detected significant torque in the 30 subjects with occipital and frontal components found to be highly correlated (P<0.02). Significant leftward planum temporale asymmetry was detected (P<0.05), and the torque index correlated with planum temporale asymmetry (P<0.001). However, torque and total brain volume were not correlated. Therefore, although components of cerebral asymmetry may be related, their magnitude is not influenced by total hemisphere volume. LowD provides increased sensitivity for detection and quantification of brain torque on an individual subject basis, and future studies will apply these techniques to investigate the relationship between cerebral asymmetry and functional laterality.

  6. Automatic cell segmentation and nuclear-to-cytoplasmic ratio analysis for third harmonic generated microscopy medical images.

    PubMed

    Lee, Gwo Giun; Lin, Huan-Hsiang; Tsai, Ming-Rung; Chou, Sin-Yo; Lee, Wen-Jeng; Liao, Yi-Hua; Sun, Chi-Kuang; Chen, Chun-Fu

    2013-04-01

    Traditional biopsy procedures require invasive tissue removal from a living subject, followed by time-consuming and complicated processes, so noninvasive in vivo virtual biopsy, which possesses the ability to obtain exhaustive tissue images without removing tissues, is highly desired. Some sets of in vivo virtual biopsy images provided by healthy volunteers were processed by the proposed cell segmentation approach, which is based on the watershed-based approach and the concept of convergence index filter for automatic cell segmentation. Experimental results suggest that the proposed algorithm not only reveals high accuracy for cell segmentation but also has dramatic potential for noninvasive analysis of cell nuclear-to-cytoplasmic ratio (NC ratio), which is important in identifying or detecting early symptoms of diseases with abnormal NC ratios, such as skin cancers during clinical diagnosis via medical imaging analysis.

  7. An Analysis of Serial Number Tracking Automatic Identification Technology as Used in Naval Aviation Programs

    NASA Astrophysics Data System (ADS)

    Csorba, Robert

    2002-09-01

    The Government Accounting Office found that the Navy, between 1996 and 1998, lost 3 billion in materiel in-transit. This thesis explores the benefits and cost of automatic identification and serial number tracking technologies under consideration by the Naval Supply Systems Command and the Naval Air Systems Command. Detailed cost-savings estimates are made for each aircraft type in the Navy inventory. Project and item managers of repairable components using Serial Number Tracking were surveyed as to the value of this system. It concludes that two thirds of the in-transit losses can be avoided with implementation of effective information technology-based logistics and maintenance tracking systems. Recommendations are made for specific steps and components of such an implementation. Suggestions are made for further research.

  8. Flow measurements in sewers based on image analysis: automatic flow velocity algorithm.

    PubMed

    Jeanbourquin, D; Sage, D; Nguyen, L; Schaeli, B; Kayal, S; Barry, D A; Rossi, L

    2011-01-01

    Discharges of combined sewer overflows (CSOs) and stormwater are recognized as an important source of environmental contamination. However, the harsh sewer environment and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. An in situ system for sewer water flow monitoring based on video images was evaluated. Algorithms to determine water velocities were developed based on image-processing techniques. The image-based water velocity algorithm identifies surface features and measures their positions with respect to real world coordinates. A web-based user interface and a three-tier system architecture enable remote configuration of the cameras and the image-processing algorithms in order to calculate automatically flow velocity on-line. Results of investigations conducted in a CSO are presented. The system was found to measure reliably water velocities, thereby providing the means to understand particular hydraulic behaviors.

  9. Versatile, high sensitivity, and automatized angular dependent vectorial Kerr magnetometer for the analysis of nanostructured materials.

    PubMed

    Teixeira, J M; Lusche, R; Ventura, J; Fermento, R; Carpinteiro, F; Araujo, J P; Sousa, J B; Cardoso, S; Freitas, P P

    2011-04-01

    Magneto-optical Kerr effect (MOKE) magnetometry is an indispensable, reliable, and one of the most widely used techniques for the characterization of nanostructured magnetic materials. Information, such as the magnitude of coercive fields or anisotropy strengths, can be readily obtained from MOKE measurements. We present a description of our state-of-the-art vectorial MOKE magnetometer, being an extremely versatile, accurate, and sensitivity unit with a low cost and comparatively simple setup. The unit includes focusing lenses and an automatized stepper motor stage for angular dependent measurements. The performance of the magnetometer is demonstrated by hysteresis loops of Co thin films displaying uniaxial anisotropy induced on growth, MnIr/CoFe structures exhibiting the so called exchange bias effect, spin valves, and microfabricated flux guides produced by optical lithography.

  10. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Rashidi, Mehdi; Dehmeshki, Jamshid; Dickenson, Eric; Daemi, M. Farhang

    1997-10-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurement within a porous medium. An aqueous fluid lace with a fluorescent dye to microspheres flows through a transparent, refractive-index-matched column packed with transparent crystals. For illumination purposes, a planar sheet of laser passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fields have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows through the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder. The recorded images are acquired automatically frame by frame and transferred to the computer for processing, by using a frame grabber an written relevant algorithms through an RS-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these enhanced particles are monitored to calculate velocity vectors in the plane of the beam. For concentration measurements, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact images that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentrations as a function of time within the porous column.

  11. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    NASA Astrophysics Data System (ADS)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  12. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    NASA Astrophysics Data System (ADS)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  13. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA) and Spectral Discrimination Measurements

    PubMed Central

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-01

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials. PMID:25625907

  14. Automatic extraction of optimal endmembers from airborne hyperspectral imagery using iterative error analysis (IEA) and spectral discrimination measurements.

    PubMed

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-23

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials.

  15. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  16. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1988-01-19

    approach for the analysis of aerial images. In this approach image analysis is performed ast three levels of abstraction, namely iconic or low-level... image analysis , symbolic or medium-level image analysis , and semantic or high-level image analysis . Domain dependent knowledge about prototypical urban

  17. CIRF Publications, Vol. 12, No. 5.

    ERIC Educational Resources Information Center

    International Labour Office, Geneva (Switzerland).

    CIRF Publications, Vol. 12, No. 5 is a collection of 80 abstracts giving particular attention to education, training, and economic growth in developing countries, Iran, Japan, Kenya, the Solomon Islands, and Sri Lanka; vocational rehabilitation in Italy, Spain, the United Kingdom, and the U. S. A.; agriculture in Chad, developing countries, and…

  18. Validating the performance of one-time decomposition for fMRI analysis using ICA with automatic target generation process.

    PubMed

    Yao, Shengnan; Zeng, Weiming; Wang, Nizhuan; Chen, Lei

    2013-07-01

    Independent component analysis (ICA) has been proven to be effective for functional magnetic resonance imaging (fMRI) data analysis. However, ICA decomposition requires to optimize the unmixing matrix iteratively whose initial values are generated randomly. Thus the randomness of the initialization leads to different ICA decomposition results. Therefore, just one-time decomposition for fMRI data analysis is not usually reliable. Under this circumstance, several methods about repeated decompositions with ICA (RDICA) were proposed to reveal the stability of ICA decomposition. Although utilizing RDICA has achieved satisfying results in validating the performance of ICA decomposition, RDICA cost much computing time. To mitigate the problem, in this paper, we propose a method, named ATGP-ICA, to do the fMRI data analysis. This method generates fixed initial values with automatic target generation process (ATGP) instead of being produced randomly. We performed experimental tests on both hybrid data and fMRI data to indicate the effectiveness of the new method and made a performance comparison of the traditional one-time decomposition with ICA (ODICA), RDICA and ATGP-ICA. The proposed method demonstrated that it not only could eliminate the randomness of ICA decomposition, but also could save much computing time compared to RDICA. Furthermore, the ROC (Receiver Operating Characteristic) power analysis also denoted the better signal reconstruction performance of ATGP-ICA than that of RDICA.

  19. Preliminary Statistical Analysis of the 1995 Evaluation by NASA LaRC of the IAI Automatic Balance Calibration Machine

    NASA Technical Reports Server (NTRS)

    Tcheng, Ping; Tripp, John S.

    1999-01-01

    The NASA Langley Research Center (LARC) participated in a national cooperative evaluation of the Israel Aircraft Industries (IAI) automatic balance calibration machine at Microcraft, San Diego in September 1995. A LaRC-designed six-component strain gauge balance was selected for test and calibration during LaRC's scheduled evaluation period. Eight calibrations were conducted using three selected experimental designs. Raw data were exported to LaRC facilities for reduction and statistical analysis using the techniques outlined in Tripp and Tcheng (1994). This report presents preliminary assessments of the results, and compares IAI calibration results with manual calibration results obtained at the Modern Machine and Tool Co., Inc. (MM & T). Newport News, VA. A more comprehensive report is forthcoming.

  20. Automatic frequency alignment and quantitation of single resonances in multiple magnetic resonance spectra via complex principal component analysis

    NASA Astrophysics Data System (ADS)

    Van Huffel, Sabine; Wang, Yu; Vanhamme, Leentje; Van Hecke, Paul

    2002-09-01

    Several algorithms for automatic frequency alignment and quantitation of single resonances in multiple magnetic resonance (MR) spectra are investigated. First, a careful comparison between the complex principal component analysis (PCA) and the Hankel total least squares-based methods for quantifying the resonances in the spectral sets of magnetic resonance spectroscopy imaging (MRSI) spectra is presented. Afterward, we discuss a method based on complex PCA plus linear regression and a method based on cross correlation of the magnitude spectra for correcting frequency shifts of resonances in sets of MR spectra. Their advantages and limitations are demonstrated on simulated MR data sets as well as on an in vivo MRSI data set of the human brain.

  1. Volume-based Feature Analysis of Mucosa for Automatic Initial Polyp Detection in Virtual Colonoscopy

    PubMed Central

    Wang, Su; Zhu, Hongbin; Lu, Hongbing; Liang, Zhengrong

    2009-01-01

    In this paper, we present a volume-based mucosa-based polyp candidate determination scheme for automatic polyp detection in computed colonography. Different from most of the existing computer-aided detection (CAD) methods where mucosa layer is a one-layer surface, a thick mucosa of 3-5 voxels wide fully reflecting partial volume effect is intentionally extracted, which excludes the direct applications of the traditional geometrical features. In order to address this dilemma, fast marching-based adaptive gradient/curvature and weighted integral curvature along normal directions (WICND) are developed for volume-based mucosa. In doing so, polyp candidates are optimally determined by computing and clustering these fast marching-based adaptive geometrical features. By testing on 52 patients datasets in which 26 patients were found with polyps of size 4-22 mm, both the locations and number of polyp candidates detected by WICND and previously developed linear integral curvature (LIC) were compared. The results were promising that WICND outperformed LIC mainly in two aspects: (1) the number of detected false positives was reduced from 706 to 132 on average, which significantly released our burden of machine learning in the feature space, and (2) both the sensitivity and accuracy of polyp detection have been slightly improved, especially for those polyps smaller than 5mm. PMID:19774204

  2. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents

    PubMed Central

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M.; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  3. Automatic nevi segmentation using adaptive mean shift filters and feature analysis

    NASA Astrophysics Data System (ADS)

    King, Michael A.; Lee, Tim K.; Atkins, M. Stella; McLean, David I.

    2004-05-01

    A novel automatic method of segmenting nevi is explained and analyzed in this paper. The first step in nevi segmentation is to iteratively apply an adaptive mean shift filter to form clusters in the image and to remove noise. The goal of this step is to remove differences in skin intensity and hairs from the image, while still preserving the shape of nevi present on the skin. Each iteration of the mean shift filter changes pixel values to be a weighted average of pixels in its neighborhood. Some new extensions to the mean shift filter are proposed to allow for better segmentation of nevi from the skin. The kernel, that describes how the pixels in its neighborhood will be averaged, is adaptive; the shape of the kernel is a function of the local histogram. After initial clustering, a simple merging of clusters is done. Finally, clusters that are local minima are found and analyzed to determine which clusters are nevi. When this algorithm was compared to an assessment by an expert dermatologist, it showed a sensitivity rate and diagnostic accuracy of over 95% on the test set, for nevi larger than 1.5mm.

  4. Monitoring infants by automatic video processing: A unified approach to motion analysis.

    PubMed

    Cattani, Luca; Alinovi, Davide; Ferrari, Gianluigi; Raheli, Riccardo; Pavlidis, Elena; Spagnoli, Carlotta; Pisani, Francesco

    2017-01-01

    A unified approach to contact-less and low-cost video processing for automatic detection of neonatal diseases characterized by specific movement patterns is presented. This disease category includes neonatal clonic seizures and apneas. Both disorders are characterized by the presence or absence, respectively, of periodic movements of parts of the body-e.g., the limbs in case of clonic seizures and the chest/abdomen in case of apneas. Therefore, one can analyze the data obtained from multiple video sensors placed around a patient, extracting relevant motion signals and estimating, using the Maximum Likelihood (ML) criterion, their possible periodicity. This approach is very versatile and allows to investigate various scenarios, including: a single Red, Green and Blue (RGB) camera, an RGB-depth sensor or a network of a few RGB cameras. Data fusion principles are considered to aggregate the signals from multiple sensors. In the case of apneas, since breathing movements are subtle, the video can be pre-processed by a recently proposed algorithm which is able to emphasize small movements. The performance of the proposed contact-less detection algorithms is assessed, considering real video recordings of newborns, in terms of sensitivity, specificity, and Receiver Operating Characteristic (ROC) curves, with respect to medical gold standard devices. The obtained results show that a video processing-based system can effectively detect the considered specific diseases, with increasing performance for increasing number of sensors.

  5. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    SciTech Connect

    Fang, Y; Huang, H; Su, T

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  6. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  7. Automatic analysis of slips of the tongue: Insights into the cognitive architecture of speech production.

    PubMed

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-04-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When "pig" is mispronounced as "big," the resulting /b/ sound differs from correct productions of "big," moving towards intended "pig"-revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing.

  8. Automatic regional analysis of DTI properties in the developmental macaque brain

    NASA Astrophysics Data System (ADS)

    Styner, Martin; Knickmeyer, Rebecca; Coe, Christopher; Short, Sarah J.; Gilmore, John

    2008-03-01

    Many neuroimaging studies are applied to monkeys as pathologies and environmental exposures can be studied in well-controlled settings and environment. In this work, we present a framework for the use of an atlas based, fully automatic segmentation of brain tissues, lobar parcellations, subcortical structures and the regional extraction of Diffusion Tensor Imaging (DTI) properties. We first built a structural atlas from training images by iterative, joint deformable registration into an unbiased average image. On this atlas, probabilistic tissue maps, a lobar parcellation and subcortical structures were determined. This information is applied to each subjects structural image via affine, followed by deformable registration. The affinely transformed atlas is employed for a joint T1 and T2 based tissue classification. The deformed parcellation regions mask the tissue segmentations to define the parcellation for white and gray matter separately. Each subjects structural image is then non-rigidly matched with its DTI image by normalized mutual information, b-spline based registration. The DTI property histograms were then computed using the probabilistic white matter information for each lobar parcellation. We successfully built an average atlas using a developmental training datasets of 18 cases aged 16-34 months. Our framework was successfully applied to over 50 additional subjects in the age range of 9 70 months. The probabilistically weighted FA average in the corpus callosum region showed the largest increase over time in the observed age range. Most cortical regions show modest FA increase, whereas the cerebellums FA values remained stable. The individual methods used in this segmentation framework have been applied before, but their combination is novel, as is their application to macaque MRI data. Furthermore, this is the first study to date looking at the DTI properties of the developing macaque brain.

  9. Automatic analysis of the Gorkha earthquake aftershock sequence: evidences of structurally-segmented seismicity

    NASA Astrophysics Data System (ADS)

    Baillard, Christian; Lyon-Caen, Hélène; Bollinger, Laurent; Rietbrock, Andreas; Letort, Jean; Adhikari, Lok Bijaya

    2017-03-01

    We present the first 3 months of aftershock activity following the 25th April 2015 Gorkha earthquake MW 7.8 recorded on the Nepalese Seismic network. We deployed an automatic procedure composed of three main stages: 1) coarse determination of the P and S onsets; 2) phase association to declare events and 3) iterative addition and refinement of onsets using the Kurtosis characteristic function. In total 9188 events could be located in the Kathmandu region with the majority having small location errors (< 4.5, 9, 10 km in the X, Y, Z directions, respectively). Additionally, we propose a new attenuation law to estimate local magnitudes in the region. This new seismic catalog reveals a detailed insight into the Gorkha aftershock sequence and its relation to the main shock rupture models and tectonic structures in the region. Most aftershocks fall within the Main Himalayan Thrust (MHT) shear zone or in its hanging-wall. Significant temporal and lateral variations of aftershocks location are observed among them: 1) three distinct stages, highlighting subsequent jump-offs at the easternmost termination, 2) the existence of a seismic gap north of Kathmandu which matches with a low slip zone in the rupture area of the mainshock, 3) the confinement of seismic activity in the trace of the 12th May MW 7.3 earthquake within the MHT and its hanging-wall through a 30 by 30 km2 region, 4) a shallow westward-dipping structure east of the Kathmandu klippe. These new observations with the inferred tectonic structures at depth suggests a tectonic control of part of the aftershock activity by the lateral breaks along the MHT and by the geometry of the duplex above the thrust.

  10. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  11. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  12. A normative spatiotemporal MRI atlas of the fetal brain for automatic segmentation and analysis of early brain growth.

    PubMed

    Gholipour, Ali; Rollins, Caitlin K; Velasco-Annis, Clemente; Ouaalam, Abdelhakim; Akhondi-Asl, Alireza; Afacan, Onur; Ortinau, Cynthia M; Clancy, Sean; Limperopoulos, Catherine; Yang, Edward; Estroff, Judy A; Warfield, Simon K

    2017-03-28

    Longitudinal characterization of early brain growth in-utero has been limited by a number of challenges in fetal imaging, the rapid change in size, shape and volume of the developing brain, and the consequent lack of suitable algorithms for fetal brain image analysis. There is a need for an improved digital brain atlas of the spatiotemporal maturation of the fetal brain extending over the key developmental periods. We have developed an algorithm for construction of an unbiased four-dimensional atlas of the developing fetal brain by integrating symmetric diffeomorphic deformable registration in space with kernel regression in age. We applied this new algorithm to construct a spatiotemporal atlas from MRI of 81 normal fetuses scanned between 19 and 39 weeks of gestation and labeled the structures of the developing brain. We evaluated the use of this atlas and additional individual fetal brain MRI atlases for completely automatic multi-atlas segmentation of fetal brain MRI. The atlas is available online as a reference for anatomy and for registration and segmentation, to aid in connectivity analysis, and for groupwise and longitudinal analysis of early brain growth.

  13. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  14. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  15. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    NASA Astrophysics Data System (ADS)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78

  16. A Q-GERT Analysis of the Effect of Improved Automatic Testing on F-16 Aircraft Availability.

    DTIC Science & Technology

    1983-09-01

    September 1932. ADA 12%000 guerra, Joel A., Lesko, Andrew J., & Periera, Jose . OpratinE and supoort cost model for avionics automatic tst eauipment...1981. perr ., Corporation. Automatic test equipment. Final report No. RL-R-80-3, U. S. Army; Missile Command, Redstoie Aroenal AL, February 1980. 17 84

  17. Clarifying Inconclusive Functional Analysis Results: Assessment and Treatment of Automatically Reinforced Aggression

    PubMed Central

    Saini, Valdeep; Greer, Brian D.; Fisher, Wayne W.

    2016-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy’s functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences maintaining aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation. PMID:25891269

  18. Clarifying inconclusive functional analysis results: Assessment and treatment of automatically reinforced aggression.

    PubMed

    Saini, Valdeep; Greer, Brian D; Fisher, Wayne W

    2015-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy's functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences that maintained aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation.

  19. On 3-D modeling and automatic regridding in shape design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Yao, Tse-Min

    1987-01-01

    The material derivative idea of continuum mechanics and the adjoint variable method of design sensitivity analysis are used to obtain a computable expression for the effect of shape variations on measures of structural performance of three-dimensional elastic solids.

  20. Automatic transmission

    SciTech Connect

    Ohkubo, M.

    1988-02-16

    An automatic transmission is described combining a stator reversing type torque converter and speed changer having first and second sun gears comprising: (a) a planetary gear train composed of first and second planetary gears sharing one planetary carrier in common; (b) a clutch and requisite brakes to control the planetary gear train; and (c) a speed-increasing or speed-decreasing mechanism is installed both in between a turbine shaft coupled to a turbine of the stator reversing type torque converter and the first sun gear of the speed changer, and in between a stator shaft coupled to a reversing stator and the second sun gear of the speed changer.

  1. Automatic transmission

    SciTech Connect

    Miki, N.

    1988-10-11

    This patent describes an automatic transmission including a fluid torque converter, a first gear unit having three forward-speed gears and a single reverse gear, a second gear unit having a low-speed gear and a high-speed gear, and a hydraulic control system, the hydraulic control system comprising: a source of pressurized fluid; a first shift valve for controlling the shifting between the first-speed gear and the second-speed gear of the first gear unit; a second shift valve for controlling the shifting between the second-speed gear and the third-speed gear of the first gear unit; a third shift valve equipped with a spool having two positions for controlling the shifting between the low-speed gear and the high-speed gear of the second gear unit; a manual selector valve having a plurality of shift positions for distributing the pressurized fluid supply from the source of pressurized fluid to the first, second and third shift valves respectively; first, second and third solenoid valves corresponding to the first, second and third shift valves, respectively for independently controlling the operation of the respective shift valves, thereby establishing a six forward-speed automatic transmission by combining the low-speed gear and the high-speed gear of the second gear unit with each of the first-speed gear, the second speed gear and the third-speed gear of the first gear unit; and means to fixedly position the spool of the third shift valve at one of the two positions by supplying the pressurized fluid to the third shift valve when the manual selector valve is shifted to a particular shift position, thereby locking the second gear unit in one of low-speed gear and the high-speed gear, whereby the six forward-speed automatic transmission is converted to a three forward-speed automatic transmission when the manual selector valve is shifted to the particular shift position.

  2. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    SciTech Connect

    Wei, J; Yuan, A; Li, G

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  3. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  4. Direct analysis of oligomeric tackifying resins in rubber compounds by automatic thermal desorption gas chromatography/mass spectrometry

    PubMed

    Kim

    1999-01-01

    Two analytical methods, automatic thermal desorption gas chromatography/mass spectrometry (ATD-GC/MS) and pyrolysis gas chromatography/mass spectrometry (Py-GC/MS), were applied as direct methods for the analysis of oligomeric tackifying resins in a vulcanized rubber. The ATD-GC/MS method, based on discontinuous volatile extraction, was found to be an effective means for direct analysis of the oligomeric tackifying resins contained in a vulcanized rubber. The oligomeric tackifying resins, such as t-octylphenolformaldehyde (TOPF) resin, rosin-modified terpene resin, and cashew resin, could be directly analyzed in vulcanized rubber by ATD-GC/MS. Much simpler total ion chromatograms were obtained by ATD-GC/MS than by flash pyrolysis with a Curie-point pyrolyzer, permitting much easier interpretation. Ions at m/z 206, 135, and 107 were fingerprints in the characteristic mass spectra obtained by ATD-GC/MS for TOPF resin in the vulcanized rubber. 1H-Indene, styrene, and isolongifolene were observed as their characteristic mass spectra in the pyrolyzate of the rosin-modified terpene resin. From the cashew resin, phenol, 3-methylphenol, and 4-(1,1,3, 3-tetramethylbutyl)phenol were obtained as the characteristic pyrolyzates by discontinuous thermal extraction via ATD-GC/MS. Copyright 1999 John Wiley & Sons, Ltd.

  5. Automatic detection of large dense-core vesicles in secretory cells and statistical analysis of their intracellular distribution.

    PubMed

    Díaz, Ester; Ayala, Guillermo; Díaz, María Elena; Gong, Liang-Wei; Toomre, Derek

    2010-01-01

    Analyzing the morphological appearance and the spatial distribution of large dense-core vesicles (granules) in the cell cytoplasm is central to the understanding of regulated exocytosis. This paper is concerned with the automatic detection of granules and the statistical analysis of their spatial locations in different cell groups. We model the locations of granules of a given cell as a realization of a finite spatial point process and the point patterns associated with the cell groups as replicated point patterns of different spatial point processes. First, an algorithm to segment the granules using electron microscopy images is proposed. Second, the relative locations of the granules with respect to the plasma membrane are characterized by two functional descriptors: the empirical cumulative distribution function of the distances from the granules to the plasma membrane and the density of granules within a given distance to the plasma membrane. The descriptors of the different cells for each group are compared using bootstrap procedures. Our results show that these descriptors and the testing procedure allow discriminating between control and treated cells. The application of these novel tools to studies of secretion should help in the analysis of diseases associated with dysfunctional secretion, such as diabetes.

  6. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  7. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis

    PubMed Central

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R. G.; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  8. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  9. Automatic transmission

    SciTech Connect

    Aoki, H.

    1989-03-21

    An automatic transmission is described, comprising: a torque converter including an impeller having a connected member, a turbine having an input member and a reactor; and an automatic transmission mechanism having first to third clutches and plural gear units including a single planetary gear unit with a ring gear and a dual planetary gear unit with a ring gear. The single and dual planetary gear units have respective carriers integrally coupled with each other and respective sun gears integrally coupled with each other, the input member of the turbine being coupled with the ring gear of the single planetary gear unit through the first clutch, and being coupled with the sun gear through the second clutch. The connected member of the impeller is coupled with the ring gear of the dual planetary gear of the dual planetary gear unit is made to be and ring gear of the dual planetary gear unit is made to be restrained as required, and the carrier is coupled with an output member.

  10. Statistical Approaches to Automatic Indexing.

    ERIC Educational Resources Information Center

    Harter, Stephen P.

    1978-01-01

    Views automatic indexing as a two-tiered word frequency analysis that involves selection of a technical vocabulary and identification of document keywords. Assumptions, criteria, evaluation, and relevance are discussed. (JD)

  11. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  12. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    ERIC Educational Resources Information Center

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  13. Automatic co-registration of space-based sensors for precision change detection and analysis

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  14. Enzymatic Microreactors for the Determination of Ethanol by an Automatic Sequential Injection Analysis System

    NASA Astrophysics Data System (ADS)

    Alhadeff, Eliana M.; Salgado, Andrea M.; Cos, Oriol; Pereira, Nei; Valdman, Belkis; Valero, Francisco

    A sequential injection analysis system with two enzymatic microreactors for the determination of ethanol has been designed. Alcohol oxidase and horseradish peroxidase were separately immobilized on glass aminopropyl beads, and packed in 0.91-mL volume microreactors, working in line with the sequential injection analysis system. A stop flow of 120 s was selected for a linear ethanol range of 0.005-0.04 g/L±0.6% relative standard deviation with a throughput of seven analyses per hour. The system was applied to measure ethanol concentrations in samples of distilled and nondistilled alcoholic beverages, and of alcoholic fermentation with good performance and no significant difference compared with other analytical procedures (gas chromatography and high-performance liquid chromatography).

  15. Automatic backscatter analysis of regional left ventricular systolic function using color kinesis.

    PubMed

    Schwartz, S L; Cao, Q L; Vannan, M A; Pandian, N G

    1996-06-15

    Assessment of regional wall motion by 2-dimensional echocardiography can be performed by either semiquantitative wall motion scoring or by quantitative analysis. The former is subjective and requires expertise. Quantitative methods are too time-consuming for routine use in a busy clinical laboratory. Color kinesis is a new algorithm utilizing acoustic backscatter analysis. It provides a color encoded map of endocardial motion in real time. In each frame a new color layer is added; the thickness of the color beam represents endocardial motion during that frame. The end-systolic image has multiple color layers, representing regional and temporal heterogeneity of segmental motion. The purpose of this study was to validate the use of color kinesis for semiquantitative analysis of regional left ventricular systolic function and quantitatively in measurement of endocardial excursion. Semiquantitative wall motion scoring was performed in 18 patients using both 2-dimensional echo and color kinesis. Scoring was identical in 74% of segments; there was 84% agreement in definition of normal vs. abnormal. There was less interobserver variability in wall motion scoring using color kinesis. Endocardial excursion was quantified in 21 patients. 70% of the imaged segments were suitable for analysis. Correlation between 2-dimensional echocardiographic measurements and color kinesis was excellent, r = 0.87. The mean difference in excursion as measured by the 2 methods was -0.05 +/- 2.0 mm. In conclusion, color kinesis is a useful method for assessing regional contraction by displaying a color map of systolic endocardial excursion. This algorithm may improve the confidence and accuracy of assessment of segmental ventricular function by echocardiographic methods.

  16. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  17. Automaticity of Conceptual Magnitude.

    PubMed

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-02-16

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object's conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system.

  18. Automaticity of Conceptual Magnitude

    PubMed Central

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-01-01

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object’s conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system. PMID:26879153

  19. Analysis of cannabis in oral fluid specimens by GC-MS with automatic SPE.

    PubMed

    Choi, Hyeyoung; Baeck, Seungkyung; Kim, Eunmi; Lee, Sooyeun; Jang, Moonhee; Lee, Juseon; Choi, Hwakyung; Chung, Heesun

    2009-12-01

    Methamphetamine (MA) is the most commonly abused drug in Korea, followed by cannabis. Traditionally, MA analysis is carried out on both urine and hair samples and cannabis analysis in urine samples only. Despite the fact that oral fluid has become increasingly popular as an alternative specimen in the field of driving under the influence of drugs (DUID) and work place drug testing, its application has not been expanded to drug analysis in Korea. Oral fluid is easy to collect and handle and can provide an indication of recent drug abuse. In this study, we present an analytical method using GC-MS to determine tetrahydrocannabinol (THC) and its main metabolite 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in oral fluid. The validated method was applied to oral fluid samples collected from drug abuse suspects and the results were compared with those in urine. The stability of THC and THC-COOH in oral fluid stored in different containers was also investigated. Oral fluid specimens from 12 drug abuse suspects, submitted by the police, were collected by direct expectoration. The samples were screened with microplate ELISA. For confirmation they were extracted using automated SPE with mixed-mode cation exchange cartridge, derivatized and analyzed by GC-MS using selective ion monitoring (SIM). The concentrations ofTHC and THC-COOH in oral fluid showed a large variation and the results from oral fluid and urine samples from cannabis abusers did not show any correlation. Thus, detailed information about time interval between drug use and sample collection is needed to interpret the oral fluid results properly. In addition, further investigation about the detection time window ofTHC and THC-COOH in oral fluid is required to substitute oral fluid for urine in drug testing.

  20. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    NASA Astrophysics Data System (ADS)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  1. Automatic classification of the interferential tear film lipid layer using colour texture analysis.

    PubMed

    Remeseiro, B; Penas, M; Barreira, N; Mosquera, A; Novo, J; García-Resúa, C

    2013-07-01

    The tear film lipid layer is heterogeneous among the population. Its classification depends on its thickness and can be done using the interference pattern categories proposed by Guillon. This papers presents an exhaustive study about the characterisation of the interference phenomena as a texture pattern, using different feature extraction methods in different colour spaces. These methods are first analysed individually and then combined to achieve the best results possible. The principal component analysis (PCA) technique has also been tested to reduce the dimensionality of the feature vectors. The proposed methodologies have been tested on a dataset composed of 105 images from healthy subjects, with a classification rate of over 95% in some cases.

  2. Automatic and Interactive Analysis Software for Beta-Gamma Coincidence Systems Used in CTBT Monitoring

    DTIC Science & Technology

    2000-09-01

    publication in the Journal of Radioanalytical and Nuclear Chemistry , April 2000. [2] Biegalski, K.M.F. and Biegalski, S. “Determining Minimum Detectable... Radioanalytical and Nuclear Chemistry , April 2000. [3] Reeder, P.L., Bowyer, T.W., and Perkins, R.W. “Analysis of Beta-Gamma Spectra for the PNNL ARSA and...DTRA01-99-C-0031 ABSTRACT A suite of software has been developed by Veridian Systems as part of the Prototype International Data Center (PIDC) to assist

  3. Automatic transmission

    SciTech Connect

    Hamane, M.; Ohri, H.

    1989-03-21

    This patent describes an automatic transmission connected between a drive shaft and a driven shaft and comprising: a planetary gear mechanism including a first gear driven by the drive shaft, a second gear operatively engaged with the first gear to transmit speed change output to the driven shaft, and a third gear operatively engaged with the second gear to control the operation thereof; centrifugally operated clutch means for driving the first gear and the second gear. It also includes a ratchet type one-way clutch for permitting rotation of the third gear in the same direction as that of the drive shaft but preventing rotation in the reverse direction; the clutch means comprising a ratchet pawl supporting plate coaxially disposed relative to the drive shaft and integrally connected to the third gear, the ratchet pawl supporting plate including outwardly projection radial projections united with one another at base portions thereof.

  4. Automatic transmission

    SciTech Connect

    Meyman, U.

    1987-03-10

    An automatic transmission is described comprising wheel members each having discs defining an inner space therebetween; turnable blades and vane members located in the inner space between the discs of at least one of the wheel members, the turnable blades being mechanically connected with the vane members. Each of the turnable blades has an inner surface and an outer surface formed by circular cylindrical surfaces having a common axis, each of the turnable blades being turnable about the common axis of the circular cylindrical surfaces forming the inner and outer surfaces of the respective blade; levers turnable about the axes and supporting the blades; the discs having openings extending coaxially with the surfaces which describe the blades. The blades are partially received in the openings of the discs; and a housing accommodating the wheel members and the turnable blades and the vane members.

  5. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    NASA Astrophysics Data System (ADS)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  6. Online Determination of Trace Amounts of Tannic Acid in Colored Tannery Wastewaters by Automatic Reference Flow Injection Analysis

    PubMed Central

    Wei, Liang

    2010-01-01

    A simple, rapid and sensitive method was proposed for online determination of tannic acid in colored tannery wastewater by automatic reference flow injection analysis. Based on the tannic acid reduction phosphotungstic acid to form blue compound in pH 12.38 alkaline solutions, the shade of blue compound is in a linear relation to the content of tannic acid at the point of the maximum absorption peak of 760 nm. The optimal experimental conditions had been obtained. The linear range of the proposed method was between 200 μg L−1 to 80 mg L−1 and the detection limit was 0.58 μg L−1. The relative standard deviation was 3.08% and 2.43% for 500 μg L−1 and 40 mg L−1 of tannic acid standard solution, respectively, (n = 10). The method had been successfully applied to determination of tannic acid in colored tannery wastewaters and the analytical results were satisfactory. PMID:20508812

  7. The metagenomics RAST server - a public resource for the automatic phylogenetic and functional analysis of metagenomes.

    SciTech Connect

    Meyer, F.; Paarmann, D.; D'Souza, M.; Olson, R.; Glass, E. M.; Kubal, M.; Paczian, T.; Stevens, R.; Wilke, A.; Wilkening, J.; Edwards, R. A.; Rodriguez, A.; Mathematics and Computer Science; Univ. of Chicago; San Diego State Univ.

    2008-09-19

    Random community genomes (metagenomes) are now commonly used to study microbes in different environments. Over the past few years, the major challenge associated with metagenomics shifted from generating to analyzing sequences. High-throughput, low-cost next-generation sequencing has provided access to metagenomics to a wide range of researchers. A high-throughput pipeline has been constructed to provide high-performance computing to all researchers interested in using metagenomics. The pipeline produces automated functional assignments of sequences in the metagenome by comparing both protein and nucleotide databases. phylogenetic and functional summaries of the metagenomes are generated, and tools for comparative metagenomics are incorporated into the standard views. user access is controlled to ensure data privacy, but the collaborative environment underpinning the service provides a framework for sharing databasets between multiple users. In the metagenomics RAST, all users retain full control of their data, and everything is available for download in a variety of formats. The open-source metagenomics RAST service provides a new paradigm for the annotation and analysis of metagenomes. With built-in support for multiple data sources and a back end that houses abstract data types, the metagenomics RAST is stable, extensible, and freely available to all researchers. This service has removed one of the primary bottlenecks in metagenome sequence analysis--the available of high-performance computing for annotating the data.

  8. International seismological datacenter. Database structure, computer facilities, automatic and interactive analysis

    NASA Astrophysics Data System (ADS)

    Barkeby, G.

    1980-11-01

    A data base and data analysis system were designed for receiving and processing seismological data from the WMO Global Telecommunications System. The interface with the system uses a PDP 11/34 computer. Data from the seismic bulletins can be analyzed online or stored for later use. The data base is constructed on a list basis and is a set of seismological data consisting of either reported or calculated parameters. Each parameter is accessible to the user. Programming for the system is in standard FORTRAN except for the communications interface. Record structure is shown with examples of the command language. Computer equipment includes an Amdahl 470/V7, a DEC 10, and a CDC 170-720

  9. Automatic Contrast Enhancement of Brain MR Images Using Hierarchical Correlation Histogram Analysis.

    PubMed

    Chen, Chiao-Min; Chen, Chih-Cheng; Wu, Ming-Chi; Horng, Gwoboa; Wu, Hsien-Chu; Hsueh, Shih-Hua; Ho, His-Yun

    Parkinson's disease is a progressive neurodegenerative disorder that has a higher probability of occurrence in middle-aged and older adults than in the young. With the use of a computer-aided diagnosis (CAD) system, abnormal cell regions can be identified, and this identification can help medical personnel to evaluate the chance of disease. This study proposes a hierarchical correlation histogram analysis based on the grayscale distribution degree of pixel intensity by constructing a correlation histogram, that can improves the adaptive contrast enhancement for specific objects. The proposed method produces significant results during contrast enhancement preprocessing and facilitates subsequent CAD processes, thereby reducing recognition time and improving accuracy. The experimental results show that the proposed method is superior to existing methods by using two estimation image quantitative methods of PSNR and average gradient values. Furthermore, the edge information pertaining to specific cells can effectively increase the accuracy of the results.

  10. Automatic Segmentation of Invasive Breast Carcinomas from DCE-MRI using Time Series Analysis

    PubMed Central

    Jayender, Jagadaeesan; Chikarmane, Sona; Jolesz, Ferenc A.; Gombos, Eva

    2013-01-01

    Purpose Quantitative segmentation methods based on black-box modeling and pharmacokinetic modeling are highly dependent on imaging pulse sequence, timing of bolus injection, arterial input function, imaging noise and fitting algorithms. To accurately segment invasive ductal carcinomas (IDCs) from dynamic contrast enhanced MRI (DCE-MRI) using time series analysis based on linear dynamic system (LDS) modeling. Methods We modeled the underlying dynamics of the tumor by a LDS and use the system parameters to segment the carcinoma on the DCE-MRI. Twenty-four patients with biopsy-proven IDCs were analyzed. The lesions segmented by the algorithm were compared with an expert radiologist’s segmentation and the output of a commercial software, CADstream. The results are quantified in terms of the accuracy and sensitivity of detecting the lesion and the amount of overlap, measured in terms of the Dice similarity coefficient (DSC). Results The segmentation algorithm detected the tumor with 90% accuracy and 100% sensitivity when compared to the radiologist’s segmentation and 82.1% accuracy and 100% sensitivity when compared to the CADstream output. The overlap of the algorithm output with the radiologist’s segmentation and CADstream output, computed in terms of the DSC was 0.77 and 0.72 respectively. The algorithm also shows robust stability to imaging noise. Simulated imaging noise with zero mean and standard deviation equal to 25% of the base signal intensity was added to the DCE-MRI series. The amount of overlap between the tumor maps generated by the LDS-based algorithm from the noisy and original DCE-MRI was DSC=0.95. Conclusion The time-series analysis based segmentation algorithm provides high accuracy and sensitivity in delineating the regions of enhanced perfusion corresponding to tumor from DCE-MRI. PMID:24115175

  11. Automatic pole-like object modeling via 3D part-based analysis of point cloud

    NASA Astrophysics Data System (ADS)

    He, Liu; Yang, Haoxiang; Huang, Yuchun

    2016-10-01

    Pole-like objects, including trees, lampposts and traffic signs, are indispensable part of urban infrastructure. With the advance of vehicle-based laser scanning (VLS), massive point cloud of roadside urban areas becomes applied in 3D digital city modeling. Based on the property that different pole-like objects have various canopy parts and similar trunk parts, this paper proposed the 3D part-based shape analysis to robustly extract, identify and model the pole-like objects. The proposed method includes: 3D clustering and recognition of trunks, voxel growing and part-based 3D modeling. After preprocessing, the trunk center is identified as the point that has local density peak and the largest minimum inter-cluster distance. Starting from the trunk centers, the remaining points are iteratively clustered to the same centers of their nearest point with higher density. To eliminate the noisy points, cluster border is refined by trimming boundary outliers. Then, candidate trunks are extracted based on the clustering results in three orthogonal planes by shape analysis. Voxel growing obtains the completed pole-like objects regardless of overlaying. Finally, entire trunk, branch and crown part are analyzed to obtain seven feature parameters. These parameters are utilized to model three parts respectively and get signal part-assembled 3D model. The proposed method is tested using the VLS-based point cloud of Wuhan University, China. The point cloud includes many kinds of trees, lampposts and other pole-like posters under different occlusions and overlaying. Experimental results show that the proposed method can extract the exact attributes and model the roadside pole-like objects efficiently.

  12. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  13. Upgrade of a semi-automatic flow injection analysis system to a fully automatic one by means of a resident program

    PubMed Central

    Prodromidis, M. I.; Tsibiris, A. B.

    1995-01-01

    The program and the arrangement for a versatile, computer-controlled flow injection analysis system is described. A resident program (which can be run simultaneously and complementary to any other program) controls (on/off, speed, direction) a pump and a pneumatic valve (emptying and filling position). The system was designed to be simple and flexible for both research and routine work. PMID:18925039

  14. El espectro de K2 Vol

    NASA Astrophysics Data System (ADS)

    Pintado, O. I.; Adelman, S. J.

    Se han obtenido espectros de alta dispersión para la estrella K2 Vol con el espectrógrafo REOSC del CASLEO en un rango de longitudes de ondas comprendido entre 3500 y 5050 Å. El espectro muestra evidencias de que la estrella es binaria. Se determinan las abundancias de los componentes químicos presentes en su atmósfera, como así también algunas características de su compañera.

  15. Automatic segmentation and analysis of fibrin networks in 3D confocal microscopy images

    NASA Astrophysics Data System (ADS)

    Liu, Xiaomin; Mu, Jian; Machlus, Kellie R.; Wolberg, Alisa S.; Rosen, Elliot D.; Xu, Zhiliang; Alber, Mark S.; Chen, Danny Z.

    2012-02-01

    Fibrin networks are a major component of blood clots that provides structural support to the formation of growing clots. Abnormal fibrin networks that are too rigid or too unstable can promote cardiovascular problems and/or bleeding. However, current biological studies of fibrin networks rarely perform quantitative analysis of their structural properties (e.g., the density of branch points) due to the massive branching structures of the networks. In this paper, we present a new approach for segmenting and analyzing fibrin networks in 3D confocal microscopy images. We first identify the target fibrin network by applying the 3D region growing method with global thresholding. We then produce a one-voxel wide centerline for each fiber segment along which the branch points and other structural information of the network can be obtained. Branch points are identified by a novel approach based on the outer medial axis. Cells within the fibrin network are segmented by a new algorithm that combines cluster detection and surface reconstruction based on the α-shape approach. Our algorithm has been evaluated on computer phantom images of fibrin networks for identifying branch points. Experiments on z-stack images of different types of fibrin networks yielded results that are consistent with biological observations.

  16. Automatic detection of sleep apnea based on EEG detrended fluctuation analysis and support vector machine.

    PubMed

    Zhou, Jing; Wu, Xiao-ming; Zeng, Wei-jie

    2015-12-01

    Sleep apnea syndrome (SAS) is prevalent in individuals and recently, there are many studies focus on using simple and efficient methods for SAS detection instead of polysomnography. However, not much work has been done on using nonlinear behavior of the electroencephalogram (EEG) signals. The purpose of this study is to find a novel and simpler method for detecting apnea patients and to quantify nonlinear characteristics of the sleep apnea. 30 min EEG scaling exponents that quantify power-law correlations were computed using detrended fluctuation analysis (DFA) and compared between six SAS and six healthy subjects during sleep. The mean scaling exponents were calculated every 30 s and 360 control values and 360 apnea values were obtained. These values were compared between the two groups and support vector machine (SVM) was used to classify apnea patients. Significant difference was found between EEG scaling exponents of the two groups (p < 0.001). SVM was used and obtained high and consistent recognition rate: average classification accuracy reached 95.1% corresponding to the sensitivity 93.2% and specificity 98.6%. DFA of EEG is an efficient and practicable method and is helpful clinically in diagnosis of sleep apnea.

  17. Bayesian analysis of fingerprint, face and signature evidences with automatic biometric systems.

    PubMed

    Gonzalez-Rodriguez, Joaquin; Fierrez-Aguilar, Julian; Ramos-Castro, Daniel; Ortega-Garcia, Javier

    2005-12-20

    The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.

  18. Automatic Quantitative MRI Texture Analysis in Small-for-Gestational-Age Fetuses Discriminates Abnormal Neonatal Neurobehavior

    PubMed Central

    Sanz-Cortes, Magdalena; Ratta, Giuseppe A.; Figueras, Francesc; Bonet-Carne, Elisenda; Padilla, Nelly; Arranz, Angela; Bargallo, Nuria; Gratacos, Eduard

    2013-01-01

    Background We tested the hypothesis whether texture analysis (TA) from MR images could identify patterns associated with an abnormal neurobehavior in small for gestational age (SGA) neonates. Methods Ultrasound and MRI were performed on 91 SGA fetuses at 37 weeks of GA. Frontal lobe, basal ganglia, mesencephalon and cerebellum were delineated from fetal MRIs. SGA neonates underwent NBAS test and were classified as abnormal if ≥1 area was <5th centile and as normal if all areas were >5th centile. Textural features associated with neurodevelopment were selected and machine learning was used to model a predictive algorithm. Results Of the 91 SGA neonates, 49 were classified as normal and 42 as abnormal. The accuracies to predict an abnormal neurobehavior based on TA were 95.12% for frontal lobe, 95.56% for basal ganglia, 93.18% for mesencephalon and 83.33% for cerebellum. Conclusions Fetal brain MRI textural patterns were associated with neonatal neurodevelopment. Brain MRI TA could be a useful tool to predict abnormal neurodevelopment in SGA. PMID:23922750

  19. SpotMetrics: An Open-Source Image-Analysis Software Plugin for Automatic Chromatophore Detection and Measurement

    PubMed Central

    Hadjisolomou, Stavros P.; El-Haddad, George

    2017-01-01

    Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, “SpotMetrics,” that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines. PMID:28298896

  20. SpotMetrics: An Open-Source Image-Analysis Software Plugin for Automatic Chromatophore Detection and Measurement.

    PubMed

    Hadjisolomou, Stavros P; El-Haddad, George

    2017-01-01

    Coleoid cephalopods (squid, octopus, and sepia) are renowned for their elaborate body patterning capabilities, which are employed for camouflage or communication. The specific chromatic appearance of a cephalopod, at any given moment, is a direct result of the combined action of their intradermal pigmented chromatophore organs and reflecting cells. Therefore, a lot can be learned about the cephalopod coloration system by video recording and analyzing the activation of individual chromatophores in time. The fact that adult cephalopods have small chromatophores, up to several hundred thousand in number, makes measurement and analysis over several seconds a difficult task. However, current advancements in videography enable high-resolution and high framerate recording, which can be used to record chromatophore activity in more detail and accuracy in both space and time domains. In turn, the additional pixel information and extra frames per video from such recordings result in large video files of several gigabytes, even when the recording spans only few minutes. We created a software plugin, "SpotMetrics," that can automatically analyze high resolution, high framerate video of chromatophore organ activation in time. This image analysis software can track hundreds of individual chromatophores over several hundred frames to provide measurements of size and color. This software may also be used to measure differences in chromatophore activation during different behaviors which will contribute to our understanding of the cephalopod sensorimotor integration system. In addition, this software can potentially be utilized to detect numbers of round objects and size changes in time, such as eye pupil size or number of bacteria in a sample. Thus, we are making this software plugin freely available as open-source because we believe it will be of benefit to other colleagues both in the cephalopod biology field and also within other disciplines.

  1. Regional analysis of volumes and reproducibilities of automatic and manual hippocampal segmentations

    PubMed Central

    Vrenken, Hugo; Bijma, Fetsje; Barkhof, Frederik; van Herk, Marcel; de Munck, Jan C.

    2017-01-01

    Purpose Precise and reproducible hippocampus outlining is important to quantify hippocampal atrophy caused by neurodegenerative diseases and to spare the hippocampus in whole brain radiation therapy when performing prophylactic cranial irradiation or treating brain metastases. This study aimed to quantify systematic differences between methods by comparing regional volume and outline reproducibility of manual, FSL-FIRST and FreeSurfer hippocampus segmentations. Materials and methods This study used a dataset from ADNI (Alzheimer’s Disease Neuroimaging Initiative), including 20 healthy controls, 40 patients with mild cognitive impairment (MCI), and 20 patients with Alzheimer’s disease (AD). For each subject back-to-back (BTB) T1-weighted 3D MPRAGE images were acquired at time-point baseline (BL) and 12 months later (M12). Hippocampi segmentations of all methods were converted into triangulated meshes, regional volumes were extracted and regional Jaccard indices were computed between the hippocampi meshes of paired BTB scans to evaluate reproducibility. Regional volumes and Jaccard indices were modelled as a function of group (G), method (M), hemisphere (H), time-point (T), region (R) and interactions. Results For the volume data the model selection procedure yielded the following significant main effects G, M, H, T and R and interaction effects G-R and M-R. The same model was found for the BTB scans. For all methods volumes reduces with the severity of disease. Significant fixed effects for the regional Jaccard index data were M, R and the interaction M-R. For all methods the middle region was most reproducible, independent of diagnostic group. FSL-FIRST was most and FreeSurfer least reproducible. Discussion/Conclusion A novel method to perform detailed analysis of subtle differences in hippocampus segmentation is proposed. The method showed that hippocampal segmentation reproducibility was best for FSL-FIRST and worst for Freesurfer. We also found systematic

  2. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    PubMed

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study.

  3. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab

    PubMed Central

    Faron, Matthew L.; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R.; Soucy, Genevieve; Korver, John

    2016-01-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and “nonnegative” chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study. PMID:27413193

  4. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  5. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    PubMed Central

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  6. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    PubMed

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  7. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    NASA Astrophysics Data System (ADS)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  8. Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion

    NASA Astrophysics Data System (ADS)

    Beigi, Parmida; Salcudean, Septimiu E.; Rohling, Robert; Ng, Gary C.

    2016-03-01

    This paper presents an automatic localization method for a standard hand-held needle in ultrasound based on temporal motion analysis of spatially decomposed data. Subtle displacement arising from tremor motion has a periodic pattern which is usually imperceptible in the intensity image but may convey information in the phase image. Our method aims to detect such periodic motion of a hand-held needle and distinguish it from intrinsic tissue motion, using a technique inspired by video magnification. Complex steerable pyramids allow specific design of the wavelets' orientations according to the insertion angle as well as the measurement of the local phase. We therefore use steerable pairs of even and odd Gabor wavelets to decompose the ultrasound B-mode sequence into various spatial frequency bands. Variations of the local phase measurements in the spatially decomposed input data is then temporally analyzed using a finite impulse response bandpass filter to detect regions with a tremor motion pattern. Results obtained from different pyramid levels are then combined and thresholded to generate the binary mask input for the Hough transform, which determines an estimate of the direction angle and discards some of the outliers. Polynomial fitting is used at the final stage to remove any remaining outliers and improve the trajectory detection. The detected needle is finally added back to the input sequence as an overlay of a cloud of points. We demonstrate the efficiency of our approach to detect the needle using subtle tremor motion in an agar phantom and in-vivo porcine cases where intrinsic motion is also present. The localization accuracy was calculated by comparing to expert manual segmentation, and presented in (mean, standard deviation and root-mean-square error) of (0.93°, 1.26° and 0.87°) and (1.53 mm, 1.02 mm and 1.82 mm) for the trajectory and the tip, respectively.

  9. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    PubMed

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  10. Automatic transmission

    SciTech Connect

    Miura, M.; Inuzuka, T.

    1986-08-26

    1. An automatic transmission with four forward speeds and one reverse position, is described which consists of: an input shaft; an output member; first and second planetary gear sets each having a sun gear, a ring gear and a carrier supporting a pinion in mesh with the sun gear and ring gear; the carrier of the first gear set, the ring gear of the second gear set and the output member all being connected; the ring gear of the first gear set connected to the carrier of the second gear set; a first clutch means for selectively connecting the input shaft to the sun gear of the first gear set, including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a second clutch means for selectively connecting the input shaft to the sun gear of the second gear set a third clutch means for selectively connecting the input shaft to the carrier of the second gear set including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a first drive-establishing means for selectively preventing rotation of the ring gear of the first gear set and the carrier of the second gear set in only one direction and, alternatively, in any direction; a second drive-establishing means for selectively preventing rotation of the sun gear of the second gear set; and a drum being open to the first planetary gear set, with a cylindrical intermediate wall, an inner peripheral wall and outer peripheral wall and forming the hydraulic servos of the first and third clutch means between the intermediate wall and the inner peripheral wall and between the intermediate wall and the outer peripheral wall respectively.

  11. Automatic Chloroplast Movement Analysis.

    PubMed

    Johansson, Henrik; Zeidler, Mathias

    2016-01-01

    In response to low or high intensities of light, the chloroplasts in the mesophyll cells of the leaf are able to increase or decrease their exposure to light by accumulating at the upper and lower sides or along the side walls of the cell respectively. This movement, regulated by the phototropin blue light photoreceptors phot1 and phot2, results in a decreased or increased transmission of light through the leaf. This way the plant is able to optimize harvesting of the incoming light or avoid damage caused by excess light. Here we describe a method that indirectly measures the movement of chloroplasts by taking advantage of the resulting change in leaf transmittance. By using a microplate reader, quantitative measurements of chloroplast accumulation or avoidance can be monitored over time, for multiple samples with relatively little hands-on time.

  12. Dose equations for tube current modulation in CT scanning and the interpretation of the associated CTDI{sub vol}

    SciTech Connect

    Dixon, Robert L.; Boone, John M.

    2013-11-15

    Purpose: The scanner-reported CTDI{sub vol} for automatic tube current modulation (TCM) has a different physical meaning from the traditional CTDI{sub vol} at constant mA, resulting in the dichotomy “CTDI{sub vol} of the first and second kinds” for which a physical interpretation is sought in hopes of establishing some commonality between the two.Methods: Rigorous equations are derived to describe the accumulated dose distributions for TCM. A comparison with formulae for scanner-reported CTDI{sub vol} clearly identifies the source of their differences. Graphical dose simulations are also provided for a variety of TCM tube current distributions (including constant mA), all having the same scanner-reported CTDI{sub vol}.Results: These convolution equations and simulations show that the local dose at z depends only weakly on the local tube current i(z) due to the strong influence of scatter from all other locations along z, and that the “local CTDI{sub vol}(z)” does not represent a local dose but rather only a relative i(z) ≡ mA(z). TCM is a shift-variant technique to which the CTDI-paradigm does not apply and its application to TCM leads to a CTDI{sub vol} of the second kind which lacks relevance.Conclusions: While the traditional CTDI{sub vol} at constant mA conveys useful information (the peak dose at the center of the scan length), CTDI{sub vol} of the second kind conveys no useful information about the associated TCM dose distribution it purportedly represents and its physical interpretation remains elusive. On the other hand, the total energy absorbed E (“integral dose”) as well as its surrogate DLP remain robust between variable i(z) TCM and constant current i{sub 0} techniques, both depending only on the total mAs =t{sub 0}=i{sub 0} t{sub 0} during the beam-on time t{sub 0}.

  13. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    NASA Astrophysics Data System (ADS)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  14. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  15. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    ERIC Educational Resources Information Center

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  16. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells.

  17. Automatic analysis of stereoscopic GOES/GOES and GOES/NOAA image pairs for measurement of hurricane cloud top height and structure

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Strong, J.; Pierce, H.; Woodward, R. H.

    1989-01-01

    Results are presented from a baseline study using an synthetic stereo image pair to test the Automatic Stereo Analysis (ASA) technique for reproducing cloud top structure. The ASA analysis, display, and calibration procedures are described. A GEO/LEO (GOES/NOAA AVHRR) image pair from Hurrican Allen in 1980 is used to illustrate the results that can be obtained using the ASA technique. Also, results are presented from applying the ASA technique to a GEO/GEO (GOES/GOES) image pair of Hurricane Gilbert in 1988.

  18. AUTOMATIC NAVIGATION.

    DTIC Science & Technology

    NAVIGATION, REPORTS), (*CONTROL SYSTEMS, *INFORMATION THEORY), ABSTRACTS, OPTIMIZATION, DYNAMIC PROGRAMMING, GAME THEORY, NONLINEAR SYSTEMS, CORRELATION TECHNIQUES, FOURIER ANALYSIS, INTEGRAL TRANSFORMS, DEMODULATION, NAVIGATION CHARTS, PATTERN RECOGNITION, DISTRIBUTION THEORY , TIME SHARING, GRAPHICS, DIGITAL COMPUTERS, FEEDBACK, STABILITY

  19. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer

    PubMed Central

    2013-01-01

    Background In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. Methods After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Results Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. Conclusion The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition. PMID:24499557

  20. System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator

    DTIC Science & Technology

    2006-08-01

    System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator Jae-Jun Kim∗ and Brij N. Agrawal † Department of...TITLE AND SUBTITLE System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator 5a. CONTRACT NUMBER 5b...and Dynamics, Vol. 20, No. 4, July-August 1997, pp. 625-632. 6Schwartz, J. L. and Hall, C. D., “ System Identification of a Spherical Air-Bearing

  1. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  2. Cognitions, emotions, and sexual response: analysis of the relationship among automatic thoughts, emotional responses, and sexual arousal.

    PubMed

    Nobre, Pedro J; Pinto-Gouveia, José

    2008-08-01

    The relationship between automatic thoughts and emotions presented during sexual activity and their correlation with sexual arousal was investigated. A total of 491 individuals (163 women and 232 men without sexual problems and 47 women and 49 men with a DSM-IV diagnosis of sexual dysfunction) completed the Sexual Modes Questionnaire (SMQ; Nobre and Pinto-Gouveia, Journal of Sex Research, 40, 368-382, 2003). Results indicated several significant correlations among automatic thoughts, emotions, and sexual arousal. Erection concern thoughts in the men and failure/disengagement thoughts and lack of erotic thoughts in the women presented the most significant negative correlations with sexual arousal. Additionally, sadness and disillusion were positively related to these negative cognitions and negatively associated with sexual arousal in both sexes. On the other hand, pleasure and satisfaction were negatively associated with the above-mentioned negative cognitions and positively associated with subjective sexual arousal in both men and women. Overall, findings support the hypothesis that cognitive, emotional, and behavioral dimensions are closely linked and suggest a mode typical of sexual dysfunction composed of negative automatic thoughts, depressive affect, and low subjective sexual arousal.

  3. A semi-automatic measurement system based on digital image analysis for the application to the single fiber fragmentation test

    NASA Astrophysics Data System (ADS)

    Blobel, Swen; Thielsch, Karin; Ulbricht, Volker

    2013-04-01

    The computational prediction of the effective macroscopic material behavior of fiber reinforced composites is a goal of research to exploit the potential of these materials. Besides the mechanical characteristics of the material components, an extensive knowledge of the mechanical interaction between these components is necessary in order to set-up suitable models of the local material structure. For example, an experimental investigation of the micromechanical damage behavior of simplified composite specimens can help to understand the mechanisms, which causes matrix and interface damage in the vicinity of a fiber fracture. To realize an appropriate experimental setup, a novel semi-automatic measurement system based on the analysis of digital images using photoelasticity and image correlation was developed. Applied to specimens with a birefringent matrix material, it is able to provide global and local information of the damage evolution and the stress and strain state at the same time. The image acquisition is accomplished using a long distance microscopic optic with an effective resolution of two micrometer per pixel. While the system is moved along the domain of interest of the specimen, the acquired images are assembled online and used to interpret optically extracted information in combination with global force-displacement curves provided by the load frame. The illumination of the specimen with circularly polarized light and the projection of the transmitted light through different configurations of polarizer and quarterwave-plates enables the synchronous capturing of four images at the quadrants of a four megapixel image sensor. The fifth image is decoupled from the same optical path and is projected to a second camera chip, to get a non-polarized image of the same scene at the same time. The benefit of this optical setup is the opportunity to extract a wide range of information locally, without influence on the progress of the experiment. The four images

  4. Automatic transmission adapter kit

    SciTech Connect

    Stich, R.L.; Neal, W.D.

    1987-02-10

    This patent describes, in a four-wheel-drive vehicle apparatus having a power train including an automatic transmission and a transfer case, an automatic transmission adapter kit for installation of a replacement automatic transmission of shorter length than an original automatic transmission in the four-wheel-drive vehicle. The adapter kit comprises: an extension housing interposed between the replacement automatic transmission and the transfer case; an output shaft, having a first end which engages the replacement automatic transmission and a second end which engages the transfer case; first sealing means for sealing between the extension housing and the replacement automatic transmission; second sealing means for sealing between the extension housing and the transfer case; and fastening means for connecting the extension housing between the replacement automatic transmission and the transfer case.

  5. Fully Automatic Determination of Soil Bacterium Numbers, Cell Volumes, and Frequencies of Dividing Cells by Confocal Laser Scanning Microscopy and Image Analysis

    PubMed Central

    Bloem, J.; Veninga, M.; Shepherd, J.

    1995-01-01

    We describe a fully automatic image analysis system capable of measuring cell numbers, volumes, lengths, and widths of bacteria in soil smears. The system also determines the number of cells in agglomerates and thus provides the frequency of dividing cells (FDC). Images are acquired from a confocal laser scanning microscope. The grey images are smoothed by convolution and by morphological erosion and dilation to remove noise. The background is equalized by flooding holes in the image and is then subtracted by two top hat transforms. Finally, the grey image is sharpened by delineation, and all particles above a fixed threshold are detected. The number of cells in each detected particle is determined by counting the number of local grey-level maxima in the particle. Thus, up to 1,500 cells in 10 fields of view in a soil smear are analyzed in 30 min without human intervention. Automatic counts of cell numbers and FDC were similar to visual counts in field samples. In microcosms, automatic measurements showed significant increases in cell numbers, FDC, mean cell volume, and length-to-width ratio after amendment of the soil. Volumes of fluorescent microspheres were measured with good approximation, but the absolute values obtained were strongly affected by the settings of the detector sensitivity. Independent measurements of bacterial cell numbers and volumes by image analysis and of cell carbon by a total organic carbon analyzer yielded an average specific carbon content of 200 fg of C (mu)m(sup-3), which indicates that our volume estimates are reasonable. PMID:16534976

  6. A prostate CAD system based on multiparametric analysis of DCE T1-w, and DW automatically registered images

    NASA Astrophysics Data System (ADS)

    Giannini, Valentina; Vignati, Anna; Mazzetti, Simone; De Luca, Massimo; Bracco, Christian; Stasi, Michele; Russo, Filippo; Armando, Enrico; Regge, Daniele

    2013-02-01

    Prostate specific antigen (PSA)-based screening reduces the rate of death from prostate cancer (PCa) by 31%, but this benefit is associated with a high risk of overdiagnosis and overtreatment. As prostate transrectal ultrasound-guided biopsy, the standard procedure for prostate histological sampling, has a sensitivity of 77% with a considerable false-negative rate, more accurate methods need to be found to detect or rule out significant disease. Prostate magnetic resonance imaging has the potential to improve the specificity of PSA-based screening scenarios as a non-invasive detection tool, in particular exploiting the combination of anatomical and functional information in a multiparametric framework. The purpose of this study was to describe a computer aided diagnosis (CAD) method that automatically produces a malignancy likelihood map by combining information from dynamic contrast enhanced MR images and diffusion weighted images. The CAD system consists of multiple sequential stages, from a preliminary registration of images of different sequences, in order to correct for susceptibility deformation and/or movement artifacts, to a Bayesian classifier, which fused all the extracted features into a probability map. The promising results (AUROC=0.87) should be validated on a larger dataset, but they suggest that the discrimination on a voxel basis between benign and malignant tissues is feasible with good performances. This method can be of benefit to improve the diagnostic accuracy of the radiologist, reduce reader variability and speed up the reading time, automatically highlighting probably cancer suspicious regions.

  7. Algorithms for skiascopy measurement automatization

    NASA Astrophysics Data System (ADS)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  8. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XXI); illustration of turbine and belt system. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  9. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, 1873 (PL XXIX top); illustration of full milll, as enlarged to south. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  10. 1. Photocopy of delineation, American Architect and Building News, Vol ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Photocopy of delineation, American Architect and Building News, Vol VI, No. 146, (September 27, 1879). SHOWING FRONT ELEVATION AND FLOOR PLAN - G. B. P. Carpenter House, 100 Block of Polk Streets (Prospect Point), Burlington, Des Moines County, IA

  11. Photocopy of a photograph (original from Kansas City Spirit, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of a photograph (original from Kansas City Spirit, Vol. III, no. 5, April 1910) View to the west toward front arcade entrance - Scarritt Building & Arcade, Ninth Street & Grand Avenue, & 819 Walnut Street, Kansas City, Jackson County, MO

  12. Vol(2)velle: Printable Interactive Volume Visualization.

    PubMed

    Stoppel, Sergej; Bruckner, Stefan

    2017-01-01

    Interaction is an indispensable aspect of data visualization. The presentation of volumetric data, in particular, often significantly benefits from interactive manipulation of parameters such as transfer functions, rendering styles, or clipping planes. However, when we want to create hardcopies of such visualizations, this essential aspect is lost. In this paper, we present a novel approach for creating hardcopies of volume visualizations which preserves a certain degree of interactivity. We present a method for automatically generating Volvelles, printable tangible wheel charts that can be manipulated to explore different parameter settings. Our interactive system allows the flexible mapping of arbitrary visualization parameters and supports advanced features such as linked views. The resulting designs can be easily reproduced using a standard printer and assembled within a few minutes.

  13. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  14. Dissimilarity analysis and automatic identification of monomethylalkanes from gas chromatography mass spectrometry data 1. Principle and protocols.

    PubMed

    Zhang, Liangxiao; Liang, Yizeng

    2009-07-03

    Monomethylalkanes are common but important components in many naturally occurring and synthetic organic materials. Generally, this kind of compounds is routinely analyzed by gas chromatography mass spectrometry (GC-MS) and identified by the retention pattern or similarity matching to the reference mass spectral library. However, these identification approaches rely on the limited standard database or costly standard compounds. When unknown monomethylalkane is absent from the reference library, these approaches might be less useful. In this study, based on the fragmentation rules and empirical observation, many interesting mass spectral characteristics of monomethylalkanes were discovered and employed to infer the number of carbon atoms and methylated position. Combined with the retention pattern, a protocol was described for the identification of monomethylalkane analyzed by GC-MS. After tested by simulated data and GC-MS data of the gasoline sample, it was demonstrated that the developing approach could automatically and correctly identify monomethylalkanes in complicated GC-MS data.

  15. On the Selection of Non-Invasive Methods Based on Speech Analysis Oriented to Automatic Alzheimer Disease Diagnosis

    PubMed Central

    López-de-Ipiña, Karmele; Alonso, Jesus-Bernardino; Travieso, Carlos Manuel; Solé-Casals, Jordi; Egiraun, Harkaitz; Faundez-Zanuy, Marcos; Ezeiza, Aitzol; Barroso, Nora; Ecay-Torres, Miriam; Martinez-Lage, Pablo; de Lizardui, Unai Martinez

    2013-01-01

    The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients. PMID:23698268

  16. Automatic control of a robot camera for broadcasting based on cameramen's techniques and subjective evaluation and analysis of reproduced images.

    PubMed

    Kato, D; Katsuura, T; Koyama, H

    2000-03-01

    With the goal of achieving an intelligent robot camera system that can take dynamic images automatically through humanlike, natural camera work, we analyzed how images were shot, subjectively evaluated reproduced images, and examined effects of camerawork, using camera control technique as a parameter. It was found that (1) A high evaluation is obtained when human-based data are used for the position adjusting velocity curve of the target; (2) Evaluation scores are relatively high for images taken with feedback-feedforward camera control method for target movement in one direction; (3) Keeping the target within the image area using the control method that imitates human camera handling becomes increasingly difficult when the target changes both direction and velocity and becomes bigger and faster, and (4) The mechanical feedback method can cope with rapid changes in the target's direction and velocity, constantly keeping the target within the image area, though the viewer finds the image rather mechanical as opposed to humanlike.

  17. LibME-automatic extraction of 3D ligand-binding motifs for mechanistic analysis of protein-ligand recognition.

    PubMed

    He, Wei; Liang, Zhi; Teng, MaiKun; Niu, LiWen

    2016-12-01

    Identifying conserved binding motifs is an efficient way to study protein-ligand recognition. Most 3D binding motifs only contain information from the protein side, and so motifs that combine information from both protein and ligand sides are desired. Here, we propose an algorithm called LibME (Ligand-binding Motif Extractor), which automatically extracts 3D binding motifs composed of the target ligand and surrounding conserved residues. We show that the motifs extracted by LibME for ATP and its analogs are highly similar to well-known motifs reported by previous studies. The superiority of our method to handle flexible ligands was also demonstrated using isocitric acid as an example. Finally, we show that these motifs, together with their visual exhibition, permit better investigating and understanding of protein-ligand recognition process.

  18. The feasibility of a regional CTDI{sub vol} to estimate organ dose from tube current modulated CT exams

    SciTech Connect

    Khatonabadi, Maryam; Kim, Hyun J.; Lu, Peiyun; McMillan, Kyle L.; Cagnon, Chris H.; McNitt-Gray, Michael F.; DeMarco, John J.

    2013-05-15

    dose to correlate with patient size was investigated. Results: For all five organs, the correlations with patient size increased when organ doses were normalized by regional and organ-specific CTDI{sub vol} values. For example, when estimating dose to the liver, CTDI{sub vol,global} yielded a R{sup 2} value of 0.26, which improved to 0.77 and 0.86, when using the regional and organ-specific CTDI{sub vol} for abdomen and liver, respectively. For breast dose, the global CTDI{sub vol} yielded a R{sup 2} value of 0.08, which improved to 0.58 and 0.83, when using the regional and organ-specific CTDI{sub vol} for chest and breasts, respectively. The R{sup 2} values also increased once the thoracic models were separated for the analysis into females and males, indicating differences between genders in this region not explained by a simple measure of effective diameter. Conclusions: This work demonstrated the utility of regional and organ-specific CTDI{sub vol} as normalization factors when using TCM. It was demonstrated that CTDI{sub vol,global} is not an effective normalization factor in TCM exams where attenuation (and therefore tube current) varies considerably throughout the scan, such as abdomen/pelvis and even thorax. These exams can be more accurately assessed for dose using regional CTDI{sub vol} descriptors that account for local variations in scanner output present when TCM is employed.

  19. A micellar electrokinetic chromatography-mass spectrometry approach using in-capillary diastereomeric derivatization for fully automatized chiral analysis of amino acids.

    PubMed

    Moldovan, Radu-Cristian; Bodoki, Ede; Kacsó, Timea; Servais, Anne-Catherine; Crommen, Jacques; Oprean, Radu; Fillet, Marianne

    2016-10-07

    In the context of bioanalytical method development, process automatization is nowadays a necessity in order to save time, improve method reliability and reduce costs. For the first time, a fully automatized micellar electrokinetic chromatography-mass spectrometry (MEKC-MS) method with in-capillary derivatization was developed for the chiral analysis of d- and l-amino acids using (-)-1-(9-fluorenyl) ethyl chloroformate (FLEC) as labeling reagent. The derivatization procedure was optimized using an experimental design approach leading to the following conditions: sample and FLEC plugs in a 2:1 ratio (15s, 30mbar: 7.5s, 30mbar) followed by 15min of mixing using a voltage of 0.1kV. The formed diastereomers were then separated using a background electrolyte (BGE) consisting of 150mM ammonium perfluorooctanoate (APFO) (pH=9.5) and detected by mass spectrometry (MS). Complete chiral resolution was obtained for 8 amino acids, while partial separation was achieved for 6 other amino acid pairs. The method showed good reproducibility and linearity in the low micromolar concentration range. The applicability of the method to biological samples was tested by analyzing artificial cerebrospinal fluid (aCSF) samples.

  20. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  1. Application of a method for the automatic detection and Ground-Based Velocity Track Display (GBVTD) analysis of a tornado crossing the Hong Kong International Airport

    NASA Astrophysics Data System (ADS)

    Chan, P. W.; Wurman, J.; Shun, C. M.; Robinson, P.; Kosiba, K.

    2012-03-01

    A weak tornado with a maximum Doppler velocity shear of about 40 m s - 1 moved across the Hong Kong International Airport (HKIA) during the evening of 20 May 2002. The tornado caused damage equivalent to F0 on the Fujita Scale, based on a damage survey. The Doppler velocity data from the Hong Kong Terminal Doppler Weather Radar (TDWR) are studied using the Ground-Based Velocity Track Display (GBVTD) method of single Doppler analysis. The GBVTD analysis is able to clearly depict the development and decay of the tornado though it appears to underestimate its magnitude. In the pre-tornadic state, the wind field is characterized by inflow toward the center near the ground and upward motion near the center. When the tornado attains its maximum strength, an eye-like structure with a downdraft appears to form in the center. Several minutes later the tornado begins to decay and outflow dominates at low levels. Assuming cyclostrophic balance, the pressure drop 200 m from the center of the tornado at its maximum strength is calculated to be about 6 hPa. To estimate the maximum ground-relative wind speed of the tornado, the TDWR's Doppler velocities are adjusted for the ratio of the sample-volume size of the radar and the radius of the tornado, resulting in a peak wind speed of 28 m s - 1 , consistent with the readings from a nearby ground-based anemometers and the F0 damage observed. An automatic tornado detection algorithm based on Doppler velocity difference (delta-V) and temporal and spatial continuity is applied to this event. The locations and the core flow radii of the tornado as determined by the automatic method and by subjective analysis agree closely.

  2. Automatic Verification of Serializers.

    DTIC Science & Technology

    1980-03-01

    Programming Languages, Academic Press, New York, 1968. Dijkstra 71 E. Dijkstra, Hierarchical Ordering of Sequential Processes, Acta Informatica , vol. 1...Programming Languages, Las Vegas, January 1980, 174-185. Lanipson and Redell 79 B. Lanipson, 1). RedelI , Experience with monitors and processes in

  3. Automatic Analysis and Classification of the Roof Surfaces for the Installation of Solar Panels Using a Multi-Data Source and Multi-Sensor Aerial Platform

    NASA Astrophysics Data System (ADS)

    López, L.; Lagüela, S.; Picon, I.; González-Aguilera, D.

    2015-02-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbour solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the surfaces, slopes, orientations and the existence of obstacles. This way, large areas may be efficiently analysed obtaining as final result the optimal locations for the placement of solar panels as well as the required geometry of the supports for the installation of the panels in those roofs where geometry is not optimal.

  4. Automatic method of analysis of OCT images in the assessment of the tooth enamel surface after orthodontic treatment with fixed braces

    PubMed Central

    2014-01-01

    Introduction Fixed orthodontic appliances, despite years of research and development, still raise a lot of controversy because of its potentially destructive influence on enamel. Therefore, it is necessary to quantitatively assess the condition and therein the thickness of tooth enamel in order to select the appropriate orthodontic bonding and debonding methodology as well as to assess the quality of enamel after treatment and clean-up procedure in order to choose the most advantageous course of treatment. One of the assessment methods is computed tomography where the measurement of enamel thickness and the 3D reconstruction of image sequences can be performed fully automatically. Material and method OCT images of 180 teeth were obtained from the Topcon 3D OCT-2000 camera. The images were obtained in vitro by performing sequentially 7 stages of treatment on all the teeth: before any interference into enamel, polishing with orthodontic paste, etching and application of a bonding system, orthodontic bracket bonding, orthodontic bracket removal, cleaning off adhesive residue. A dedicated method for the analysis and processing of images involving median filtering, mathematical morphology, binarization, polynomial approximation and the active contour method has been proposed. Results The obtained results enable automatic measurement of tooth enamel thickness in 5 seconds using the Core i5 CPU M460 @ 2.5GHz 4GB RAM. For one patient, the proposed method of analysis confirms enamel thickness loss of 80 μm (from 730 ± 165 μm to 650 ± 129 μm) after polishing with paste, enamel thickness loss of 435 μm (from 730 ± 165 μm to 295 ± 55 μm) after etching and bonding resin application, growth of a layer having a thickness of 265 μm (from 295 ± 55 μm to 560 ± 98 μm after etching) which is the adhesive system. After removing an orthodontic bracket, the adhesive residue was 105 μm and after cleaning it off, the enamel thickness was

  5. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  6. Further evidence of complex motor dysfunction in drug naive children with autism using automatic motion analysis of gait.

    PubMed

    Nobile, Maria; Perego, Paolo; Piccinini, Luigi; Mani, Elisa; Rossi, Agnese; Bellina, Monica; Molteni, Massimo

    2011-05-01

    In order to increase the knowledge of locomotor disturbances in children with autism, and of the mechanism underlying them, the objective of this exploratory study was to reliably and quantitatively evaluate linear gait parameters (spatio-temporal and kinematic parameters), upper body kinematic parameters, walk orientation and smoothness using an automatic motion analyser (ELITE systems) in drug naïve children with Autistic Disorder (AD) and healthy controls. The children with AD showed a stiffer gait in which the usual fluidity of walking was lost, trunk postural abnormalities, highly significant difficulties to maintain a straight line and a marked loss of smoothness (increase of jerk index), compared to the healthy controls. As a whole, these data suggest a complex motor dysfunction involving both the cortical and the subcortical area or, maybe, a possible deficit in the integration of sensory-motor information within motor networks (i.e., anomalous connections within the fronto-cerebello-thalamo-frontal network). Although the underlying neural structures involved remain to be better defined, these data may contribute to highlighting the central role of motor impairment in autism and suggest the usefulness of taking into account motor difficulties when developing new diagnostic and rehabilitation programs.

  7. Analysis of feeding and drinking patterns of dairy cows in two cow traffic situations in automatic milking systems.

    PubMed

    Melin, M; Wiktorsson, H; Norell, L

    2005-01-01

    With increasing possibilities for obtaining online information for individual cows, systems for individual management can be developed. Feeding and drinking patterns from automatically obtained records may be valuable input information in these systems. With the aim of evaluating appropriate mixed-distribution models for feeding and drinking events, records of 30 fresh cows from visits at feeding stations (n = 83,249) and water bowls (n = 67,525) were analyzed. Cows were either allowed a high-milking (HF) or a low-milking (LF) frequency by being subjected to controlled cow traffic with minimum milking intervals of 4 and 8 h, respectively. Milking frequency had significant effects on feeding patterns. The major part (84 to 98%) of the random variation in feeding patterns of the cows was due to individual differences between cows. It can be concluded that cows develop consistent feeding and drinking patterns over time that are characteristic for each individual cow. Based on this consistency, patterns of feeding and drinking activities have valuable potential for purposes of monitoring and decision making in individual control management systems. Use of a Weibull distribution to describe the population of intervals between meals increased the statistical fit, predicted biologically relevant starting probabilities, and estimated meal criteria that were closer to what has been published by others.

  8. Automatic identification of fault surfaces through Object Based Image Analysis of a Digital Elevation Model in the submarine area of the North Aegean Basin

    NASA Astrophysics Data System (ADS)

    Argyropoulou, Evangelia

    2015-04-01

    The current study was focused on the seafloor morphology of the North Aegean Basin in Greece, through Object Based Image Analysis (OBIA) using a Digital Elevation Model. The goal was the automatic extraction of morphologic and morphotectonic features, resulting into fault surface extraction. An Object Based Image Analysis approach was developed based on the bathymetric data and the extracted features, based on morphological criteria, were compared with the corresponding landforms derived through tectonic analysis. A digital elevation model of 150 meters spatial resolution was used. At first, slope, profile curvature, and percentile were extracted from this bathymetry grid. The OBIA approach was developed within the eCognition environment. Four segmentation levels were created having as a target "level 4". At level 4, the final classes of geomorphological features were classified: discontinuities, fault-like features and fault surfaces. On previous levels, additional landforms were also classified, such as continental platform and continental slope. The results of the developed approach were evaluated by two methods. At first, classification stability measures were computed within eCognition. Then, qualitative and quantitative comparison of the results took place with a reference tectonic map which has been created manually based on the analysis of seismic profiles. The results of this comparison were satisfactory, a fact which determines the correctness of the developed OBIA approach.

  9. Automatic Versus Manual Indexing

    ERIC Educational Resources Information Center

    Vander Meulen, W. A.; Janssen, P. J. F. C.

    1977-01-01

    A comparative evaluation of results in terms of recall and precision from queries submitted to systems with automatic and manual subject indexing. Differences were attributed to query formulation. The effectiveness of automatic indexing was found equivalent to manual indexing. (Author/KP)

  10. Automatic Test Program Generation.

    DTIC Science & Technology

    1978-03-01

    presents a test description language, NOPAL , in which a user may describe diagnostic tests, and a software system which automatically generates test...programs for an automatic test equipment based on the descriptions of tests. The software system accepts as input the tests specified in NOPAL , performs

  11. Keystone feasibility study. Final report. Vol. 4

    SciTech Connect

    Not Available

    1982-12-01

    Volume four of the Keystone coal-to-methanol project includes the following: (1) project management; (2) economic and financial analyses; (3) market analysis; (4) process licensing and agreements; and (5) appendices. 24 figures, 27 tables.

  12. Automatic and integrated micro-enzyme assay (AIμEA) platform for highly sensitive thrombin analysis via an engineered fluorescence protein-functionalized monolithic capillary column.

    PubMed

    Lin, Lihua; Liu, Shengquan; Nie, Zhou; Chen, Yingzhuang; Lei, Chunyang; Wang, Zhen; Yin, Chao; Hu, Huiping; Huang, Yan; Yao, Shouzhuo

    2015-04-21

    Nowadays, large-scale screening for enzyme discovery, engineering, and drug discovery processes require simple, fast, and sensitive enzyme activity assay platforms with high integration and potential for high-throughput detection. Herein, a novel automatic and integrated micro-enzyme assay (AIμEA) platform was proposed based on a unique microreaction system fabricated by a engineered green fluorescence protein (GFP)-functionalized monolithic capillary column, with thrombin as an example. The recombinant GFP probe was rationally engineered to possess a His-tag and a substrate sequence of thrombin, which enable it to be immobilized on the monolith via metal affinity binding, and to be released after thrombin digestion. Combined with capillary electrophoresis-laser-induced fluorescence (CE-LIF), all the procedures, including thrombin injection, online enzymatic digestion in the microreaction system, and label-free detection of the released GFP, were integrated in a single electrophoretic process. By taking advantage of the ultrahigh loading capacity of the AIμEA platform and the CE automatic programming setup, one microreaction column was sufficient for many times digestion without replacement. The novel microreaction system showed significantly enhanced catalytic efficiency, about 30 fold higher than that of the equivalent bulk reaction. Accordingly, the AIμEA platform was highly sensitive with a limit of detection down to 1 pM of thrombin. Moreover, the AIμEA platform was robust and reliable to detect thrombin in human serum samples and its inhibition by hirudin. Hence, this AIμEA platform exhibits great potential for high-throughput analysis in future biological application, disease diagnostics, and drug screening.

  13. On the implementation of automatic differentiation tools.

    SciTech Connect

    Bischof, C. H.; Hovland, P. D.; Norris, B.; Mathematics and Computer Science; Aachen Univ. of Technology

    2008-01-01

    Automatic differentiation is a semantic transformation that applies the rules of differential calculus to source code. It thus transforms a computer program that computes a mathematical function into a program that computes the function and its derivatives. Derivatives play an important role in a wide variety of scientific computing applications, including numerical optimization, solution of nonlinear equations, sensitivity analysis, and nonlinear inverse problems. We describe the forward and reverse modes of automatic differentiation and provide a survey of implementation strategies. We describe some of the challenges in the implementation of automatic differentiation tools, with a focus on tools based on source transformation. We conclude with an overview of current research and future opportunities.

  14. Composite materials: Fatigue and fracture. Vol. 3

    NASA Technical Reports Server (NTRS)

    O'Brien, T. K. (Editor)

    1991-01-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  15. Automatic Fibrosis Quantification By Using a k-NN Classificator

    DTIC Science & Technology

    2001-10-25

    Fluthrope, “Stages in fiber breakdown in duchenne muscular dystrophy ,” J. Neurol. Sci., vol. 24, pp. 179– 186, 1975. [6] F. Cornelio and I. Dones, “Muscle...pp. 694–701, 1984. [7] A.E.H. Emery, Duchenne muscular dystrophy , 2nd ed, Oxford University Press, 1993. [8] A.T.M. Hageman, F.J.M. Gabreels, and...an automatic algorithm to measure fibrosis in muscle sections of mdx mice, a mutant species used as a model of the Duchenne dystrophy . The al- gorithm

  16. Youth Studies Abstracts. Vol. 4 No. 1.

    ERIC Educational Resources Information Center

    Youth Studies Abstracts, 1985

    1985-01-01

    This volume contains abstracts of 76 projects (most of which were conducted in Australia and New Zealand) concerned with programs for youth and with social and educational developments affecting youth. The abstracts are arranged in the following two categories: (1) Social and Educational Developments: Policy, Analysis, Research; and (2) Programs:…

  17. TESL Reporter, Vol. 10, No. 4.

    ERIC Educational Resources Information Center

    Pack, Alice C., Ed.

    This issue contains the following articles: "Providing Practice Teaching through Peer Teaching: A Realistic Approach," by Ted Plaister; "Repetition within a Fun Context," by Emilio G. Cortez; "Sector Analysis and Working Sentences," by Lynn Henrichsen; "The TESL Teacher and English Prefixes," by Mohammed Ali…

  18. Risk assessment and economic impact analysis of the implementation of new European legislation on radiopharmaceuticals in Italy: the case of the new monograph chapter Compounding of Radiopharmaceuticals (PHARMEUROPA, Vol. 23, No. 4, October 2011).

    PubMed

    Chitto, Giuseppe; Di Domenico, Elvira; Gandolfo, Patrizia; Ria, Francesco; Tafuri, Chiara; Papa, Sergio

    2013-12-01

    An assessment of the new monograph chapter Compounding of Radiopharmaceuticals has been conducted on the basis of the first period of implementation of Italian legislation on Good Radiopharmaceuticals Practice (NBP) in the preparation of radiopharmaceuticals, in keeping with Decree by the Italian Ministry of Health dated March 30, 2005. This approach is well grounded in the several points of similarity between the two sets of regulations. The impact on patient risk, on staff risk, and on healthcare organization risk, has been assessed. At the same time, the actual costs of coming into compliance with regulations have been estimated. A change risk analysis has been performed through the identification of healthcare-associated risks, the analysis and measurement of the likelihood of occurrence and of the potential impact in terms of patient harm and staff harm, and the determination of the healthcare organization's controlling capability. In order to evaluate the economic impact, the expenses directly related to the implementation of the activities as per ministerial decree have been estimated after calculating the overall costs unrelated to NBP implementation. The resulting costs have then been averaged over the total number of patient services delivered. NBP implementation shows an extremely positive impact on risk management for both patients receiving Nuclear Medicine services and the healthcare organization. With regard to healthcare workers, instead, the implementation of these regulations has a negative effect on the risk for greater exposure and a positive effect on the defense against litigation. The economic impact analysis of NBP implementation shows a 34% increase in the costs for a single patient service. The implementation of the ministerial decree allows for greater detectability of and control over a number of critical elements, paving the way for risk management and minimization. We, therefore, believe that the proposed tool can provide basic

  19. 3. Photocopy form Western Architect, Vol, 19, No. 8, August ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Photocopy form Western Architect, Vol, 19, No. 8, August 1913, following page 80. 'TOWN AND COMMUNITY PLANNING, WALTER BURLEY GRIFFEN.' ORIGINAL PRESENTATION DRAWING AT NORTHWESTERN UNIVERSITY, ART DEPARTMENT. - Joshua G. Melson House, 56 River Heights Drive, Mason City, Cerro Gordo County, IA

  20. 14. Photocopy of engraving from History of Westchester County, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. Photocopy of engraving from History of Westchester County, Vol. 2, by L.E. Preston & Company, Philadelphia, 1886 ALEXANDER SMITH AND SONS CARPET COMPANY, DETAIL, SPINNING AND PRINT MILLS, - Moquette Row Housing, Moquette Row North & Moquette Row South, Yonkers, Westchester County, NY

  1. 13. Photocopy of engraving from History of Westchester County, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Photocopy of engraving from History of Westchester County, Vol. 2, by J. Thomas Scharf, published by L.E. Preston & Company, Philadelphia, 1886 ALEXANDER SMITH AND SONS CARPET COMPANY, MOQUETTE MILLS, WEAVING MILLS, SPINNING AND PRINT MILLS - Moquette Row Housing, Moquette Row North & Moquette Row South, Yonkers, Westchester County, NY

  2. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XX); illustration used by eminent British textile engineer to exemplify the ultimate development in American cotton mill technology. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  3. Stratified object-based image analysis of high-res laser altimetry data for semi-automatic geomorphological mapping in an alpine area

    NASA Astrophysics Data System (ADS)

    Anders, Niels S.; Seijmonsbergen, Arie C.; Bouten, Willem

    2010-05-01

    Classic geomorphological mapping is gradually replaced by (semi) automated techniques to rapidly obtain geomorphological information in remote, steep and/or forested areas. To ensure a high accuracy of these semi-automated maps, there is a need to optimize automated mapping procedures. Within this context, we present a novel approach to semi-automatically map alpine geomorphology using a stratified object-based image analysis approach, in contrast to traditional object-based image analysis. We used a 1 m ‘Light Detection And Ranging' (LiDAR) Digital Terrain Model (DTM) from a mountainous area in Vorarlberg (western Austria). From the DTM, we calculated various terrain derivatives which served as input for segmentation of the DTM and object-based classification. We assessed the segmentation results by comparing the generated image objects with a reference dataset. In this way, we optimized image segmentation parameters which were used for classifying karst, glacial, fluvial and denudational landforms. To evaluate our approach, the classification results were compared with results from traditional object-based image analysis. Our results show that landform-specific segmentation parameters are needed to extract and classify alpine landforms in a step-wise manner, producing a geomorphological map with higher accuracy than maps resulting from traditional object-based image analysis. We conclude that the stratified object-based image analysis of high-resolution laser altimetry data substantially improves classification results in the study area. Using this approach, geomorphological maps can be produced more accurately and efficiently than before in difficult-to-access alpine areas. A further step may be the development of specific landform segmentation/classification signatures which can be transferred and applied in other mountain regions.

  4. Automatic wire twister.

    PubMed

    Smith, J F; Rodeheaver, G T; Thacker, J G; Morgan, R F; Chang, D E; Fariss, B L; Edlich, R F

    1988-06-01

    This automatic wire twister used in surgery consists of a 6-inch needle holder attached to a twisting mechanism. The major advantage of this device is that it twists wires significantly more rapidly than the conventional manual techniques. Testing has found that the ultimate force required to disrupt the wires twisted by either the automatic wire twister or manual techniques did not differ significantly and was directly related to the number of twists. The automatic wire twister reduces the time needed for wire twisting without altering the security of the twisted wire.

  5. Direct automatic determination of bitterness and total phenolic compounds in virgin olive oil using a pH-based flow-injection analysis system.

    PubMed

    Garcia-Mesa, José A; Mateos, Raquel

    2007-05-16

    Flavor and taste are sensorial attributes of virgin olive oil (VOO) highly appreciated by consumers. Among the organoleptic properties of VOO, bitterness is related to the natural phenolic compounds present in the oil. Sensorial analysis is the official method to evaluate VOO flavor and bitterness, which requires highly specialized experts. Alternatively, methods based on physicochemical determinations could be useful for the industry. The present work presents a flow-injection analysis system for the direct automatic determination of bitterness and total phenolic compounds in VOO without prior isolation, based on the spectral shift undergone by phenolic compounds upon pH variation. This system enables a complete automation of the process, including dilution of the sample and its sequential injection into buffer solutions of acidic and alkaline pH. The variation of the absorbance at 274 nm showed a high correlation with bitterness and the total phenolic content of VOO, due to the close relationship between these two parameters. Thus, the proposed method determines the bitterness and phenolic compounds, with results similar to those from reference methods (relative errors ranging from 1% to 8% for bitterness and from 2% and 7% for phenolic compounds). The precision evaluated at two levels of both parameters ranged between 0.6% and 1.5% for bitterness and between 0.7% and 2.6% for phenolic compounds.

  6. Application of an automatic thermal desorption-gas chromatography-mass spectrometry system for the analysis of polycyclic aromatic hydrocarbons in airborne particulate matter.

    PubMed

    Gil-Moltó, J; Varea, M; Galindo, N; Crespo, J

    2009-02-27

    The application of the thermal desorption (TD) method coupled with gas chromatography-mass spectrometry (GC-MS) to the analysis of aerosol organics has been the focus of many studies in recent years. This technique overcomes the main drawbacks of the solvent extraction approach such as the use of large amounts of toxic organic solvents and long and laborious extraction processes. In this work, the application of an automatic TD-GC-MS instrument for the determination of particle-bound polycyclic aromatic hydrocarbons (PAHs) is evaluated. This device offers the advantage of allowing the analysis of either gaseous or particulate organics without any modification. Once the thermal desorption conditions for PAH extraction were optimised, the method was verified on NIST standard reference material (SRM) 1649a urban dust, showing good linearity, reproducibility and accuracy for all target PAHs. The method has been applied to PM10 and PM2.5 samples collected on quartz fibre filters with low volume samplers, demonstrating its capability to quantify PAHs when only a small amount of sample is available.

  7. Automatic registration of satellite imagery

    NASA Technical Reports Server (NTRS)

    Fonseca, Leila M. G.; Costa, Max H. M.; Manjunath, B. S.; Kenney, C.

    1997-01-01

    Image registration is one of the basic image processing operations in remote sensing. With the increase in the number of images collected every day from different sensors, automated registration of multi-sensor/multi-spectral images has become an important issue. A wide range of registration techniques has been developed for many different types of applications and data. The objective of this paper is to present an automatic registration algorithm which uses a multiresolution analysis procedure based upon the wavelet transform. The procedure is completely automatic and relies on the grey level information content of the images and their local wavelet transform modulus maxima. The registration algorithm is very simple and easy to apply because it needs basically one parameter. We have obtained very encouraging results on test data sets from the TM and SPOT sensor images of forest, urban and agricultural areas.

  8. Treatment of Automatically Reinforced Object Mouthing with Noncontingent Reinforcement and Response Blocking: Experimental Analysis and Social Validation.

    ERIC Educational Resources Information Center

    Carr, James E.; Dozier, Claudia L.; Patel, Meeta R.; Adams, Amanda Nicolson; Martin, Nichelle

    2002-01-01

    A brief functional analysis indicated that the object mouthing of a young girl diagnosed with autism was maintained independent of social consequences. Separate and combined effects of response blocking and non-contingent reinforcement were then evaluated as treatments. Although both interventions were unsuccessful when implemented separately,…

  9. Automatic image analysis and spot classification for detection of fruit fly infestation in hyperspectral images of mangoes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An algorithm has been developed to identify spots generated in hyperspectral images of mangoes infested with fruit fly larvae. The algorithm incorporates background removal, application of a Gaussian blur, thresholding, and particle count analysis to identify locations of infestations. Each of the f...

  10. Automatic switching matrix

    DOEpatents

    Schlecht, Martin F.; Kassakian, John G.; Caloggero, Anthony J.; Rhodes, Bruce; Otten, David; Rasmussen, Neil

    1982-01-01

    An automatic switching matrix that includes an apertured matrix board containing a matrix of wires that can be interconnected at each aperture. Each aperture has associated therewith a conductive pin which, when fully inserted into the associated aperture, effects electrical connection between the wires within that particular aperture. Means is provided for automatically inserting the pins in a determined pattern and for removing all the pins to permit other interconnecting patterns.

  11. Semi-automatic volumetrics system to parcellate ROI on neocortex

    NASA Astrophysics Data System (ADS)

    Tan, Ou; Ichimiya, Tetsuya; Yasuno, Fumihiko; Suhara, Tetsuya

    2002-05-01

    A template-based and semi-automatic volumetrics system--BrainVol is build to divide the any given patient brain to neo-cortical and sub-cortical regions. The standard region is given as standard ROI drawn on a standard brain volume. After normalization between the standard MR image and the patient MR image, the sub-cortical ROIs' boundary are refined based on gray matter. The neo-cortical ROIs are refined by sulcus information that is semi-automatically marked on the patient brain. Then the segmentation is applied to 4D PET image of same patient for calculation of TAC (Time Activity Curve) by co-registration between MR and PET.

  12. Using image analysis and ArcGIS® to improve automatic grain boundary detection and quantify geological images

    NASA Astrophysics Data System (ADS)

    DeVasto, Michael A.; Czeck, Dyanna M.; Bhattacharyya, Prajukti

    2012-12-01

    Geological images, such as photos and photomicrographs of rocks, are commonly used as supportive evidence to indicate geological processes. A limiting factor to quantifying images is the digitization process; therefore, image analysis has remained largely qualitative. ArcGIS®, the most widely used Geographic Information System (GIS) available, is capable of an array of functions including building models capable of digitizing images. We expanded upon a previously designed model built using Arc ModelBuilder® to quantify photomicrographs and scanned images of thin sections. In order to enhance grain boundary detection, but limit computer processing and hard drive space, we utilized a preprocessing image analysis technique such that only a single image is used in the digitizing model. Preprocessing allows the model to accurately digitize grain boundaries with fewer images and requires less user intervention by using batch processing in image analysis software and ArcCatalog®. We present case studies for five basic textural analyses using a semi-automated digitized image and quantified in ArcMap®. Grain Size Distributions, Shape Preferred Orientations, Weak phase connections (networking), and Nearest Neighbor statistics are presented in a simplified fashion for further analyses directly obtainable from the automated digitizing method. Finally, we discuss the ramifications for incorporating this method into geological image analyses.

  13. QSAR study and VolSurf characterization of anti-HIV quinolone library

    NASA Astrophysics Data System (ADS)

    Filipponi, Enrica; Cruciani, Gabriele; Tabarrini, Oriana; Cecchetti, Violetta; Fravolini, Arnaldo

    2001-03-01

    Antiviral quinolones are promising compounds in the search for new therapeutically effective agents for the treatment of AIDS. To rationalize the SAR for this new interesting class of anti-HIV derivatives, we performed a 3D-QSAR study on a library of 101 6-fluoro and 6-desfluoroquinolones, taken either from the literature or synthesized by us. The chemometric procedure involved a fully semiempirical minimization of the molecular structures by the AMSOL program, which takes into account the solvatation effect, and their 3D characterization by the VolSurf/GRID program. The QSAR analysis, based on PCA and PLS methods, shows the key structural features responsible for the antiviral activity.

  14. Evaluating the reforested area for the municipality of Buri by automatic analysis of LANDSAT imagery. [Sao Paulo, Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Lee, D. C. L.; Filho, R. H.; Shimabukuro, Y. E.

    1979-01-01

    The author has identified the following significant results. The class of reforestation (Pinus, Eucalyptus, Araucaria) was defined using iterative image analysis (1-100) and LANDSAT MSS data. Estimates of class area by 1-100 were compared with data supplied by the forestry institute in Sao Paulo. LANDSAT channels 4 and 5 served to differentiate the Pinus, Eucalyptus, and Araucaria from the other trees. Channels 6 and 7 gave best results for differentiating between the classes. A good representative spectral response was obtained for Auraucaria on these two channels. The small relative differences obtained were +4.24% for Araucaria, -7.51% for Pinus, and -32.07% for Eucalyptus.

  15. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  16. Automatic tracking sensor camera system

    NASA Astrophysics Data System (ADS)

    Tsuda, Takao; Kato, Daiichiro; Ishikawa, Akio; Inoue, Seiki

    2001-04-01

    We are developing a sensor camera system for automatically tracking and determining the positions of subjects moving in three-dimensions. The system is intended to operate even within areas as large as soccer fields. The system measures the 3D coordinates of the object while driving the pan and tilt movements of camera heads, and the degree of zoom of the lenses. Its principal feature is that it automatically zooms in as the object moves farther away and out as the object moves closer. This maintains the area of the object as a fixed position of the image. This feature makes stable detection by the image processing possible. We are planning to use the system to detect the position of a soccer ball during a soccer game. In this paper, we describe the configuration of the developing automatic tracking sensor camera system. We then give an analysis of the movements of the ball within images of games, the results of experiments on method of image processing used to detect the ball, and the results of other experiments to verify the accuracy of an experimental system. These results show that the system is sufficiently accurate in terms of obtaining positions in three-dimensions.

  17. A three-dimensional quantitative analysis of restenosis parameters after balloon angioplasty: comparison between semi-automatic computer-assisted planimetry and stereology.

    PubMed

    Salu, Koen J; Knaapen, Michiel W M; Bosmans, Johan M; Vrints, Chris J; Bult, Hidde

    2002-01-01

    Semi-automatic computer-assisted planimetry is often used for the quantification of restenosis parameters after balloon angioplasty although it is a time-consuming method. Moreover, slicing the artery to enable analysis of two-dimensional (2-D) images leads to a loss of information since the vessel structure is three-dimensional (3-D). Cavalieri's principle uses systematic random sampling allowing 3-D quantification. This study compares the accuracy and efficiency of planimetry versus point-counting measurements on restenosis parameters after balloon angioplasty and investigates the use of Cavalieri's principle for 3-D volume quantification. Bland and Altman plots showed good agreement between planimetry and point counting for the 2-D and 3-D quantification of lumen, internal elastic lamina (IEL) and external elastic lamina (EEL), with a slightly smaller agreement for intima and media. Mean values and induced coefficients of variation were similar for both methods for all parameters. Point counting induced a 6% error in its 3-D quantification, which is negligible in view of the biological variation (>90%) among animals. However, point counting was 3 times faster compared to planimetry, improving its efficiency. This study shows that combining Cavalieri's principle with point counting is a precise and efficient method for the 3-D quantification of restenosis parameters after balloon angioplasty.

  18. Automatic flow analysis method to determine traces of Mn²⁺ in sea and drinking waters by a kinetic catalytic process using LWCC-spectrophotometric detection.

    PubMed

    Chaparro, Laura; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2016-02-01

    A new automatic kinetic catalytic method has been developed for the measurement of Mn(2+) in drinking and seawater samples. The method is based on the catalytic effect of Mn(2+) on the oxidation of tiron by hydrogen peroxide in presence of Pb(2+) as an activator. The optimum conditions were obtained at pH 10 with 0.019 mol L(-1) 2'2 bipyridyl, 0.005 mol L(-1) tiron and 0.38 mol L(-1) hydrogen peroxide. Flow system is based on multisyringe flow injection analysis (MSFIA) coupled with a lab-on-valve (LOV) device exploiting on line spectrophotometric detection by a Liquid Waveguide Capillary Cell (LWCC), 1m optical length and performed at 445 nm. Under the optimized conditions by a multivariate approach, the method allowed the measurement of Mn(2+) in a range of 0.03-35 µg L(-1) with a detection limit of 0.010 µg L(-1), attaining a repeatability of 1.4% RSD. The method was satisfactorily applied to the determination of Mn(2+) in environmental water samples. The reliability of method was also verified by determining the manganese content of the certified standard reference seawater sample, CASS-4.

  19. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    PubMed

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  20. Histological analysis of tissue structures of the internal organs of steppe tortoises following their exposure to spaceflight conditions while circumnavigating the moon aboard the Zond-7 automatic station

    NASA Technical Reports Server (NTRS)

    Sutulov, L. S.; Sutulov, Y. L.; Trukhina, L. V.

    1975-01-01

    Tortoises flown around the Moon on the 6-1/2 day voyage of the Zond-7 automatic space station evidently did not suffer any pathological changes to their peripheral blood picture, heart, lungs, intestines, or liver.

  1. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    NASA Astrophysics Data System (ADS)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of

  2. An interdisciplinary analysis of multispectral satellite data for selected cover types in the Colorado Mountains, using automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has reported the following significant results. A data set containing SKYLAB, LANDSAT, and topographic data has been overlayed, registered, and geometrically corrected to a scale of 1:24,000. After geometrically correcting both sets of data, the SKYLAB data were overlayed on the LANDSAT data. Digital topographic data were then obtained, reformatted, and a data channel containing elevation information was then digitally overlayed onto the LANDSAT and SKYLAB spectral data. The 14,039 square kilometers involving 2,113, 776 LANDSAT pixels represents a relatively large data set available for digital analysis. The overlayed data set enables investigators to numerically analyze and compare two sources of spectral data and topographic data from any point in the scene. This capability is new and it will permit a numerical comparison of spectral response with elevation, slope, and aspect. Utilization of the spectral and topographic data together to obtain more accurate classifications of the various cover types present is feasible.

  3. Determination of free and total sulfites in wine using an automatic flow injection analysis system with voltammetric detection.

    PubMed

    Goncalves, Luis Moreira; Grosso Pacheco, Joao; Jorge Magalhaes, Paulo; Antonio Rodrigues, Jose; Araujo Barros, Aquiles

    2010-02-01

    An automated flow injection analysis (FIA) system, based on an initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell, was developed for the determination of total and free sulfur dioxide (SO(2)) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and a simplified method commonly used by the wine industry). The developed method displayed good repeatability (RSD lower than 6%) and linearity (between 10 and 250 mg l(-1)) as well as a suitable LOD (3 mg l(-1)) and LOQ (9 mg l(-1)). A major advantage of this system is that SO(2) is directly detected by flow SWV.

  4. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    NASA Astrophysics Data System (ADS)

    Bainbridge, Matthew B.; Webb, John K.

    2017-01-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  5. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant.

    NASA Astrophysics Data System (ADS)

    Bainbridge, Matthew B.; Webb, John K.

    2017-01-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one "artificial intelligence" process: a genetic algorithm (GVPFIT); non-linear least-squares with parameter constraints (VPFIT); and Bayesian Model Averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. GVPFIT is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. (2011) and King et al. (2012).

  6. Automatic Description of the Gulf Stream from IR Images Using Neural Networks.

    DTIC Science & Technology

    1990-01-01

    IIII I I I b TSPIE-The International Society for Optical Engineering Reprinted from Applications of Artificial Neural Networks 18-20 April 1990 Orlando...Telephone 206/676-3290. Automatic description of the Gulf Stream from IR images using neural networks Matthew Lybanon Naval Oceanographic and Atmospheric...nodes. Again, learning was by back SPIE Vol 1294 Applications of Artificial Neural Networks (1990) / 225 propagation. The training set consisted of

  7. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  8. Automatic shape recognition of human limbs to avoid errors due to skin marker shifting in motion analysis

    NASA Astrophysics Data System (ADS)

    Hatze, Herbert; Baca, Arnold

    1991-12-01

    A new method in human motion analysis is presented for overcoming the problem of the shifting of skin-mounted position markers relative to the skeleton. The present version of the method is based on two-dimensional video processing and involves the recording of subjects wearing special clothing. The clothing is designed in such a way as to permit the unambiguous spatial shape recognition of each of the 17 body segments by means of an edge detection algorithm. The latter and the algorithms for the computation of segment translation and rotation constitute improved versions of previously used algorithms, especially with respect to the execution times of the respective computer program on ordinary PCs. From the recognized shapes, the translation and rotation of each segment relative to its initial configuration is computed by using positional information from the previous frames. For the first frame to be analyzed, a starting algorithm has to be applied. Finally, the configurational coordinates of the body model are calculated from the respective spatial linear and angular positions.

  9. Automatic determination of insolubles in lubricating oils by flow injection analysis employing an LED-photometer detector.

    PubMed

    Pignalosa, Gustavo; Sixto, Alexandra; Knochen, Moisés

    2007-10-31

    A flow injection system is presented for the determination of the insolubles content in used lubricating oil samples. The system is based on the injection of an aliquot of the sample in a stream of organic solvent where it is dispersed, and measurement of the scattered radiation (measured as apparent absorbance) in the visible range (lambda=640nm). An LED-based photometer was used for this purpose. The whole system including sample injection and data acquisition was controlled by a personal computer. Calibration curves exhibited good linearity (h=0.415+/-0.016C+0.00+/-0.03, r(2)=0.9995, confidence level of 95%) in the range up to 2.68% (insolubles in pentane). Detection and quantification limits were respectively 0.07% and 0.16% (w/w). The method was validated by analysis of 25 real samples by the proposed method and the FTIR method finding high correlation. Waste generation and reactive consumption is much less than in the official method (ASTM D-893). The proposed method employs 25mL of kerosene per sample while the official method employs 200mL of pentane.

  10. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    NASA Astrophysics Data System (ADS)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  11. Automatic Analysis of Retinal Vascular Parameters for Detection of Diabetes in Indian Patients with No Retinopathy Sign

    PubMed Central

    Jain, Rajeev

    2016-01-01

    This study has investigated the association between retinal vascular parameters with type II diabetes in Indian population with no observable diabetic retinopathy. It has introduced two new retinal vascular parameters: total number of branching angles (TBA) and average acute branching angles (ABA) as potential biomarkers of diabetes in an explanatory model. A total number of 180 retinal images (two (left and right) × two (ODC and MC) × 45 subjects (13 diabetics and 32 nondiabetics)) were analysed. Stepwise linear regression analysis was performed to model the association between type II diabetes with the best subset of explanatory variables (predictors), consisting of retinal vascular parameters and patients' demographic information. P value of the estimated coefficients (P < 0.001) indicated that, at α level of 0.05, the newly introduced retinal vascular parameters, that is, TBA and ABA together with CRAE, mean tortuosity, SD of branching angle, and VB, are related to type II diabetes when there is no observable sign of retinopathy. PMID:27579347

  12. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    PubMed

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.

  13. WOLF; automatic typing program

    USGS Publications Warehouse

    Evenden, G.I.

    1982-01-01

    A FORTRAN IV program for the Hewlett-Packard 1000 series computer provides for automatic typing operations and can, when employed with manufacturer's text editor, provide a system to greatly facilitate preparation of reports, letters and other text. The input text and imbedded control data can perform nearly all of the functions of a typist. A few of the features available are centering, titles, footnotes, indentation, page numbering (including Roman numerals), automatic paragraphing, and two forms of tab operations. This documentation contains both user and technical description of the program.

  14. AUTOMATIC COUNTING APPARATUS

    DOEpatents

    Howell, W.D.

    1957-08-20

    An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.

  15. Automatic Program Synthesis Reports.

    ERIC Educational Resources Information Center

    Biermann, A. W.; And Others

    Some of the major results of future goals of an automatic program synthesis project are described in the two papers that comprise this document. The first paper gives a detailed algorithm for synthesizing a computer program from a trace of its behavior. Since the algorithm involves a search, the length of time required to do the synthesis of…

  16. Automatic Language Identification

    DTIC Science & Technology

    2000-08-01

    the speech utterance is hypothesized. ter performance for his HMM approach than his static ap- Finally, Thyme -Gobbel et al. [47] have also looked...1998. [47] A.E. Thyme -Gobbel and S.E. Hutchins. On using prosodic cues in automatic language identification. In International Conference on Spoken

  17. Automatic multiple applicator electrophoresis

    NASA Technical Reports Server (NTRS)

    Grunbaum, B. W.

    1977-01-01

    Easy-to-use, economical device permits electrophoresis on all known supporting media. System includes automatic multiple-sample applicator, sample holder, and electrophoresis apparatus. System has potential applicability to fields of taxonomy, immunology, and genetics. Apparatus is also used for electrofocusing.

  18. Automatic Transmission Vehicle Injuries

    PubMed Central

    Fidler, Malcolm

    1973-01-01

    Four drivers sustained severe injuries when run down by their own automatic cars while adjusting the carburettor or throttle linkages. The transmission had been left in the “Drive” position and the engine was idling. This accident is easily avoidable. PMID:4695693

  19. Reactor component automatic grapple

    DOEpatents

    Greenaway, Paul R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment.

  20. Automatic Thesaurus Generation for an Electronic Community System.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; And Others

    1995-01-01

    This research reports an algorithmic approach to the automatic generation of thesauri for electronic community systems. The techniques used include term filtering, automatic indexing, and cluster analysis. The Worm Community System, used by molecular biologists studying the nematode worm C. elegans, was used as the testbed for this research.…

  1. Automatically Detecting Authors’ Native Language

    DTIC Science & Technology

    2011-03-01

    for detecting Chinese and Japanese, but it performed less well with Slavic and Romance languages . Empirical analysis of character trigrams also...well for detecting Chinese and Japanese, but it performed less well with Slavics and Romance languages . We also compared the overall performance...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS AUTOMATICALLY DETECTING AUTHORS’ NATIVE LANGUAGE by Charles S. Ahn March 2011 Thesis Advisor

  2. Automatic TLI recognition system beta prototype testing

    SciTech Connect

    Lassahn, G.D.

    1996-06-01

    This report describes the beta prototype automatic target recognition system ATR3, and some performance tests done with this system. This is a fully operational system, with a high computational speed. It is useful for findings any kind of target in digitized image data, and as a general purpose image analysis tool.

  3. Automatic Frequency Control For DMSK Receiver

    NASA Technical Reports Server (NTRS)

    Davarian, Faramaz; Sumida, Joe T.

    1989-01-01

    Report discusses performance of automatic frequency-control (AFC) subsystem of differential minimum-shift-keying receiver described in "DMSK Receiver for Moblile/Satellite Service," NPO-16659. Describes efforts to quantify behavior of system during acquisition of carrier signal; including theoretical analysis leading to numerical simulation, and measurements of performance of receiving equipment.

  4. Automatic transmission control method

    SciTech Connect

    Hasegawa, H.; Ishiguro, T.

    1989-07-04

    This patent describes a method of controlling an automatic transmission of an automotive vehicle. The transmission has a gear train which includes a brake for establishing a first lowest speed of the transmission, the brake acting directly on a ring gear which meshes with a pinion, the pinion meshing with a sun gear in a planetary gear train, the ring gear connected with an output member, the sun gear being engageable and disengageable with an input member of the transmission by means of a clutch. The method comprises the steps of: detecting that a shift position of the automatic transmission has been shifted to a neutral range; thereafter introducing hydraulic pressure to the brake if present vehicle velocity is below a predetermined value, whereby the brake is engaged to establish the first lowest speed; and exhausting hydraulic pressure from the brake if present vehicle velocity is higher than a predetermined value, whereby the brake is disengaged.

  5. Automatic Abstraction in Planning

    NASA Technical Reports Server (NTRS)

    Christensen, J.

    1991-01-01

    Traditionally, abstraction in planning has been accomplished by either state abstraction or operator abstraction, neither of which has been fully automatic. We present a new method, predicate relaxation, for automatically performing state abstraction. PABLO, a nonlinear hierarchical planner, implements predicate relaxation. Theoretical, as well as empirical results are presented which demonstrate the potential advantages of using predicate relaxation in planning. We also present a new definition of hierarchical operators that allows us to guarantee a limited form of completeness. This new definition is shown to be, in some ways, more flexible than previous definitions of hierarchical operators. Finally, a Classical Truth Criterion is presented that is proven to be sound and complete for a planning formalism that is general enough to include most classical planning formalisms that are based on the STRIPS assumption.

  6. Automatic speech recognition

    NASA Astrophysics Data System (ADS)

    Espy-Wilson, Carol

    2005-04-01

    Great strides have been made in the development of automatic speech recognition (ASR) technology over the past thirty years. Most of this effort has been centered around the extension and improvement of Hidden Markov Model (HMM) approaches to ASR. Current commercially-available and industry systems based on HMMs can perform well for certain situational tasks that restrict variability such as phone dialing or limited voice commands. However, the holy grail of ASR systems is performance comparable to humans-in other words, the ability to automatically transcribe unrestricted conversational speech spoken by an infinite number of speakers under varying acoustic environments. This goal is far from being reached. Key to the success of ASR is effective modeling of variability in the speech signal. This tutorial will review the basics of ASR and the various ways in which our current knowledge of speech production, speech perception and prosody can be exploited to improve robustness at every level of the system.

  7. Automatic carrier acquisition system

    NASA Technical Reports Server (NTRS)

    Bunce, R. C. (Inventor)

    1973-01-01

    An automatic carrier acquisition system for a phase locked loop (PLL) receiver is disclosed. It includes a local oscillator, which sweeps the receiver to tune across the carrier frequency uncertainty range until the carrier crosses the receiver IF reference. Such crossing is detected by an automatic acquisition detector. It receives the IF signal from the receiver as well as the IF reference. It includes a pair of multipliers which multiply the IF signal with the IF reference in phase and in quadrature. The outputs of the multipliers are filtered through bandpass filters and power detected. The output of the power detector has a signal dc component which is optimized with respect to the noise dc level by the selection of the time constants of the filters as a function of the sweep rate of the local oscillator.

  8. Automatic vehicle monitoring

    NASA Technical Reports Server (NTRS)

    Bravman, J. S.; Durrani, S. H.

    1976-01-01

    Automatic vehicle monitoring systems are discussed. In a baseline system for highway applications, each vehicle obtains position information through a Loran-C receiver in rural areas and through a 'signpost' or 'proximity' type sensor in urban areas; the vehicle transmits this information to a central station via a communication link. In an advance system, the vehicle carries a receiver for signals emitted by satellites in the Global Positioning System and uses a satellite-aided communication link to the central station. An advanced railroad car monitoring system uses car-mounted labels and sensors for car identification and cargo status; the information is collected by electronic interrogators mounted along the track and transmitted to a central station. It is concluded that automatic vehicle monitoring systems are technically feasible but not economically feasible unless a large market develops.

  9. Automatic Retinal Oximetry

    NASA Astrophysics Data System (ADS)

    Halldorsson, G. H.; Karlsson, R. A.; Hardarson, S. H.; Mura, M. Dalla; Eysteinsson, T.; Beach, J. M.; Stefansson, E.; Benediktsson, J. A.

    2007-10-01

    This paper presents a method for automating the evaluation of hemoglobin oxygen saturation in the retina. This method should prove useful for monitoring ischemic retinal diseases and the effect of treatment. In order to obtain saturation values automatically, spectral images must be registered in pairs, the vessels of the retina located and measurement points must be selected. The registration algorithm is based on a data driven approach that circumvents many of the problems that have plagued previous methods. The vessels are extracted using an algorithm based on morphological profiles and supervised classifiers. Measurement points on retinal arterioles and venules as well as reference points on the adjacent fundus are automatically selected. Oxygen saturation values along vessels are averaged to arrive at a more accurate estimate of the retinal vessel oxygen saturation. The system yields reproducible results as well as being sensitive to changes in oxygen saturation.

  10. Automatic threshold selection using histogram quantization

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Adali, Tulay; Lo, Shih-Chung B.

    1997-04-01

    An automatic threshold selection method is proposed for biomedical image analysis based on a histogram coding scheme. The threshold values can be determined based on the well-known Lloyd-Max scalar quantization rule, which is optimal in the sense of achieving minimum mean-square-error distortion. An iterative self-organizing learning rule is derived to determine the threshold levels. The rule does not require any prior information about the histogram, hence is fully automatic. Experimental results show that this new approach is easy to implement yet is highly efficient, robust with respect to noise, and yields reliable estimates of the threshold levels.

  11. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    NASA Astrophysics Data System (ADS)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk

  12. Automatic Word Alignment

    DTIC Science & Technology

    2014-02-18

    strategy was evalu­ ated in the context of English -to-Pashto (E2P) and Pashto-to- English (P2E), a low-resource language pair. For E2P, the training and...improves the quality of automatic word alignment, for example for resource poor language pairs, thus improving Statistical Machine Translation (SMT...example for resource poor language pairs, thus improving Statistical Machine Translation (SMT) performance. 15. SUBJECT TERMS 16. SECURITY

  13. Automatic Test Equipment

    DTIC Science & Technology

    1980-02-28

    Search Terms Automatic Test Equipment Frequency Analyzers Oscilloscopes Pulse Analyzers Signal Generators "Etc." Third Level Search Guided...VAST Building Block Equipment RF Test Point Control Switch Digital Multimeter Frequency and Time Interval Meter Digital Word Generator Delay...Generator RF Amplifier, 95 Hz-2 GHz RF Amplifier, 2-4 GHz RF Amplifier, 4-8 GHz RF Amplifier, 8-12.2 GHz Signal Generator, 0.1 Hz-50 kHz

  14. Automatic exposure control for space sequential camera

    NASA Technical Reports Server (NTRS)

    Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.

    1975-01-01

    The final report for the automatic exposure control study for space sequential cameras, for the NASA Johnson Space Center is presented. The material is shown in the same sequence that the work was performed. The purpose of the automatic exposure control is to automatically control the lens iris as well as the camera shutter so that the subject is properly exposed on the film. A study of design approaches is presented. Analysis of the light range of the spectrum covered indicates that the practical range would be from approximately 20 to 6,000 foot-lamberts, or about nine f-stops. Observation of film available from space flights shows that optimum scene illumination is apparently not present in vehicle interior photography as well as in vehicle-to-vehicle situations. The evaluation test procedure for a breadboard, and the results, which provided information for the design of a brassboard are given.

  15. Automatic weld torch guidance control system

    NASA Technical Reports Server (NTRS)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  16. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  17. Analysis of results obtained using the automatic chemical control of the quality of the water heat carrier in the drum boiler of the Ivanovo CHP-3 power plant

    NASA Astrophysics Data System (ADS)

    Larin, A. B.; Kolegov, A. V.

    2012-10-01

    Results of industrial tests of the new method used for the automatic chemical control of the quality of boiler water of the drum-type power boiler ( P d = 13.8 MPa) are described. The possibility of using an H-cationite column for measuring the electric conductivity of an H-cationized sample of boiler water over a long period of time is shown.

  18. A method for measuring enthalpy of volatilization of a compound, Delta(vol)H, from dilute aqueous solution.

    PubMed

    Wang, Tianshu

    2006-01-01

    This study has developed a method for measuring the enthalpy of volatilization (Delta(vol)H) of a compound in a dilute solution via ion-molecule reactions and gas-phase analysis using selected ion flow tube mass spectrometry (SIFT-MS). The Delta(vol)H/R value was obtained using an equation with three variant forms either from the headspace concentration of the solution or from individual product ion(s). Under certain experimental conditions, the equation has the simplest form [formula: see text], where R is the gas constant (8.314 J . mol(-1) . K(-1)), i(n) and I are the respective product and precursor ion count rates, and T is the temperature of the solution. As an example, a series of 27.0 micromol/L aqueous solutions of acetone was analyzed over a temperature range of 25-50 degrees C at 5 degrees C intervals using H3O+, NO+ and O2+* precursor ions, producing a mean Delta(vol)H/R value of 4700 +/- 200 K. This corresponds with current literature values and supports the consistency of the new method. Notably, using this method, as long as the concentration of the solution falls into the range of Henry's law, the exact concentration does not have to be known and it can require only one sample at each temperature. Compared with previous methods which involve the measurement of Henry's law constant at each temperature, this method significantly reduces the number of samples required and avoids the labour and difficulties in preparing standard solutions at very low concentrations. Further to this, if the contents of a solution were unknown the measured Delta(vol)H/R from individual product ion(s) can help to identify the origin of the ion(s).

  19. Automatic microscopy for mitotic cell location.

    NASA Technical Reports Server (NTRS)

    Herron, J.; Ranshaw, R.; Castle, J.; Wald, N.

    1972-01-01

    Advances are reported in the development of an automatic microscope with which to locate hematologic or other cells in mitosis for subsequent chromosome analysis. The system under development is designed to perform the functions of: slide scanning to locate metaphase cells; conversion of images of selected cells into binary form; and on-line computer analysis of the digitized image for significant cytogenetic data. Cell detection criteria are evaluated using a test sample of 100 mitotic cells and 100 artifacts.

  20. Using airborne LiDAR in geoarchaeological contexts: Assessment of an automatic tool for the detection and the morphometric analysis of grazing archaeological structures (French Massif Central).

    NASA Astrophysics Data System (ADS)

    Roussel, Erwan; Toumazet, Jean-Pierre; Florez, Marta; Vautier, Franck; Dousteyssier, Bertrand

    2014-05-01

    Airborne laser scanning (ALS) of archaeological regions of interest is nowadays a widely used and established method for accurate topographic and microtopographic survey. The penetration of the vegetation cover by the laser beam allows the reconstruction of reliable digital terrain models (DTM) of forested areas where traditional prospection methods are inefficient, time-consuming and non-exhaustive. The ALS technology provides the opportunity to discover new archaeological features hidden by vegetation and provides a comprehensive survey of cultural heritage sites within their environmental context. However, the post-processing of LiDAR points clouds produces a huge quantity of data in which relevant archaeological features are not easily detectable with common visualizing and analysing tools. Undoubtedly, there is an urgent need for automation of structures detection and morphometric extraction techniques, especially for the "archaeological desert" in densely forested areas. This presentation deals with the development of automatic detection procedures applied to archaeological structures located in the French Massif Central, in the western forested part of the Puy-de-Dôme volcano between 950 and 1100 m a.s.l.. These unknown archaeological sites were discovered by the March 2011 ALS mission and display a high density of subcircular depressions with a corridor access. The spatial organization of these depressions vary from isolated to aggregated or aligned features. Functionally, they appear to be former grazing constructions built from the medieval to the modern period. Similar grazing structures are known in other locations of the French Massif Central (Sancy, Artense, Cézallier) where the ground is vegetation-free. In order to develop a reliable process of automatic detection and mapping of these archaeological structures, a learning zone has been delineated within the ALS surveyed area. The grazing features were mapped and typical morphometric attributes

  1. Automatic range selector

    DOEpatents

    McNeilly, Clyde E.

    1977-01-04

    A device is provided for automatically selecting from a plurality of ranges of a scale of values to which a meter may be made responsive, that range which encompasses the value of an unknown parameter. A meter relay indicates whether the unknown is of greater or lesser value than the range to which the meter is then responsive. The rotatable part of a stepping relay is rotated in one direction or the other in response to the indication from the meter relay. Various positions of the rotatable part are associated with particular scales. Switching means are sensitive to the position of the rotatable part to couple the associated range to the meter.

  2. AUTOMATIC FREQUENCY CONTROL SYSTEM

    DOEpatents

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  3. Automatic Speech Recognition

    NASA Astrophysics Data System (ADS)

    Potamianos, Gerasimos; Lamel, Lori; Wölfel, Matthias; Huang, Jing; Marcheret, Etienne; Barras, Claude; Zhu, Xuan; McDonough, John; Hernando, Javier; Macho, Dusan; Nadeu, Climent

    Automatic speech recognition (ASR) is a critical component for CHIL services. For example, it provides the input to higher-level technologies, such as summarization and question answering, as discussed in Chapter 8. In the spirit of ubiquitous computing, the goal of ASR in CHIL is to achieve a high performance using far-field sensors (networks of microphone arrays and distributed far-field microphones). However, close-talking microphones are also of interest, as they are used to benchmark ASR system development by providing a best-case acoustic channel scenario to compare against.

  4. Automatic readout micrometer

    DOEpatents

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  5. Automatic readout micrometer

    DOEpatents

    Lauritzen, Ted

    1982-01-01

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  6. Automatic enrollment for gait-based person re-identification

    NASA Astrophysics Data System (ADS)

    Ortells, Javier; Martín-Félez, Raúl; Mollineda, Ramón A.

    2015-02-01

    Automatic enrollment involves a critical decision-making process within people re-identification context. However, this process has been traditionally undervalued. This paper studies the problem of automatic person enrollment from a realistic perspective relying on gait analysis. Experiments simulating random flows of people with considerable appearance variations between different observations of a person have been conducted, modeling both short- and longterm scenarios. Promising results based on ROC analysis show that automatically enrolling people by their gait is affordable with high success rates.

  7. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  8. Automatic sets and Delone sets

    NASA Astrophysics Data System (ADS)

    Barbé, A.; von Haeseler, F.

    2004-04-01

    Automatic sets D\\subset{\\bb Z}^m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D\\subset{\\bb Z}^m to be a Delone set in {\\bb R}^m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples.

  9. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  10. Automatic TLI recognition system, general description

    SciTech Connect

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  11. Automatic vehicle location system

    NASA Technical Reports Server (NTRS)

    Hansen, G. R., Jr. (Inventor)

    1973-01-01

    An automatic vehicle detection system is disclosed, in which each vehicle whose location is to be detected carries active means which interact with passive elements at each location to be identified. The passive elements comprise a plurality of passive loops arranged in a sequence along the travel direction. Each of the loops is tuned to a chosen frequency so that the sequence of the frequencies defines the location code. As the vehicle traverses the sequence of the loops as it passes over each loop, signals only at the frequency of the loop being passed over are coupled from a vehicle transmitter to a vehicle receiver. The frequencies of the received signals in the receiver produce outputs which together represent a code of the traversed location. The code location is defined by a painted pattern which reflects light to a vehicle carried detector whose output is used to derive the code defined by the pattern.

  12. Automatic routing module

    NASA Technical Reports Server (NTRS)

    Malin, Janice A.

    1987-01-01

    Automatic Routing Module (ARM) is a tool to partially automate Air Launched Cruise Missile (ALCM) routing. For any accessible launch point or target pair, ARM creates flyable routes that, within the fidelity of the models, are optimal in terms of threat avoidance, clobber avoidance, and adherence to vehicle and planning constraints. Although highly algorithmic, ARM is an expert system. Because of the heuristics applied, ARM generated routes closely resemble manually generated routes in routine cases. In more complex cases, ARM's ability to accumulate and assess threat danger in three dimensions and trade that danger off with the probability of ground clobber results in the safest path around or through difficult areas. The tools available prior to ARM did not provide the planner with enough information or present it in such a way that ensured he would select the safest path.

  13. AUTOMATIC HAND COUNTER

    DOEpatents

    Mann J.R.; Wainwright, A.E.

    1963-06-11

    An automatic, personnel-operated, alpha-particle hand monitor is described which functions as a qualitative instrument to indicate to the person using it whether his hands are cold'' or hot.'' The monitor is activated by a push button and includes several capacitor-triggered thyratron tubes. Upon release of the push button, the monitor starts the counting of the radiation present on the hands of the person. If the count of the radiation exceeds a predetermined level within a predetermined time, then a capacitor will trigger a first thyratron tube to light a hot'' lamp. If, however, the count is below such level during this time period, another capacitor will fire a second thyratron to light a safe'' lamp. (AEC)

  14. Automatic Bayesian polarity determination

    NASA Astrophysics Data System (ADS)

    Pugh, D. J.; White, R. S.; Christie, P. A. F.

    2016-07-01

    The polarity of the first motion of a seismic signal from an earthquake is an important constraint in earthquake source inversion. Microseismic events often have low signal-to-noise ratios, which may lead to difficulties estimating the correct first-motion polarities of the arrivals. This paper describes a probabilistic approach to polarity picking that can be both automated and combined with manual picking. This approach includes a quantitative estimate of the uncertainty of the polarity, improving calculation of the polarity probability density function for source inversion. It is sufficiently fast to be incorporated into an automatic processing workflow. When used in source inversion, the results are consistent with those from manual observations. In some cases, they produce a clearer constraint on the range of high-probability source mechanisms, and are better constrained than source mechanisms determined using a uniform probability of an incorrect polarity pick.

  15. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    PubMed

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery.

  16. Networked Automatic Optical Telescopes

    NASA Astrophysics Data System (ADS)

    Mattox, J. R.

    2000-05-01

    Many groups around the world are developing automated or robotic optical observatories. The coordinated operation of automated optical telescopes at diverse sites could provide observing prospects which are not otherwise available, e.g., continuous optical photometry without diurnal interruption. Computer control and scheduling also offers the prospect of effective response to transient events such as γ -ray bursts. These telescopes could also serve science education by providing high-quality CCD data for educators and students. The Automatic Telescope Network (ATN) project has been undertaken to promote networking of automated telescopes. A web site is maintained at http://gamma.bu.edu/atn/. The development of such networks will be facilitated by the existence of standards. A set of standard commands for instrument and telescope control systems will allow for the creation of software for an ``observatory control system'' which can be used at any facility which complies with the TCS and ICS standards. Also, there is a strong need for standards for the specification of observations to be done, and reports on the results and status of observations. A proposed standard for this is the Remote Telescope Markup Language (RTML), which is expected to be described in another poster in this session. It may thus be feasible for amateur-astronomers to soon buy all necessary equipment and software to field an automatic telescope. The owner/operator could make otherwise unused telescope time available to the network in exchange for the utilization of other telescopes in the network --- including occasional utilization of meter-class telescopes with research-grade CCD detectors at good sites.

  17. Automatic Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1985-01-01

    Coal cutting and removal done with minimal hazard to people. Automatic coal mine cutting, transport and roof-support movement all done by automatic machinery. Exposure of people to hazardous conditions reduced to inspection tours, maintenance, repair, and possibly entry mining.

  18. Automatic differentiation using vectorized hyper dual numbers

    NASA Astrophysics Data System (ADS)

    Swaroop, Kshitiz

    Sensitivity analysis is a method to measure the change in a dependent variable with respect to one or more independent variables with uses including optimization, design analysis and risk modeling. Conventional methods like finite difference suffer from both truncation, subtraction errors and cannot be used to simultaneously calculate derivatives of an output with respect to multiple inputs (commonly seen in optimization problems). Automatic Differentiation tackles all these issues successfully allowing us to calculate derivatives of any variable with respect to the independent variables in a computer program up to machine precision without any significant user input. Vectorized Hyper Dual Numbers, an extension of Hyper Dual Numbers, which allows the user to automatically calculate both the Hessian and derivative along with the function evaluation is developed for this thesis. The method is then used for the sizing and layup of a composite wind turbine blade as a proof of concept.

  19. Automatic multidiagnosis system for slit lamp

    NASA Astrophysics Data System (ADS)

    Ventura, Liliane; Chiaradia, Caio; Vieira Messias, Andre M.; Faria de Sousa, Sidney J.; Isaac, Flavio; Caetano, Cesar A. C.; Rosa Filho, Andre B.

    2001-06-01

    We have developed a system for several automatic diagnose in Slit Lamp in order to provide 04 additional measurements to the biomicroscope: (1) counting of the endothelial cells of donated corneas; (2) automatic keratometry; (3) corneal ulcer evaluation; (4) measurement of linear distances and areas of the ocular image. The system consists in a Slit Lamp, a beam-splitter, some optical components, a CCD detector, a frame grabber and a PC. The optical components attached to the beam-splitter are the same for all the functions, except for 1. For function 1, we have developed an optical system that magnifies the image 290X and a software that counts the cells interactively and automatically. Results are in good agreement with commercial specular microscopes (correlation coefficient is 0,98081). The automatic keratometry function is able to measure cylinders over 30 di and also irregular astigmatisms. The system consists of projecting a light ring at the patient's cornea and the further analysis of the deformation of the ring provides the radius of curvature as well as the axis of the astigmatism. The nominal precision is 0,005 mm for the curvature radius and 2 degree(s) for the axis component. The results are in good agreement with commercial systems (correlation coefficient of 0,99347). For function 3, the ulcer is isolated by the usual clinical ways and the image of the green area is automatically detected by the developed software in order to evaluate the evolution of the disease. Function 4 simply allows the clinician do any linear or area measurement of the ocular image. The system is a low cost multi evaluation equipment and it is being used in a public hospital in Brazil.

  20. Automatically Classifying Question Types for Consumer Health Questions

    PubMed Central

    Roberts, Kirk; Kilicoglu, Halil; Fiszman, Marcelo; Demner-Fushman, Dina

    2014-01-01

    We present a method for automatically classifying consumer health questions. Our thirteen question types are designed to aid in the automatic retrieval of medical answers from consumer health resources. To our knowledge, this is the first machine learning-based method specifically for classifying consumer health questions. We demonstrate how previous approaches to medical question classification are insufficient to achieve high accuracy on this task. Additionally, we describe, manually annotate, and automatically classify three important question elements that improve question classification over previous techniques. Our results and analysis illustrate the difficulty of the task and the future directions that are necessary to achieve high-performing consumer health question classification. PMID:25954411

  1. A training programme involving automatic self-transcending meditation in late-life depression: preliminary analysis of an ongoing randomised controlled trial.

    PubMed

    Vasudev, Akshya; Arena, Amanda; Burhan, Amer M; Ionson, Emily; Hirjee, Hussein; Maldeniya, Pramudith; Wetmore, Stephen; Newman, Ronnie I

    2016-03-01

    Late-life depression affects 2-6% of seniors aged 60 years and above. Patients are increasingly embracing non-pharmacological therapies, many of which have not been scientifically evaluated. This study aimed to evaluate a category of meditation, automatic self-transcending meditation (ASTM), in alleviating symptoms of depression when augmenting treatment as usual (NCT02149810). The preliminary results of an ongoing single-blind randomised controlled trial comparing a training programme involving ASTM with a wait-list control indicate that a 12-week ASTM programme may lead to significantly greater reductions in depression and anxiety severity. As such, ASTM may be an effective adjunctive therapy in the treatment of late-life depression.

  2. 3D numerical test objects for the evaluation of a software used for an automatic analysis of a linear accelerator mechanical stability

    NASA Astrophysics Data System (ADS)

    Torfeh, Tarraf; Beaumont, Stéphane; Guédon, Jeanpierre; Benhdech, Yassine

    2010-04-01

    Mechanical stability of a medical LINear ACcelerator (LINAC), particularly the quality of the gantry, collimator and table rotations and the accuracy of the isocenter position, are crucial for the radiation therapy process, especially in stereotactic radio surgery and in Image Guided Radiation Therapy (IGRT) where this mechanical stability is perturbed due to the additional weight the kV x-ray tube and detector. In this paper, we present a new method to evaluate a software which is used to perform an automatic measurement of the "size" (flex map) and the location of the kV and the MV isocenters of the linear accelerator. The method consists of developing a complete numerical 3D simulation of a LINAC and physical phantoms in order to produce Electronic Portal Imaging Device (EPID) images including calibrated distortions of the mechanical movement of the gantry and isocenter misalignments.

  3. A Comparative Analysis of DBSCAN, K-Means, and Quadratic Variation Algorithms for Automatic Identification of Swallows from Swallowing Accelerometry Signals

    PubMed Central

    Dudik, Joshua M.; Kurosu, Atsuko; Coyle, James L

    2015-01-01

    Background Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. Methods In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Results Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differen-tiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. Conclusions In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. PMID:25658505

  4. Automatic Command Sequence Generation

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladded, Roy; Khanampompan, Teerapat

    2007-01-01

    Automatic Sequence Generator (Autogen) Version 3.0 software automatically generates command sequences for the Mars Reconnaissance Orbiter (MRO) and several other JPL spacecraft operated by the multi-mission support team. Autogen uses standard JPL sequencing tools like APGEN, ASP, SEQGEN, and the DOM database to automate the generation of uplink command products, Spacecraft Command Message Format (SCMF) files, and the corresponding ground command products, DSN Keywords Files (DKF). Autogen supports all the major multi-mission mission phases including the cruise, aerobraking, mapping/science, and relay mission phases. Autogen is a Perl script, which functions within the mission operations UNIX environment. It consists of two parts: a set of model files and the autogen Perl script. Autogen encodes the behaviors of the system into a model and encodes algorithms for context sensitive customizations of the modeled behaviors. The model includes knowledge of different mission phases and how the resultant command products must differ for these phases. The executable software portion of Autogen, automates the setup and use of APGEN for constructing a spacecraft activity sequence file (SASF). The setup includes file retrieval through the DOM (Distributed Object Manager), an object database used to store project files. This step retrieves all the needed input files for generating the command products. Depending on the mission phase, Autogen also uses the ASP (Automated Sequence Processor) and SEQGEN to generate the command product sent to the spacecraft. Autogen also provides the means for customizing sequences through the use of configuration files. By automating the majority of the sequencing generation process, Autogen eliminates many sequence generation errors commonly introduced by manually constructing spacecraft command sequences. Through the layering of commands into the sequence by a series of scheduling algorithms, users are able to rapidly and reliably construct the

  5. Automatic Parametrization of Somatosensory Evoked Potentials With Chirp Modeling.

    PubMed

    Vayrynen, Eero; Noponen, Kai; Vipin, Ashwati; Thow, X Y; Al-Nashash, Hasan; Kortelainen, Jukka; All, Angelo

    2016-09-01

    In this paper, an approach using polynomial phase chirp signals to model somatosensory evoked potentials (SEPs) is proposed. SEP waveforms are assumed as impulses undergoing group velocity dispersion while propagating along a multipath neural connection. Mathematical analysis of pulse dispersion resulting in chirp signals is performed. An automatic parameterization of SEPs is proposed using chirp models. A Particle Swarm Optimization algorithm is used to optimize the model parameters. Features describing the latencies and amplitudes of SEPs are automatically derived. A rat model is then used to evaluate the automatic parameterization of SEPs in two experimental cases, i.e., anesthesia level and spinal cord injury (SCI). Experimental results show that chirp-based model parameters and the derived SEP features are significant in describing both anesthesia level and SCI changes. The proposed automatic optimization based approach for extracting chirp parameters offers potential for detailed SEP analysis in future studies. The method implementation in Matlab technical computing language is provided online.

  6. The Development of the Automatic Target Recognition System for the UGV/RSTA LADAR

    DTIC Science & Technology

    1995-03-21

    Vol. 1960, Orlando, FL, 14-16 April 1993, pp. 57-71. 6. J.E. Nettleton and S. Holder, "Active/passive laser radar field test (Fort A.P. Hill, VA...is to develop the automatic target recogni- tion (ATR) system that will process the imagery from the RSTA laser radar (ladar). A real-time...ix 1. INTRODUCTION 1 2. IMAGE DATA BASES 5 2.1 Introduction 5 2.2 Tri-Service Laser -Radar (TSLR) Data 5 2.3 Hobby Shop (HBS) Laser -Radar Data 11

  7. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.

  8. Automatic transmission structure

    SciTech Connect

    Iwase, Y.; Morisawa, K.

    1987-03-24

    An automatic transmission is described comprising: an output shaft of the transmission including a stepped portion; a parking gear spline-connected with the output shaft on a first side of the stepped portion; a plurality of governor values mounted on a rear side of the parking gear and radially disposed around the output shaft on the first side of the stepped portion; a speed meter drive gear spline-connected with the output shaft on a second side of the stepped portion and on a rear side of the governor valves; and an annular spacer fitted on the output shaft on the second side of the stepped portion between the governor valves and the speed meter drive gear to abut on each of the governor valves and the speed meter drive gear. The annular member is constructed separately from the speed meter drive gear and has an outer diameter larger than an outer diameter of the speed meter drive gear thereby resulting in a contact area between the annular space and the speed meter drive gear which is smaller than a contact area between the annular spacer and the rear side of the governor valves; the drive gear being axially secured relative to the output shaft by a bearing thereby enabling a fixed axial positioning of the annular spacer on the output shaft.

  9. Electronically controlled automatic transmission

    SciTech Connect

    Ohkubo, M.; Shiba, H.; Nakamura, K.

    1989-03-28

    This patent describes an electronically controlled automatic transmission having a manual valve working in connection with a manual shift lever, shift valves operated by solenoid valves which are driven by an electronic control circuit previously memorizing shift patterns, and a hydraulic circuit controlled by these manual valve and shift valves for driving brakes and a clutch in order to change speed. Shift patterns of 2-range and L-range, in addition to a shift pattern of D-range, are memorized previously in the electronic control circuit, an operation switch is provided which changes the shift pattern of the electronic control circuit to any shift pattern among those of D-range, 2-range and L-range at time of the manual shift lever being in a D-range position, a releasable lock mechanism is provided which prevents the manual shift lever from entering 2-range and L-range positions, and the hydraulic circuit is set to a third speed mode when the manual shift lever is in the D-range position. The circuit is set to a second speed mode when it is in the 2-range position, and the circuit is set to a first speed mode when it is in the L-range position, respectively, in case where the shift valves are not working.

  10. Automatic Welding System

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Robotic welding has been of interest to industrial firms because it offers higher productivity at lower cost than manual welding. There are some systems with automated arc guidance available, but they have disadvantages, such as limitations on types of materials or types of seams that can be welded; susceptibility to stray electrical signals; restricted field of view; or tendency to contaminate the weld seam. Wanting to overcome these disadvantages, Marshall Space Flight Center, aided by Hayes International Corporation, developed system that uses closed-circuit TV signals for automatic guidance of the welding torch. NASA granted license to Combined Technologies, Inc. for commercial application of the technology. They developed a refined and improved arc guidance system. CTI in turn, licensed the Merrick Corporation, also of Nashville, for marketing and manufacturing of the new system, called the CT2 Optical Trucker. CT2 is a non-contracting system that offers adaptability to broader range of welding jobs and provides greater reliability in high speed operation. It is extremely accurate and can travel at high speed of up to 150 inches per minute.

  11. Automatic imitation in dogs

    PubMed Central

    Range, Friederike; Huber, Ludwig; Heyes, Cecilia

    2011-01-01

    After preliminary training to open a sliding door using their head and their paw, dogs were given a discrimination task in which they were rewarded with food for opening the door using the same method (head or paw) as demonstrated by their owner (compatible group), or for opening the door using the alternative method (incompatible group). The incompatible group, which had to counterimitate to receive food reward, required more trials to reach a fixed criterion of discrimination performance (85% correct) than the compatible group. This suggests that, like humans, dogs are subject to ‘automatic imitation’; they cannot inhibit online the tendency to imitate head use and/or paw use. In a subsequent transfer test, where all dogs were required to imitate their owners' head and paw use for food reward, the incompatible group made a greater proportion of incorrect, counterimitative responses than the compatible group. These results are consistent with the associative sequence learning model, which suggests that the development of imitation depends on sensorimotor experience and phylogenetically general mechanisms of associative learning. More specifically, they suggest that the imitative behaviour of dogs is shaped more by their developmental interactions with humans than by their evolutionary history of domestication. PMID:20667875

  12. Automatic aircraft recognition

    NASA Astrophysics Data System (ADS)

    Hmam, Hatem; Kim, Jijoong

    2002-08-01

    Automatic aircraft recognition is very complex because of clutter, shadows, clouds, self-occlusion and degraded imaging conditions. This paper presents an aircraft recognition system, which assumes from the start that the image is possibly degraded, and implements a number of strategies to overcome edge fragmentation and distortion. The current vision system employs a bottom up approach, where recognition begins by locating image primitives (e.g., lines and corners), which are then combined in an incremental fashion into larger sets of line groupings using knowledge about aircraft, as viewed from a generic viewpoint. Knowledge about aircraft is represented in the form of whole/part shape description and the connectedness property, and is embedded in production rules, which primarily aim at finding instances of the aircraft parts in the image and checking the connectedness property between the parts. Once a match is found, a confidence score is assigned and as evidence in support of an aircraft interpretation is accumulated, the score is increased proportionally. Finally a selection of the resulting image interpretations with the highest scores, is subjected to competition tests, and only non-ambiguous interpretations are allowed to survive. Experimental results demonstrating the effectiveness of the current recognition system are given.

  13. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  14. Automatic Weather Station (AWS) Lidar

    NASA Technical Reports Server (NTRS)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  15. Automatic landslides detection on Stromboli volcanic Island

    NASA Astrophysics Data System (ADS)

    Silengo, Maria Cristina; Delle Donne, Dario; Ulivieri, Giacomo; Cigolini, Corrado; Ripepe, Maurizio

    2016-04-01

    Landslides occurring in active volcanic islands play a key role in triggering tsunami and other related risks. Therefore, it becomes vital for a correct and prompt risk assessment to monitor landslides activity and to have an automatic system for a robust early-warning. We then developed a system based on a multi-frequency analysis of seismic signals for automatic landslides detection occurring at Stromboli volcano. We used a network of 4 seismic 3 components stations located along the unstable flank of the Sciara del Fuoco. Our method is able to recognize and separate the different sources of seismic signals related to volcanic and tectonic activity (e.g. tremor, explosions, earthquake) from landslides. This is done using a multi-frequency analysis combined with a waveform patter recognition. We applied the method to one year of seismic activity of Stromboli volcano centered during the last 2007 effusive eruption. This eruption was characterized by a pre-eruptive landslide activity reflecting the slow deformation of the volcano edifice. The algorithm is at the moment running off-line but has proved to be robust and efficient in picking automatically landslide. The method provides also real-time statistics on the landslide occurrence, which could be used as a proxy for the volcano deformation during the pre-eruptive phases. This method is very promising since the number of false detections is quite small (<5%) and is reducing when the size of the landslide increases. The final aim will be to apply this method on-line and for a real-time automatic detection as an improving tool for early warnings of tsunami-genic landslide activity. We suggest that a similar approach could be also applied to other unstable non-volcanic also slopes.

  16. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  17. Clothes Dryer Automatic Termination Evaluation

    SciTech Connect

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  18. Automatic pump for deep wells

    SciTech Connect

    Brown, K.D.

    1981-11-24

    An automatic pump for deep wells comprises a long stroke reciprocating pump having its piston normally in its bottom position and an automatic control dependent upon the collection of a predetermined amount of liquid in the pump cylinder above the piston for actuating the piston to pump the liquid into a production line. The automatic control includes an electric motor driven hydraulic pump and a reservoir of hydraulic fluid which is actuated upon filling of the reciprocating pump chamber to supply hydraulic fluid to a closed chamber below the piston and force the piston upwardly to discharge liquid from the pump cylinder. Gas collected in the top of the pump cylinder results in low starting current and a saving of energy. The hydraulic pump is reversed automatically upon completion of the pumping stroke of the piston.

  19. Automatic Classification in Information Retrieval.

    ERIC Educational Resources Information Center

    van Rijsbergen, C. J.

    1978-01-01

    Addresses the application of automatic classification methods to the problems associated with computerized document retrieval. Different kinds of classifications are described, and both document and term clustering methods are discussed. References and notes are provided. (Author/JD)

  20. Automatic safety rod for reactors

    DOEpatents

    Germer, John H.

    1988-01-01

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-core flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  1. Indian summer heat wave of 2015: a biometeorological analysis using half hourly automatic weather station data with special reference to Andhra Pradesh

    NASA Astrophysics Data System (ADS)

    Sarath Chandran, M. A.; Subba Rao, A. V. M.; Sandeep, V. M.; Pramod, V. P.; Pani, P.; Rao, V. U. M.; Visha Kumari, V.; Srinivasa Rao, Ch

    2016-12-01

    Heat wave is a hazardous weather-related extreme event that affects living beings. The 2015 summer heat wave affected many regions in India and caused the death of 2248 people across the country. An attempt has been made to quantify the intensity and duration of heat wave that resulted in high mortality across the country. Half hourly Physiologically Equivalent Temperature (PET), based on a complete heat budget of human body, was estimated using automatic weather station (AWS) data of four locations in Andhra Pradesh state, where the maximum number of deaths was reported. The heat wave characterization using PET revealed that extreme heat load conditions (PET >41) existed in all the four locations throughout May during 2012-2015, with varying intensity. The intensity and duration of heat waves characterized by "area under the curve" method showed good results for Srikakulam and Undi locations. Variations in PET during each half an hour were estimated. Such studies will help in fixing thresholds for defining heat waves, designing early warning systems, etc.

  2. Factors influencing relative speech intelligibility in patients with oral squamous cell carcinoma: a prospective study using automatic, computer-based speech analysis.

    PubMed

    Stelzle, F; Knipfer, C; Schuster, M; Bocklet, T; Nöth, E; Adler, W; Schempf, L; Vieler, P; Riemann, M; Neukam, F W; Nkenke, E

    2013-11-01

    Oral squamous cell carcinoma (OSCC) and its treatment impair speech intelligibility by alteration of the vocal tract. The aim of this study was to identify the factors of oral cancer treatment that influence speech intelligibility by means of an automatic, standardized speech-recognition system. The study group comprised 71 patients (mean age 59.89, range 35-82 years) with OSCC ranging from stage T1 to T4 (TNM staging). Tumours were located on the tongue (n=23), lower alveolar crest (n=27), and floor of the mouth (n=21). Reconstruction was conducted through local tissue plasty or microvascular transplants. Adjuvant radiotherapy was performed in 49 patients. Speech intelligibility was evaluated before, and at 3, 6, and 12 months after tumour resection, and compared to that of a healthy control group (n=40). Postoperatively, significant influences on speech intelligibility were tumour localization (P=0.010) and resection volume (P=0.019). Additionally, adjuvant radiotherapy (P=0.049) influenced intelligibility at 3 months after surgery. At 6 months after surgery, influences were resection volume (P=0.028) and adjuvant radiotherapy (P=0.034). The influence of tumour localization (P=0.001) and adjuvant radiotherapy (P=0.022) persisted after 12 months. Tumour localization, resection volume, and radiotherapy are crucial factors for speech intelligibility. Radiotherapy significantly impaired word recognition rate (WR) values with a progression of the impairment for up to 12 months after surgery.

  3. A training programme involving automatic self-transcending meditation in late-life depression: preliminary analysis of an ongoing randomised controlled trial

    PubMed Central

    Arena, Amanda; Burhan, Amer M.; Ionson, Emily; Hirjee, Hussein; Maldeniya, Pramudith; Wetmore, Stephen; Newman, Ronnie I.

    2016-01-01

    Late-life depression affects 2–6% of seniors aged 60 years and above. Patients are increasingly embracing non-pharmacological therapies, many of which have not been scientifically evaluated. This study aimed to evaluate a category of meditation, automatic self-transcending meditation (ASTM), in alleviating symptoms of depression when augmenting treatment as usual (NCT02149810). The preliminary results of an ongoing single-blind randomised controlled trial comparing a training programme involving ASTM with a wait-list control indicate that a 12-week ASTM programme may lead to significantly greater reductions in depression and anxiety severity. As such, ASTM may be an effective adjunctive therapy in the treatment of late-life depression. Declaration of interest R.I.N. is Director of Research and Health Promotion for the Art of Living Foundation, Canada and supervised the staff providing ASTM training. Copyright and usage © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) licence. PMID:27703774

  4. Automatic Collision Avoidance Technology (ACAT)

    NASA Technical Reports Server (NTRS)

    Swihart, Donald E.; Skoog, Mark A.

    2007-01-01

    This document represents two views of the Automatic Collision Avoidance Technology (ACAT). One viewgraph presentation reviews the development and system design of Automatic Collision Avoidance Technology (ACAT). Two types of ACAT exist: Automatic Ground Collision Avoidance (AGCAS) and Automatic Air Collision Avoidance (AACAS). The AGCAS Uses Digital Terrain Elevation Data (DTED) for mapping functions, and uses Navigation data to place aircraft on map. It then scans DTED in front of and around aircraft and uses future aircraft trajectory (5g) to provide automatic flyup maneuver when required. The AACAS uses data link to determine position and closing rate. It contains several canned maneuvers to avoid collision. Automatic maneuvers can occur at last instant and both aircraft maneuver when using data link. The system can use sensor in place of data link. The second viewgraph presentation reviews the development of a flight test and an evaluation of the test. A review of the operation and comparison of the AGCAS and a pilot's performance are given. The same review is given for the AACAS is given.

  5. An automatic composition model of Chinese folk music

    NASA Astrophysics Data System (ADS)

    Zheng, Xiaomei; Li, Dongyang; Wang, Lei; Shen, Lin; Gao, Yanyuan; Zhu, Yuanyuan

    2017-03-01

    The automatic composition has achieved rich results in recent decades, including Western and some other areas of music. However, the automatic composition of Chinese music is less involved. After thousands of years of development, Chinese folk music has a wealth of resources. To design an automatic composition mode, learn the characters of Chinese folk melody and imitate the creative process of music is of some significance. According to the melodic features of Chinese folk music, a Chinese folk music composition based on Markov model is proposed to analyze Chinese traditional music. Folk songs with typical Chinese national characteristics are selected for analysis. In this paper, an example of automatic composition is given. The experimental results show that this composition model can produce music with characteristics of Chinese folk music.

  6. 12 CFR 925.4 - Automatic membership.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Automatic membership. 925.4 Section 925.4 Banks... MEMBERS OF THE BANKS Membership Application Process § 925.4 Automatic membership. (a) Automatic membership... between the member and the Bank at the time of such conversion may continue. (b) Automatic membership...

  7. Automatic morphometry of nerve histological sections.

    PubMed

    Romero, E; Cuisenaire, O; Denef, J F; Delbeke, J; Macq, B; Veraart, C

    2000-04-15

    A method for the automatic segmentation, recognition and measurement of neuronal myelinated fibers in nerve histological sections is presented. In this method, the fiber parameters i.e. perimeter, area, position of the fiber and myelin sheath thickness are automatically computed. Obliquity of the sections may be taken into account. First, the image is thresholded to provide a coarse classification between myelin and non-myelin pixels. Next, the resulting binary image is further simplified using connected morphological operators. By applying semantic rules to the zonal graph axon candidates are identified. Those are either isolated or still connected. Then, separation of connected fibers is performed by evaluating myelin sheath thickness around each candidate area with an Euclidean distance transformation. Finally, properties of each detected fiber are computed and false positives are removed. The accuracy of the method is assessed by evaluating missed detection, false positive ratio and comparing the results to the manual procedure with sampling. In the evaluated nerve surface, a 0.9% of false positives was found, along with 6.36% of missed detections. The resulting histograms show strong correlation with those obtained by manual measure. The noise introduced by this method is significantly lower than the intrinsic sampling variability. This automatic method constitutes an original tool for morphometrical analysis.

  8. On-line dynamic fractionation and automatic determination of inorganic phosphorus in environmental solid substrates exploiting sequential injection microcolumn extraction and flow injection analysis.

    PubMed

    Buanuam, Janya; Miró, Manuel; Hansen, Elo Harald; Shiowatana, Juwadee

    2006-06-16

    Sequential injection microcolumn extraction (SI-MCE) based on the implementation of a soil-containing microcartridge as external reactor in a sequential injection network is, for the first time, proposed for dynamic fractionation of macronutrients in environmental solids, as exemplified by the partitioning of inorganic phosphorus in agricultural soils. The on-line fractionation method capitalises on the accurate metering and sequential exposure of the various extractants to the solid sample by application of programmable flow as precisely coordinated by a syringe pump. Three different soil phase associations for phosphorus, that is, exchangeable, Al- and Fe-bound, and Ca-bound fractions, were elucidated by accommodation in the flow manifold of the three steps of the Hieltjes-Lijklema (HL) scheme involving the use of 1.0M NH4Cl, 0.1M NaOH and 0.5M HCl, respectively, as sequential leaching reagents. The precise timing and versatility of SI for tailoring various operational extraction modes were utilized for investigating the extractability and the extent of phosphorus re-distribution for variable partitioning times. Automatic spectrophotometric determination of soluble reactive phosphorus in soil extracts was performed by a flow injection (FI) analyser based on the Molybdenum Blue (MB) chemistry. The 3sigma detection limit was 0.02 mg P L(-1) while the linear dynamic range extended up to 20 mg P L(-1) regardless of the extracting media. Despite the variable chemical composition of the HL extracts, a single FI set-up was assembled with no need for either manifold re-configuration or modification of chemical composition of reagents. The mobilization of trace elements, such as Cd, often present in grazed pastures as a result of the application of phosphate fertilizers, was also explored in the HL fractions by electrothermal atomic absorption spectrometry.

  9. Solidification of Magnesium (AM50A) / vol%. SiCp composite

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Hu, H.

    2012-01-01

    Magnesium matrix composite is one of the advanced lightweight materials with high potential to be used in automotive and aircraft industries due to its low density and high specific mechanical properties. The magnesium composites can be fabricated by adding the reinforcements of fibers or/and particles. In the previous literature, extensive studies have been performed on the development of matrix grain structure of aluminum-based metal matrix composites. However, there is limited information available on the development of grain structure during the solidification of particulate-reinforced magnesium. In this work, a 5 vol.% SiCp particulate-reinforced magnesium (AM50A) matrix composite (AM50A/SiCp) was prepared by stir casting. The solidification behavior of the cast AM50A/SiCp composite was investigated by computer-based thermal analysis. Optical and scanning electron microscopies (SEM) were employed to examine the occurrence of nucleation and grain refinement involved. The results indicate that the addition of SiCp particulates leads to a finer grain structure in the composite compared with the matrix alloy. The refinement of grain structure should be attributed to both the heterogeneous nucleation and the restricted primary crystal growth.

  10. Warmer temperatures stimulate respiration and reduce net ecosystem productivity in a northern Great Plains grassland: Analysis of CO2 exchange in automatic chambers

    NASA Astrophysics Data System (ADS)

    Flanagan, L. B.

    2013-12-01

    The interacting effects of altered temperature and precipitation are expected to have significant consequences for ecosystem net carbon storage. Here I report the results of an experiment that evaluated the effects of elevated temperature and altered precipitation on ecosystem CO2 exchange in a northern Great Plains grassland, near Lethbridge, Alberta Canada. Open-top chambers were used to establish an experiment in 2012 with three treatments (control, warmed, warmed plus 50% of normal precipitation input). A smaller experiment with only the two temperature treatments (control and warmed) was conducted in 2013. Continuous half-hourly net CO2 exchange measurements were made using nine automatic chambers during May-October in both years. My objectives were to determine the sensitivity of the ecosystem carbon budget to temperature and moisture manipulations, and to test for direct and indirect effects of the environmental changes on ecosystem CO2 exchange. The experimental manipulations resulted primarily in a significant increase in air temperature in the warmed treatment plots. A cumulative net loss of carbon or negative net ecosystem productivity (NEP) occurred during May through September in the warmed treatment (NEP = -659 g C m-2), while in the control treatment there was a cumulative net gain of carbon (NEP = +50 g C m-2). An eddy covariance system that operated at the site, over a footprint region that was not influenced by the experimental treatments, also showed a net gain of carbon by the ecosystem. The reduced NEP was due to higher plant and soil respiration rates in the warmed treatment that appeared to be caused by a combination of: (i) higher carbon substrate availability indirectly stimulating soil respiration in the warmed relative to the control treatment, and (ii) a strong increase in leaf respiration likely caused by a shift in electron partitioning to the alternative pathway respiration in the warmed treatment, particularly when exposed to high

  11. Automatically Tracing Information Flow of Vulnerability and Cyber-Attack Information through Text Strings

    DTIC Science & Technology

    2008-06-01

    network security. Computer Communications , Vol. 30, No. 9, 2032- 2047. Li, B., Wang, Q., & Luo, J. 2006, December. Forensic analysis of document...come bundled with a legitimate piece of software the user actually wants, such as a game or emoticon . http://www.greatarticleshere.com/aid32813/Adware

  12. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  13. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, Anthony J.

    1994-05-10

    Disclosed are a method and apparatus for (1) automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, (2) automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, (3) manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and (4) automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly.

  14. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, A.J.

    1994-05-10

    Disclosed are a method and apparatus for automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly. 10 figures.

  15. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  16. Grinding Parts For Automatic Welding

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  17. Automatic interpretation of Schlumberger soundings

    SciTech Connect

    Ushijima, K.

    1980-09-01

    The automatic interpretation of apparent resistivity curves from horizontally layered earth models is carried out by the curve-fitting method in three steps: (1) the observed VES data are interpolated at equidistant points of electrode separations on the logarithmic scale by using the cubic spline function, (2) the layer parameters which are resistivities and depths are predicted from the sampled apparent resistivity values by SALS system program and (3) the theoretical VES curves from the models are calculated by Ghosh's linear filter method using the Zhody's computer program. Two soundings taken over Takenoyu geothermal area were chosen to test the procedures of the automatic interpretation.

  18. Automatic diluter for bacteriological samples.

    PubMed Central

    Trinel, P A; Bleuze, P; Leroy, G; Moschetto, Y; Leclerc, H

    1983-01-01

    The described apparatus, carrying 190 tubes, allows automatic and aseptic dilution of liquid or suspended-solid samples. Serial 10-fold dilutions are programmable from 10(-1) to 10(-9) and are carried out in glass tubes with screw caps and split silicone septa. Dilution assays performed with strains of Escherichia coli and Bacillus stearothermophilus permitted efficient conditions for sterilization of the needle to be defined and showed that the automatic dilutions were as accurate and as reproducible as the most rigorous conventional dilutions. Images PMID:6338826

  19. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology

    ERIC Educational Resources Information Center

    Dansereau, Jules

    1978-01-01

    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  20. Performance characterization of the MiniVol PM{sub 2.5} sampler

    SciTech Connect

    Hill, J.S.; Patel, P.D.; Turner, J.R.

    1999-07-01

    Measurements were conducted to assess the MiniVol PM{sub 2.5} sampler performance for various particle preseparator configurations including flat and cup impaction stages. Laboratory measurements were conducted to determine the impactor collection efficiency as a function of particle size. Impactor cut points--the aerodynamic particle diameter exhibiting 50% collection efficiency--were 2.5 mm ({approximately} 10%) for the flat stage and 3.0 mm for the cup stage. Collection efficiency curves for cascade (tandem) impactor configurations (PM{sub 10} followed by PM{sub 2.5}) generally agreed with the single stage results. In all cases the collection efficiency curves exhibited the classical sigmoidal shape, albeit less steep than required for PM{sub 2.5} Federal Reference Method (FRM) samplers. Field data were collected at urban sites in St. Louis using MiniVol samplers collocated with a PM{sub 2.5} FRM sampler to quantify the MiniVol sampler precision and accuracy. Collocated sampler precision was 6% and 10% for MiniVol cascade impactors with flat PM{sub 2.5} stages and cup PM{sub 2.5} stages, respectively (N = 31). Both of these MiniVol configurations were deemed statistically equivalent to the FRM when the reported ambient mass concentrations were corrected for field blank values (N = 15).

  1. NASA automatic system for computer program documentation, volume 2

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1972-01-01

    The DYNASOR 2 program is used for the dynamic nonlinear analysis of shells of revolution. The equations of motion of the shell are solved using Houbolt's numerical procedure. The displacements and stress resultants are determined for both symmetrical and asymmetrical loading conditions. Asymmetrical dynamic buckling can be investigated. Solutions can be obtained for highly nonlinear problems utilizing as many as five of the harmonics generated by SAMMSOR program. A restart capability allows the user to restart the program at a specified time. For Vol. 1, see N73-22129.

  2. Automatic Recognition of Deaf Speech.

    ERIC Educational Resources Information Center

    Abdelhamied, Kadry; And Others

    1990-01-01

    This paper describes a speech perception system for automatic recognition of deaf speech. Using a 2-step segmentation approach for 468 utterances by 2 hearing-impaired men and 2 normal-hearing men, rates as high as 93.01 percent and 81.81 percent recognition were obtained in recognizing from deaf speech isolated words and connected speech,…

  3. Automatic calculation in quarkonium physics

    NASA Astrophysics Data System (ADS)

    Gong, Bin; Wan, Lu-Ping; Wang, Jian-Xiong; Zhang, Hong-Fei

    2014-06-01

    In this report, an automatic calculating package based on REDUCE and RLISP, FDC, is introduced, especially its one-loop calculation part and its special treatment for quarkonium physics. With FDC, many works have been completed, most of them are very important in solve/clarify current puzzles in quarkonium physics.

  4. Automatic Identification of Metaphoric Utterances

    ERIC Educational Resources Information Center

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  5. Automatic evaluation of interferograms

    NASA Technical Reports Server (NTRS)

    Becker, F.

    1982-01-01

    A system for the evaluation of interference patterns was developed. For digitizing and processing of the interferograms from classical and holographic interferometers a picture analysis system based upon a computer with a television digitizer was installed. Depending on the quality of the interferograms, four different picture enhancement operations may be used: Signal averaging; spatial smoothing, subtraction of the overlayed intensity function and the removal of distortion-patterns using a spatial filtering technique in the frequency spectrum of the interferograms. The extraction of fringe loci from the digitized interferograms is performed by a foating-threshold method. The fringes are numbered using a special scheme after the removal of any fringe disconnections which appeared if there was insufficient contrast in the holograms. The reconstruction of the object function from the fringe field uses least squares approximation with spline fit. Applications are given.

  6. Stationary table CT dosimetry and anomalous scanner-reported values of CTDI{sub vol}

    SciTech Connect

    Dixon, Robert L.; Boone, John M.

    2014-01-15

    Purpose: Anomalous, scanner-reported values of CTDI{sub vol} for stationary phantom/table protocols (having elevated values of CTDI{sub vol} over 300% higher than the actual dose to the phantom) have been observed; which are well-beyond the typical accuracy expected of CTDI{sub vol} as a phantom dose. Recognition of these outliers as “bad data” is important to users of CT dose index tracking systems (e.g., ACR DIR), and a method for recognition and correction is provided. Methods: Rigorous methods and equations are presented which describe the dose distributions for stationary-table CT. A comparison with formulae for scanner-reported values of CTDI{sub vol} clearly identifies the source of these anomalies. Results: For the stationary table, use of the CTDI{sub 100} formula (applicable to a moving phantom only) overestimates the dose due to extra scatter and also includes an overbeaming correction, both of which are nonexistent when the phantom (or patient) is held stationary. The reported DLP remains robust for the stationary phantom. Conclusions: The CTDI-paradigm does not apply in the case of a stationary phantom and simpler nonintegral equations suffice. A method of correction of the currently reported CTDI{sub vol} using the approach-to-equilibrium formula H(a) and an overbeaming correction factor serves to scale the reported CTDI{sub vol} values to more accurate levels for stationary-table CT, as well as serving as an indicator in the detection of “bad data.”.

  7. Users manual for AUTOMESH-2D: A program of automatic mesh generation for two-dimensional scattering analysis by the finite element method

    NASA Technical Reports Server (NTRS)

    Hua, Chongyu; Volakis, John L.

    1990-01-01

    AUTOMESH-2D is a computer program specifically designed as a preprocessor for the scattering analysis of two dimensional bodies by the finite element method. This program was developed due to a need for reproducing the effort required to define and check the geometry data, element topology, and material properties. There are six modules in the program: (1) Parameter Specification; (2) Data Input; (3) Node Generation; (4) Element Generation; (5) Mesh Smoothing; and (5) Data File Generation.

  8. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  9. Automatic segmentation of solitary pulmonary nodules based on local intensity structure analysis and 3D neighborhood features in 3D chest CT images

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Kitasaka, Takayuki; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2012-03-01

    This paper presents a solitary pulmonary nodule (SPN) segmentation method based on local intensity structure analysis and neighborhood feature analysis in chest CT images. Automated segmentation of SPNs is desirable for a chest computer-aided detection/diagnosis (CAS) system since a SPN may indicate early stage of lung cancer. Due to the similar intensities of SPNs and other chest structures such as blood vessels, many false positives (FPs) are generated by nodule detection methods. To reduce such FPs, we introduce two features that analyze the relation between each segmented nodule candidate and it neighborhood region. The proposed method utilizes a blob-like structure enhancement (BSE) filter based on Hessian analysis to augment the blob-like structures as initial nodule candidates. Then a fine segmentation is performed to segment much more accurate region of each nodule candidate. FP reduction is mainly addressed by investigating two neighborhood features based on volume ratio and eigenvector of Hessian that are calculates from the neighborhood region of each nodule candidate. We evaluated the proposed method by using 40 chest CT images, include 20 standard-dose CT images that we randomly chosen from a local database and 20 low-dose CT images that were randomly chosen from a public database: LIDC. The experimental results revealed that the average TP rate of proposed method was 93.6% with 12.3 FPs/case.

  10. Manual and Automatic Lineament Mapping: Comparing Results

    NASA Astrophysics Data System (ADS)

    Vaz, D. A.; di Achille, G.; Barata, M. T.; Alves, E. I.

    2008-03-01

    A method for automatic lineament extraction using topographic data is applied on the Thaumasia plateau. A comparison is made between the results that are obtained from the automatic mapping approach and from a traditional tectonic lineament mapping.

  11. Tank Farms Technical Safety Requirements [VOL 1 and 2

    SciTech Connect

    CASH, R.J.

    2000-12-28

    The Technical Safety Requirements (TSRs) define the acceptable conditions, safe boundaries, basis thereof, and controls to ensure safe operation during authorized activities, for facilities within the scope of the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR).

  12. Self-Compassion and Automatic Thoughts

    ERIC Educational Resources Information Center

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  13. 8 CFR 1205.1 - Automatic revocation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Automatic revocation. 1205.1 Section 1205.1... REGULATIONS REVOCATION OF APPROVAL OF PETITIONS § 1205.1 Automatic revocation. (a) Reasons for automatic revocation. The approval of a petition or self-petition made under section 204 of the Act and in...

  14. 8 CFR 205.1 - Automatic revocation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Automatic revocation. 205.1 Section 205.1 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS REVOCATION OF APPROVAL OF PETITIONS § 205.1 Automatic revocation. (a) Reasons for automatic revocation. The approval of a petition...

  15. 12 CFR 1263.4 - Automatic membership.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 9 2012-01-01 2012-01-01 false Automatic membership. 1263.4 Section 1263.4 Banks and Banking FEDERAL HOUSING FINANCE AGENCY FEDERAL HOME LOAN BANKS MEMBERS OF THE BANKS Membership Application Process § 1263.4 Automatic membership. (a) Automatic membership for certain charter...

  16. 12 CFR 1263.4 - Automatic membership.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Automatic membership. 1263.4 Section 1263.4 Banks and Banking FEDERAL HOUSING FINANCE AGENCY FEDERAL HOME LOAN BANKS MEMBERS OF THE BANKS Membership Application Process § 1263.4 Automatic membership. (a) Automatic membership for certain charter...

  17. 12 CFR 1263.4 - Automatic membership.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 10 2014-01-01 2014-01-01 false Automatic membership. 1263.4 Section 1263.4 Banks and Banking FEDERAL HOUSING FINANCE AGENCY FEDERAL HOME LOAN BANKS MEMBERS OF THE BANKS Membership Application Process § 1263.4 Automatic membership. (a) Automatic membership for certain charter...

  18. 12 CFR 1263.4 - Automatic membership.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 9 2013-01-01 2013-01-01 false Automatic membership. 1263.4 Section 1263.4 Banks and Banking FEDERAL HOUSING FINANCE AGENCY FEDERAL HOME LOAN BANKS MEMBERS OF THE BANKS Membership Application Process § 1263.4 Automatic membership. (a) Automatic membership for certain charter...

  19. Adding Automatic Evaluation to Interactive Virtual Labs

    ERIC Educational Resources Information Center

    Farias, Gonzalo; Muñoz de la Peña, David; Gómez-Estern, Fabio; De la Torre, Luis; Sánchez, Carlos; Dormido, Sebastián

    2016-01-01

    Automatic evaluation is a challenging field that has been addressed by the academic community in order to reduce the assessment workload. In this work we present a new element for the authoring tool Easy Java Simulations (EJS). This element, which is named automatic evaluation element (AEE), provides automatic evaluation to virtual and remote…

  20. Delineating subtypes of self-injurious behavior maintained by automatic reinforcement.

    PubMed

    Hagopian, Louis P; Rooker, Griffin W; Zarcone, Jennifer R

    2015-09-01

    Self-injurious behavior (SIB) is maintained by automatic reinforcement in roughly 25% of cases. Automatically reinforced SIB typically has been considered a single functional category, and is less understood than socially reinforced SIB. Subtyping automatically reinforced SIB into functional categories has the potential to guide the development of more targeted interventions and increase our understanding of its biological underpinnings. The current study involved an analysis of 39 individuals with automatically reinforced SIB and a comparison group of 13 individuals with socially reinforced SIB. Automatically reinforced SIB was categorized into 3 subtypes based on patterns of responding in the functional analysis and the presence of self-restraint. These response features were selected as the basis for subtyping on the premise that they could reflect functional properties of SIB unique to each subtype. Analysis of treatment data revealed important differences across subtypes and provides preliminary support to warrant additional research on this proposed subtyping model.

  1. DELINEATING SUBTYPES OF SELF-INJURIOUS BEHAVIOR MAINTAINED BY AUTOMATIC REINFORCEMENT

    PubMed Central

    Hagopian, Louis P.; Rooker, Griffin W.; Zarcone, Jennifer R.

    2016-01-01

    Self-injurious behavior (SIB) is maintained by automatic reinforcement in roughly 25% of cases. Automatically reinforced SIB typically has been considered a single functional category, and is less understood than socially reinforced SIB. Subtyping automatically reinforced SIB into functional categories has the potential to guide the development of more targeted interventions and increase our understanding of its biological underpinnings. The current study involved an analysis of 39 individuals with automatically reinforced SIB and a comparison group of 13 individuals with socially reinforced SIB. Automatically reinforced SIB was categorized into 3 subtypes based on patterns of responding in the functional analysis and the presence of self-restraint. These response features were selected as the basis for subtyping on the premise that they could reflect functional properties of SIB unique to each subtype. Analysis of treatment data revealed important differences across subtypes and provides preliminary support to warrant additional research on this proposed subtyping model. PMID:26223959

  2. Harmonizing Automatic Test System Assets, Drivers, and Control Methodologies

    DTIC Science & Technology

    1999-07-18

    technology for automatically extracting inferred knowledge. Technology is Latent Semantic Analysis (LSA) and appears to have merit. Assessed approach to...terms by different manufacturers. One technique we discovered is the use of Latent Semantic Analysis (LSA) as a tool to aid in classifying an...the IVI specifications. 3.5 Latent Semantic Analysis The technique of LSA may be quite suitable to addressing the issues of class membership

  3. The Concept of Automatic Reinforcement: Implications for Behavioral Research in Developmental Disabilities.

    ERIC Educational Resources Information Center

    Vollmer, Timothy R.

    1994-01-01

    This article discusses problems inherent in the analysis of automatically reinforced behaviors, which are behaviors that are maintained by operant mechanisms independent of the social environment. Four classes of treatment that are compatible with automatic reinforcement are reviewed, including manipulations of establishing operations, sensory…

  4. Automatic Text Structuring and Categorization As a First Step in Summarizing Legal Cases.

    ERIC Educational Resources Information Center

    Moens, Marie-Francine; Uyttendaele, Caroline

    1997-01-01

    Describes SALOMON (Summary and Analysis of Legal texts for Managing Online Needs), a system which automatically summarizes Belgian criminal cases to improve access to court decisions. Highlights include a text grammar represented as a semantic network; automatic abstracting; knowledge acquisition and representation; parsing; evaluation, including…

  5. Automatic methods for the adjustment of faceted solar-energy concentrators and heliostats

    NASA Astrophysics Data System (ADS)

    Zakhidov, R. A.; Khakimov, R. A.; Abdurakhmanov, A. A.; Sizov, Iu. M.; Baranov, V. K.

    An automatic adjustment technique is described which makes possible a considerable simplification of the operation of solar-energy installations. Diagrams of devices for the automatic adjustment of the facets of a composite concentrator and of a plane heliostat are presented. An error analysis shows that the accuracy of the adjustment method is 4-5 arcmin.

  6. Characterizing chaotic melodies in automatic music composition

    NASA Astrophysics Data System (ADS)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  7. Anisotropic path searching for automatic neuron reconstruction.

    PubMed

    Xie, Jun; Zhao, Ting; Lee, Tzumin; Myers, Eugene; Peng, Hanchuan

    2011-10-01

    Full reconstruction of neuron morphology is of fundamental interest for the analysis and understanding of their functioning. We have developed a novel method capable of automatically tracing neurons in three-dimensional microscopy data. In contrast to template-based methods, the proposed approach makes no assumptions about the shape or appearance of neurite structure. Instead, an efficient seeding approach is applied to capture complex neuronal structures and the tracing problem is solved by computing the optimal reconstruction with a weighted graph. The optimality is determined by the cost function designed for the path between each pair of seeds and by topological constraints defining the component interrelations and completeness. In addition, an automated neuron comparison method is introduced for performance evaluation and structure analysis. The proposed algorithm is computationally efficient and has been validated using different types of microscopy data sets including Drosophila's projection neurons and fly neurons with presynaptic sites. In all cases, the approach yielded promising results.

  8. Investigation of morphometric variability of subthalamic nucleus, red nucleus, and substantia nigra in advanced Parkinson's disease patients using automatic segmentation and PCA-based analysis.

    PubMed

    Xiao, Yiming; Jannin, Pierre; D'Albis, Tiziano; Guizard, Nicolas; Haegelen, Claire; Lalys, Florent; Vérin, Marc; Collins, D Louis

    2014-09-01

    Subthalamic nucleus (STN) deep brain stimulation (DBS) is an effective surgical therapy to treat Parkinson's disease (PD). Conventional methods employ standard atlas coordinates to target the STN, which, along with the adjacent red nucleus (RN) and substantia nigra (SN), are not well visualized on conventional T1w MRIs. However, the positions and sizes of the nuclei may be more variable than the standard atlas, thus making the pre-surgical plans inaccurate. We investigated the morphometric variability of the STN, RN and SN by using label-fusion segmentation results from 3T high resolution T2w MRIs of 33 advanced PD patients. In addition to comparing the size and position measurements of the cohort to the Talairach atlas, principal component analysis (PCA) was performed to acquire more intuitive and detailed perspectives of the measured variability. Lastly, the potential correlation between the variability shown by PCA results and the clinical scores was explored.

  9. On Automatic Support to Indexing a Life Sciences Data Base.

    ERIC Educational Resources Information Center

    Vleduts-Stokolov, N.

    1982-01-01

    Describes technique developed as automatic support to subject heading indexing at BIOSIS based on use of formalized language for semantic representation of biological texts and subject headings. Language structures, experimental results, and analysis of journal/subject heading and author/subject heading correlation data are discussed. References…

  10. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    ERIC Educational Resources Information Center

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  11. A Network of Automatic Control Web-Based Laboratories

    ERIC Educational Resources Information Center

    Vargas, Hector; Sanchez Moreno, J.; Jara, Carlos A.; Candelas, F. A.; Torres, Fernando; Dormido, Sebastian

    2011-01-01

    This article presents an innovative project in the context of remote experimentation applied to control engineering education. Specifically, the authors describe their experience regarding the analysis, design, development, and exploitation of web-based technologies within the scope of automatic control. This work is part of an inter-university…

  12. IBM MASTOR SYSTEM: Multilingual Automatic Speech-to-speech Translator

    DTIC Science & Technology

    2006-01-01

    IBM MASTOR SYSTEM: Multilingual Automatic Speech-to-speech Translator * Yuqing Gao, Liang Gu, Bowen Zhou, Ruhi Sarikaya, Mohamed Afify , Hong-Kwang...for Improved Discriminative Training,” In Proc. ICASSP, Orlando, 2002. [14] M. Afify et.al, “On the Use of Morphological Analysis for Dialectal

  13. Use of Automatic Text Analyzer in Preparation of SDI Profiles

    ERIC Educational Resources Information Center

    Carroll, John M.; Tague, Jean M.

    1973-01-01

    This research shows that by submitting samples of the client's recent professional reading material to automatic text analysis, Selective Dissemination of Information (SDI) profiles can be prepared that result in significantly higher initial recall scores than do those prepared by conventional techniques; relevance scores are not significantly…

  14. Automatic design of magazine covers

    NASA Astrophysics Data System (ADS)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  15. Automatically scramming nuclear reactor system

    DOEpatents

    Ougouag, Abderrafi M.; Schultz, Richard R.; Terry, William K.

    2004-10-12

    An automatically scramming nuclear reactor system. One embodiment comprises a core having a coolant inlet end and a coolant outlet end. A cooling system operatively associated with the core provides coolant to the coolant inlet end and removes heated coolant from the coolant outlet end, thus maintaining a pressure differential therebetween during a normal operating condition of the nuclear reactor system. A guide tube is positioned within the core with a first end of the guide tube in fluid communication with the coolant inlet end of the core, and a second end of the guide tube in fluid communication with the coolant outlet end of the core. A control element is positioned within the guide tube and is movable therein between upper and lower positions, and automatically falls under the action of gravity to the lower position when the pressure differential drops below a safe pressure differential.

  16. Automatic identification of artifacts in electrodermal activity data.

    PubMed

    Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind

    2015-01-01

    Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.

  17. Automatic translation among spoken languages

    NASA Astrophysics Data System (ADS)

    Walter, Sharon M.; Costigan, Kelly

    1994-02-01

    The Machine Aided Voice Translation (MAVT) system was developed in response to the shortage of experienced military field interrogators with both foreign language proficiency and interrogation skills. Combining speech recognition, machine translation, and speech generation technologies, the MAVT accepts an interrogator's spoken English question and translates it into spoken Spanish. The spoken Spanish response of the potential informant can then be translated into spoken English. Potential military and civilian applications for automatic spoken language translation technology are discussed in this paper.

  18. How CBO Estimates Automatic Stabilizers

    DTIC Science & Technology

    2015-11-01

    Pamela Green , Kurt Seibert, Joshua Shakin, and Robert Stewart for technical assistance. The authors also thank Leah Loversky for outstanding research...ANSI Std Z39-18 Abstract Federal receipts and outlays regularly respond to cyclical movements in the economy. When the economy is operating...Those “automatic stabilizers” thus tend to dampen the size of cyclical movements in the economy, by supporting or restraining private spending. (The

  19. Automatically-Programed Machine Tools

    NASA Technical Reports Server (NTRS)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  20. Automatic computation of transfer functions

    DOEpatents

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.