Science.gov

Sample records for automatic vol analysis

  1. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  2. FAMA: Fast Automatic MOOG Analysis

    NASA Astrophysics Data System (ADS)

    Magrini, Laura; Randich, Sofia; Friel, Eileen; Spina, Lorenzo; Jacobson, Heather; Cantat-Gaudin, Tristan; Donati, Paolo; Baglioni, Roberto; Maiorca, Enrico; Bragaglia, Angela; Sordo, Rosanna; Vallenari, Antonella

    2014-02-01

    FAMA (Fast Automatic MOOG Analysis), written in Perl, computes the atmospheric parameters and abundances of a large number of stars using measurements of equivalent widths (EWs) automatically and independently of any subjective approach. Based on the widely-used MOOG code, it simultaneously searches for three equilibria, excitation equilibrium, ionization balance, and the relationship between logn(FeI) and the reduced EWs. FAMA also evaluates the statistical errors on individual element abundances and errors due to the uncertainties in the stellar parameters. Convergence criteria are not fixed "a priori" but instead are based on the quality of the spectra.

  3. Automatic fringe analysis

    NASA Technical Reports Server (NTRS)

    Chiu, Arnold; Ladewski, Ted; Turney, Jerry

    1991-01-01

    To satisfy the requirement for fast, accurate interferometric analytical tools, the Fringe Analysis Workstation (FAW) has been developed to analyze complex fringe image data easily and rapidly. FAW is employed for flow studies in hydrodynamics and aerodynamics experiments, and for target shell characterization in inertial confinement fusion research. Three major components of the FAW system: fringe analysis/image processing, input/output, and visualization/graphical user interface are described.

  4. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  5. Automatic emotional expression analysis from eye area

    NASA Astrophysics Data System (ADS)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  6. Accuracy analysis of automatic distortion correction

    NASA Astrophysics Data System (ADS)

    Kolecki, Jakub; Rzonca, Antoni

    2015-06-01

    The paper addresses the problem of the automatic distortion removal from images acquired with non-metric SLR camera equipped with prime lenses. From the photogrammetric point of view the following question arises: is the accuracy of distortion control data provided by the manufacturer for a certain lens model (not item) sufficient in order to achieve demanded accuracy? In order to obtain the reliable answer to the aforementioned problem the two kinds of tests were carried out for three lens models. Firstly the multi-variant camera calibration was conducted using the software providing full accuracy analysis. Secondly the accuracy analysis using check points took place. The check points were measured in the images resampled based on estimated distortion model or in distortion-free images simply acquired in the automatic distortion removal mode. The extensive conclusions regarding application of each calibration approach in practice are given. Finally the rules of applying automatic distortion removal in photogrammetric measurements are suggested.

  7. Automatic Syntactic Analysis of Free Text.

    ERIC Educational Resources Information Center

    Schwarz, Christoph

    1990-01-01

    Discusses problems encountered with the syntactic analysis of free text documents in indexing. Postcoordination and precoordination of terms is discussed, an automatic indexing system call COPSY (context operator syntax) that uses natural language processing techniques is described, and future developments are explained. (60 references) (LRW)

  8. Automatic photointerpretation via texture and morphology analysis

    NASA Technical Reports Server (NTRS)

    Tou, J. T.

    1982-01-01

    Computer-based techniques for automatic photointerpretation based upon information derived from texture and morphology analysis of images are discussed. By automatic photointerpretation, is meant the determination of semantic descriptions of the content of the images by computer. To perform semantic analysis of morphology, a heirarchical structure of knowledge representation was developed. The simplest elements in a morphology are strokes, which are used to form alphabets. The alphabets are the elements for generating words, which are used to describe the function or property of an object or a region. The words are the elements for constructing sentences, which are used for semantic description of the content of the image. Photointerpretation based upon morphology is then augmented by textural information. Textural analysis is performed using a pixel-vector approach.

  9. Automatic Prosodic Analysis to Identify Mild Dementia

    PubMed Central

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  10. Automatic Prosodic Analysis to Identify Mild Dementia.

    PubMed

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  11. Automatic analysis and classification of surface electromyography.

    PubMed

    Abou-Chadi, F E; Nashar, A; Saad, M

    2001-01-01

    In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device. PMID:11556501

  12. Automatic processing, analysis, and recognition of images

    NASA Astrophysics Data System (ADS)

    Abrukov, Victor S.; Smirnov, Evgeniy V.; Ivanov, Dmitriy G.

    2004-11-01

    New approaches and computer codes (A&CC) for automatic processing, analysis and recognition of images are offered. The A&CC are based on presentation of object image as a collection of pixels of various colours and consecutive automatic painting of distinguished itself parts of the image. The A&CC have technical objectives centred on such direction as: 1) image processing, 2) image feature extraction, 3) image analysis and some others in any consistency and combination. The A&CC allows to obtain various geometrical and statistical parameters of object image and its parts. Additional possibilities of the A&CC usage deal with a usage of artificial neural networks technologies. We believe that A&CC can be used at creation of the systems of testing and control in a various field of industry and military applications (airborne imaging systems, tracking of moving objects), in medical diagnostics, at creation of new software for CCD, at industrial vision and creation of decision-making system, etc. The opportunities of the A&CC are tested at image analysis of model fires and plumes of the sprayed fluid, ensembles of particles, at a decoding of interferometric images, for digitization of paper diagrams of electrical signals, for recognition of the text, for elimination of a noise of the images, for filtration of the image, for analysis of the astronomical images and air photography, at detection of objects.

  13. Automatic analysis of computation in biochemical reactions.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L; Rhodes, John L; Schilstra, Maria J

    2008-01-01

    We propose a modeling and analysis method for biochemical reactions based on finite state automata. This is a completely different approach compared to traditional modeling of reactions by differential equations. Our method aims to explore the algebraic structure behind chemical reactions using automatically generated coordinate systems. In this paper we briefly summarize the underlying mathematical theory (the algebraic hierarchical decomposition theory of finite state automata) and describe how such automata can be derived from the description of chemical reaction networks. We also outline techniques for the flexible manipulation of existing models. As a real-world example we use the Krebs citric acid cycle. PMID:18606208

  14. Research on automatic human chromosome image analysis

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Tian, Jinwen; Liu, Jian

    2007-11-01

    Human chromosome karyotyping is one of the essential tasks in cytogenetics, especially in genetic syndrome diagnoses. In this thesis, an automatic procedure is introduced for human chromosome image analysis. According to different status of touching and overlapping chromosomes, several segmentation methods are proposed to achieve the best results. Medial axis is extracted by the middle point algorithm. Chromosome band is enhanced by the algorithm based on multiscale B-spline wavelets, extracted by average gray profile, gradient profile and shape profile, and calculated by the WDD (Weighted Density Distribution) descriptors. The multilayer classifier is used in classification. Experiment results demonstrate that the algorithms perform well.

  15. Semi-automatic analysis of fire debris

    PubMed

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-01

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians. PMID:10802196

  16. Automatic cortical thickness analysis on rodent brain

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek

    2011-03-01

    Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.

  17. Automatic variance analysis of multistage care pathways.

    PubMed

    Li, Xiang; Liu, Haifeng; Zhang, Shilei; Mei, Jing; Xie, Guotong; Yu, Yiqin; Li, Jing; Lakshmanan, Geetika T

    2014-01-01

    A care pathway (CP) is a standardized process that consists of multiple care stages, clinical activities and their relations, aimed at ensuring and enhancing the quality of care. However, actual care may deviate from the planned CP, and analysis of these deviations can help clinicians refine the CP and reduce medical errors. In this paper, we propose a CP variance analysis method to automatically identify the deviations between actual patient traces in electronic medical records (EMR) and a multistage CP. As the care stage information is usually unavailable in EMR, we first align every trace with the CP using a hidden Markov model. From the aligned traces, we report three types of deviations for every care stage: additional activities, absent activities and violated constraints, which are identified by using the techniques of temporal logic and binomial tests. The method has been applied to a CP for the management of congestive heart failure and real world EMR, providing meaningful evidence for the further improvement of care quality. PMID:25160280

  18. Remote weapon station for automatic target recognition system demand analysis

    NASA Astrophysics Data System (ADS)

    Lei, Zhang; Li, Sheng-cai; Shi, Cai

    2015-08-01

    Introduces a remote weapon station basic composition and the main advantage, analysis of target based on image automatic recognition system for remote weapon station of practical significance, the system elaborated the image based automatic target recognition system in the photoelectric stabilized technology, multi-sensor image fusion technology, integrated control target image enhancement, target behavior risk analysis technology, intelligent based on the character of the image automatic target recognition algorithm research, micro sensor technology as the key technology of the development in the field of demand.

  19. Automatic analysis of the corneal ulcer

    NASA Astrophysics Data System (ADS)

    Ventura, Liliane; Chiaradia, Caio; Faria de Sousa, Sidney J.

    1999-06-01

    A very common disease in agricultural countries is the corneal ulcer. Particularly in the public hospitals, several patients come every week presenting this kind of pathology. One of the most important features to diagnose the regression of the disease is the determination of the vanishing of the affected area. An automatic system (optical system and software), attached to a Slit Lamp, has been developed to determine automatically the area of the ulcer and to follow up its regression. The clinical procedure to isolate the ulcer is still done, but the measuring time is fast enough to not cause discomfort to the patient as the traditional evaluation does. The system has been used in the last 6 months in a hospital that has about 80 patients per week presenting corneal ulcer. The patients follow up (which is an indispensable criteria for the cure of the disease) has been improved by the system and has guaranteed the treatment success.

  20. Automatic basal slice detection for cardiac analysis

    NASA Astrophysics Data System (ADS)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  1. Automatism

    PubMed Central

    McCaldon, R. J.

    1964-01-01

    Individuals can carry out complex activity while in a state of impaired consciousness, a condition termed “automatism”. Consciousness must be considered from both an organic and a psychological aspect, because impairment of consciousness may occur in both ways. Automatism may be classified as normal (hypnosis), organic (temporal lobe epilepsy), psychogenic (dissociative fugue) or feigned. Often painstaking clinical investigation is necessary to clarify the diagnosis. There is legal precedent for assuming that all crimes must embody both consciousness and will. Jurists are loath to apply this principle without reservation, as this would necessitate acquittal and release of potentially dangerous individuals. However, with the sole exception of the defence of insanity, there is at present no legislation to prohibit release without further investigation of anyone acquitted of a crime on the grounds of “automatism”. PMID:14199824

  2. Automatic ionospheric layers detection: Algorithms analysis

    NASA Astrophysics Data System (ADS)

    Molina, María G.; Zuccheretti, Enrico; Cabrera, Miguel A.; Bianchi, Cesidio; Sciacca, Umberto; Baskaradas, James

    2016-03-01

    Vertical sounding is a widely used technique to obtain ionosphere measurements, such as an estimation of virtual height versus frequency scanning. It is performed by high frequency radar for geophysical applications called "ionospheric sounder" (or "ionosonde"). Radar detection depends mainly on targets characteristics. While several targets behavior and correspondent echo detection algorithms have been studied, a survey to address a suitable algorithm for ionospheric sounder has to be carried out. This paper is focused on automatic echo detection algorithms implemented in particular for an ionospheric sounder, target specific characteristics were studied as well. Adaptive threshold detection algorithms are proposed, compared to the current implemented algorithm, and tested using actual data obtained from the Advanced Ionospheric Sounder (AIS-INGV) at Rome Ionospheric Observatory. Different cases of study have been selected according typical ionospheric and detection conditions.

  3. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  4. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    PubMed

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved. PMID:27139389

  5. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  6. A hierarchical structure for automatic meshing and adaptive FEM analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Saxena, Mukul; Perucchio, Renato

    1987-01-01

    A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.

  7. Profiling School Shooters: Automatic Text-Based Analysis

    PubMed Central

    Neuman, Yair; Assaf, Dan; Cohen, Yochai; Knoll, James L.

    2015-01-01

    School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters’ texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology. PMID:26089804

  8. Profiling School Shooters: Automatic Text-Based Analysis.

    PubMed

    Neuman, Yair; Assaf, Dan; Cohen, Yochai; Knoll, James L

    2015-01-01

    School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology. PMID:26089804

  9. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  10. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  11. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    PubMed Central

    Magalhaes, Fabrício A.; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions) were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key Points The availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports. An important feature of automatic tracking software is to require limited human

  12. Automatic analysis of double coronal mass ejections from coronagraph images

    NASA Astrophysics Data System (ADS)

    Jacobs, Matthew; Chang, Lin-Ching; Pulkkinen, Antti; Romano, Michelangelo

    2015-11-01

    Coronal mass ejections (CMEs) can have major impacts on man-made technology and humans, both in space and on Earth. These impacts have created a high interest in the study of CMEs in an effort to detect and track events and forecast the CME arrival time to provide time for proper mitigation. A robust automatic real-time CME processing pipeline is greatly desired to avoid laborious and subjective manual processing. Automatic methods have been proposed to segment CMEs from coronagraph images and estimate CME parameters such as their heliocentric location and velocity. However, existing methods suffered from several shortcomings such as the use of hard thresholding and an inability to handle two or more CMEs occurring within the same coronagraph image. Double-CME analysis is a necessity for forecasting the many CME events that occur within short time frames. Robust forecasts for all CME events are required to fully understand space weather impacts. This paper presents a new method to segment CME masses and pattern recognition approaches to differentiate two CMEs in a single coronagraph image. The proposed method is validated on a data set of 30 halo CMEs, with results showing comparable ability in transient arrival time prediction accuracy and the new ability to automatically predict the arrival time of a double-CME event. The proposed method is the first automatic method to successfully calculate CME parameters from double-CME events, making this automatic method applicable to a wider range of CME events.

  13. Automatic recognition and analysis of synapses. [in brain tissue

    NASA Technical Reports Server (NTRS)

    Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.

    1976-01-01

    An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.

  14. Development of an automatic identification algorithm for antibiogram analysis.

    PubMed

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a <4mm difference between AIA and human analysis, exhibiting a correlation index of 0.85 for all images, 0.90 for standards and 0.80 for oddities with no significant difference between automatic and manual method. AIA resolved some reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  15. Automatic Generation of User Material Subroutines for Biomechanical Growth Analysis

    PubMed Central

    Young, Jonathan M.; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A.; Perucchio, Renato

    2010-01-01

    Background The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis (FEA) package Abaqus allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. Method To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package Mathematica, and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress–stretch response of a material defined by a Fung-Orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in Abaqus. The Mathematica UMAT generator is then extended to include continuum growth, by adding a growth subroutine to the automatically generated UMAT. Results The Mathematica UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT we simulate the growth-based bending of a bilayered bar with differing fiber directions in a non-growing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. Conclusions The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for biomechanical growth analysis. PMID:20887023

  16. Cernuc: A program for automatic high-resolution radioelemental analysis

    NASA Astrophysics Data System (ADS)

    Roca, V.; Terrasi, F.; Moro, R.; Sorrentino, G.

    1981-04-01

    A computer program capable of qualitative and quantitative radioelemental analysis with high accuracy, a high degree of automatism and great ease in utilization, is presented. It has been produced to be used for Ge(Li) gammay-ray spectroscopy and can be used for X-ray spectroscopy as well. This program provides automatic searching and fitting of peaks, energy and intensity determination, identification and calculation of activities of the radioisotopes present in the sample. The last step is carried out by using a radionuclides library. The problem of a gamma line being assigned to more than one nuclide, is solved by searching the least-squares solution of a set of equations for the activities of the isotopes. Two versions of this program have been written to be run batchwise on a medium sized computer (UNIVAC 1106) and interactively on a small computer (HP 2100A).

  17. An integrated spatial signature analysis and automatic defect classification system

    SciTech Connect

    Gleason, S.S.; Tobin, K.W.; Karnowski, T.P.

    1997-08-01

    An integrated Spatial Signature Analysis (SSA) and automatic defect classification (ADC) system for improved automatic semiconductor wafer manufacturing characterization is presented. Both concepts of SSA and ADC methodologies are reviewed and then the benefits of an integrated system are described, namely, focused ADC and signature-level sampling. Focused ADC involves the use of SSA information on a defect signature to reduce the number of possible classes that an ADC system must consider, thus improving the ADC system performance. Signature-level sampling improved the ADC system throughput and accuracy by intelligently sampling defects within a given spatial signature for subsequent off-line, high-resolution ADC. A complete example of wafermap characterization via an integrated SSA/ADC system is presented where a wafer with 3274 defects is completely characterized by revisiting only 25 defects on an off-line ADC review station. 13 refs., 7 figs.

  18. Facilitator control as automatic behavior: A verbal behavior analysis

    PubMed Central

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is automatic when the speaker or writer is not stimulated by the behavior at the time of emission, the behavior is not edited, the products of behavior differ from what the person would produce normally, and the behavior is attributed to an outside source. All of these characteristics appear to be present in facilitator behavior. Other variables seem to account for the thematic content of the typed messages. These variables also are discussed. PMID:22477083

  19. Spectral analysis methods for automatic speech recognition applications

    NASA Astrophysics Data System (ADS)

    Parinam, Venkata Neelima Devi

    In this thesis, we evaluate the front-end of Automatic Speech Recognition (ASR) systems, with respect to different types of spectral processing methods that are extensively used. A filter bank approach for front end spectral analysis is one of the common methods used for spectral analysis. In this work we describe and evaluate spectral analysis based on Mel and Gammatone filter banks. These filtering methods are derived from auditory models and are thought to have some advantages for automatic speech recognition work. Experimentally, however, we show that direct use of FFT spectral values is just as effective as using either Mel or Gammatone filter banks, provided that the features extracted from the FFT spectral values take into account a Mel or Mel-like frequency scale. It is also shown that trajectory features based on sliding block of spectral features, computed using either FFT or filter bank spectral analysis are considerably more effective, in terms of ASR accuracy, than are delta and delta-delta terms often used for ASR. Although there is no major performance disadvantage to using a filter bank, simplicity of analysis is a reason to eliminate this step in speech processing. These assertions hold for both clean and noisy speech.

  20. Automatic 3-D grayscale volume matching and shape analysis.

    PubMed

    Guétat, Grégoire; Maitre, Matthieu; Joly, Laurène; Lai, Sen-Lin; Lee, Tzumin; Shinagawa, Yoshihisa

    2006-04-01

    Recently, shape matching in three dimensions (3-D) has been gaining importance in a wide variety of fields such as computer graphics, computer vision, medicine, and biology, with applications such as object recognition, medical diagnosis, and quantitative morphological analysis of biological operations. Automatic shape matching techniques developed in the field of computer graphics handle object surfaces, but ignore intensities of inner voxels. In biology and medical imaging, voxel intensities obtained by computed tomography (CT), magnetic resonance imagery (MRI), and confocal microscopes are important to determine point correspondences. Nevertheless, most biomedical volume matching techniques require human interactions, and automatic methods assume matched objects to have very similar shapes so as to avoid combinatorial explosions of point. This article is aimed at decreasing the gap between the two fields. The proposed method automatically finds dense point correspondences between two grayscale volumes; i.e., finds a correspondent in the second volume for every voxel in the first volume, based on the voxel intensities. Mutiresolutional pyramids are introduced to reduce computational load and handle highly plastic objects. We calculate the average shape of a set of similar objects and give a measure of plasticity to compare them. Matching results can also be used to generate intermediate volumes for morphing. We use various data to validate the effectiveness of our method: we calculate the average shape and plasticity of a set of fly brain cells, and we also match a human skull and an orangutan skull. PMID:16617625

  1. Automatic selection of region of interest for radiographic texture analysis

    NASA Astrophysics Data System (ADS)

    Lan, Li; Giger, Maryellen L.; Wilkie, Joel R.; Vokes, Tamara J.; Chen, Weijie; Li, Hui; Lyons, Tracy; Chinander, Michael R.; Pham, Ann

    2007-03-01

    We have been developing radiographic texture analysis (RTA) for assessing osteoporosis and the related risk of fracture. Currently, analyses are performed on heel images obtained from a digital imaging device, the GE/Lunar PIXI, that yields both the bone mineral density (BMD) and digital images (0.2-mm pixels; 12-bit quantization). RTA is performed on the image data in a region-of-interest (ROI) placed just below the talus in order to include the trabecular structure in the analysis. We have found that variations occur from manually selecting this ROI for RTA. To reduce the variations, we present an automatic method involving an optimized Canny edge detection technique and parameterized bone segmentation, to define bone edges for the placement of an ROI within the predominantly calcaneus portion of the radiographic heel image. The technique was developed using 1158 heel images and then tested on an independent set of 176 heel images. Results from a subjective analysis noted that 87.5% of ROI placements were rated as "good". In addition, an objective overlap measure showed that 98.3% of images had successful ROI placements as compared to placement by an experienced observer at an overlap threshold of 0.4. In conclusion, our proposed method for automatic ROI selection on radiographic heel images yields promising results and the method has the potential to reduce intra- and inter-observer variations in selecting ROIs for radiographic texture analysis.

  2. Feature++: Automatic Feature Construction for Clinical Data Analysis.

    PubMed

    Sun, Wen; Hao, Bibo; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    With the rapid growth of clinical data and knowledge, feature construction for clinical analysis becomes increasingly important and challenging. Given a clinical dataset with up to hundreds or thousands of columns, the traditional manual feature construction process is usually too labour intensive to generate a full spectrum of features with potential values. As a result, advanced large-scale data analysis technologies, such as feature selection for predictive modelling, cannot be fully utilized for clinical data analysis. In this paper, we propose an automatic feature construction framework for clinical data analysis, namely, Feature++. It leverages available public knowledge to understand the semantics of the clinical data, and is able to integrate external data sources to automatically construct new features based on predefined rules and clinical knowledge. We demonstrate the effectiveness of Feature++ in a typical predictive modelling use case with a public clinical dataset, and the results suggest that the proposed approach is able to fulfil typical feature construction tasks with minimal dataset specific configurations, so that more accurate models can be obtained from various clinical datasets in a more efficient way. PMID:27577443

  3. Rapid automatic keyword extraction for information retrieval and analysis

    DOEpatents

    Rose, Stuart J; Cowley,; Wendy E; Crow, Vernon L; Cramer, Nicholas O

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  4. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  5. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-07-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  6. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  7. Breast Density Analysis Using an Automatic Density Segmentation Algorithm.

    PubMed

    Oliver, Arnau; Tortajada, Meritxell; Lladó, Xavier; Freixenet, Jordi; Ganau, Sergi; Tortajada, Lidia; Vilagran, Mariona; Sentís, Melcior; Martí, Robert

    2015-10-01

    Breast density is a strong risk factor for breast cancer. In this paper, we present an automated approach for breast density segmentation in mammographic images based on a supervised pixel-based classification and using textural and morphological features. The objective of the paper is not only to show the feasibility of an automatic algorithm for breast density segmentation but also to prove its potential application to the study of breast density evolution in longitudinal studies. The database used here contains three complete screening examinations, acquired 2 years apart, of 130 different patients. The approach was validated by comparing manual expert annotations with automatically obtained estimations. Transversal analysis of the breast density analysis of craniocaudal (CC) and mediolateral oblique (MLO) views of both breasts acquired in the same study showed a correlation coefficient of ρ = 0.96 between the mammographic density percentage for left and right breasts, whereas a comparison of both mammographic views showed a correlation of ρ = 0.95. A longitudinal study of breast density confirmed the trend that dense tissue percentage decreases over time, although we noticed that the decrease in the ratio depends on the initial amount of breast density. PMID:25720749

  8. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  9. Automatic Visual Tracking and Social Behaviour Analysis with Multiple Mice

    PubMed Central

    Giancardo, Luca; Sona, Diego; Huang, Huiping; Sannino, Sara; Managò, Francesca; Scheggia, Diego; Papaleo, Francesco; Murino, Vittorio

    2013-01-01

    Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain) and BTBR T+tf/J (a mouse model for autism spectrum disorders). Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2) interacting mice, and its versatility to deal with different experimental settings and

  10. DWT features performance analysis for automatic speech recognition of Urdu.

    PubMed

    Ali, Hazrat; Ahmad, Nasir; Zhou, Xianwei; Iqbal, Khalid; Ali, Sahibzada Muhammad

    2014-01-01

    This paper presents the work on Automatic Speech Recognition of Urdu language, using a comparative analysis for Discrete Wavelets Transform (DWT) based features and Mel Frequency Cepstral Coefficients (MFCC). These features have been extracted for one hundred isolated words of Urdu, each word uttered by ten different speakers. The words have been selected from the most frequently used words of Urdu. A variety of age and dialect has been covered by using a balanced corpus approach. After extraction of features, the classification has been achieved by using Linear Discriminant Analysis. After the classification task, the confusion matrix obtained for the DWT features has been compared with the one obtained for Mel-Frequency Cepstral Coefficients based speech recognition. The framework has been trained and tested for speech data recorded under controlled environments. The experimental results are useful in determination of the optimum features for speech recognition task. PMID:25674450

  11. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  12. The automaticity of emotional Stroop: a meta-analysis.

    PubMed

    Phaf, R Hans; Kan, Kees-Jan

    2007-06-01

    An automatic bias to threat is often invoked to account for colour-naming interference in emotional Stroop. Recent findings by McKenna and Sharma [(2004). Reversing the emotional Stroop effect reveals that it is not what it seems: The role of fast and slow components. Journal of Experimental Psychology: Learning, Memory, and Cognition, 30, 382-392], however, cast doubt on the fast and non-conscious nature of emotional Stroop. Interference by threat words only occurred with colour naming in the trial subsequent to the threat trial (i.e., a "slow" effect), but not immediately (i.e., a "fast" effect, as would be predicted by the bias hypothesis). In a meta-analysis of 70 published emotional Stroop studies the largest effects occurred when presentation of threat words was blocked, suggesting a strong contribution by slow interference. We did not find evidence; moreover, for interference in suboptimal (less conscious) presentation conditions and the only significant effects were observed in optimal (fully conscious) conditions with high-anxious non-clinical participants and patients. The emotional Stroop effect seems to rely more on a slow disengagement process than on a fast, automatic, bias. PMID:17112461

  13. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    SciTech Connect

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  14. Automatic Analysis of Cellularity in Glioblastoma and Correlation with ADC Using Trajectory Analysis and Automatic Nuclei Counting

    PubMed Central

    Burth, Sina; Kieslich, Pascal J.; Jungk, Christine; Sahm, Felix; Kickingereder, Philipp; Kiening, Karl; Unterberg, Andreas; Wick, Wolfgang; Schlemmer, Heinz-Peter; Bendszus, Martin; Radbruch, Alexander

    2016-01-01

    Objective Several studies have analyzed a correlation between the apparent diffusion coefficient (ADC) derived from diffusion-weighted MRI and the tumor cellularity of corresponding histopathological specimens in brain tumors with inconclusive findings. Here, we compared a large dataset of ADC and cellularity values of stereotactic biopsies of glioblastoma patients using a new postprocessing approach including trajectory analysis and automatic nuclei counting. Materials and Methods Thirty-seven patients with newly diagnosed glioblastomas were enrolled in this study. ADC maps were acquired preoperatively at 3T and coregistered to the intraoperative MRI that contained the coordinates of the biopsy trajectory. 561 biopsy specimens were obtained; corresponding cellularity was calculated by semi-automatic nuclei counting and correlated to the respective preoperative ADC values along the stereotactic biopsy trajectory which included areas of T1-contrast-enhancement and necrosis. Results There was a weak to moderate inverse correlation between ADC and cellularity in glioblastomas that varied depending on the approach towards statistical analysis: for mean values per patient, Spearman’s ρ = -0.48 (p = 0.002), for all trajectory values in one joint analysis Spearman’s ρ = -0.32 (p < 0.001). The inverse correlation was additionally verified by a linear mixed model. Conclusions Our data confirms a previously reported inverse correlation between ADC and tumor cellularity. However, the correlation in the current article is weaker than the pooled correlation of comparable previous studies. Hence, besides cell density, other factors, such as necrosis and edema might influence ADC values in glioblastomas. PMID:27467557

  15. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  16. Automatic measuring device for octave analysis of noise

    NASA Technical Reports Server (NTRS)

    Memnonov, D. L.; Nikitin, A. M.

    1973-01-01

    An automatic decoder is described that counts noise levels by pulse counters and forms audio signals proportional in duration to the total or to one of the octave noise levels. Automatic ten fold repetition of the measurement cycle is provided at each measurement point before the transition to a new point is made.

  17. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  18. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  19. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  20. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  1. Trends of Science Education Research: An Automatic Content Analysis

    NASA Astrophysics Data System (ADS)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  2. Variable frame rate analysis for automatic speech recognition

    NASA Astrophysics Data System (ADS)

    Tan, Zheng-Hua

    2007-09-01

    In this paper we investigate the use of variable frame rate (VFR) analysis in automatic speech recognition (ASR). First, we review VFR technique and analyze its behavior. It is experimentally shown that VFR improves ASR performance for signals with low signal-to-noise ratios since it generates improved acoustic models and substantially reduces insertion and substitution errors although it may increase deletion errors. It is also underlined that the match between the average frame rate and the number of hidden Markov model states is critical in implementing VFR. Secondly, we analyze an effective VFR method that uses a cumulative, weighted cepstral-distance criterion for frame selection and present a revision for it. Lastly, the revised VFR method is combined with spectral- and cepstral-domain enhancement methods including the minimum statistics noise estimation (MSNE) based spectral subtraction and the cepstral mean subtraction, variance normalization and ARMA filtering (MVA) process. Experiments on the Aurora 2 database justify that VFR is highly complementary to the enhancement methods. Enhancement of speech both facilitates the frame selection in VFR and provides de-noised speech for recognition.

  3. Automatic analysis of ciliary beat frequency using optical flow

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  4. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  5. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  6. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  7. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  8. Discourse Analysis for Language Learners. Australian Review of Applied Linguistics, Vol. 1, No. 2.

    ERIC Educational Resources Information Center

    Hartmann, R. R. K.

    Discourse analysis, a field that reflects an interest in language as text and social interaction, is discussed. Discourse analysis deals with the way language varies from one communicative situation to another; textological analysis deals with the internal organization of such discourse in terms of grammar and vocabulary. Assumptions in…

  9. Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis

    NASA Astrophysics Data System (ADS)

    Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.

    2013-11-01

    Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.

  10. Automatic Crowd Analysis from Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Reinartz, P.

    2011-04-01

    Recently automatic detection of people crowds from images became a very important research field, since it can provide crucial information especially for police departments and crisis management teams. Due to the importance of the topic, many researchers tried to solve this problem using street cameras. However, these cameras cannot be used to monitor very large outdoor public events. In order to bring a solution to the problem, herein we propose a novel approach to detect crowds automatically from remotely sensed images, and especially from very high resolution satellite images. To do so, we use a local feature based probabilistic framework. We extract local features from color components of the input image. In order to eliminate redundant local features coming from other objects in given scene, we apply a feature selection method. For feature selection purposes, we benefit from three different type of information; digital elevation model (DEM) of the region which is automatically generated using stereo satellite images, possible street segment which is obtained by segmentation, and shadow information. After eliminating redundant local features, remaining features are used to detect individual persons. Those local feature coordinates are also assumed as observations of the probability density function (pdf) of the crowds to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf which gives us information about dense crowd and people locations. We test our algorithm usingWorldview-2 satellite images over Cairo and Munich cities. Besides, we also provide test results on airborne images for comparison of the detection accuracy. Our experimental results indicate the possible usage of the proposed approach in real-life mass events.

  11. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    NASA Astrophysics Data System (ADS)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  12. An analysis of automatic human detection and tracking

    NASA Astrophysics Data System (ADS)

    Demuth, Philipe R.; Cosmo, Daniel L.; Ciarelli, Patrick M.

    2015-12-01

    This paper presents an automatic method to detect and follow people on video streams. This method uses two techniques to determine the initial position of the person at the beginning of the video file: one based on optical flow and the other one based on Histogram of Oriented Gradients (HOG). After defining the initial bounding box, tracking is done using four different trackers: Median Flow tracker, TLD tracker, Mean Shift tracker and a modified version of the Mean Shift tracker using HSV color space. The results of the methods presented in this paper are then compared at the end of the paper.

  13. AVTA: a device for automatic vocal transaction analysis1

    PubMed Central

    Cassotta, Louis; Feldstein, Stanley; Jaffe, Joseph

    1964-01-01

    The Automatic Vocal Transaction Analyzer was designed to recognize the pattern of certain variables in spontaneous vocal transactions. In addition, it records these variables directly in a machine-readable form and preserves their sequential relationships. This permits the immediate extraction of data by a digital computer. The AVTA system reliability has been shown to be equal to or better than that of a trained human operator in uncomplicated interaction. The superiority of the machine was demonstrated in complex interactions which tax the information processing abilities of the human observer. PMID:14120152

  14. The Romanian-English Contrastive Analysis Project; Further Developments in Contrastive Studies, Vol. 5.

    ERIC Educational Resources Information Center

    Chitoran, Dumitru, Ed.

    The fifth volume in this series contains ten articles dealing with various aspects of Romanian-English contrastive analysis. They are: "Theoretical Interpretation and Methodological Consequences of 'REGULARIZATION'," by Tatiana Slama-Cazacu; "On Error Analysis," by Charles M. Carlton; "The Contrastive Hypothesis in Second Language Acquisition," by…

  15. Structuring Lecture Videos by Automatic Projection Screen Localization and Analysis.

    PubMed

    Li, Kai; Wang, Jue; Wang, Haoqian; Dai, Qionghai

    2015-06-01

    We present a fully automatic system for extracting the semantic structure of a typical academic presentation video, which captures the whole presentation stage with abundant camera motions such as panning, tilting, and zooming. Our system automatically detects and tracks both the projection screen and the presenter whenever they are visible in the video. By analyzing the image content of the tracked screen region, our system is able to detect slide progressions and extract a high-quality, non-occluded, geometrically-compensated image for each slide, resulting in a list of representative images that reconstruct the main presentation structure. Afterwards, our system recognizes text content and extracts keywords from the slides, which can be used for keyword-based video retrieval and browsing. Experimental results show that our system is able to generate more stable and accurate screen localization results than commonly-used object tracking methods. Our system also extracts more accurate presentation structures than general video summarization methods, for this specific type of video. PMID:26357345

  16. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    NASA Astrophysics Data System (ADS)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  17. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  18. Development of a System for Automatic Facial Expression Analysis

    NASA Astrophysics Data System (ADS)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  19. Mathematical morphology for TOFD image analysis and automatic crack detection.

    PubMed

    Merazi-Meksen, Thouraya; Boudraa, Malika; Boudraa, Bachir

    2014-08-01

    The aim of this work is to automate the interpretation of ultrasonic images during the non-destructive testing (NDT) technique called time-of-flight diffraction (TOFD) to aid in decision making. In this paper, the mathematical morphology approach is used to extract relevant pixels corresponding to the presence of a discontinuity, and a pattern recognition technique is used to characterize the discontinuity. The watershed technique is exploited to determine the region of interest and image background is removed using an erosion process, thereby improving the detection of connected shapes present in the image. Remaining shapes, are finally reduced to curves using a skeletonization technique. In the case of crack defects, the curve formed by such pixels has a parabolic form that can be automatically detected using the randomized Hough transform. PMID:24709071

  20. Automatic K scaling by means of fractal and harmonic analysis

    NASA Astrophysics Data System (ADS)

    de Santis, A.; Chiappini, M.

    1992-08-01

    The K index indicates the level of magnetic perturbation with respect to the normal diurnal variation. Usually K is taken manually from magnetograms, and the involved operations are consequently rather subjective. When data are available in digital form, it is possible to derive the K index automatically, using computer algorithms. This work applies a new combined technique based on both fractal and harmonic analyses. While the latter is often used in K determination, the former provides a substantially novel approach. One year (1989) of K observations at L'Aquila observatory has been used as a basis for comparison between hand and computer estimations of K. Agreements which have been found are comparable with those expected from two different operators.

  1. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  2. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  3. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  4. USE OF ULTRASONICS IN THE RAPID EXTRACTION OF HI-VOL FILTERS FOR BENZO-A-PYRENE (BAP) ANALYSIS

    EPA Science Inventory

    A rapid simple procedure was developed to extract residual Benzo-a-pyrene from a single hi-vol filter strip. It involves quantitative dispensing of cyclohexane, 10 minutes of ultrasonication at 78C and a quiescent period of 1 hour. At that time the solvent is ready for chromatogr...

  5. Automatic classification for pathological prostate images based on fractal analysis.

    PubMed

    Huang, Po-Whei; Lee, Cheng-Hsiung

    2009-07-01

    Accurate grading for prostatic carcinoma in pathological images is important to prognosis and treatment planning. Since human grading is always time-consuming and subjective, this paper presents a computer-aided system to automatically grade pathological images according to Gleason grading system which is the most widespread method for histological grading of prostate tissues. We proposed two feature extraction methods based on fractal dimension to analyze variations of intensity and texture complexity in regions of interest. Each image can be classified into an appropriate grade by using Bayesian, k-NN, and support vector machine (SVM) classifiers, respectively. Leave-one-out and k-fold cross-validation procedures were used to estimate the correct classification rates (CCR). Experimental results show that 91.2%, 93.7%, and 93.7% CCR can be achieved by Bayesian, k-NN, and SVM classifiers, respectively, for a set of 205 pathological prostate images. If our fractal-based feature set is optimized by the sequential floating forward selection method, the CCR can be promoted up to 94.6%, 94.2%, and 94.6%, respectively, using each of the above three classifiers. Experimental results also show that our feature set is better than the feature sets extracted from multiwavelets, Gabor filters, and gray-level co-occurrence matrix methods because it has a much smaller size and still keeps the most powerful discriminating capability in grading prostate images. PMID:19164082

  6. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  7. Towards automatic music transcription: note extraction based on independent subspace analysis

    NASA Astrophysics Data System (ADS)

    Wellhausen, Jens; Hoynck, Michael

    2005-01-01

    Due to the increasing amount of music available electronically the need of automatic search, retrieval and classification systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications, music analysis and music classification. The first part of the algorithm performs a note accurate temporal audio segmentation. In the second part, the resulting segments are examined using Independent Subspace Analysis to extract sounding notes. Finally, the results are used to build a MIDI file as a new representation of the piece of music which is examined.

  8. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  9. Automatic photolaryngoscope for vibration analysis of vocal cords

    NASA Astrophysics Data System (ADS)

    Igielski, J.; Kujawinska, Malgorzata; Pawlowski, Z.

    1995-05-01

    The vibration analysis of vocal cords gives information about the functioning of speech organs as well as about some illness within human organism. The analysis is usually performed by electroglottography or stroboscopic methods. The authors present the new opto-mechanical and electronic system of photolaryngoscope. The instrument uses laser diode light for illumination of vocal cords. The light reflected from the vibrating cord surface is detected electronically and analyzed. The further mathematical analysis of glottograms by autoregression method with covariance or by periodogram method is performed in order to define new criteria for medical interpretation of results.

  10. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  11. Automatic Method of Supernovae Classification by Modeling Human Procedure of Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Módolo, Marcelo; Rosa, Reinaldo; Guimaraes, Lamartine N. F.

    2016-07-01

    The classification of a recently discovered supernova must be done as quickly as possible in order to define what information will be captured and analyzed in the following days. This classification is not trivial and only a few experts astronomers are able to perform it. This paper proposes an automatic method that models the human procedure of classification. It uses Multilayer Perceptron Neural Networks to analyze the supernovae spectra. Experiments were performed using different pre-processing and multiple neural network configurations to identify the classic types of supernovae. Significant results were obtained indicating the viability of using this method in places that have no specialist or that require an automatic analysis.

  12. System for Automatic Detection and Analysis of Targets in FMICW Radar Signal

    NASA Astrophysics Data System (ADS)

    Rejfek, Luboš; Mošna, Zbyšek; Urbář, Jaroslav; Koucká Knížová, Petra

    2016-01-01

    This paper presents the automatic system for the processing of the signals from the frequency modulated interrupted continuous wave (FMICW) radar and describes methods for the primary signal processing. Further, we present methods for the detection of the targets in strong noise. These methods are tested both on the real and simulated signals. The real signals were measured using the developed at the IAP CAS experimental prototype of FMICW radar with operational frequency 35.4 GHz. The measurement campaign took place at the TU Delft, the Netherlands. The obtained results were used for development of the system for the automatic detection and analysis of the targets measured by the FMICW radar.

  13. Automatic "pipeline" analysis of 3-D MRI data for clinical trials: application to multiple sclerosis.

    PubMed

    Zijdenbos, Alex P; Forghani, Reza; Evans, Alan C

    2002-10-01

    The quantitative analysis of magnetic resonance imaging (MRI) data has become increasingly important in both research and clinical studies aiming at human brain development, function, and pathology. Inevitably, the role of quantitative image analysis in the evaluation of drug therapy will increase, driven in part by requirements imposed by regulatory agencies. However, the prohibitive length of time involved and the significant intraand inter-rater variability of the measurements obtained from manual analysis of large MRI databases represent major obstacles to the wider application of quantitative MRI analysis. We have developed a fully automatic "pipeline" image analysis framework and have successfully applied it to a number of large-scale, multicenter studies (more than 1,000 MRI scans). This pipeline system is based on robust image processing algorithms, executed in a parallel, distributed fashion. This paper describes the application of this system to the automatic quantification of multiple sclerosis lesion load in MRI, in the context of a phase III clinical trial. The pipeline results were evaluated through an extensive validation study, revealing that the obtained lesion measurements are statistically indistinguishable from those obtained by trained human observers. Given that intra- and inter-rater measurement variability is eliminated by automatic analysis, this system enhances the ability to detect small treatment effects not readily detectable through conventional analysis techniques. While useful for clinical trial analysis in multiple sclerosis, this system holds widespread potential for applications in other neurological disorders, as well as for the study of neurobiology in general. PMID:12585710

  14. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  15. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  16. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  17. Automatic forensic analysis of automotive paints using optical microscopy.

    PubMed

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time. PMID:26774250

  18. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kaisko, Outi; Kortström, Jari; Vuorinen, Tommi; Uski, Marja; Korja, Annakaisa

    2015-04-01

    The site of a new planned nuclear power plant is located in Pyhäjoki, eastern coast of the Bay of Bothnia. The area is characterized by low-active intraplate seismicity, with earthquake magnitudes rarely exceeding 4.0. IAEA guidelines state that when a nuclear power plant site is evaluated a network of sensitive seismographs having a recording capability for micro-earthquakes should be installed to acquire more detailed information on potential seismic sources. The operation period of the network should be long enough to obtain a comprehensive earthquake catalogue for seismotectonic interpretation. A near optimal configuration of ten seismograph stations will be installed around the site. A central station, including 3-C high-frequency and strong motion seismographs, is located in the site area. In addition, the network comprises nine high-frequency 3-C stations within a distance of 50 km from the central station. The network is dense enough to fulfil the requirements of azimuthal coverage better than 180o and automatic event location capability down to ~ ML -0.1 within a radius of 25 km from the site. Automatic processing and analysis of the planned seismic network is presented. Following the IAEA guidelines, real-time monitoring of the site area is integrated with the automatic detection and location process operated by the Institute of Seismology, University of Helsinki. In addition interactive data analysis is needed. At the end of year 2013 5 stations have been installed. The automatic analysis utilizes also 7 near by stations of national seismic networks of Finland and Sweden. During this preliminary phase several small earthquakes have been detected. The detection capability and location accuracy of the automatic analysis is estimated using chemical explosions at 15 known sites.

  19. A framework for automatic heart sound analysis without segmentation

    PubMed Central

    2011-01-01

    Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS). The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR), and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set. PMID:21303558

  20. Theory and algorithms of an efficient fringe analysis technology for automatic measurement applications.

    PubMed

    Juarez-Salazar, Rigoberto; Guerrero-Sanchez, Fermin; Robledo-Sanchez, Carlos

    2015-06-10

    Some advances in fringe analysis technology for phase computing are presented. A full scheme for phase evaluation, applicable to automatic applications, is proposed. The proposal consists of: a fringe-pattern normalization method, Fourier fringe-normalized analysis, generalized phase-shifting processing for inhomogeneous nonlinear phase shifts and spatiotemporal visibility, and a phase-unwrapping method by a rounding-least-squares approach. The theoretical principles of each algorithm are given. Numerical examples and an experimental evaluation are presented. PMID:26192836

  1. Development of automatic movement analysis system for a small laboratory animal using image processing

    NASA Astrophysics Data System (ADS)

    Nagatomo, Satoshi; Kawasue, Kikuhito; Koshimoto, Chihiro

    2013-03-01

    Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

  2. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  3. Biosignal analysis to assess mental stress in automatic driving of trucks: palmar perspiration and masseter electromyography.

    PubMed

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  4. Toward automatic computer aided dental X-ray analysis using level set method.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Jin, Chao; Li, Song

    2005-01-01

    A Computer Aided Dental X-rays Analysis (CADXA) framework is proposed to semi-automatically detect areas of bone loss and root decay in digital dental X-rays. In this framework, first, a new proposed competitive coupled level set method is proposed to segment the image into three pathologically meaningful regions using two coupled level set functions. Tailored for the dental clinical environment, the segmentation stage uses a trained support vector machine (SVM) classifier to provide initial contours. Then, based on the segmentation results, an analysis scheme is applied. First, the scheme builds an uncertainty map from which those areas with bone loss will be automatically detected. Secondly, the scheme employs a method based on the SVM and the average intensity profile to isolate the teeth and detect root decay. Experimental results show that our proposed framework is able to automatically detect the areas of bone loss and, when given the orientation of the teeth, it is able to automatically detect the root decay with a seriousness level marked for diagnosis. PMID:16685904

  5. Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.

    2014-01-01

    Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when symptom severity was high, participants made more facial expressions associated with contempt, smiled less, and those smiles that occurred were more likely to be accompanied by facial actions associated with contempt. These results are consistent with the “social risk hypothesis” of depression. According to this hypothesis, when symptoms are severe, depressed participants withdraw from other people in order to protect themselves from anticipated rejection, scorn, and social exclusion. As their symptoms fade, participants send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and produced the same pattern of depression effects suggests that automatic facial expression analysis may be ready for use in behavioral and clinical science. PMID:24598859

  6. Automatic Fatigue Detection of Drivers through Yawning Analysis

    NASA Astrophysics Data System (ADS)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  7. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    PubMed Central

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays. PMID:24404030

  8. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks. PMID:20924860

  9. Automatic localization of cerebral cortical malformations using fractal analysis

    NASA Astrophysics Data System (ADS)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  10. Automatic localization of cerebral cortical malformations using fractal analysis.

    PubMed

    De Luca, A; Arrigoni, F; Romaniello, R; Triulzi, F M; Peruzzo, D; Bertoldo, A

    2016-08-21

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity. PMID:27444964

  11. Analysis of Fiber deposition using Automatic Image Processing Method

    NASA Astrophysics Data System (ADS)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  12. Generalization versus contextualization in automatic evaluation revisited: A meta-analysis of successful and failed replications.

    PubMed

    Gawronski, Bertram; Hu, Xiaoqing; Rydell, Robert J; Vervliet, Bram; De Houwer, Jan

    2015-08-01

    To account for disparate findings in the literature on automatic evaluation, Gawronski, Rydell, Vervliet, and De Houwer (2010) proposed a representational theory that specifies the contextual conditions under which automatic evaluations reflect initially acquired attitudinal information or subsequently acquired counterattitudinal information. The theory predicts that automatic evaluations should reflect the valence of expectancy-violating counterattitudinal information only in the context in which this information had been learned. In contrast, automatic evaluations should reflect the valence of initial attitudinal information in any other context, be it the context in which the initial attitudinal information had been acquired (ABA renewal) or a novel context in which the target object had not been encountered before (ABC renewal). The current article presents a meta-analysis of all published and unpublished studies from the authors' research groups regardless of whether they produced the predicted pattern of results. Results revealed average effect sizes of d = 0.249 for ABA renewal (30 studies, N = 3,142) and d = 0.174 for ABC renewal (27 studies, N = 2,930), both of which were significantly different from zero. Effect sizes were moderated by attention to context during learning, order of positive and negative information, context-valence contingencies during learning, and sample country. Although some of the obtained moderator effects are consistent with the representational theory, others require theoretical refinements and future research to gain deeper insights into the mechanisms underlying contextual renewal. PMID:26010481

  13. Investigation of Ballistic Evidence through an Automatic Image Analysis and Identification System.

    PubMed

    Kara, Ilker

    2016-05-01

    Automated firearms identification (AFI) systems contribute to shedding light on criminal events by comparison between different pieces of evidence on cartridge cases and bullets and by matching similar ones that were fired from the same firearm. Ballistic evidence can be rapidly analyzed and classified by means of an automatic image analysis and identification system. In addition, it can be used to narrow the range of possible matching evidence. In this study conducted on the cartridges ejected from the examined pistol, three imaging areas, namely the firing pin impression, capsule traces, and the intersection of these traces, were compared automatically using the image analysis and identification system through the correlation ranking method to determine the numeric values that indicate the significance of the similarities. These numerical features that signify the similarities and differences between pistol makes and models can be used in groupings to make a distinction between makes and models of pistols. PMID:27122419

  14. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction. PMID:24623466

  15. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    NASA Astrophysics Data System (ADS)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  16. Automatic Assessment and Reduction of Noise using Edge Pattern Analysis in Non-Linear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  17. Theoretical Analysis of the Longitudinal Behavior of an Automatically Controlled Supersonic Interceptor During the Attack Phase

    NASA Technical Reports Server (NTRS)

    Gates, Ordway B., Jr.; Woodling, C. H.

    1959-01-01

    Theoretical analysis of the longitudinal behavior of an automatically controlled supersonic interceptor during the attack phase against a nonmaneuvering target is presented. Control of the interceptor's flight path is obtained by use of a pitch rate command system. Topics lift, and pitching moment, effects of initial tracking errors, discussion of normal acceleration limited, limitations of control surface rate and deflection, and effects of neglecting forward velocity changes of interceptor during attack phase.

  18. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  19. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    NASA Technical Reports Server (NTRS)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  20. Imaging and analysis platform for automatic phenotyping and trait ranking of plant root systems.

    PubMed

    Iyer-Pascuzzi, Anjali S; Symonova, Olga; Mileyko, Yuriy; Hao, Yueling; Belcher, Heather; Harer, John; Weitz, Joshua S; Benfey, Philip N

    2010-03-01

    The ability to nondestructively image and automatically phenotype complex root systems, like those of rice (Oryza sativa), is fundamental to identifying genes underlying root system architecture (RSA). Although root systems are central to plant fitness, identifying genes responsible for RSA remains an underexplored opportunity for crop improvement. Here we describe a nondestructive imaging and analysis system for automated phenotyping and trait ranking of RSA. Using this system, we image rice roots from 12 genotypes. We automatically estimate RSA traits previously identified as important to plant function. In addition, we expand the suite of features examined for RSA to include traits that more comprehensively describe monocot RSA but that are difficult to measure with traditional methods. Using 16 automatically acquired phenotypic traits for 2,297 images from 118 individuals, we observe (1) wide variation in phenotypes among the genotypes surveyed; and (2) greater intergenotype variance of RSA features than variance within a genotype. RSA trait values are integrated into a computational pipeline that utilizes supervised learning methods to determine which traits best separate two genotypes, and then ranks the traits according to their contribution to each pairwise comparison. This trait-ranking step identifies candidate traits for subsequent quantitative trait loci analysis and demonstrates that depth and average radius are key contributors to differences in rice RSA within our set of genotypes. Our results suggest a strong genetic component underlying rice RSA. This work enables the automatic phenotyping of RSA of individuals within mapping populations, providing an integrative framework for quantitative trait loci analysis of RSA. PMID:20107024

  1. Nonverbal Social Withdrawal in Depression: Evidence from manual and automatic analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, S. Mohammad; Hammal, Zakia; Rosenwald, Dean P.

    2014-01-01

    The relationship between nonverbal behavior and severity of depression was investigated by following depressed participants over the course of treatment and video recording a series of clinical interviews. Facial expressions and head pose were analyzed from video using manual and automatic systems. Both systems were highly consistent for FACS action units (AUs) and showed similar effects for change over time in depression severity. When symptom severity was high, participants made fewer affiliative facial expressions (AUs 12 and 15) and more non-affiliative facial expressions (AU 14). Participants also exhibited diminished head motion (i.e., amplitude and velocity) when symptom severity was high. These results are consistent with the Social Withdrawal hypothesis: that depressed individuals use nonverbal behavior to maintain or increase interpersonal distance. As individuals recover, they send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and revealed the same pattern of findings suggests that automatic facial expression analysis may be ready to relieve the burden of manual coding in behavioral and clinical science. PMID:25378765

  2. Dynamic Response and Stability Analysis of AN Automatic Ball Balancer for a Flexible Rotor

    NASA Astrophysics Data System (ADS)

    Chung, J.; Jang, I.

    2003-01-01

    Dynamic stability and time responses are studied for an automatic ball balancer of a rotor with a flexible shaft. The Stodola-Green rotor model, of which the shaft is flexible, is selected for analysis. This rotor model is able to include the influence of rigid-body rotations due to the shaft flexibility on dynamic responses. Applying Lagrange's equation to the rotor with the ball balancer, the non-linear equations of motion are derived. Based on the linearized equations, the stability of the ball balancer around the balanced equilibrium position is analyzed. On the other hand, the time responses computed from the non-linear equations are investigated. This study shows that the automatic ball balancer can achieve the balancing of a rotor with a flexible shaft if the system parameters of the balancer satisfy the stability conditions for the balanced equilibrium position.

  3. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  4. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    NASA Astrophysics Data System (ADS)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  5. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    NASA Astrophysics Data System (ADS)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  6. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    PubMed

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments. PMID:25577383

  7. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis

    PubMed Central

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text “The North Wind and the Sun” were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. PMID:26136813

  8. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis.

    PubMed

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text "The North Wind and the Sun" were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. PMID:26136813

  9. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    SciTech Connect

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.; Jin, Shuangshuang; Lin, Jenglung; Hauer, Matthew L.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliably and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.

  10. Automatic quantitative analysis of ultrasound tongue contours via wavelet-based functional mixed models.

    PubMed

    Lancia, Leonardo; Rausch, Philip; Morris, Jeffrey S

    2015-02-01

    This paper illustrates the application of wavelet-based functional mixed models to automatic quantification of differences between tongue contours obtained through ultrasound imaging. The reliability of this method is demonstrated through the analysis of tongue positions recorded from a female and a male speaker at the onset of the vowels /a/ and /i/ produced in the context of the consonants /t/ and /k/. The proposed method allows detection of significant differences between configurations of the articulators that are visible in ultrasound images during the production of different speech gestures and is compatible with statistical designs containing both fixed and random terms. PMID:25698047

  11. Automatic generation of stop word lists for information retrieval and analysis

    DOEpatents

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  12. Automatic Analysis of Single-Channel Sleep EEG: Validation in Healthy Individuals

    PubMed Central

    Berthomier, Christian; Drouot, Xavier; Herman-Stoïca, Maria; Berthomier, Pierre; Prado, Jacques; Bokar-Thire, Djibril; Benoit, Odile; Mattout, Jérémie; d'Ortho, Marie-Pia

    2007-01-01

    Study Objective: To assess the performance of automatic sleep scoring software (ASEEGA) based on a single EEG channel comparatively with manual scoring (2 experts) of conventional full polysomnograms. Design: Polysomnograms from 15 healthy individuals were scored by 2 independent experts using conventional R&K rules. The results were compared to those of ASEEGA scoring on an epoch-by-epoch basis. Setting: Sleep laboratory in the physiology department of a teaching hospital. Participants: Fifteen healthy volunteers. Measurements and Results: The epoch-by-epoch comparison was based on classifying into 2 states (wake/sleep), 3 states (wake/REM/NREM), 4 states (wake/REM/stages 1-2/SWS), or 5 states (wake/REM/stage 1/stage 2/SWS). The obtained overall agreements, as quantified by the kappa coefficient, were 0.82, 0.81, 0.75, and 0.72, respectively. Furthermore, obtained agreements between ASEEGA and the expert consensual scoring were 96.0%, 92.1%, 84.9%, and 82.9%, respectively. Finally, when classifying into 5 states, the sensitivity and positive predictive value of ASEEGA regarding wakefulness were 82.5% and 89.7%, respectively. Similarly, sensitivity and positive predictive value regarding REM state were 83.0% and 89.1%. Conclusions: Our results establish the face validity and convergent validity of ASEEGA for single-channel sleep analysis in healthy individuals. ASEEGA appears as a good candidate for diagnostic aid and automatic ambulant scoring. Citation: Berthomier C; Drouot X; Herman-Stoïca M; Berthomier P; Prado J; Bokar-Thire D; Benoit O; Mattout J; d'Ortho MP. Automatic analysis of single-channel sleep EEG: validation in healthy individuals. SLEEP 2007;30(11):1587-1595. PMID:18041491

  13. Urban land use of the Sao Paulo metropolitan area by automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Niero, M.; Foresti, C.

    1983-01-01

    The separability of urban land use classes in the metropolitan area of Sao Paulo was studied by means of automatic analysis of MSS/LANDSAT digital data. The data were analyzed using the media K and MAXVER classification algorithms. The land use classes obtained were: CBD/vertical growth area, residential area, mixed area, industrial area, embankment area type 1, embankment area type 2, dense vegetation area and sparse vegetation area. The spectral analysis of representative samples of urban land use classes was done using the "Single Cell" analysis option. The classes CBD/vertical growth area, residential area and embankment area type 2 showed better spectral separability when compared to the other classes.

  14. Automatic quantitative evaluation of autoradiographic band films by computerized image analysis

    SciTech Connect

    Masseroli, M.; Messori, A.; Bendotti, C.; Ponti, M.; Forloni, G. )

    1993-01-01

    The present paper describes a new image processing method for automatic quantitative analysis of autoradiographic band films. It was developed in a specific image analysis environment (IBAS 2.0) but the algorithms and methods can be utilized elsewhere. The program is easy to use and presents some particularly useful features for evaluation of autoradiographic band films, such as the choice of whole film or single lane background determination; the possibility of evaluating bands with film scratch artifacts and the quantification in absolute terms or relative to reference values. The method was tested by comparison with laser-scanner densitometric quantifications of the same autoradiograms. The results show the full compatibility of the two methods and demonstrate the reliability and sensitivity of image analysis. The method can be used not only to evaluate autoradiographic band films, but to analyze any type of signal bands on other materials (e.g electrophoresis gel, chromatographic paper, etc.).

  15. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor. PMID:23145702

  16. AnaSP: a software suite for automatic image analysis of multicellular spheroids.

    PubMed

    Piccinini, Filippo

    2015-04-01

    Today, more and more biological laboratories use 3D cell cultures and tissues grown in vitro as a 3D model of in vivo tumours and metastases. In the last decades, it has been extensively established that multicellular spheroids represent an efficient model to validate effects of drugs and treatments for human care applications. However, a lack of methods for quantitative analysis limits the usage of spheroids as models for routine experiments. Several methods have been proposed in literature to perform high throughput experiments employing spheroids by automatically computing different morphological parameters, such as diameter, volume and sphericity. Nevertheless, these systems are typically grounded on expensive automated technologies, that make the suggested solutions affordable only for a limited subset of laboratories, frequently performing high content screening analysis. In this work we propose AnaSP, an open source software suitable for automatically estimating several morphological parameters of spheroids, by simply analyzing brightfield images acquired with a standard widefield microscope, also not endowed with a motorized stage. The experiments performed proved sensitivity and precision of the segmentation method proposed, and excellent reliability of AnaSP to compute several morphological parameters of spheroids imaged in different conditions. AnaSP is distributed as an open source software tool. Its modular architecture and graphical user interface make it attractive also for researchers who do not work in areas of computer vision and suitable for both high content screenings and occasional spheroid-based experiments. PMID:25737369

  17. Intercellular fluorescence background on microscope slides: some problems and solutions for automatic analysis

    NASA Astrophysics Data System (ADS)

    Piper, Jim; Sudar, Damir; Peters, Don; Pinkel, Daniel

    1994-05-01

    Although high contrast between signal and the dark background is often claimed as a major advantage of fluorescence staining in cytology and cytogenetics, in practice this is not always the case and in some circumstances the inter-cellular or, in the case of metaphase preparations, the inter-chromosome background can be both brightly fluorescent and vary substantially across the slide or even across a single metaphase. Bright background results in low image contrast, making automatic detection of metaphase cells more difficult. The background correction strategy employed in automatic search must both cope with variable background and be computationally efficient. The method employed in a fluorescence metaphase finder is presented, and the compromises involved are discussed. A different set of problems arise when the analysis is aimed at accurate quantification of the fluorescence signal. Some insight into the nature of the background in the case of comparative genomic hybridization is obtained by image analysis of data obtained from experiments using cell lines with known abnormal copy numbers of particular chromosome types.

  18. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    NASA Astrophysics Data System (ADS)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  19. [Design and analysis of automatic measurement instrument for diffraction efficiency of plane reflection grating].

    PubMed

    Wang, Fang; Qi, Xiang-Dong; Yu, Hong-Zhu; Yu, Hai-Li

    2009-02-01

    A new-style system that automatically measures the diffraction efficiency of plane reflection grating was designed. The continuous illuminant was adopted for illumination, the duplex grating spectrograph structure was applied, and the linear array NMOS was the receiving component. Wielding relevant principle of the grating spectrograph, theoretical analysis principle was carried out for the testing system. Integrating the aberration theory of geometrical optics, the image quality of this optics system was analyzed. Analysis indicated that the systematic device structure is compact, and electronics system is simplified. The system does not have the problem about wavelength sweep synchronization of the two grating spectrographs, and its wavelength repeatability is very good. So the precision is easy to guarantee. Compared with the former automated scheme, the production cost is reduced, moreover it is easy to operate, and the working efficiency is enhanced. The study showed that this automatic measurement instrument system features a spectral range of 190-1 100 nm and resolution is less than 3 nm, which entirely satisfies the design request. It is an economical and feasible plan. PMID:19445251

  20. Acoustic Analysis of Inhaler Sounds From Community-Dwelling Asthmatic Patients for Automatic Assessment of Adherence

    PubMed Central

    D'arcy, Shona; Costello, Richard W.

    2014-01-01

    Inhalers are devices which deliver medication to the airways in the treatment of chronic respiratory diseases. When used correctly inhalers relieve and improve patients' symptoms. However, adherence to inhaler medication has been demonstrated to be poor, leading to reduced clinical outcomes, wasted medication, and higher healthcare costs. There is a clinical need for a system that can accurately monitor inhaler adherence as currently no method exists to evaluate how patients use their inhalers between clinic visits. This paper presents a method of automatically evaluating inhaler adherence through acoustic analysis of inhaler sounds. An acoustic monitoring device was employed to record the sounds patients produce while using a Diskus dry powder inhaler, in addition to the time and date patients use the inhaler. An algorithm was designed and developed to automatically detect inhaler events from the audio signals and provide feedback regarding patient adherence. The algorithm was evaluated on 407 audio files obtained from 12 community dwelling asthmatic patients. Results of the automatic classification were compared against two expert human raters. For patient data for whom the human raters Cohen's kappa agreement score was \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${>}{0.81}$\\end{document}, results indicated that the algorithm's accuracy was 83% in determining the correct inhaler technique score compared with the raters. This paper has several clinical implications as it demonstrates the feasibility of using acoustics to objectively monitor patient inhaler adherence and provide real-time personalized medical care for a chronic respiratory illness. PMID:27170883

  1. Analysis and Exploitation of Automatically Generated Scene Structure from Aerial Imagery

    NASA Astrophysics Data System (ADS)

    Nilosek, David R.

    The recent advancements made in the field of computer vision, along with the ever increasing rate of computational power has opened up opportunities in the field of automated photogrammetry. Many researchers have focused on using these powerful computer vision algorithms to extract three-dimensional point clouds of scenes from multi-view imagery, with the ultimate goal of creating a photo-realistic scene model. However, geographically accurate three-dimensional scene models have the potential to be exploited for much more than just visualization. This work looks at utilizing automatically generated scene structure from near-nadir aerial imagery to identify and classify objects within the structure, through the analysis of spatial-spectral information. The limitation to this type of imagery is imposed due to the common availability of this type of aerial imagery. Popular third-party computer-vision algorithms are used to generate the scene structure. A voxel-based approach for surface estimation is developed using Manhattan-world assumptions. A surface estimation confidence metric is also presented. This approach provides the basis for further analysis of surface materials, incorporating spectral information. Two cases of spectral analysis are examined: when additional hyperspectral imagery of the reconstructed scene is available, and when only R,G,B spectral information can be obtained. A method for registering the surface estimation to hyperspectral imagery, through orthorectification, is developed. Atmospherically corrected hyperspectral imagery is used to assign reflectance values to estimated surface facets for physical simulation with DIRSIG. A spatial-spectral region growing-based segmentation algorithm is developed for the R,G,B limited case, in order to identify possible materials for user attribution. Finally, an analysis of the geographic accuracy of automatically generated three-dimensional structure is performed. An end-to-end, semi-automated, workflow

  2. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring

  3. Automatic computer-aided detection of prostate cancer based on multiparametric magnetic resonance image analysis

    NASA Astrophysics Data System (ADS)

    Vos, P. C.; Barentsz, J. O.; Karssemeijer, N.; Huisman, H. J.

    2012-03-01

    In this paper, a fully automatic computer-aided detection (CAD) method is proposed for the detection of prostate cancer. The CAD method consists of multiple sequential steps in order to detect locations that are suspicious for prostate cancer. In the initial stage, a voxel classification is performed using a Hessian-based blob detection algorithm at multiple scales on an apparent diffusion coefficient map. Next, a parametric multi-object segmentation method is applied and the resulting segmentation is used as a mask to restrict the candidate detection to the prostate. The remaining candidates are characterized by performing histogram analysis on multiparametric MR images. The resulting feature set is summarized into a malignancy likelihood by a supervised classifier in a two-stage classification approach. The detection performance for prostate cancer was tested on a screening population of 200 consecutive patients and evaluated using the free response operating characteristic methodology. The results show that the CAD method obtained sensitivities of 0.41, 0.65 and 0.74 at false positive (FP) levels of 1, 3 and 5 per patient, respectively. In conclusion, this study showed that it is feasible to automatically detect prostate cancer at a FP rate lower than systematic biopsy. The CAD method may assist the radiologist to detect prostate cancer locations and could potentially guide biopsy towards the most aggressive part of the tumour.

  4. Hardware and software system for automatic microemulsion assay evaluation by analysis of optical properties

    NASA Astrophysics Data System (ADS)

    Maeder, Ulf; Schmidts, Thomas; Burg, Jan-Michael; Heverhagen, Johannes T.; Runkel, Frank; Fiebich, Martin

    2010-03-01

    A new hardware device called Microemulsion Analyzer (MEA), which facilitates the preparation and evaluation of microemulsions, was developed. Microemulsions, consisting of three phases (oil, surfactant and water) and prepared on deep well plates according to the PDMPD method can be automatically evaluated by means of the optical properties. The ratio of ingredients to form a microemulsion strongly depends on the properties and the amounts of the used ingredients. A microemulsion assay is set up on deep well plates to determine these ratios. The optical properties of the ingredients change from turbid to transparent as soon as a microemulsion is formed. The MEA contains a frame and an imageprocessing and analysis algorithm. The frame itself consists of aluminum, an electro luminescent foil (ELF) and a camera. As the frame keeps the well plate at the correct position and angle, the ELF provides constant illumination of the plate from below. The camera provides an image that is processed by the algorithm to automatically evaluate the turbidity in the wells. Using the determined parameters, a phase diagram is created that visualizes the information. This build-up can be used to analyze microemulsion assays and to get results in a standardized way. In addition, it is possible to perform stability tests of the assay by creating special differential stability diagrams after a period of time.

  5. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  6. Semi-automatic system for UV images analysis of historical musical instruments

    NASA Astrophysics Data System (ADS)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  7. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    PubMed

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  8. Automatic fuzzy object-based analysis of VHSR images for urban objects extraction

    NASA Astrophysics Data System (ADS)

    Sebari, Imane; He, Dong-Chen

    2013-05-01

    We present an automatic approach for object extraction from very high spatial resolution (VHSR) satellite images based on Object-Based Image Analysis (OBIA). The proposed solution requires no input data other than the studied image. Not input parameters are required. First, an automatic non-parametric cooperative segmentation technique is applied to create object primitives. A fuzzy rule base is developed based on the human knowledge used for image interpretation. The rules integrate spectral, textural, geometric and contextual object proprieties. The classes of interest are: tree, lawn, bare soil and water for natural classes; building, road, parking lot for man made classes. The fuzzy logic is integrated in our approach in order to manage the complexity of the studied subject, to reason with imprecise knowledge and to give information on the precision and certainty of the extracted objects. The proposed approach was applied to extracts of Ikonos images of Sherbrooke city (Canada). An overall total extraction accuracy of 80% was observed. The correctness rates obtained for building, road and parking lot classes are of 81%, 75% and 60%, respectively.

  9. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-01

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. PMID:27139215

  10. Automatic brain tumour detection and neovasculature assessment with multiseries MRI analysis.

    PubMed

    Szwarc, Pawel; Kawa, Jacek; Rudzki, Marcin; Pietka, Ewa

    2015-12-01

    In this paper a novel multi-stage automatic method for brain tumour detection and neovasculature assessment is presented. First, the brain symmetry is exploited to register the magnetic resonance (MR) series analysed. Then, the intracranial structures are found and the region of interest (ROI) is constrained within them to tumour and peritumoural areas using the Fluid Light Attenuation Inversion Recovery (FLAIR) series. Next, the contrast-enhanced lesions are detected on the basis of T1-weighted (T1W) differential images before and after contrast medium administration. Finally, their vascularisation is assessed based on the Regional Cerebral Blood Volume (RCBV) perfusion maps. The relative RCBV (rRCBV) map is calculated in relation to a healthy white matter, also found automatically, and visualised on the analysed series. Three main types of brain tumours, i.e. HG gliomas, metastases and meningiomas have been subjected to the analysis. The results of contrast enhanced lesions detection have been compared with manual delineations performed independently by two experts, yielding 64.84% sensitivity, 99.89% specificity and 71.83% Dice Similarity Coefficient (DSC) for twenty analysed studies of subjects with brain tumours diagnosed. PMID:26183648

  11. Two-Stage Automatic Calibration and Predictive Uncertainty Analysis of a Semi-distributed Watershed Model

    NASA Astrophysics Data System (ADS)

    Lin, Z.; Radcliffe, D. E.; Doherty, J.

    2004-12-01

    monthly flow produced a very good fit to the measured data. Nash and Sutcliffe coefficients for daily and monthly flow over the calibration period were 0.60 and 0.86, respectively; they were 0.61 and 0.87 respectively over the validation period. Regardless of the level of model-to-measurement fit, nonuniqueness of the optimal parameter values renders the necessity of uncertainty analysis for model prediction. The nonlinear prediction uncertainty analysis showed that cautions must be exercised when using the SWAT model to predict instantaneous peak flows. The PEST (Parameter Estimation) free software was used to conduct the two-stage automatic calibration and prediction uncertainty analysis of the SWAT model.

  12. Group-wise automatic mesh-based analysis of cortical thickness

    NASA Astrophysics Data System (ADS)

    Vachet, Clement; Cody Hazlett, Heather; Niethammer, Marc; Oguz, Ipek; Cates, Joshua; Whitaker, Ross; Piven, Joseph; Styner, Martin

    2011-03-01

    The analysis of neuroimaging data from pediatric populations presents several challenges. There are normal variations in brain shape from infancy to adulthood and normal developmental changes related to tissue maturation. Measurement of cortical thickness is one important way to analyze such developmental tissue changes. We developed a novel framework that allows group-wise automatic mesh-based analysis of cortical thickness. Our approach is divided into four main parts. First an individual pre-processing pipeline is applied on each subject to create genus-zero inflated white matter cortical surfaces with cortical thickness measurements. The second part performs an entropy-based group-wise shape correspondence on these meshes using a particle system, which establishes a trade-off between an even sampling of the cortical surfaces and the similarity of corresponding points across the population using sulcal depth information and spatial proximity. A novel automatic initial particle sampling is performed using a matched 98-lobe parcellation map prior to a particle-splitting phase. Third, corresponding re-sampled surfaces are computed with interpolated cortical thickness measurements, which are finally analyzed via a statistical vertex-wise analysis module. This framework consists of a pipeline of automated 3D Slicer compatible modules. It has been tested on a small pediatric dataset and incorporated in an open-source C++ based high-level module called GAMBIT. GAMBIT's setup allows efficient batch processing, grid computing and quality control. The current research focuses on the use of an average template for correspondence and surface re-sampling, as well as thorough validation of the framework and its application to clinical pediatric studies.

  13. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  14. [A new method of automatic analysis of tongue deviation using self-correction].

    PubMed

    Zhu, Mingfeng; Du, Jianqiang; Meng, Fan; Zhang, Kang; Ding, Chenghua

    2012-02-01

    The article analyzes the old analysis method of tongue deviation and introduces a new analysis method of it with self-correction avoiding the shortcomings of the old method. In this article, comparisons and analyses are made to current central axis extraction methods, and the facts proved that these methods were not suitable for central axis extraction of tongue images. To overcome the shortcoming that the old method utilized area symmetry to extract central axis so that it would lead to a failure to find central axis, we introduced a kind of shape symmetry analysis method to extract the central axis. This method was capable of correcting the edge of tongue root automatically, and it improved the accuracy of central axis extraction. In additional, in this article, a kind of mouth corner analysis method by analysis of variational hue of tongue images was introduced. In the experiment for comparison, this method was more accurate than the old one and its efficiency was higher than that of the old one. PMID:22404028

  15. Cold Flow Properties of Biodiesel by Automatic and Manual Analysis Methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Biodiesel from most common feedstocks has inferior cold flow properties compared to conventional diesel fuel. Blends with as little as 10 vol% biodiesel content typically have significantly higher cloud point (CP), pour point (PP) and cold filter plugging point (CFPP) than No. 2 grade diesel fuel (...

  16. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens.

    PubMed

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-04-01

    This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear-cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  17. Development of automatic image analysis algorithms for protein localization studies in budding yeast

    NASA Astrophysics Data System (ADS)

    Logg, Katarina; Kvarnström, Mats; Diez, Alfredo; Bodvard, Kristofer; Käll, Mikael

    2007-02-01

    Microscopy of fluorescently labeled proteins has become a standard technique for live cell imaging. However, it is still a challenge to systematically extract quantitative data from large sets of images in an unbiased fashion, which is particularly important in high-throughput or time-lapse studies. Here we describe the development of a software package aimed at automatic quantification of abundance and spatio-temporal dynamics of fluorescently tagged proteins in vivo in the budding yeast Saccharomyces cerevisiae, one of the most important model organisms in proteomics. The image analysis methodology is based on first identifying cell contours from bright field images, and then use this information to measure and statistically analyse protein abundance in specific cellular domains from the corresponding fluorescence images. The applicability of the procedure is exemplified for two nuclear localized GFP-tagged proteins, Mcm4p and Nrm1p.

  18. Analysis of steranes and triterpanes in geolipid extracts by automatic classification of mass spectra

    NASA Technical Reports Server (NTRS)

    Wardroper, A. M. K.; Brooks, P. W.; Humberston, M. J.; Maxwell, J. R.

    1977-01-01

    A computer method is described for the automatic classification of triterpanes and steranes into gross structural type from their mass spectral characteristics. The method has been applied to the spectra obtained by gas-chromatographic/mass-spectroscopic analysis of two mixtures of standards and of hydrocarbon fractions isolated from Green River and Messel oil shales. Almost all of the steranes and triterpanes identified previously in both shales were classified, in addition to a number of new components. The results indicate that classification of such alkanes is possible with a laboratory computer system. The method has application to diagenesis and maturation studies as well as to oil/oil and oil/source rock correlations in which rapid screening of large numbers of samples is required.

  19. Application of automatic image analysis for the investigation of autoclaved aerated concrete structure

    SciTech Connect

    Petrov, I.; Schlegel, E. . Inst. fuer Silikattechnik)

    1994-01-01

    Autoclaved aerated concrete (AAC) is formed from small-grained mixtures of raw materials and Al-powder as an air entraining agent. Owing to its high porosity AAC has a low bulk density which leads to very good heat insulating qualities. Automatic image analysis in connection with stereology and stochastic geometry was used to describe the size distribution of air pores in autoclaved concrete. The experiments were carried out an AAC samples with extremely different bulk densities and compressive strengths. The assumption of an elliptic shape of pores leads to an unambiguous characterization of structure by bi-histograms. It will be possible to calculate the spatial pore size distribution by these histograms, if the pores are assumed as being spheroids. A marked point field model and the pair correlation function g[sub a](r) were used to describe the pore structure.

  20. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  1. Automatic Denoising of Functional MRI Data: Combining Independent Component Analysis and Hierarchical Fusion of Classifiers

    PubMed Central

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject “at rest”). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing “signal” (brain activity) can be distinguished form the “noise” components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX (“FMRIB’s ICA-based X-noiseifier”), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different Classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of

  2. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  3. A system for automatic recording and analysis of motor activity in rats.

    PubMed

    Heredia-López, Francisco J; May-Tuyub, Rossana M; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J

    2013-03-01

    We describe the design and evaluation of an electronic system for the automatic recording of motor activity in rats. The device continually locates the position of a rat inside a transparent acrylic cube (50 cm/side) with infrared sensors arranged on its walls so as to correspond to the x-, y-, and z-axes. The system is governed by two microcontrollers. The raw data are saved in a text file within a secure digital memory card, and offline analyses are performed with a library of programs that automatically compute several parameters based on the sequence of coordinates and the time of occurrence of each movement. Four analyses can be made at specified time intervals: traveled distance (cm), movement speed (cm/s), time spent in vertical exploration (s), and thigmotaxis (%). In addition, three analyses are made for the total duration of the experiment: time spent at each x-y coordinate pair (min), time spent on vertical exploration at each x-y coordinate pair (s), and frequency distribution of vertical exploration episodes of distinct durations. User profiles of frequently analyzed parameters may be created and saved for future experimental analyses, thus obtaining a full set of analyses for a group of rats in a short time. The performance of the developed system was assessed by recording the spontaneous motor activity of six rats, while their behaviors were simultaneously videotaped for manual analysis by two trained observers. A high and significant correlation was found between the values measured by the electronic system and by the observers. PMID:22707401

  4. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    NASA Astrophysics Data System (ADS)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  5. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  6. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  7. Automatic target detection algorithm for foliage-penetrating ultrawideband SAR data using split spectral analysis

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Kapoor, Ravinder; Ressler, Marc A.

    1999-07-01

    We present an automatic target detection (ATD) algorithm for foliage penetrating (FOPEN) ultra-wideband (UWB) synthetic aperture radar (SAR) data using split spectral analysis. Split spectral analysis is commonly used in the ultrasonic, non-destructive evaluation of materials using wide band pulses for flaw detection. In this paper, we show the application of split spectral analysis for detecting obscured targets in foliage using UWB pulse returns to discriminate targets from foliage, the data spectrum is split into several bands, namely, 20 to 75, 75 to 150, ..., 825 to 900 MHz. An ATD algorithm is developed based on the relative energy levels in various bands, the number of bands containing significant energy (spread of energy), and chip size (number of crossrange and range bins). The algorithm is tested on the (FOPEN UWB SAR) data of foliage and vehicles obscured by foliage collected at Aberdeen Proving Ground, MD. The paper presents various split spectral parameters used in the algorithm and discusses the rationale for their use.

  8. Automatic system for analysis of locomotor activity in rodents--a reproducibility study.

    PubMed

    Aragão, Raquel da Silva; Rodrigues, Marco Aurélio Benedetti; de Barros, Karla Mônica Ferraz Teixeira; Silva, Sebastião Rogério Freitas; Toscano, Ana Elisa; de Souza, Ricardo Emmanuel; Manhães-de-Castro, Raul

    2011-02-15

    Automatic analysis of locomotion in studies of behavior and development is of great importance because it eliminates the subjective influence of evaluators on the study. This study aimed to develop and test the reproducibility of a system for automated analysis of locomotor activity in rats. For this study, 15 male Wistar were evaluated at P8, P14, P17, P21, P30 and P60. A monitoring system was developed that consisted of an open field of 1m in diameter with a black surface, an infrared digital camera and a video capture card. The animals were filmed for 2 min as they moved freely in the field. The images were sent to a computer connected to the camera. Afterwards, the videos were analyzed using software developed using MATLAB® (mathematical software). The software was able to recognize the pixels constituting the image and extract the following parameters: distance traveled, average speed, average potency, time immobile, number of stops, time spent in different areas of the field and time immobile/number of stops. All data were exported for further analysis. The system was able to effectively extract the desired parameters. Thus, it was possible to observe developmental changes in the patterns of movement of the animals. We also discuss similarities and differences between this system and previously described systems. PMID:21182870

  9. Fractal analysis of elastographic images for automatic detection of diffuse diseases of salivary glands: preliminary results.

    PubMed

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of "real-time" elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology. PMID:23762183

  10. Semi-automatic analysis of rock fracture orientations from borehole wall images

    SciTech Connect

    Thapa, B.B.; Hughett, P.; Karasaki, K.

    1997-01-01

    The authors develop a semiautomatic method of identifying rock fractures and analyzing their orientations from digital images of borehole walls. This method is based on an algorithm related to the Hough transform which is modified to find sinusoidal rather than linear patterns. The algorithm uses the high-intensity contrast between the fracture aperture and the rock wall, as well as the sinusoidal trajectory defined by the intersection of the borehole and the fracture. The analysis rate of the algorithm itself is independent of fracture contrast and network complexity. The method has successfully identified fractures both in test cases containing several fractures in a noisy background and in real borehole images. The analysis rate was 0.3--1.2 minutes/m of input data, compared to an average of 12 minutes/m using an existing interactive method. An automatic version under development should open new possibilities for site characterization, such as real-time exploration and analysis of tunnel stability and support requirements as construction proceeds.

  11. Automatic generation of skeletal mechanisms for ignition combustion based on level of importance analysis

    SciTech Connect

    Loevaas, Terese

    2009-07-15

    A level of importance (LOI) selection parameter is employed in order to identify species with general low importance to the overall accuracy of a chemical model. This enables elimination of the minor reaction paths in which these species are involved. The generation of such skeletal mechanisms is performed automatically in a pre-processing step ranking species according to their level of importance. This selection criterion is a combined parameter based on a time scale and sensitivity analysis, identifying both short lived species and species with respect to which the observable of interest has low sensitivity. In this work a careful element flux analysis demonstrates that such species do not interact in major reaction paths. Employing the LOI procedure replaces the previous method of identifying redundant species through a two step procedure involving a reaction flow analysis followed by a sensitivity analysis. The flux analysis is performed using DARS {sup copyright}, a digital analysis tool modelling reactive systems. Simplified chemical models are generated based on a detailed ethylene mechanism involving 111 species and 784 reactions (1566 forward and backward reactions) proposed by Wang et al. Eliminating species from detailed mechanisms introduces errors in the predicted combustion parameters. In the present work these errors are systematically studied for a wide range of conditions, including temperature, pressure and mixtures. Results show that the accuracy of simplified models is particularly lowered when the initial temperatures are close to the transition between low- and high-temperature chemistry. A speed-up factor of 5 is observed when using a simplified model containing only 27% of the original species and 19% of the original reactions. (author)

  12. Automatic Differentiation Package

    Energy Science and Technology Software Center (ESTSC)

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  13. Automatic Behavior Analysis During a Clinical Interview with a Virtual Human.

    PubMed

    Rizzo, Albert; Lucas, Gale; Gratch, Jonathan; Stratou, Giota; Morency, Louis-Philippe; Chavez, Kenneth; Shilling, Russ; Scherer, Stefan

    2016-01-01

    SimSensei is a Virtual Human (VH) interviewing platform that uses off-the-shelf sensors (i.e., webcams, Microsoft Kinect and a microphone) to capture and interpret real-time audiovisual behavioral signals from users interacting with the VH system. The system was specifically designed for clinical interviewing and health care support by providing a face-to-face interaction between a user and a VH that can automatically react to the inferred state of the user through analysis of behavioral signals gleaned from the user's facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human-to-human interaction and communication, SimSensei aims to capture and infer user state from signals generated from user non-verbal communication to improve engagement between a VH and a user and to quantify user state from the data captured across a 20 minute interview. Results from of sample of service members (SMs) who were interviewed before and after a deployment to Afghanistan indicate that SMs reveal more PTSD symptoms to the VH than they report on the Post Deployment Health Assessment. Pre/Post deployment facial expression analysis indicated more sad expressions and few happy expressions at post deployment. PMID:27046598

  14. Automatic Roof Plane Detection and Analysis in Airborne Lidar Point Clouds for Solar Potential Assessment

    PubMed Central

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m2. PMID:22346695

  15. Learner Systems and Error Analysis. Perspective: A New Freedom. ACTFL Review of Foreign Language Education, Vol. 7.

    ERIC Educational Resources Information Center

    Valdman, Albert

    Errors in second language learning are viewed as evidence of the learner's hypotheses and strategies about the new data. Error observation and analysis are important to the formulation of theories about language learning and the preparation of teaching materials. Learning a second language proceeds by a series of approximative reorganizations…

  16. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    PubMed

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  17. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    PubMed Central

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J.; Wild, Conor J.; Auer, Tibor; Linke, Annika C.; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  18. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  19. A new and fast methodology to assess oxidative damage in cardiovascular diseases risk development through eVol-MEPS-UHPLC analysis of four urinary biomarkers.

    PubMed

    Mendes, Berta; Silva, Pedro; Mendonça, Isabel; Pereira, Jorge; Câmara, José S

    2013-11-15

    In this work, a new, fast and reliable methodology using a digitally controlled microextraction by packed sorbent (eVol(®)-MEPS) followed by ultra-high pressure liquid chromatography (UHPLC) analysis with photodiodes (PDA) detection, was developed to establish the urinary profile levels of four putative oxidative stress biomarkers (OSBs) in healthy subjects and patients evidencing cardiovascular diseases (CVDs). This data was used to verify the suitability of the selected OSBs (uric acid-UAc, malondialdehyde-MDA, 5-(hydroxymethyl)uracil-5-HMUra and 8-hydroxy-2'-deoxyguanosine-8-oxodG) as potential biomarkers of CVDs progression. Important parameters affecting the efficiency of the extraction process were optimized, particularly stationary phase selection, pH influence, sample volume, number of extraction cycles and washing and elution volumes. The experimental conditions that allowed the best extraction efficiency, expressed in terms of total area of the target analytes and data reproducibility, includes a 10 times dilution and pH adjustment of the urine samples to 6.0, followed by a gradient elution through the C8 adsorbent with 5 times 50 µL of 0.01% formic acid and 3×50 µL of 20% methanol in 0.01% formic acid. The chromatographic separation of the target analytes was performed with a HSS T3 column (100 mm × 2.1 mm, 1.7 µm in particle size) using 0.01% formic acid 20% methanol at 250 µL min(-1). The methodology was validated in terms of selectivity, linearity, instrumental limit of detection (LOD), method limit of quantification (LOQ), matrix effect, accuracy and precision (intra-and inter-day). Good results were obtained in terms of selectivity and linearity (r(2)>0.9906), as well as the LOD and LOQ, whose values were low, ranging from 0.00005 to 0.72 µg mL(-1) and 0.00023 to 2.31 µg mL(-1), respectively. The recovery results (91.1-123.0%), intra-day (1.0-8.3%), inter-day precision (4.6-6.3%) and the matrix effect (60.1-110.3%) of eVol

  20. Automatic Robust Neurite Detection and Morphological Analysis of Neuronal Cell Cultures in High-content Screening

    PubMed Central

    Wu, Chaohong; Schulte, Joost; Sepp, Katharine J.; Littleton, J. Troy

    2011-01-01

    Cell-based high content screening (HCS) is becoming an important and increasingly favored approach in therapeutic drug discovery and functional genomics. In HCS, changes in cellular morphology and biomarker distributions provide an information-rich profile of cellular responses to experimental treatments such as small molecules or gene knockdown probes. One obstacle that currently exists with such cell-based assays is the availability of image processing algorithms that are capable of reliably and automatically analyzing large HCS image sets. HCS images of primary neuronal cell cultures are particularly challenging to analyze due to complex cellular morphology. Here we present a robust method for quantifying and statistically analyzing the morphology of neuronal cells in HCS images. The major advantages of our method over existing software lie in its capability to correct non-uniform illumination using the contrast-limited adaptive histogram equalization method; segment neuromeres using Gabor-wavelet texture analysis; and detect faint neurites by a novel phase-based neurite extraction algorithm that is invariant to changes in illumination and contrast and can accurately localize neurites. Our method was successfully applied to analyze a large HCS image set generated in a morphology screen for polyglutamine-mediated neuronal toxicity using primary neuronal cell cultures derived from embryos of a Drosophila Huntington’s Disease (HD) model. PMID:20405243

  1. Automatic aerial image shadow detection through the hybrid analysis of RGB and HIS color space

    NASA Astrophysics Data System (ADS)

    Wu, Jun; Li, Huilin; Peng, Zhiyong

    2015-12-01

    This paper presents our research on automatic shadow detection from high-resolution aerial image through the hybrid analysis of RGB and HIS color space. To this end, the spectral characteristics of shadow are firstly discussed and three kinds of spectral components including the difference between normalized blue and normalized red component - BR, intensity and saturation components are selected as criterions to obtain initial segmentation of shadow region (called primary segmentation). After that, within the normalized RGB color space and HIS color space, the shadow region is extracted again (called auxiliary segmentation) using the OTSU operation, respectively. Finally, the primary segmentation and auxiliary segmentation are combined through a logical AND-connection operation to obtain reliable shadow region. In this step, small shadow areas are removed from combined shadow region and morphological algorithms are apply to fill small holes as well. The experimental results show that the proposed approach can effectively detect the shadow region from high-resolution aerial image and in high degree of automaton.

  2. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  3. Sensitivity analysis of a mixed-phase chemical mechanism using automatic differentiation

    SciTech Connect

    Zhang, Y.; Easter, R.C.

    1998-08-01

    A sensitivity analysis of a comprehensive mixed-phase chemical mechanism is conducted under a variety of atmospheric conditions. The local sensitivities of gas and aqueous phase species concentrations with respect to a variety of model parameters are calculated using the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect tropospheric O{sub 3} formation and O{sub 3}-precursor relations under all modeled conditions are identified and analyzed. The results show that the presence of clouds not only reduces many gas phase species concentrations and the total oxidizing capacity but alters O{sub 3}-precursor relations. Decreases in gas phase concentrations and photochemical formation rates of O{sub 3} can be up to 9{percent} and 100{percent}, respectively, depending on the preexisting atmospheric conditions. The decrease in O{sub 3} formation is primarily caused by the aqueous phase reactions of O{sub 2}{sup {minus}} with dissolved HO{sub 2} and O{sub 3} under most cloudy conditions. {copyright} 1998 American Geophysical Union

  4. Sensitivity Analysis of Photochemical Indicators for O3 Chemistry Using Automatic Differentiation

    SciTech Connect

    Zhang, Yang; Bischof, Christian H.; Easter, Richard C.; Wu, Po-Ting

    2005-05-01

    Photochemical indicators for determination of O{sub 3}-NO{sub x}-ROG sensitivity and their sensitivity to model parameters are studied for a variety of polluted conditions using a comprehensive mixed-phase chemistry box model and the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect the indicators are identified and analyzed. Condensed mixed-phase chemical mechanisms are derived from the sensitivity analysis. Our results show that cloud chemistry has a significant impact on the indicators and their sensitivities, particularly on those involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z}. Caution should be taken when applying the established threshold values of indicators in regions with large cloud coverage. Among the commonly used indicators, NO{sub y} and O{sub 3}/NO{sub y} are relatively insensitive to most model parameters, whereas indicators involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z} are highly sensitive to changes in initial species concentrations, reaction rate constants, equilibrium constants, temperature, relative humidity, cloud droplet size, and cloud water content.

  5. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  6. A visual latent semantic approach for automatic analysis and interpretation of anaplastic medulloblastoma virtual slides.

    PubMed

    Cruz-Roa, Angel; González, Fabio; Galaro, Joseph; Judkins, Alexander R; Ellison, David; Baccon, Jennifer; Madabhushi, Anant; Romero, Eduardo

    2012-01-01

    A method for automatic analysis and interpretation of histopathology images is presented. The method uses a representation of the image data set based on bag of features histograms built from visual dictionary of Haar-based patches and a novel visual latent semantic strategy for characterizing the visual content of a set of images. One important contribution of the method is the provision of an interpretability layer, which is able to explain a particular classification by visually mapping the most important visual patterns associated with such classification. The method was evaluated on a challenging problem involving automated discrimination of medulloblastoma tumors based on image derived attributes from whole slide images as anaplastic or non-anaplastic. The data set comprised 10 labeled histopathological patient studies, 5 for anaplastic and 5 for non-anaplastic, where 750 square images cropped randomly from cancerous region from whole slide per study. The experimental results show that the new method is competitive in terms of classification accuracy achieving 0.87 in average. PMID:23285547

  7. Automatic Sleep Stage Scoring Using Time-Frequency Analysis and Stacked Sparse Autoencoders.

    PubMed

    Tsinalis, Orestis; Matthews, Paul M; Guo, Yike

    2016-05-01

    We developed a machine learning methodology for automatic sleep stage scoring. Our time-frequency analysis-based feature extraction is fine-tuned to capture sleep stage-specific signal features as described in the American Academy of Sleep Medicine manual that the human experts follow. We used ensemble learning with an ensemble of stacked sparse autoencoders for classifying the sleep stages. We used class-balanced random sampling across sleep stages for each model in the ensemble to avoid skewed performance in favor of the most represented sleep stages, and addressed the problem of misclassification errors due to class imbalance while significantly improving worst-stage classification. We used an openly available dataset from 20 healthy young adults for evaluation. We used a single channel of EEG from this dataset, which makes our method a suitable candidate for longitudinal monitoring using wearable EEG in real-world settings. Our method has both high overall accuracy (78%, range 75-80%), and high mean [Formula: see text]-score (84%, range 82-86%) and mean accuracy across individual sleep stages (86%, range 84-88%) over all subjects. The performance of our method appears to be uncorrelated with the sleep efficiency and percentage of transitional epochs in each recording. PMID:26464268

  8. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  9. Automatic fault diagnosis of rotating machines by time-scale manifold ridge analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; He, Qingbo; Kong, Fanrang

    2013-10-01

    This paper explores the improved time-scale representation by considering the non-linear property for effectively identifying rotating machine faults in the time-scale domain. A new time-scale signature, called time-scale manifold (TSM), is proposed in this study through combining phase space reconstruction (PSR), continuous wavelet transform (CWT), and manifold learning. For the TSM generation, an optimal scale band is selected to eliminate the influence of unconcerned scale components, and the noise in the selected band is suppressed by manifold learning to highlight the inherent non-linear structure of faulty impacts. The TSM reserves the non-stationary information and reveals the non-linear structure of the fault pattern, with the merits of noise suppression and resolution improvement. The TSM ridge is further extracted by seeking the ridge with energy concentration lying on the TSM signature. It inherits the advantages of both the TSM and ridge analysis, and hence is beneficial to demodulation of the fault information. Through analyzing the instantaneous amplitude (IA) of the TSM ridge, in which the noise is nearly not contained, the fault characteristic frequency can be exactly identified. The whole process of the proposed fault diagnosis scheme is automatic, and its effectiveness has been verified by means of typical faulty vibration/acoustic signals from a gearbox and bearings. A reliable performance of the new method is validated in comparison with traditional enveloping methods for rotating machine fault diagnosis.

  10. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations

    PubMed Central

    2015-01-01

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states. PMID:25516725

  11. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features.

    PubMed

    Martínez, Fabio; Manzanera, Antoine; Romero, Eduardo

    2015-01-01

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of [Formula: see text] and [Formula: see text] for the global angular acceleration and the global wing deformation respectively. PMID:25599248

  12. Automatic detection of basal cell carcinoma using telangiectasia analysis in dermoscopy skin lesion images

    PubMed Central

    Cheng, Beibei; Erdos, David; Stanley, Ronald J.; Stoecker, William V.; Calcara, David A.; Gómez, David D.

    2011-01-01

    Background Telangiectasia, dilated blood vessels near the surface of the skin of small, varying diameter, are critical dermoscopy structures used in the detection of basal cell carcinoma (BCC). Distinguishing these vessels from other telangiectasia, that are commonly found in sun-damaged skin, is challenging. Methods Image analysis techniques are investigated to find vessels structures found in BCC automatically. The primary screen for vessels uses an optimized local color drop technique. A noise filter is developed to eliminate false-positive structures, primarily bubbles, hair, and blotch and ulcer edges. From the telangiectasia mask containing candidate vessel-like structures, shape, size and normalized count features are computed to facilitate the discrimination of benign skin lesions from BCCs with telangiectasia. Results Experimental results yielded a diagnostic accuracy as high as 96.7% using a neural network classifier for a data set of 59 BCCs and 152 benign lesions for skin lesion discrimination based on features computed from the telangiectasia masks. Conclusion In current clinical practice, it is possible to find smaller BCCs by dermoscopy than by clinical inspection. Although almost all of these small BCCs have telangiectasia, they can be short and thin. Normalization of lengths and areas helps to detect these smaller BCCs. PMID:23815446

  13. A marked point process of rectangles and segments for automatic analysis of digital elevation models.

    PubMed

    Ortner, Mathias; Descombe, Xavier; Zerubia, Josiane

    2008-01-01

    This work presents a framework for automatic feature extraction from images using stochastic geometry. Features in images are modeled as realizations of a spatial point process of geometrical shapes. This framework allows the incorporation of a priori knowledge on the spatial repartition of features. More specifically, we present a model based on the superposition of a process of segments and a process of rectangles. The former is dedicated to the detection of linear networks of discontinuities, while the latter aims at segmenting homogeneous areas. An energy is defined, favoring connections of segments, alignments of rectangles, as well as a relevant interaction between both types of objects. The estimation is performed by minimizing the energy using a simulated annealing algorithm. The proposed model is applied to the analysis of Digital Elevation Models (DEMs). These images are raster data representing the altimetry of a dense urban area. We present results on real data provided by the IGN (French National Geographic Institute) consisting in low quality DEMs of various types. PMID:18000328

  14. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    PubMed Central

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum. PMID:23766941

  15. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    PubMed Central

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  16. Automatic Analysis for the Chemical Testing of Urine Examination Using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Vilardy, Juan M.; Peña, Jose C.; Daza, Miller F.; Torres, Cesar O.; Mattos, Lorenzo

    2008-04-01

    For to make the chemical testing of urine examination a dipstick is used, which contains pads that have incorporated within them the reagents for chemical reactions for the detection of a number from substances in the urine. Urine is added to the pads for reaction by dipping the dipstick into the urine and then slowly withdrawing it. The subsequent colorimetric reactions are timed to an endpoint; the extent of colors formation is directly related to the level of the urine constituent. The colors can be read manually by comparison with color charts or with the use of automated reflectance meters. The aim of the System described in this paper is to analyze and to determine automatically the changes of the colors in the dipstick when this is retired of the urine sample and to compare the results with color charts for the diagnosis of many common diseases such as diabetes. The system consists of: (a) a USB camera. (b) Computer. (c) Software Matlab v7.4. Image analysis begins with a digital capturing of the image as data. Once the image is acquired in digital format, the data can be manipulated through digital image processing. Our objective was to develop a computerised image processing system and an interactive software package for the backing of clinicians, medical research and medical students.

  17. Automatic Detections of P and S Phases using Singular Value Decomposition Analysis

    NASA Astrophysics Data System (ADS)

    Kurzon, I.; Vernon, F.; Ben-Zion, Y.; Rosenberger, A.

    2012-12-01

    We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm, with the modification that we apply a set of filters prior to the SVD, and study the success of these filters in detecting correctly the P and S arrivals, in different stations and segments of the San Jacinto Fault Zone. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic networks, with ~ 90 stations in a fault zone which is 150km long and 30km wide. Embedded in this network are 5 linear arrays crossing the fault trace, with ~ 10 stations at ~ 25-50m spacing in each array. This allows us to test the detection algorithm in a diverse setting, including events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, such as rays propagating within the fault and recorded on the linear arrays. Comparing our new method with classic automatic detection methods using Short Time Average (STA) to Long Time Average (LTA) ratios, we show the success of this SVD detection. Unlike the STA to LTA ratio methods that normally tend to detect the P phase, but in many cases cannot distinguish the S arrival, the main advantage of the SVD method is that almost all the P arrivals have an associated S arrival. Moreover, even for cases of short distance events, in which the S arrivals are masked by the P waves, the SVD algorithm under low band filters, manages to detect those S arrivals. The method is less consistent for stations located directly on the fault traces, in which the SVD approximation is not always valid; but even in such cases the

  18. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  19. Automatic differentiation bibliography

    SciTech Connect

    Corliss, G.F.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  20. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  1. Multifractal Analysis and Relevance Vector Machine-Based Automatic Seizure Detection in Intracranial EEG.

    PubMed

    Zhang, Yanli; Zhou, Weidong; Yuan, Shasha

    2015-09-01

    Automatic seizure detection technology is of great significance for long-term electroencephalogram (EEG) monitoring of epilepsy patients. The aim of this work is to develop a seizure detection system with high accuracy. The proposed system was mainly based on multifractal analysis, which describes the local singular behavior of fractal objects and characterizes the multifractal structure using a continuous spectrum. Compared with computing the single fractal dimension, multifractal analysis can provide a better description on the transient behavior of EEG fractal time series during the evolvement from interictal stage to seizures. Thus both interictal EEG and ictal EEG were analyzed by multifractal formalism and their differences in the multifractal features were used to distinguish the two class of EEG and detect seizures. In the proposed detection system, eight features (α0, α(min), α(max), Δα, f(α(min)), f(α(max)), Δf and R) were extracted from the multifractal spectrums of the preprocessed EEG to construct feature vectors. Subsequently, relevance vector machine (RVM) was applied for EEG patterns classification, and a series of post-processing operations were used to increase the accuracy and reduce false detections. Both epoch-based and event-based evaluation methods were performed to appraise the system's performance on the EEG recordings of 21 patients in the Freiburg database. The epoch-based sensitivity of 92.94% and specificity of 97.47% were achieved, and the proposed system obtained a sensitivity of 92.06% with a false detection rate of 0.34/h in event-based performance assessment. PMID:25986754

  2. Digital automatic gain control

    NASA Technical Reports Server (NTRS)

    Uzdy, Z.

    1980-01-01

    Performance analysis, used to evaluated fitness of several circuits to digital automatic gain control (AGC), indicates that digital integrator employing coherent amplitude detector (CAD) is best device suited for application. Circuit reduces gain error to half that of conventional analog AGC while making it possible to automatically modify response of receiver to match incoming signal conditions.

  3. Implementation of terbium-sensitized luminescence in sequential-injection analysis for automatic analysis of orbifloxacin.

    PubMed

    Llorent-Martínez, E J; Ortega-Barrales, P; Molina-Díaz, A; Ruiz-Medina, A

    2008-12-01

    Orbifloxacin (ORBI) is a third-generation fluoroquinolone developed exclusively for use in veterinary medicine, mainly in companion animals. This antimicrobial agent has bactericidal activity against numerous gram-negative and gram-positive bacteria. A few chromatographic methods for its analysis have been described in the scientific literature. Here, coupling of sequential-injection analysis and solid-phase spectroscopy is described in order to develop, for the first time, a terbium-sensitized luminescent optosensor for analysis of ORBI. The cationic resin Sephadex-CM C-25 was used as solid support and measurements were made at 275/545 nm. The system had a linear dynamic range of 10-150 ng mL(-1), with a detection limit of 3.3 ng mL(-1) and an R.S.D. below 3% (n = 10). The analyte was satisfactorily determined in veterinary drugs and dog and horse urine. PMID:18958455

  4. Quantification of coronary artery plaque using 64-slice dual-source CT: comparison of semi-automatic and automatic computer-aided analysis based on intravascular ultrasonography as the gold standard.

    PubMed

    Kim, Young Jun; Jin, Gong Yong; Kim, Eun Young; Han, Young Min; Chae, Jei Keon; Lee, Sang Rok; Kwon, Keun Sang

    2013-12-01

    We evaluated the feasibility of automatic computer-aided analysis (CAA) compared with semi-automatic CAA for differentiating lipid-rich from fibrous plaques based on coronary CT angiography (CCTA) imaging. Seventy-four coronary plaques in 57 patients were evaluated by CCTA using 64-slice dual-source CT. Quantitative analysis of coronary artery plaques was performed by measuring the relative volumes (low, medium, and calcified) of plaque components using automatic CAA and by measuring mean CT density using semi-automatic CAA. We compared the two plaque measurement methods for lipid-rich and fibrous plaques using Pearson's correlation. Intravascular ultrasonography was used as the goal standard for assessment of plaques. Mean CT density of plaques tended to increase in the order of lipid [36 ± 19 Hounsfield unit (HU)], fibrous (106 ± 34 HU), and then calcified plaques (882 ± 296 HU). The mean relative volumes of 'low' components measured by automatic CAA were 13.8 ± 4.6, 7.9 ± 6.7, and 3.5 ± 3.0 % for lipid, fibrous, and calcified plaques, respectively (r = -0.348, P = 0.022). The mean relative volumes of 'medium' components on automatic CAA were 12.9 ± 4.1, 15.7 ± 9.6, and 5.6 ± 4.8 % for lipid, fibrous, and calcified plaques, respectively (r = -0.385, P = 0.011). The mean relative volumes of low and medium components within plaques significantly correlated with the types of plaques. Plaque analysis using automatic CAA has the potential to differentiate lipid from fibrous plaques based on measurement of the relative volume percentages of the low and medium components. PMID:24293043

  5. Automatic quantitative analysis of experimental primary and secondary retinal neurodegeneration: implications for optic neuropathies

    PubMed Central

    Davis, B M; Guo, L; Brenton, J; Langley, L; Normando, E M; Cordeiro, M F

    2016-01-01

    Secondary neurodegeneration is thought to play an important role in the pathology of neurodegenerative disease, which potential therapies may target. However, the quantitative assessment of the degree of secondary neurodegeneration is difficult. The present study describes a novel algorithm from which estimates of primary and secondary degeneration are computed using well-established rodent models of partial optic nerve transection (pONT) and ocular hypertension (OHT). Brn3-labelled retinal ganglion cells (RGCs) were identified in whole-retinal mounts from which RGC density, nearest neighbour distances and regularity indices were determined. The spatial distribution and rate of RGC loss were assessed and the percentage of primary and secondary degeneration in each non-overlapping segment was calculated. Mean RGC number (82 592±681) and RGC density (1695±23.3 RGC/mm2) in naïve eyes were comparable with previous studies, with an average decline in RGC density of 71±17 and 23±5% over the time course of pONT and OHT models, respectively. Spatial analysis revealed greatest RGC loss in the superior and central retina in pONT, but significant RGC loss in the inferior retina from 3 days post model induction. In comparison, there was no significant difference between superior and inferior retina after OHT induction, and RGC loss occurred mainly along the superior/inferior axis (~30%) versus the nasal–temporal axis (~15%). Intriguingly, a significant loss of RGCs was also observed in contralateral eyes in experimental OHT. In conclusion, a novel algorithm to automatically segment Brn3a-labelled retinal whole-mounts into non-overlapping segments is described, which enables automated spatial and temporal segmentation of RGCs, revealing heterogeneity in the spatial distribution of primary and secondary degenerative processes. This method provides an attractive means to rapidly determine the efficacy of neuroprotective therapies with implications for any

  6. Automatic identification of responses from porphyry intrusive systems within magnetic data using image analysis

    NASA Astrophysics Data System (ADS)

    Holden, Eun-Jung; Fu, Shih Ching; Kovesi, Peter; Dentith, Michael; Bourne, Barry; Hope, Matthew

    2011-08-01

    Direct targeting of mineral deposits using magnetic data may be facilitated by hydrothermal alteration associated with the mineralising event if the alteration changes the magnetic properties of the host rock. Hydrothermal alteration associated with porphyry-style mineralisation typically comprises concentric near-circular alteration zones surrounding a roughly circular central intrusion. The intrusion itself and the proximal alteration zone are usually associated with positive magnetic anomalies whilst the outer alteration zones are much less magnetic. Because the country rocks are usually magnetic, this pattern of alteration produces a central magnetic 'high' surrounded by an annular magnetic 'low'. This paper presents an automatic image analysis system for gridded data that provides an efficient, accurate and non-subjective way to seek the magnetic response of an idealised porphyry mineralising system within magnetic datasets. The method finds circular anomalies that are associated with the central intrusion and inner alteration zone of the porphyry system using a circular feature detection method called the radial symmetry transform. Next, their boundaries are traced using deformable splines that are drawn to the locations of maximum contrast between the amplitudes of the central 'high' and surrounding area of lower magnetisation. Experiments were conducted on magnetic data from Reko Diq, Pakistan; a region known to contain numerous occurrences of porphyry-style mineralisation. The predicted locations of porphyry systems closely match the locations of the known deposits in this region. This system is suitable as an initial screening tool for large geophysical datasets, therefore reducing the time and cost imposed by manual data inspection in the exploration process. The same principles can be applied to the search for circular magnetic responses with different amplitude characteristics.

  7. Image structural analysis in the tasks of automatic navigation of unmanned vehicles and inspection of Earth surface

    NASA Astrophysics Data System (ADS)

    Lutsiv, Vadim; Malyshev, Igor

    2013-10-01

    The automatic analysis of images of terrain is urgent for several decades. On the one hand, such analysis is a base of automatic navigation of unmanned vehicles. On the other hand, the amount of information transferred to the Earth by modern video-sensors increases, thus a preliminary classification of such data by onboard computer becomes urgent. We developed an object-independent approach to structural analysis of images. While creating the methods of image structural description, we did our best to abstract away from the partial peculiarities of scenes. Only the most general limitations were taken into account, that were derived from the laws of organization of observable environment and from the properties of image formation systems. The practical application of this theoretic approach enables reliable matching the aerospace photographs acquired from differing aspect angles, in different day-time and seasons by sensors of differing types. The aerospace photographs can be matched even with the geographic maps. The developed approach enabled solving the tasks of automatic navigation of unmanned vehicles. The signs of changes and catastrophes can be detected by means of matching and comparison of aerospace photographs acquired at different time. We present the theoretical proofs of chosen strategy of structural description and matching of images. Several examples of matching of acquired images with template pictures and maps of terrain are shown within the frameworks of navigation of unmanned vehicles or detection of signs of disasters.

  8. [Development of a Japanese version of the Valuation of Life (VOL) scale].

    PubMed

    Nakagawa, Takeshi; Gondo, Yasuyuki; Masui, Yukie; Ishioka, Yoshiko; Tabuchi, Megumi; Kamide, Kei; Ikebe, Kazunori; Arai, Yasumichi; Takahashi, Ryutaro

    2013-04-01

    This study developed a Japanese version of the Valuation of Life (VOL) scale, to measure psychological wellbeing among older adults. In Analysis 1, we conducted a factor analysis of 13 items, and identified two factors: positive VOL and spiritual well-being. These factors had adequate degrees of internal consistency, and were related to positive mental health. In Analysis 2, we examined sociodemographic, social, and health predictors for VOL. The role of social factors was stronger than the role of health factors, and spiritual well-being was more related to moral or religious activities than positive VOL. These results suggest that predictors for VOL vary by culture. In Analysis 3, we investigated the relationship between VOL and desired years of life. Positive VOL significantly predicted more desired years of life, whereas spiritual well-being did not. Positive VOL had acceptable reliability and validity. Future research is required to investigate whether VOL predicts survival duration or end-of-life decisions. PMID:23705232

  9. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    PubMed Central

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-o

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin. PMID:26806007

  10. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    NASA Astrophysics Data System (ADS)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  11. A system for automatic analysis of blood pressure data for digital computer entry

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1972-01-01

    Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.

  12. AUTOMATIC ANALYSIS OF DISSOLVED METAL POLLUTANTS IN WATER BY ENERGY DISPERSIVE X-RAY SPECTROSCOPY

    EPA Science Inventory

    An automated system for the quantitative determination of dissolved metals such as Fe, Cu, Zn, Ca, Co, Ni, Cr, Hg, Se, and Pb in water is described. The system collects a water sample, preconcentrates the dissolved metals with ion-exchange paper automatically in a sample collecti...

  13. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  14. Analysis of operators for detection of corners set in automatic image matching

    NASA Astrophysics Data System (ADS)

    Zawieska, D.

    2011-12-01

    Reconstruction of three dimensional models of objects from images has been a long lasting research topic in photogrammetry and computer vision. The demand for 3D models is continuously increasing in such fields as cultural heritage, computer graphics, robotics and many others. The number and types of features of a 3D model are highly dependent on the use of the models, and can be very variable in terms of accuracy and time for their creation. In last years, both computer vision and photogrammetric communities have approached the reconstruction problems by using different methods to solve the same tasks, such as camera calibration, orientation, object reconstruction and modelling. The terminology which is used for addressing the particular task in both disciplines is sometimes diverse. On the other hand, the integration of methods and algorithms coming from them can be used to improve both. The image based modelling of an object has been defined as a complete process that starts with image acquisition and ends with an interactive 3D virtual model. The photogrammetric approach to create 3D models involves the followings steps: image pre-processing, camera calibration, orientation of images network, image scanning for point detection, surface measurement and point triangulation, blunder detection and statistical filtering, mesh generation and texturing, visualization and analysis. Currently there is no single software package available that allows for each of those steps to be executed within the same environment. For high accuracy of 3D objects reconstruction operators are required as a preliminary step in the surface measurement process, to find the features that serve as suitable points when matching across multiple images. Operators are the algorithms which detect the features of interest in an image, such as corners, edges or regions. This paper reports on the first phase of research on the generation of high accuracy 3D model measurement and modelling, focusing

  15. Texture analysis of the 3D collagen network and automatic classification of the physiology of articular cartilage.

    PubMed

    Duan, Xiaojuan; Wu, Jianping; Swift, Benjamin; Kirk, Thomas Brett

    2015-07-01

    A close relationship has been found between the 3D collagen structure and physiological condition of articular cartilage (AC). Studying the 3D collagen network in AC offers a way to determine the condition of the cartilage. However, traditional qualitative studies are time consuming and subjective. This study aims to develop a computer vision-based classifier to automatically determine the condition of AC tissue based on the structural characteristics of the collagen network. Texture analysis was applied to quantitatively characterise the 3D collagen structure in normal (International Cartilage Repair Society, ICRS, grade 0), aged (ICRS grade 1) and osteoarthritic cartilages (ICRS grade 2). Principle component techniques and linear discriminant analysis were then used to classify the microstructural characteristics of the 3D collagen meshwork and the condition of the AC. The 3D collagen meshwork in the three physiological condition groups displayed distinctive characteristics. Texture analysis indicated a significant difference in the mean texture parameters of the 3D collagen network between groups. The principle component and linear discriminant analysis of the texture data allowed for the development of a classifier for identifying the physiological status of the AC with an expected prediction error of 4.23%. An automatic image analysis classifier has been developed to predict the physiological condition of AC (from ICRS grade 0 to 2) based on texture data from the 3D collagen network in the tissue. PMID:24428581

  16. Design and implementation of a context-sensitive, flow-sensitive activity analysis algorithm for automatic differentiation.

    SciTech Connect

    Shin, J.; Malusare, P.; Hovland, P. D.; Mathematics and Computer Science

    2008-01-01

    Automatic differentiation (AD) has been expanding its role in scientific computing. While several AD tools have been actively developed and used, a wide range of problems remain to be solved. Activity analysis allows AD tools to generate derivative code for fewer variables, leading to a faster run time of the output code. This paper describes a new context-sensitive, flow-sensitive (CSFS) activity analysis, which is developed by extending an existing context-sensitive, flow-insensitive (CSFI) activity analysis. Our experiments with eight benchmarks show that the new CSFS activity analysis is more than 27 times slower but reduces 8 overestimations for the MIT General Circulation Model (MITgcm) and 1 for an ODE solver (c2) compared with the existing CSFI activity analysis implementation. Although the number of reduced overestimations looks small, the additionally identified passive variables may significantly reduce tedious human effort in maintaining a large code base such as MITgcm.

  17. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  18. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  19. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals. PMID:25571461

  20. Design and analysis of an automatic method of measuring silicon-controlled-rectifier holding current

    NASA Technical Reports Server (NTRS)

    Maslowski, E. A.

    1971-01-01

    The design of an automated SCR holding-current measurement system is described. The circuits used in the measurement system were designed to meet the major requirements of automatic data acquisition, reliability, and repeatability. Performance data are presented and compared with calibration data. The data verified the accuracy of the measurement system. Data taken over a 48-hr period showed that the measurement system operated satisfactorily and met all the design requirements.

  1. Sensitivity analysis using parallel ODE solvers and automatic differentiation in C : SensPVODE and ADIC.

    SciTech Connect

    Lee, S. L.; Hovland, P. D.

    2000-11-01

    PVODE is a high-performance ordinary differential equation solver for the types of initial value problems (IVPs) that arise in large-scale computational simulations. Often, one wants to compute sensitivities with respect to certain parameters in the IVP. We discuss the use of automatic differentiation (AD) to compute these sensitivities in the context of PVODE. Results on a simple test problem indicate that the use of AD-generated derivative code can reduce the time to solution over finite difference approximations.

  2. Sensitivity analysis using parallel ODE solvers and automatic differentiation in C: sensPVODE and ADIC

    SciTech Connect

    Lee, S L; Hovland, P D

    2000-09-15

    PVODE is a high-performance ordinary differential equation solver for the types of initial value problems (IVPs) that arise in large-scale computational simulations. often, one wants to compute sensitivities with respect to certain parameters in the IVP. They discuss the use of automatic differentiation (AD) to compute these sensitivities in the context of PVODE. Results on a simple test problem indicate that the use of AD-generated derivative code can reduce the time to solution over finite difference approximations.

  3. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  4. Automatic Detection of Laryngeal Pathology on Sustained Vowels Using Short-Term Cepstral Parameters: Analysis of Performance and Theoretical Justification

    NASA Astrophysics Data System (ADS)

    Fraile, Rubén; Godino-Llorente, Juan Ignacio; Sáenz-Lechón, Nicolás; Osma-Ruiz, Víctor; Gómez-Vilda, Pedro

    The majority of speech signal analysis procedures for automatic detection of laryngeal pathologies mainly rely on parameters extracted from time-domain processing. Moreover, calculation of these parameters often requires prior pitch period estimation; therefore, their validity heavily depends on the robustness of pitch detection. Within this paper, an alternative approach based on cepstral - domain processing is presented which has the advantage of not requiring pitch estimation, thus providing a gain in both simplicity and robustness. While the proposed scheme is similar to solutions based on Mel-frequency cepstral parameters, already present in literature, it has an easier physical interpretation while achieving similar performance standards.

  5. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    SciTech Connect

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  6. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    NASA Astrophysics Data System (ADS)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  7. Finite Element Analysis of Osteosynthesis Screw Fixation in the Bone Stock: An Appropriate Method for Automatic Screw Modelling

    PubMed Central

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  8. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    PubMed

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  9. A fast automatic plate changer for the analysis of nuclear emulsions

    NASA Astrophysics Data System (ADS)

    Balestra, S.; Bertolin, A.; Bozza, C.; Calligola, P.; Cerroni, R.; D'Ambrosio, N.; Degli Esposti, L.; De Lellis, G.; De Serio, M.; Di Capua, F.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dusini, S.; Esposito, L. S.; Fini, R. A.; Giacomelli, G.; Giacomelli, R.; Grella, G.; Ieva, M.; Kose, U.; Longhin, A.; Mandrioli, G.; Mauri, N.; Medinaceli, E.; Monacelli, P.; Muciaccia, M. T.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pozzato, M.; Pupilli, F.; Rescigno, R.; Rosa, G.; Ruggieri, A.; Russo, A.; Sahnoun, Z.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Stellacci, S. M.; Strolin, P.; Tenti, M.; Tioukov, V.; Togo, V.; Valieri, C.

    2013-07-01

    This paper describes the design and performance of a computer controlled emulsion Plate Changer for the automatic placement and removal of nuclear emulsion films for the European Scanning System microscopes. The Plate Changer is used for mass scanning and measurement of the emulsions of the OPERA neutrino oscillation experiment at the Gran Sasso lab on the CNGS neutrino beam. Unlike other systems it works with both dry and oil objectives. The film changing takes less than 20 s and the accuracy on the positioning of the emulsion films is about 10 μm. The final accuracy in retrieving track coordinates after fiducial marks measurement is better than 1 μm.

  10. Journalism Abstracts. Vol. 15.

    ERIC Educational Resources Information Center

    Popovich, Mark N., Ed.

    This book, the fifteenth volume of an annual publication, contains 373 abstracts of 52 doctoral and 321 master's theses from 50 colleges and universities. The abstracts are arranged alphabetically by author, with the doctoral dissertations appearing first. These cover such topics as advertising, audience analysis, content analysis of news issues…

  11. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images.

    PubMed

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  12. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images

    PubMed Central

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  13. A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood

    2013-02-01

    In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.

  14. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  15. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    NASA Astrophysics Data System (ADS)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  16. Automatic Prediction of Cardiovascular and Cerebrovascular Events Using Heart Rate Variability Analysis

    PubMed Central

    Melillo, Paolo; Izzo, Raffaele; Orrico, Ada; Scala, Paolo; Attanasio, Marcella; Mirra, Marco; De Luca, Nicola; Pecchia, Leandro

    2015-01-01

    Background There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients. Methods A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were collected ad hoc. Subjects who experienced a vascular event (i.e., myocardial infarction, stroke, syncopal event) were considered as high-risk subjects. Several data-mining algorithms (such as support vector machine, tree-based classifier, artificial neural network) were used to develop automatic classifiers and their accuracy was tested by assessing the receiver-operator characteristics curve. Moreover, we tested the echographic parameters, which have been showed as powerful predictors of future vascular events. Results The best predictive model was based on random forest and enabled to identify high-risk hypertensive patients with sensitivity and specificity rates of 71.4% and 87.8%, respectively. The Heart Rate Variability based classifier showed higher predictive values than the conventional echographic parameters, which are considered as significant cardiovascular risk factors. Conclusions Combination of Heart Rate Variability measures, analyzed with data-mining algorithm, could be a reliable tool for identifying hypertensive patients at high risk to develop future vascular events. PMID:25793605

  17. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    PubMed Central

    Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272

  18. Dynamic response and stability analysis of an unbalanced flexible rotating shaft equipped with n automatic ball-balancers

    NASA Astrophysics Data System (ADS)

    Ehyaei, J.; Moghaddam, Majid M.

    2009-04-01

    The paper presents analytical and numerical investigations of a system of unbalanced flexible rotating shaft equipped with n automatic ball-balancers, where the unbalanced masses are distributed in the length of the shaft. It includes the derivation of the equations of motion, the stability analysis on the basis of linearized equations of motion around the equilibrium position, and the results of the time responses of the system. The Stodola-Green rotor model, of which the shaft is assumed flexible, is proposed for the analysis step. The rotor model includes the influence of rigid-body rotations, due to the shaft flexibility. Utilizing Lagrange's method, the nonlinear equations of motion are derived. The study shows that for the angular velocities more than the first natural frequency and selecting the suitable values for the parameters of the automatic ball-balancers, which are in the stability region, the auto ball-balancers tend to improve the vibration behavior of the system, i.e., the partial balancing, but the complete balancing was achieved in a special case, where the imbalances are in the planes of the auto ball-balancers. Furthermore, it is shown that if the auto ball-balancers are closer to the unbalanced masses, a better vibration reduction is achieved.

  19. Study on triterpenoic acids distribution in Ganoderma mushrooms by automatic multiple development high performance thin layer chromatographic fingerprint analysis.

    PubMed

    Yan, Yu-Zhen; Xie, Pei-Shan; Lam, Wai-Kei; Chui, Eddie; Yu, Qiong-Xi

    2010-01-01

    Ganoderma--"Lingzhi" in Chinese--is one of the superior Chinese tonic materia medicas in China, Japan, and Korea. Two species, Ganoderma lucidum (Red Lingzhi) and G. sinense (Purple Lingzhi), have been included in the Chinese Pharmacopoeia since its 2000 Edition. However, some other species of Ganoderma are also available in the market. For example, there are five species divided by color called "Penta-colors Lingzhi" that have been advocated as being the most invigorating among the Lingzhi species; but there is no scientific evidence for such a claim. Morphological identification can serve as an effective practice for differentiating the various species, but the inherent quality has to be delineated by chemical analysis. Among the diverse constituents in Lingzhi, triterpenoids are commonly recognized as the major active ingredients. An automatic triple development HPTLC fingerprint analysis was carried out for detecting the distribution consistency of the triterpenoic acids in various Lingzhi samples. The chromatographic conditions were optimized as follows: stationary phase, precoated HPTLC silica gel 60 plate; mobile phase, toluene-ethyl acetate-methanol-formic acid (15 + 15 + 1 + 0.1); and triple-development using automatic multiple development equipment. The chromatograms showed good resolution, and the color images provided more specific HPTLC fingerprints than have been previously published. It was observed that the abundance of triterpenoic acids and consistent fingerprint pattern in Red Lingzhi (fruiting body of G. lucidum) outweighs the other species of Lingzhi. PMID:21140647

  20. Automatic image analysis methods for the determination of stereological parameters - application to the analysis of densification during solid state sintering of WC-Co compacts

    PubMed

    Missiaen; Roure

    2000-08-01

    Automatic image analysis methods which were used to determine microstructural parameters of sintered materials are presented. Estimation of stereological parameters at interfaces, when the system contains more than two phases, is particularly detailed. It is shown that the specific surface areas and mean curvatures of the various interfaces can be estimated in the numerical space of the images. The methods are applied to the analysis of densification during solid state sintering of WC-Co compacts. The microstructural evolution is commented on. Application of microstructural measurements to the analysis of densification kinetics is also discussed. PMID:10947907

  1. [Digital storage and semi-automatic analysis of esophageal pressure signals. Evaluation of a commercialized system (PC Polygraft, Synectics)].

    PubMed

    Bruley des Varannes, S; Pujol, P; Salim, B; Cherbut, C; Cloarec, D; Galmiche, J P

    1989-11-01

    The aim of this work was to evaluate a new commercially available pressure recording system (PC Polygraf, Synectics) and to compare this system with a classical method using perfused catheters. The PC Polygraf uses microtransducers and allows direct digitized storage and semi-automatic analysis of data. In the first part of this study, manometric assessment was conducted using only perfused catheters. The transducers were connected to both an analog recorder and to a PC Polygraf. Using the two methods of analysis, contraction amplitudes were strongly correlated (r = 0.99; p less than 0.0001) whereas durations were significantly but loosely correlated (r = 0.51; p less than 0.001). Resting LES pressure was significantly correlated (r = 0.87; p less than 0.05). In the second part of this study, simultaneous recordings of esophageal pressure were conducted in 7 patients, by placing side by side the two tubes (microtransducers and perfused catheters) with the sideholes at the same level. The characteristics of the waves were determined both by visual analysis of analog tracing and by semi-automatic analysis of digitized recording with adequate program. Mean amplitude was lower with the microtransducers than with the perfused catheters (60 vs 68 cm H2O; p less than 0.05), but the duration of waves was not significantly different when using both systems. Values obtained for each of these parameters using both methods were significantly correlated (amplitude: r = 0.74; duration: r = 0.51). The localization and the measure of the basal tone of sphincter were found to be difficult when using microtransducers. These results show that PC Polygraf allows a satisfactory analysis of esophageal pressure signals. However, only perfused catheters offer an excellent reliability for complete studies of both sphincter and peristaltism. PMID:2612832

  2. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat . PMID:26285671

  3. Analysis of the distribution of the brain cells of the fruit fly by an automatic cell counting algorithm

    NASA Astrophysics Data System (ADS)

    Shimada, Takashi; Kato, Kentaro; Kamikouchi, Azusa; Ito, Kei

    2005-05-01

    The fruit fly is the smallest brain-having model animal. Its brain is said to consist only of about 250,000 neurons, whereas it shows “the rudiments of consciousness” in addition to its high abilities such as learning and memory. As the starting point of the exhaustive analysis of its brain-circuit information, we have developed a new algorithm of counting cells automatically from source 2D/3D figures. In our algorithm, counting cells is realized by embedding objects (typically, disks/balls), each of which has exclusive volume. Using this method, we have succeeded in counting thousands of cells accurately. This method provides us the information necessary for the analysis of brain circuits: the precise distribution of the whole brain cells.

  4. Automatic detection of CT perfusion datasets unsuitable for analysis due to head movement of acute ischemic stroke patients.

    PubMed

    Fahmi, Fahmi; Marquering, Henk A; Streekstra, Geert J; Beenen, Ludo F M; Janssen, Natasja N Y; Majoie, Charles B L; van Bavel, Ed

    2014-01-01

    Head movement during brain Computed Tomography Perfusion (CTP) can deteriorate perfusion analysis quality in acute ischemic stroke patients. We developed a method for automatic detection of CTP datasets with excessive head movement, based on 3D image-registration of CTP, with non-contrast CT providing transformation parameters. For parameter values exceeding predefined thresholds, the dataset was classified as 'severely moved'. Threshold values were determined by digital CTP phantom experiments. The automated selection was compared to manual screening by 2 experienced radiologists for 114 brain CTP datasets. Based on receiver operator characteristics, optimal thresholds were found of respectively 1.0°, 2.8° and 6.9° for pitch, roll and yaw, and 2.8 mm for z-axis translation. The proposed method had a sensitivity of 91.4% and a specificity of 82.3%. This method allows accurate automated detection of brain CTP datasets that are unsuitable for perfusion analysis. PMID:24691387

  5. SpLiNeS: automatic analysis of ecographic movies of flow-mediated dilation

    NASA Astrophysics Data System (ADS)

    Bartoli, Guido; Menegaz, Gloria; Dragoni, Saverio; Gori, Tommaso

    2007-03-01

    In this paper, we propose a fully automatic system for analyzing ecographic movies of flow-mediated dilation. Our approach uses a spline-based active contour (deformable template) to follow artery boundaries during the FMD procedure. A number of preprocessing steps (grayscale conversion, contrast enhancing, sharpening) are used to improve the visual quality of frames coming from the echographic acquisition. Our system can be used in real-time environments due to the high speed of edge recognition which iteratively minimizes fitting errors on endothelium boundaries. We also implemented a fully functional GUI which permits to interactively follow the whole recognition process as well as to reshape the results. The system accuracy and reproducibility has been validated with extensive in vivo experiments.

  6. Automatic measurement and analysis of neonatal O2 consumption and CO2 production

    NASA Astrophysics Data System (ADS)

    Chang, Jyh-Liang; Luo, Ching-Hsing; Yeh, Tsu-Fuh

    1996-02-01

    It is difficult to estimate daily energy expenditure unless continuous O2 consumption (VO2) and CO2 production (VCO2) can be measured. This study describes a simple method for calculating daily and interim changes in O2 consumption and CO2 production for neonates, especially for premature infants. Oxygen consumption and CO2 production are measured using a flow-through technique in which the total VO2 and VCO2 over a given period of time are determined through a computerized system. This system can automatically calculate VO2 and VCO2 not only minute to minute but also over a period of time, e.g., 24 h. As a result, it provides a better indirect reflection of the accurate energy expenditure in an infant's daily life and can be used at the bedside of infants during their ongoing nursery care.

  7. Versatile, high sensitivity, and automatized angular dependent vectorial Kerr magnetometer for the analysis of nanostructured materials

    NASA Astrophysics Data System (ADS)

    Teixeira, J. M.; Lusche, R.; Ventura, J.; Fermento, R.; Carpinteiro, F.; Araujo, J. P.; Sousa, J. B.; Cardoso, S.; Freitas, P. P.

    2011-04-01

    Magneto-optical Kerr effect (MOKE) magnetometry is an indispensable, reliable, and one of the most widely used techniques for the characterization of nanostructured magnetic materials. Information, such as the magnitude of coercive fields or anisotropy strengths, can be readily obtained from MOKE measurements. We present a description of our state-of-the-art vectorial MOKE magnetometer, being an extremely versatile, accurate, and sensitivity unit with a low cost and comparatively simple setup. The unit includes focusing lenses and an automatized stepper motor stage for angular dependent measurements. The performance of the magnetometer is demonstrated by hysteresis loops of Co thin films displaying uniaxial anisotropy induced on growth, MnIr/CoFe structures exhibiting the so called exchange bias effect, spin valves, and microfabricated flux guides produced by optical lithography.

  8. An Analysis of Serial Number Tracking Automatic Identification Technology as Used in Naval Aviation Programs

    NASA Astrophysics Data System (ADS)

    Csorba, Robert

    2002-09-01

    The Government Accounting Office found that the Navy, between 1996 and 1998, lost 3 billion in materiel in-transit. This thesis explores the benefits and cost of automatic identification and serial number tracking technologies under consideration by the Naval Supply Systems Command and the Naval Air Systems Command. Detailed cost-savings estimates are made for each aircraft type in the Navy inventory. Project and item managers of repairable components using Serial Number Tracking were surveyed as to the value of this system. It concludes that two thirds of the in-transit losses can be avoided with implementation of effective information technology-based logistics and maintenance tracking systems. Recommendations are made for specific steps and components of such an implementation. Suggestions are made for further research.

  9. An automatic detector of drowsiness based on spectral analysis and wavelet decomposition of EEG records.

    PubMed

    Garces Correa, Agustina; Laciar Leber, Eric

    2010-01-01

    An algorithm to detect automatically drowsiness episodes has been developed. It uses only one EEG channel to differentiate the stages of alertness and drowsiness. In this work the vectors features are building combining Power Spectral Density (PDS) and Wavelet Transform (WT). The feature extracted from the PSD of EEG signal are: Central frequency, the First Quartile Frequency, the Maximum Frequency, the Total Energy of the Spectrum, the Power of Theta and Alpha bands. In the Wavelet Domain, it was computed the number of Zero Crossing and the integrated from the scale 3, 4 and 5 of Daubechies 2 order WT. The classifying of epochs is being done with neural networks. The detection results obtained with this technique are 86.5 % for drowsiness stages and 81.7% for alertness segment. Those results show that the features extracted and the classifier are able to identify drowsiness EEG segments. PMID:21096343

  10. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    NASA Astrophysics Data System (ADS)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  11. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    NASA Astrophysics Data System (ADS)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  12. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA) and Spectral Discrimination Measurements

    PubMed Central

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-01

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials. PMID:25625907

  13. Conversation analysis at work: detection of conflict in competitive discussions through semi-automatic turn-organization analysis.

    PubMed

    Pesarin, Anna; Cristani, Marco; Murino, Vittorio; Vinciarelli, Alessandro

    2012-10-01

    This study proposes a semi-automatic approach aimed at detecting conflict in conversations. The approach is based on statistical techniques capable of identifying turn-organization regularities associated with conflict. The only manual step of the process is the segmentation of the conversations into turns (time intervals during which only one person talks) and overlapping speech segments (time intervals during which several persons talk at the same time). The rest of the process takes place automatically and the results show that conflictual exchanges can be detected with Precision and Recall around 70% (the experiments have been performed over 6 h of political debates). The approach brings two main benefits: the first is the possibility of analyzing potentially large amounts of conversational data with a limited effort, the second is that the model parameters provide indications on what turn-regularities are most likely to account for the presence of conflict. PMID:22009168

  14. Automatic analysis of pediatric renal ultrasound using shape, anatomical and image acquisition priors.

    PubMed

    Kang, Xin; Safdar, Nabile; Myers, Emmarie; Martin, Aaron D; Grisan, Enrico; Peters, Craig A; Linguraru, Marius George

    2013-01-01

    In this paper we present a segmentation method for ultrasound (US) images of the pediatric kidney, a difficult and barely studied problem. Our method segments the kidney on 2D sagittal US images and relies on minimal user intervention and a combination of improvements made to the Active Shape Model (ASM) framework. Our contributions include particle swarm initialization and profile training with rotation correction. We also introduce our methodology for segmentation of the kidney's collecting system (CS), based on graph-cuts (GC) with intensity and positional priors. Our intensity model corrects for intensity bias by comparison with other biased versions of the most similar kidneys in the training set. We prove significant improvements (p < 0.001) with respect to classic ASM and GC for kidney and CS segmentation, respectively. We use our semi-automatic method to compute the hydronephrosis index (HI) with an average error of 2.67 +/- 5.22 percentage points similar to the error of manual HI between different operators of 2.31 +/- 4.54 percentage points. PMID:24505769

  15. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents.

    PubMed

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  16. GALA: An Automatic Tool for the Abundance Analysis of Stellar Spectra

    NASA Astrophysics Data System (ADS)

    Mucciarelli, Alessio; Pancino, Elena; Lovisi, Loredana; Ferraro, Francesco R.; Lapenna, Emilio

    2013-04-01

    GALA is a freely distributed Fortran code for automatically deriving the atmospheric parameters (temperature, gravity, microturbulent velocity, and overall metallicity) and abundances for individual species of stellar spectra using the classical method based on the equivalent widths of metallic lines. The abundances of individual spectral lines are derived by using the WIDTH9 code developed by R. L. Kurucz. GALA is designed to obtain the best model atmosphere by optimizing temperature, surface gravity, microturbulent velocity, and metallicity after rejecting the discrepant lines. Finally, it computes accurate internal errors for each atmospheric parameter and abundance. GALA is suitable for analyzing both early- and late-type stars, under the assumption of local thermodynamical equilibrium. The code permits us to obtain chemical abundances and atmospheric parameters for large stellar samples in a very short time, thus making GALA a useful tool in the epoch of multi-object spectrographs and large surveys. An extensive set of tests with both synthetic and observed spectra is performed and discussed to explore the capabilities and robustness of the code. Based on observations collected at the ESO-VLT under programs 65.L-0165, 165.L-0263, 073.D-0211, 080.D-0368, 083.D-0208, and 266.D-5655 and on data available in the ELODIE archive. This research has also made use of the SIMBAD database, operated at CDS, Strasbourg, France.

  17. Applicability of semi-automatic segmentation for volumetric analysis of brain lesions.

    PubMed

    Heinonen, T; Dastidar, P; Eskola, H; Frey, H; Ryymin, P; Laasonen, E

    1998-01-01

    This project involves the development of a fast semi-automatic segmentation procedure to make an accurate volumetric estimation of brain lesions. This method has been applied in the segmentation of demyelination plaques in Multiple Sclerosis (MS) and right cerebral hemispheric infarctions in patients with neglect. The developed segmentation method includes several image processing techniques, such as image enhancement, amplitude segmentation, and region growing. The entire program operates on a PC-based computer and applies graphical user interfaces. Twenty three patients with MS and 43 patients with right cerebral hemisphere infarctions were studied on a 0.5 T MRI unit. The MS plaques and cerebral infarctions were thereafter segmented. The volumetric accuracy of the program was demonstrated by segmenting Magnetic Resonance (MR) images of fluid filled syringes. The relative error of the total volume measurement based on the MR images of syringes was 1.5%. Also the repeatability test was carried out as inter-and intra-observer study in which MS plaques of six randomly selected patients were segmented. These tests indicated 7% variability in the inter-observer study and 4% variability in the intra-observer study. Average time used to segment and calculate the total plaque volumes for one patient was 10 min. This simple segmentation method can be utilized in the quantitation of anatomical structures, such as air cells in the sinonasal and temporal bone area, as well as in different pathological conditions, such as brain tumours, intracerebral haematomas and bony destructions. PMID:9680601

  18. GALA: AN AUTOMATIC TOOL FOR THE ABUNDANCE ANALYSIS OF STELLAR SPECTRA

    SciTech Connect

    Mucciarelli, Alessio; Lovisi, Loredana; Ferraro, Francesco R.; Lapenna, Emilio

    2013-04-01

    GALA is a freely distributed Fortran code for automatically deriving the atmospheric parameters (temperature, gravity, microturbulent velocity, and overall metallicity) and abundances for individual species of stellar spectra using the classical method based on the equivalent widths of metallic lines. The abundances of individual spectral lines are derived by using the WIDTH9 code developed by R. L. Kurucz. GALA is designed to obtain the best model atmosphere by optimizing temperature, surface gravity, microturbulent velocity, and metallicity after rejecting the discrepant lines. Finally, it computes accurate internal errors for each atmospheric parameter and abundance. GALA is suitable for analyzing both early- and late-type stars, under the assumption of local thermodynamical equilibrium. The code permits us to obtain chemical abundances and atmospheric parameters for large stellar samples in a very short time, thus making GALA a useful tool in the epoch of multi-object spectrographs and large surveys. An extensive set of tests with both synthetic and observed spectra is performed and discussed to explore the capabilities and robustness of the code.

  19. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents

    PubMed Central

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M.; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  20. Analysis of the automatic depressurization system (ADS) tests for the AP600 design

    SciTech Connect

    Brockie, A.J.; Carlson, R.W.; Bueter, T.W.

    1995-12-31

    The AP600 is a Westinghouse advanced pressurized water reactor (PWR) designed with passive plant safety features that rely on natural driving forces, such as gravity, and natural circulation which allows significant simplification of the plant systems equipment and operation. As part of the passive safety concept, the AP600 utilizes an Automatic Depressurization System (ADS) to depressurize the Reactor Coolant System (RCS) allowing long-term gravity injection to be initiated and maintained for passive reflood and long term core cooling. The ADS design consists of four flow paths, two of which are connected to the top of the pressurizer and a flow path from each of the two RCS hot legs. During a postulated accident, the two flow paths from the hot legs discharge directly to containment. The two paths from the pressurizer discharge steam and/or water from the RCS into the In-containment Refueling Water Storage Tank (IRWST) through spargers located underwater where the steam is normally condensed with no increase in containment pressure or temperature. The ADS tests are one part of the planned AP600 Westinghouse test program for the passive core cooling system (PXS). The ADS tests are full-scale simulations of AP600 ADS components. The ADS tests provide dynamic performance data of the ADS for use in computer code validation and design verification.

  1. Automatic control of a robot camera for broadcasting and subjective evaluation and analysis of reproduced images

    NASA Astrophysics Data System (ADS)

    Kato, Daiichiro; Ishikawa, Akio; Tsuda, Takao; Shimoda, Shigeru; Fukushima, Hiroshi

    2000-06-01

    We are studying about an intelligent robot camera that can automatically shoot an object and produce images with a powerful sense of reality as if a very skilled cameraman were at work. In this study, we designed a control algorithm based on cameramen's techniques for the control of the robot camera and conducted a series of experiments to understand the effects of camera work on how images look to viewers. The findings were as follows: (1) Evaluation scores are high when actual data by cameraman, especially typical data, are used as the position adjusting velocity curve of the target. (2) Evaluation scores are relatively high for images taken with feedback-feedforward camera control method when the target moves in one direction. (3) When both the direction and velocity of the target change and when the target gets bigger and faster in the view finder, it becomes increasingly difficult to keep the target within the view finder using the control method that imitates the human camera handling. (4) The method with mechanical feedback, on the other hand, is able to cope with rapid changes in the target's direction and velocity, constantly keeping the target within the view finder. Even so, the viewer finds the image rather mechanical than natural.

  2. Analysis of electric energy consumption of automatic milking systems in different configurations and operative conditions.

    PubMed

    Calcante, Aldo; Tangorra, Francesco M; Oberti, Roberto

    2016-05-01

    Automatic milking systems (AMS) have been a revolutionary innovation in dairy cow farming. Currently, more than 10,000 dairy cow farms worldwide use AMS to milk their cows. Electric consumption is one of the most relevant and uncontrollable operational cost of AMS, ranging between 35 and 40% of their total annual operational costs. The aim of the present study was to measure and analyze the electric energy consumption of 4 AMS with different configurations: single box, central unit featuring a central vacuum system for 1 cow unit and for 2 cow units. The electrical consumption (daily consumption, daily consumption per cow milked, consumption per milking, and consumption per 100L of milk) of each AMS (milking unit + air compressor) was measured using 2 energy analyzers. The measurement period lasted 24h with a sampling frequency of 0.2Hz. The daily total energy consumption (milking unit + air compressor) ranged between 45.4 and 81.3 kWh; the consumption per cow milked ranged between 0.59 and 0.99 kWh; the consumption per milking ranged between 0.21 and 0.33 kWh; and the consumption per 100L of milk ranged between 1.80 to 2.44 kWh according to the different configurations and operational contexts considered. Results showed that AMS electric consumption was mainly conditioned by farm management rather than machine characteristics/architectures. PMID:26971145

  3. Automatic Stabilization

    NASA Technical Reports Server (NTRS)

    Haus, FR

    1936-01-01

    This report lays more stress on the principles underlying automatic piloting than on the means of applications. Mechanical details of servomotors and the mechanical release device necessary to assure instantaneous return of the controls to the pilot in case of malfunction are not included. Descriptions are provided of various commercial systems.

  4. Automatic warranties.

    PubMed

    Decker, R

    1987-10-01

    In addition to express warranties (those specifically made by the supplier in the contract) and implied warranties (those resulting from circumstances of the sale), there is one other classification of warranties that needs to be understood by hospital materials managers. These are sometimes known as automatic warranties. In the following dialogue, Doctor Decker develops these legal concepts. PMID:10284977

  5. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    SciTech Connect

    Rashidi, M.; Dehmeshid, J.; Dickenson, E.; Daemi, F.

    1997-07-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurements within a porous medium. An aqueous fluid laced with a fluorescent dye or microspheres flows through a transparent, reflective-index-matched column packed with a transparent crystals. For illumination purposes, a planar sheet of lasers passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fluids have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows though the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder.The recorder images are acquired frame by frame and transferred to the computer foe processing by using a frame grabber and written relevant algorithms through an RD-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these measurement, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact imaged (at difference times during the experiments) that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentration as a function of time within the porous column.

  6. Automatic analysis of slips of the tongue: Insights into the cognitive architecture of speech production.

    PubMed

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-04-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When "pig" is mispronounced as "big," the resulting /b/ sound differs from correct productions of "big," moving towards intended "pig"-revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing. PMID:26779665

  7. Automatic regional analysis of DTI properties in the developmental macaque brain

    NASA Astrophysics Data System (ADS)

    Styner, Martin; Knickmeyer, Rebecca; Coe, Christopher; Short, Sarah J.; Gilmore, John

    2008-03-01

    Many neuroimaging studies are applied to monkeys as pathologies and environmental exposures can be studied in well-controlled settings and environment. In this work, we present a framework for the use of an atlas based, fully automatic segmentation of brain tissues, lobar parcellations, subcortical structures and the regional extraction of Diffusion Tensor Imaging (DTI) properties. We first built a structural atlas from training images by iterative, joint deformable registration into an unbiased average image. On this atlas, probabilistic tissue maps, a lobar parcellation and subcortical structures were determined. This information is applied to each subjects structural image via affine, followed by deformable registration. The affinely transformed atlas is employed for a joint T1 and T2 based tissue classification. The deformed parcellation regions mask the tissue segmentations to define the parcellation for white and gray matter separately. Each subjects structural image is then non-rigidly matched with its DTI image by normalized mutual information, b-spline based registration. The DTI property histograms were then computed using the probabilistic white matter information for each lobar parcellation. We successfully built an average atlas using a developmental training datasets of 18 cases aged 16-34 months. Our framework was successfully applied to over 50 additional subjects in the age range of 9 70 months. The probabilistically weighted FA average in the corpus callosum region showed the largest increase over time in the observed age range. Most cortical regions show modest FA increase, whereas the cerebellums FA values remained stable. The individual methods used in this segmentation framework have been applied before, but their combination is novel, as is their application to macaque MRI data. Furthermore, this is the first study to date looking at the DTI properties of the developing macaque brain.

  8. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    SciTech Connect

    Fang, Y; Huang, H; Su, T

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  9. Comparison of automatic control systems

    NASA Technical Reports Server (NTRS)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  10. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  11. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    NASA Astrophysics Data System (ADS)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78

  12. Automatic tremor detection and waveform component analysis using a neural network approach

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E. S.; Wang, T.; Potier, C. E.

    2010-12-01

    Recent studies over the last decade have established that non-volcanic tremor is a ubiquitous phenomenon commonly observed in subduction zones. In recent years, it has also been widely observed in strike-slip faulting environments. In particular, observations of tremor along the San Andreas Fault indicate that many of the events occur at depths ranging between 15 and 45 km, suggesting that tremor typically occurs in the zone where fault slip behaviour transitions between stick-slip and stable-sliding frictional regimes. As such, much of the tremor occurs at or below the depths of microseismicity, and therefore has the potential to provide clues about the slip behaviour of faults at depth. Despite several recent advances, the origin and characteristics of tremor along strike-slip faults are not well-resolved. The emergent phase arrivals, low amplitude waveforms, and variable event durations associated with non-volcanic tremor make automatic tremor event detection a non-trivial task. Recent approaches employ a cross-correlation technique which readily isolates individual template tremor bursts within tremor episodes. However the method tends to detect events with nearly identical waveforms and moveout across stations within an array. We employ a new method to identify tremor in large data volumes using an automated technique that does not require the use of a designated template event. Furthermore, the same method can be used to identify distinctive tremor waveform features, such as frequency content, polarity, and amplitude ratios. We use continuous broadband waveforms from 13 STS-2 seismometers deployed in May 2010 along the Cholame segment of the San Andreas Fault. The maximum station spacing within the array is approximately 25km. We first cross-correlate waveform envelopes to reduce the data volume and find isolated seismic events. Next, we use a neural network approach to cluster events in the reduced data set. Because the unsupervised neural network algorithm

  13. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  14. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  15. On 3-D modeling and automatic regridding in shape design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Yao, Tse-Min

    1987-01-01

    The material derivative idea of continuum mechanics and the adjoint variable method of design sensitivity analysis are used to obtain a computable expression for the effect of shape variations on measures of structural performance of three-dimensional elastic solids.

  16. Automatic spike train analysis and report generation. An implementation with R, R2HTML and STAR.

    PubMed

    Pouzat, Christophe; Chaffiol, Antoine

    2009-06-30

    Multi-electrode arrays (MEA) allow experimentalists to record extracellularly from many neurons simultaneously for long durations. They therefore often require that the data analyst spends a considerable amount of time first sorting the spikes, then doing again and again the same basic analysis on the different spike trains isolated from the raw data. This spike train analysis also often generates a considerable amount of figures, mainly diagnostic plots, that need to be stored (and/or printed) and organized for efficient subsequent use. The analysis of our data recorded from the first olfactory relay of an insect, the cockroach Periplaneta americana, has led us to settle on such "routine" spike train analysis procedures: one applied to spontaneous activity recordings, the other used with recordings where an olfactory stimulation was repetitively applied. We have developed a group of functions implementing a mixture of common and original procedures and producing graphical or numerical outputs. These functions can be run in batch mode and do moreover produce an organized report of their results in an HTML file. A R package: Spike Train Analysis with R (STAR) makes these functions readily available to the neurophysiologists community. Like R, STAR is open source and free. We believe that our basic analysis procedures are of general interest but they can also be very easily modified to suit user specific needs. PMID:19473708

  17. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  18. FamPipe: An Automatic Analysis Pipeline for Analyzing Sequencing Data in Families for Disease Studies.

    PubMed

    Chung, Ren-Hua; Tsai, Wei-Yun; Kang, Chen-Yu; Yao, Po-Ju; Tsai, Hui-Ju; Chen, Chia-Hsiang

    2016-06-01

    In disease studies, family-based designs have become an attractive approach to analyzing next-generation sequencing (NGS) data for the identification of rare mutations enriched in families. Substantial research effort has been devoted to developing pipelines for automating sequence alignment, variant calling, and annotation. However, fewer pipelines have been designed specifically for disease studies. Most of the current analysis pipelines for family-based disease studies using NGS data focus on a specific function, such as identifying variants with Mendelian inheritance or identifying shared chromosomal regions among affected family members. Consequently, some other useful family-based analysis tools, such as imputation, linkage, and association tools, have yet to be integrated and automated. We developed FamPipe, a comprehensive analysis pipeline, which includes several family-specific analysis modules, including the identification of shared chromosomal regions among affected family members, prioritizing variants assuming a disease model, imputation of untyped variants, and linkage and association tests. We used simulation studies to compare properties of some modules implemented in FamPipe, and based on the results, we provided suggestions for the selection of modules to achieve an optimal analysis strategy. The pipeline is under the GNU GPL License and can be downloaded for free at http://fampipe.sourceforge.net. PMID:27272119

  19. FamPipe: An Automatic Analysis Pipeline for Analyzing Sequencing Data in Families for Disease Studies

    PubMed Central

    Chung, Ren-Hua; Tsai, Wei-Yun; Kang, Chen-Yu; Yao, Po-Ju; Tsai, Hui-Ju; Chen, Chia-Hsiang

    2016-01-01

    In disease studies, family-based designs have become an attractive approach to analyzing next-generation sequencing (NGS) data for the identification of rare mutations enriched in families. Substantial research effort has been devoted to developing pipelines for automating sequence alignment, variant calling, and annotation. However, fewer pipelines have been designed specifically for disease studies. Most of the current analysis pipelines for family-based disease studies using NGS data focus on a specific function, such as identifying variants with Mendelian inheritance or identifying shared chromosomal regions among affected family members. Consequently, some other useful family-based analysis tools, such as imputation, linkage, and association tools, have yet to be integrated and automated. We developed FamPipe, a comprehensive analysis pipeline, which includes several family-specific analysis modules, including the identification of shared chromosomal regions among affected family members, prioritizing variants assuming a disease model, imputation of untyped variants, and linkage and association tests. We used simulation studies to compare properties of some modules implemented in FamPipe, and based on the results, we provided suggestions for the selection of modules to achieve an optimal analysis strategy. The pipeline is under the GNU GPL License and can be downloaded for free at http://fampipe.sourceforge.net. PMID:27272119

  20. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  1. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis

    PubMed Central

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R. G.; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  2. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  3. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis.

    PubMed

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R G; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  4. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    SciTech Connect

    Wei, J; Yuan, A; Li, G

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  5. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    NASA Astrophysics Data System (ADS)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  6. Automatic co-registration of space-based sensors for precision change detection and analysis

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  7. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    ERIC Educational Resources Information Center

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  8. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  9. Enzymatic Microreactors for the Determination of Ethanol by an Automatic Sequential Injection Analysis System

    NASA Astrophysics Data System (ADS)

    Alhadeff, Eliana M.; Salgado, Andrea M.; Cos, Oriol; Pereira, Nei; Valdman, Belkis; Valero, Francisco

    A sequential injection analysis system with two enzymatic microreactors for the determination of ethanol has been designed. Alcohol oxidase and horseradish peroxidase were separately immobilized on glass aminopropyl beads, and packed in 0.91-mL volume microreactors, working in line with the sequential injection analysis system. A stop flow of 120 s was selected for a linear ethanol range of 0.005-0.04 g/L±0.6% relative standard deviation with a throughput of seven analyses per hour. The system was applied to measure ethanol concentrations in samples of distilled and nondistilled alcoholic beverages, and of alcoholic fermentation with good performance and no significant difference compared with other analytical procedures (gas chromatography and high-performance liquid chromatography).

  10. A software tool for automatic analysis of selected area diffraction patterns within Digital Micrograph™.

    PubMed

    Wu, C H; Reynolds, W T; Murayama, M

    2012-01-01

    A software package "SADP Tools" is developed as a complementary diffraction pattern analysis tool. The core program, called AutoSADP, is designed to facilitate automated measurements of d-spacing and interplaner angles from TEM selected area diffraction patterns (SADPs) of single crystals. The software uses iterative cross correlations to locate the forward scattered beam position and to find the coordinates of the diffraction spots. The newly developed algorithm is suitable for fully automated analysis and it works well with asymmetric diffraction patterns, off-zone axis patterns, patterns with streaks, and noisy patterns such as Fast Fourier transforms of high-resolution images. The AutoSADP tool runs as a macro for the Digital Micrograph program and can determine d-spacing values and interplanar angles based on the pixel ratio with an accuracy of better than about 2%. PMID:22079497

  11. Automatic stabilization

    NASA Technical Reports Server (NTRS)

    Haus, FR

    1936-01-01

    This report concerns the study of automatic stabilizers and extends it to include the control of the three-control system of the airplane instead of just altitude control. Some of the topics discussed include lateral disturbed motion, static stability, the mathematical theory of lateral motion, and large angles of incidence. Various mechanisms and stabilizers are also discussed. The feeding of Diesel engines by injection pumps actuated by engine compression, achieves the required high speeds of injection readily and permits rigorous control of the combustible charge introduced into each cylinder and of the peak pressure in the resultant cycle.

  12. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis.

    PubMed

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored.In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  13. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  14. Analysis of cannabis in oral fluid specimens by GC-MS with automatic SPE.

    PubMed

    Choi, Hyeyoung; Baeck, Seungkyung; Kim, Eunmi; Lee, Sooyeun; Jang, Moonhee; Lee, Juseon; Choi, Hwakyung; Chung, Heesun

    2009-12-01

    Methamphetamine (MA) is the most commonly abused drug in Korea, followed by cannabis. Traditionally, MA analysis is carried out on both urine and hair samples and cannabis analysis in urine samples only. Despite the fact that oral fluid has become increasingly popular as an alternative specimen in the field of driving under the influence of drugs (DUID) and work place drug testing, its application has not been expanded to drug analysis in Korea. Oral fluid is easy to collect and handle and can provide an indication of recent drug abuse. In this study, we present an analytical method using GC-MS to determine tetrahydrocannabinol (THC) and its main metabolite 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in oral fluid. The validated method was applied to oral fluid samples collected from drug abuse suspects and the results were compared with those in urine. The stability of THC and THC-COOH in oral fluid stored in different containers was also investigated. Oral fluid specimens from 12 drug abuse suspects, submitted by the police, were collected by direct expectoration. The samples were screened with microplate ELISA. For confirmation they were extracted using automated SPE with mixed-mode cation exchange cartridge, derivatized and analyzed by GC-MS using selective ion monitoring (SIM). The concentrations ofTHC and THC-COOH in oral fluid showed a large variation and the results from oral fluid and urine samples from cannabis abusers did not show any correlation. Thus, detailed information about time interval between drug use and sample collection is needed to interpret the oral fluid results properly. In addition, further investigation about the detection time window ofTHC and THC-COOH in oral fluid is required to substitute oral fluid for urine in drug testing. PMID:20120601

  15. Semi-automatic tool for segmentation and volumetric analysis of medical images.

    PubMed

    Heinonen, T; Dastidar, P; Kauppinen, P; Malmivuo, J; Eskola, H

    1998-05-01

    Segmentation software is described, developed for medical image processing and run on Windows. The software applies basic image processing techniques through a graphical user interface. For particular applications, such as brain lesion segmentation, the software enables the combination of different segmentation techniques to improve its efficiency. The program is applied for magnetic resonance imaging, computed tomography and optical images of cryosections. The software can be utilised in numerous applications, including pre-processing for three-dimensional presentations, volumetric analysis and construction of volume conductor models. PMID:9747567

  16. The metagenomics RAST server - a public resource for the automatic phylogenetic and functional analysis of metagenomes.

    SciTech Connect

    Meyer, F.; Paarmann, D.; D'Souza, M.; Olson, R.; Glass, E. M.; Kubal, M.; Paczian, T.; Stevens, R.; Wilke, A.; Wilkening, J.; Edwards, R. A.; Rodriguez, A.; Mathematics and Computer Science; Univ. of Chicago; San Diego State Univ.

    2008-09-19

    Random community genomes (metagenomes) are now commonly used to study microbes in different environments. Over the past few years, the major challenge associated with metagenomics shifted from generating to analyzing sequences. High-throughput, low-cost next-generation sequencing has provided access to metagenomics to a wide range of researchers. A high-throughput pipeline has been constructed to provide high-performance computing to all researchers interested in using metagenomics. The pipeline produces automated functional assignments of sequences in the metagenome by comparing both protein and nucleotide databases. phylogenetic and functional summaries of the metagenomes are generated, and tools for comparative metagenomics are incorporated into the standard views. user access is controlled to ensure data privacy, but the collaborative environment underpinning the service provides a framework for sharing databasets between multiple users. In the metagenomics RAST, all users retain full control of their data, and everything is available for download in a variety of formats. The open-source metagenomics RAST service provides a new paradigm for the annotation and analysis of metagenomes. With built-in support for multiple data sources and a back end that houses abstract data types, the metagenomics RAST is stable, extensible, and freely available to all researchers. This service has removed one of the primary bottlenecks in metagenome sequence analysis--the available of high-performance computing for annotating the data.

  17. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    NASA Astrophysics Data System (ADS)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  18. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    PubMed

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design. PMID:24449648

  19. Automatic Tumor-Stroma Separation in Fluorescence TMAs Enables the Quantitative High-Throughput Analysis of Multiple Cancer Biomarkers

    PubMed Central

    Lahrmann, Bernd; Halama, Niels; Sinn, Hans-Peter; Schirmacher, Peter; Jaeger, Dirk; Grabe, Niels

    2011-01-01

    The upcoming quantification and automation in biomarker based histological tumor evaluation will require computational methods capable of automatically identifying tumor areas and differentiating them from the stroma. As no single generally applicable tumor biomarker is available, pathology routinely uses morphological criteria as a spatial reference system. We here present and evaluate a method capable of performing the classification in immunofluorescence histological slides solely using a DAPI background stain. Due to the restriction to a single color channel this is inherently challenging. We formed cell graphs based on the topological distribution of the tissue cell nuclei and extracted the corresponding graph features. By using topological, morphological and intensity based features we could systematically quantify and compare the discrimination capability individual features contribute to the overall algorithm. We here show that when classifying fluorescence tissue slides in the DAPI channel, morphological and intensity based features clearly outpace topological ones which have been used exclusively in related previous approaches. We assembled the 15 best features to train a support vector machine based on Keratin stained tumor areas. On a test set of TMAs with 210 cores of triple negative breast cancers our classifier was able to distinguish between tumor and stroma tissue with a total overall accuracy of 88%. Our method yields first results on the discrimination capability of features groups which is essential for an automated tumor diagnostics. Also, it provides an objective spatial reference system for the multiplex analysis of biomarkers in fluorescence immunohistochemistry. PMID:22164226

  20. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    NASA Astrophysics Data System (ADS)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  1. Automatic clustering and population analysis of white matter tracts using maximum density paths.

    PubMed

    Prasad, Gautam; Joshi, Shantanu H; Jahanshad, Neda; Villalon-Reina, Julio; Aganj, Iman; Lenglet, Christophe; Sapiro, Guillermo; McMahon, Katie L; de Zubicaray, Greig I; Martin, Nicholas G; Wright, Margaret J; Toga, Arthur W; Thompson, Paul M

    2014-08-15

    We introduce a framework for population analysis of white matter tracts based on diffusion-weighted images of the brain. The framework enables extraction of fibers from high angular resolution diffusion images (HARDI); clustering of the fibers based partly on prior knowledge from an atlas; representation of the fiber bundles compactly using a path following points of highest density (maximum density path; MDP); and registration of these paths together using geodesic curve matching to find local correspondences across a population. We demonstrate our method on 4-Tesla HARDI scans from 565 young adults to compute localized statistics across 50 white matter tracts based on fractional anisotropy (FA). Experimental results show increased sensitivity in the determination of genetic influences on principal fiber tracts compared to the tract-based spatial statistics (TBSS) method. Our results show that the MDP representation reveals important parts of the white matter structure and considerably reduces the dimensionality over comparable fiber matching approaches. PMID:24747738

  2. Automatic neuron segmentation and neural network analysis method for phase contrast microscopy images

    PubMed Central

    Pang, Jincheng; Özkucur, Nurdan; Ren, Michael; Kaplan, David L.; Levin, Michael; Miller, Eric L.

    2015-01-01

    Phase Contrast Microscopy (PCM) is an important tool for the long term study of living cells. Unlike fluorescence methods which suffer from photobleaching of fluorophore or dye molecules, PCM image contrast is generated by the natural variations in optical index of refraction. Unfortunately, the same physical principles which allow for these studies give rise to complex artifacts in the raw PCM imagery. Of particular interest in this paper are neuron images where these image imperfections manifest in very different ways for the two structures of specific interest: cell bodies (somas) and dendrites. To address these challenges, we introduce a novel parametric image model using the level set framework and an associated variational approach which simultaneously restores and segments this class of images. Using this technique as the basis for an automated image analysis pipeline, results for both the synthetic and real images validate and demonstrate the advantages of our approach. PMID:26601004

  3. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  4. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature

    PubMed Central

    MacGillivray, Thomas J; Cameron, James R.; Zhang, Qiuli; El-Medany, Ahmed; Mulholland, Carl; Sheng, Ziyan; Dhillon, Bal; Doubal, Fergus N.; Foster, Paul J.

    2015-01-01

    Purpose To assess the suitability of retinal images held in the UK Biobank - the largest retinal data repository in a prospective population-based cohort - for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease. Methods Non-mydriatic fundus images from both eyes of 2,690 participants - people with a self-reported history of myocardial infarction (n=1,345) and a matched control group (n=1,345) - were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry. Results Total operator time was approximately 360 hours (4 minutes per image). 2,252 (84%) of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60%) of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004) and analysed (p<0.0001). Cases exhibited more acute arteriolar branching angles (p=0.02) as well as lower arteriolar and venular tortuosity (p<0.0001). Conclusions A proportion of the retinal images in UK Biobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group. PMID:26000792

  5. Operating safety of automatic objects

    NASA Astrophysics Data System (ADS)

    Maiorov, Anatolii Vladimirovich; Moskatov, Genrikh Karlovich; Shibanov, Georgii Petrovich

    Operating-safety assurance for automatic objects (aircraft, spacecraft, and underwater vehicles) is considered in the framework of safety-automata theory and automatic-control considerations. The interaction between the operator and the safety-assurance facilities is considered. Methodological recommendations are presented on the specification of reliability requirements for the vehicles considered, as well as on automata synthesis and analysis considerations, test planning, and the analysis of test results.

  6. Evaluation of Septa Quality for Automatic SPME–GC–MS Trace Analysis

    PubMed Central

    Ulanowska, Agnieszka; Ligor, Tomasz; Amann, Anton; Buszewski, Bogusław

    2012-01-01

    The vials used for the preparation of breath samples for automated solid-phase microextraction–gas chromatography–mass spectrometry analysis are crimped with septa. These septa often emit specific volatile organic compounds (VOCs) confounding the measurement results of breath samples. In the current paper, 14 different brands of magnetic caps with silicone–polytetrafluoroethylene (PTFE), butyl–PTFE, or butyl rubber septa were tested. The total emission of septa over a 4 h period was also evaluated. The tested septa emitted 39 different compounds, which are mainly hydrocarbons, alcohols, and ketones. Acetone and toluene are the most abundant out-gassing products. The concentration of acetone was in the range from 55 to 694 ppb for butyl–PTFE septum (brand 14) and butyl rubber (brand 10), respectively. The measured toluene amount was 69–1323 ppb for the septum brand 14 and brand 8 (silicone–PTFE), respectively. Generally, the butyl rubber septa released higher amounts of contaminants in comparison to the silicone ones. PMID:22291050

  7. Inter-observer Variability Analysis of Automatic Lung Delineation in Normal and Disease Patients.

    PubMed

    Saba, Luca; Than, Joel C M; Noor, Norliza M; Rijal, Omar M; Kassim, Rosminah M; Yunus, Ashari; Ng, Chue R; Suri, Jasjit S

    2016-06-01

    Human interaction has become almost mandatory for an automated medical system wishing to be accepted by clinical regulatory agencies such as Food and Drug Administration. Since this interaction causes variability in the gathered data, the inter-observer and intra-observer variability must be analyzed in order to validate the accuracy of the system. This study focuses on the variability from different observers that interact with an automated lung delineation system that relies on human interaction in the form of delineation of the lung borders. The database consists of High Resolution Computed Tomography (HRCT): 15 normal and 81 diseased patients' images taken retrospectively at five levels per patient. Three observers manually delineated the lungs borders independently and using software called ImgTracer™ (AtheroPoint™, Roseville, CA, USA) to delineate the lung boundaries in all five levels of 3-D lung volume. The three observers consisted of Observer-1: lesser experienced novice tracer who is a resident in radiology under the guidance of radiologist, whereas Observer-2 and Observer-3 are lung image scientists trained by lung radiologist and biomedical imaging scientist and experts. The inter-observer variability can be shown by comparing each observer's tracings to the automated delineation and also by comparing each manual tracing of the observers with one another. The normality of the tracings was tested using D'Agostino-Pearson test and all observers tracings showed a normal P-value higher than 0.05. The analysis of variance (ANOVA) test between three observers and automated showed a P-value higher than 0.89 and 0.81 for the right lung (RL) and left lung (LL), respectively. The performance of the automated system was evaluated using Dice Similarity Coefficient (DSC), Jaccard Index (JI) and Hausdorff (HD) Distance measures. Although, Observer-1 has lesser experience compared to Obsever-2 and Obsever-3, the Observer Deterioration Factor (ODF) shows that

  8. Automaticity of Conceptual Magnitude.

    PubMed

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-01-01

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object's conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system. PMID:26879153

  9. Automaticity of Conceptual Magnitude

    PubMed Central

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-01-01

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object’s conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system. PMID:26879153

  10. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    PubMed Central

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  11. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    PubMed

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  12. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    PubMed

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  13. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice. PMID:15887778

  14. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    NASA Astrophysics Data System (ADS)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  15. An automatic sleep-stage analysis system with off-line high-speed processing using a super mini-computer.

    PubMed

    Kuwahara, H; Tanaka, M; Mizuki, Y; Suetsugi, M

    1996-01-01

    An automatic sleep analysis system using a super mini-computer was developed. The system improved and expanded the data processed by the mini-computer. It had the following features: 1) wave-forms were collected and analyzed at a high speed (reproduced at 10 or 20 times the speed of a data recorder) by an off-line procedure to utilize the computer resources more efficiently; 2) all information and the original wave-forms were output to a laser printer because of the lower cost and more efficient arrangement of the data; 3) various wave-form parameters were measured by wave-form analysis; 4) the application program was based on general-purpose language; and 5) wave-form reanalysis and reconstruction of the logic was easily implemented for automatic evaluation of the sleep stages. Automatic analysis of the sleep stages was impossible for 15 of 1484 periods (20 sec per period) with one of the cases analyzed, and 142 of 1484 periods had to be corrected because of erroneous identification. PMID:8942145

  16. A semantic approach to the efficient integration of interactive and automatic target recognition systems for the analysis of complex infrastructure from aerial imagery

    NASA Astrophysics Data System (ADS)

    Bauer, A.; Peinsipp-Byma, E.

    2008-04-01

    The analysis of complex infrastructure from aerial imagery, for instance a detailed analysis of an airfield, requires the interpreter, besides to be familiar with the sensor's imaging characteristics, to have a detailed understanding of the infrastructure domain. The required domain knowledge includes knowledge about the processes and functions involved in the operation of the infrastructure, the potential objects used to provide those functions and their spatial and functional interrelations. Since it is not possible yet to provide reliable automatic object recognition (AOR) for the analysis of such complex scenes, we developed systems to support a human interpreter with either interactive approaches, able to assist the interpreter with previously acquired expert knowledge about the domain in question, or AOR methods, capable of detecting, recognizing or analyzing certain classes of objects for certain sensors. We believe, to achieve an optimal result at the end of an interpretation process in terms of efficiency and effectivity, it is essential to integrate both interactive and automatic approaches to image interpretation. In this paper we present an approach inspired by the advancing semantic web technology to represent domain knowledge, the capabilities of available AOR modules and the image parameters in an explicit way. This enables us to seamlessly extend an interactive image interpretation environment with AOR modules in a way that we can automatically select suitable AOR methods for the current subtask, focus them on an appropriate area of interest and reintegrate their results into the environment.

  17. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    PubMed

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes. PMID:12435377

  18. English Grammar, A Combined Tagmemic and Transformational Approach. A Constrastive Analysis of English and Vietnamese, Vol. 1. Linguistic Circle of Canberra Publications, Series C--Books, No. 3.

    ERIC Educational Resources Information Center

    Nguyen, Dang Liem

    This is the first volume of a contrastive analysis of English and Vietnamese in the light of a combined tagmemic and transformational approach. The dialects contrasted are Midwest Standard American English and Standard Saigon Vietnamese. The study has been designed chiefly for pedagogical applications. A general introduction gives the history of…

  19. Automatic transmission

    SciTech Connect

    Miura, M.; Inuzuka, T.

    1986-08-26

    1. An automatic transmission with four forward speeds and one reverse position, is described which consists of: an input shaft; an output member; first and second planetary gear sets each having a sun gear, a ring gear and a carrier supporting a pinion in mesh with the sun gear and ring gear; the carrier of the first gear set, the ring gear of the second gear set and the output member all being connected; the ring gear of the first gear set connected to the carrier of the second gear set; a first clutch means for selectively connecting the input shaft to the sun gear of the first gear set, including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a second clutch means for selectively connecting the input shaft to the sun gear of the second gear set a third clutch means for selectively connecting the input shaft to the carrier of the second gear set including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a first drive-establishing means for selectively preventing rotation of the ring gear of the first gear set and the carrier of the second gear set in only one direction and, alternatively, in any direction; a second drive-establishing means for selectively preventing rotation of the sun gear of the second gear set; and a drum being open to the first planetary gear set, with a cylindrical intermediate wall, an inner peripheral wall and outer peripheral wall and forming the hydraulic servos of the first and third clutch means between the intermediate wall and the inner peripheral wall and between the intermediate wall and the outer peripheral wall respectively.

  20. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    ERIC Educational Resources Information Center

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  1. "AID"-ing Academic Program Evaluation: The "Automatic Interaction Detector" as Analysis Tool. AIR 1984 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Bloom, Allan M.; And Others

    The use of of the Automatic Interaction Detector (program AID3 of the OSIRIS statistical package) to study a university program is discussed. The performance of students who took general physics lecture and laboratory concurrently is compared to the performance of those who took them separately. Five years of data are analyzed, covering 1,997…

  2. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  3. Robust automatic P-phase picking: an on-line implementation in the analysis of broadband seismogram recordings

    NASA Astrophysics Data System (ADS)

    Sleeman, Reinoud; van Eck, Torild

    1999-06-01

    The onset of a seismic signal is determined through joint AR modeling of the noise and the seismic signal, and the application of the Akaike Information Criterion (AIC) using the onset time as parameter. This so-called AR-AIC phase picker has been tested successfully and implemented on the Z-component of the broadband station HGN to provide automatic P-phase picks for a rapid warning system. The AR-AIC picker is shown to provide accurate and robust automatic picks on a large experimental database. Out of 1109 P-phase onsets with signal-to-noise ratio (SNR) above 1 from local, regional and teleseismic earthquakes, our implementation detects 71% and gives a mean difference with manual picks of 0.1 s. An optimal version of the well-established picker of Baer and Kradolfer [Baer, M., Kradolfer, U., An automatic phase picker for local and teleseismic events, Bull. Seism. Soc. Am. 77 (1987) 1437-1445] detects less than 41% and gives a mean difference with manual picks of 0.3 s using the same dataset.

  4. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  5. Automatic Spectroscopic Data Categorization by Clustering Analysis (ASCLAN): A Data-Driven Approach for Distinguishing Discriminatory Metabolites for Phenotypic Subclasses.

    PubMed

    Zou, Xin; Holmes, Elaine; Nicholson, Jeremy K; Loo, Ruey Leng

    2016-06-01

    We propose a novel data-driven approach aiming to reliably distinguish discriminatory metabolites from nondiscriminatory metabolites for a given spectroscopic data set containing two biological phenotypic subclasses. The automatic spectroscopic data categorization by clustering analysis (ASCLAN) algorithm aims to categorize spectral variables within a data set into three clusters corresponding to noise, nondiscriminatory and discriminatory metabolites regions. This is achieved by clustering each spectral variable based on the r(2) value representing the loading weight of each spectral variable as extracted from a orthogonal partial least-squares discriminant (OPLS-DA) model of the data set. The variables are ranked according to r(2) values and a series of principal component analysis (PCA) models are then built for subsets of these spectral data corresponding to ranges of r(2) values. The Q(2)X value for each PCA model is extracted. K-means clustering is then applied to the Q(2)X values to generate two clusters based on minimum Euclidean distance criterion. The cluster consisting of lower Q(2)X values is deemed devoid of metabolic information (noise), while the cluster consists of higher Q(2)X values is then further subclustered into two groups based on the r(2) values. We considered the cluster with high Q(2)X but low r(2) values as nondiscriminatory, while the cluster with high Q(2)X and r(2) values as discriminatory variables. The boundaries between these three clusters of spectral variables, on the basis of the r(2) values were considered as the cut off values for defining the noise, nondiscriminatory and discriminatory variables. We evaluated the ASCLAN algorithm using six simulated (1)H NMR spectroscopic data sets representing small, medium and large data sets (N = 50, 500, and 1000 samples per group, respectively), each with a reduced and full resolution set of variables (0.005 and 0.0005 ppm, respectively). ASCLAN correctly identified all discriminatory

  6. A semi-automatic measurement system based on digital image analysis for the application to the single fiber fragmentation test

    NASA Astrophysics Data System (ADS)

    Blobel, Swen; Thielsch, Karin; Ulbricht, Volker

    2013-04-01

    The computational prediction of the effective macroscopic material behavior of fiber reinforced composites is a goal of research to exploit the potential of these materials. Besides the mechanical characteristics of the material components, an extensive knowledge of the mechanical interaction between these components is necessary in order to set-up suitable models of the local material structure. For example, an experimental investigation of the micromechanical damage behavior of simplified composite specimens can help to understand the mechanisms, which causes matrix and interface damage in the vicinity of a fiber fracture. To realize an appropriate experimental setup, a novel semi-automatic measurement system based on the analysis of digital images using photoelasticity and image correlation was developed. Applied to specimens with a birefringent matrix material, it is able to provide global and local information of the damage evolution and the stress and strain state at the same time. The image acquisition is accomplished using a long distance microscopic optic with an effective resolution of two micrometer per pixel. While the system is moved along the domain of interest of the specimen, the acquired images are assembled online and used to interpret optically extracted information in combination with global force-displacement curves provided by the load frame. The illumination of the specimen with circularly polarized light and the projection of the transmitted light through different configurations of polarizer and quarterwave-plates enables the synchronous capturing of four images at the quadrants of a four megapixel image sensor. The fifth image is decoupled from the same optical path and is projected to a second camera chip, to get a non-polarized image of the same scene at the same time. The benefit of this optical setup is the opportunity to extract a wide range of information locally, without influence on the progress of the experiment. The four images

  7. CIRF Publications, Vol. 12, No. 5.

    ERIC Educational Resources Information Center

    International Labour Office, Geneva (Switzerland).

    CIRF Publications, Vol. 12, No. 5 is a collection of 80 abstracts giving particular attention to education, training, and economic growth in developing countries, Iran, Japan, Kenya, the Solomon Islands, and Sri Lanka; vocational rehabilitation in Italy, Spain, the United Kingdom, and the U. S. A.; agriculture in Chad, developing countries, and…

  8. Fully Automatic Determination of Soil Bacterium Numbers, Cell Volumes, and Frequencies of Dividing Cells by Confocal Laser Scanning Microscopy and Image Analysis

    PubMed Central

    Bloem, J.; Veninga, M.; Shepherd, J.

    1995-01-01

    We describe a fully automatic image analysis system capable of measuring cell numbers, volumes, lengths, and widths of bacteria in soil smears. The system also determines the number of cells in agglomerates and thus provides the frequency of dividing cells (FDC). Images are acquired from a confocal laser scanning microscope. The grey images are smoothed by convolution and by morphological erosion and dilation to remove noise. The background is equalized by flooding holes in the image and is then subtracted by two top hat transforms. Finally, the grey image is sharpened by delineation, and all particles above a fixed threshold are detected. The number of cells in each detected particle is determined by counting the number of local grey-level maxima in the particle. Thus, up to 1,500 cells in 10 fields of view in a soil smear are analyzed in 30 min without human intervention. Automatic counts of cell numbers and FDC were similar to visual counts in field samples. In microcosms, automatic measurements showed significant increases in cell numbers, FDC, mean cell volume, and length-to-width ratio after amendment of the soil. Volumes of fluorescent microspheres were measured with good approximation, but the absolute values obtained were strongly affected by the settings of the detector sensitivity. Independent measurements of bacterial cell numbers and volumes by image analysis and of cell carbon by a total organic carbon analyzer yielded an average specific carbon content of 200 fg of C (mu)m(sup-3), which indicates that our volume estimates are reasonable. PMID:16534976

  9. Innovative automatic resonant mode identification for nano-scale dynamic full-field characterization of MEMS using interferometric fringe analysis

    NASA Astrophysics Data System (ADS)

    Chen, Liang-Chia; Huang, Yao-Ting; Lai, Huang-Wen; Chen, Jin-Liang; Chang, Calvin C.

    2008-12-01

    A dynamic 3D nano-scale surface profilometer was successfully developed for novel automatic resonant frequency identification using stroboscopic interferometric principle. With rapid increase in the application of micro electromechanical systems (MEMS) to industries, the needs of accurate dynamic characterization have become a major challenge in design and fabrication. In view of such, an interferometric microscopy was developed using LED stroboscopic interferometry to achieve dynamic full-field profilometry and characterization of MEMS with a measurement bandwidth exceeding 1 MHz. Most importantly, a novel detection algorithm was also developed employing interferogram fringe density measure for automatic resonant frequency identification. Natural resonant modes of a series of microstructures can be accurately detected, giving values consistent with theoretical ones. To verify the effectiveness of the developed methodology, an AFM cantilever microbeam and a cross-bridge microbeam were measured to analyze their full-field resonant vibratory shapes. Our experimental results confirmed that the resonant vibration of the tested beams can be fully characterized while achieving an accuracy in vertical measurement of 3-5 nm with a vertical measurement range of tens of micrometers.

  10. A prostate CAD system based on multiparametric analysis of DCE T1-w, and DW automatically registered images

    NASA Astrophysics Data System (ADS)

    Giannini, Valentina; Vignati, Anna; Mazzetti, Simone; De Luca, Massimo; Bracco, Christian; Stasi, Michele; Russo, Filippo; Armando, Enrico; Regge, Daniele

    2013-02-01

    Prostate specific antigen (PSA)-based screening reduces the rate of death from prostate cancer (PCa) by 31%, but this benefit is associated with a high risk of overdiagnosis and overtreatment. As prostate transrectal ultrasound-guided biopsy, the standard procedure for prostate histological sampling, has a sensitivity of 77% with a considerable false-negative rate, more accurate methods need to be found to detect or rule out significant disease. Prostate magnetic resonance imaging has the potential to improve the specificity of PSA-based screening scenarios as a non-invasive detection tool, in particular exploiting the combination of anatomical and functional information in a multiparametric framework. The purpose of this study was to describe a computer aided diagnosis (CAD) method that automatically produces a malignancy likelihood map by combining information from dynamic contrast enhanced MR images and diffusion weighted images. The CAD system consists of multiple sequential stages, from a preliminary registration of images of different sequences, in order to correct for susceptibility deformation and/or movement artifacts, to a Bayesian classifier, which fused all the extracted features into a probability map. The promising results (AUROC=0.87) should be validated on a larger dataset, but they suggest that the discrimination on a voxel basis between benign and malignant tissues is feasible with good performances. This method can be of benefit to improve the diagnostic accuracy of the radiologist, reduce reader variability and speed up the reading time, automatically highlighting probably cancer suspicious regions.

  11. Automatic screening of narrow anterior chamber angle and angle-closure glaucoma based on slit-lamp image analysis by using support vector machine.

    PubMed

    Theeraworn, C; Kongprawechnon, W; Kondo, T; Bunnun, P; Nishihara, A; Manassakorn, A

    2013-01-01

    At present, Van Herick's method is a standard technique used to screen a Narrow Anterior Chamber Angle (NACA) and Angle-Closure Glaucoma (ACG). It can identify a patient who suffers from NACA and ACG by considering the width of peripheral anterior chamber depth (PACD) and corneal thickness. However, the screening result of this method often varies among ophthalmologists. So, an automatic screening of NACA and ACG based on slit-lamp image analysis by using Support Vector Machine (SVM) is proposed. SVM can automatically generate the classification model, which is used to classify the result as an angle-closure likely or an angle-closure unlikely. It shows that it can improve the accuracy of the screening result. To develop the classification model, the width of PACD and corneal thickness from many positions are measured and selected to be features. A statistic analysis is also used in the PACD and corneal thickness estimation in order to reduce the error from reflection on the cornea. In this study, it is found that the generated models are evaluated by using 5-fold cross validation and give a better result than the result classified by Van Herick's method. PMID:24111078

  12. Automatic recognition of T and teleseismic P waves by statistical analysis of their spectra: An application to continuous records of moored hydrophones

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexey; Irisson, Jean-Olivier; Perrot, Julie; Nolet, Guust

    2014-08-01

    A network of moored hydrophones is an effective way of monitoring seismicity of oceanic ridges since it allows detection and localization of underwater events by recording generated T waves. The high cost of ship time necessitates long periods (normally a year) of autonomous functioning of the hydrophones, which results in very large data sets. The preliminary but indispensable part of the data analysis consists of identifying all T wave signals. This process is extremely time consuming if it is done by a human operator who visually examines the entire database. We propose a new method for automatic signal discrimination based on the Gradient Boosted Decision Trees technique that uses the distribution of signal spectral power among different frequency bands as the discriminating characteristic. We have applied this method to automatically identify the types of acoustic signals in data collected by two moored hydrophones in the North Atlantic. We show that the method is capable of efficiently resolving the signals of seismic origin with a small percentage of wrong identifications and missed events: 1.2% and 0.5% for T waves and 14.5% and 2.8% for teleseismic P waves, respectively. In addition, good identification rates for signals of other types (iceberg and ship generated) are obtained. Our results indicate that the method can be successfully applied to automate the analysis of other (not necessarily acoustic) databases provided that enough information is available to describe statistical properties of the signals to be identified.

  13. Automatic beamline calibration procedures

    SciTech Connect

    Corbett, W.J.; Lee, M.J.; Zambre, Y.

    1992-03-01

    Recent experience with the SLC and SPEAR accelerators have led to a well-defined set of procedures for calibration of the beamline model using the orbit fitting program, RESOLVE. Difference orbit analysis is used to calibrate quadrupole strengths, BPM sensitivities, corrector strengths, focusing effects from insertion devices, and to determine the source of dispersion and coupling errors. Absolute orbit analysis is used to locate quadrupole misalignments, BPM offsets, or beam loss. For light source applications, the photon beam source coordinates can be found. The result is an accurate model of the accelerator which can be used for machine control. In this paper, automatable beamline calibration procedures are outlined and illustrated with recent examples. 5 refs.

  14. Annual Report: Automatic Informative Abstracting and Extracting.

    ERIC Educational Resources Information Center

    Earl, L. L.; And Others

    The development of automatic indexing, abstracting, and extracting systems is investigated. Part I describes the development of tools for making syntactic and semantic distinctions of potential use in automatic indexing and extracting. One of these tools is a program for syntactic analysis (i.e., parsing) of English, the other is a dictionary of…

  15. Dose equations for tube current modulation in CT scanning and the interpretation of the associated CTDI{sub vol}

    SciTech Connect

    Dixon, Robert L.; Boone, John M.

    2013-11-15

    Purpose: The scanner-reported CTDI{sub vol} for automatic tube current modulation (TCM) has a different physical meaning from the traditional CTDI{sub vol} at constant mA, resulting in the dichotomy “CTDI{sub vol} of the first and second kinds” for which a physical interpretation is sought in hopes of establishing some commonality between the two.Methods: Rigorous equations are derived to describe the accumulated dose distributions for TCM. A comparison with formulae for scanner-reported CTDI{sub vol} clearly identifies the source of their differences. Graphical dose simulations are also provided for a variety of TCM tube current distributions (including constant mA), all having the same scanner-reported CTDI{sub vol}.Results: These convolution equations and simulations show that the local dose at z depends only weakly on the local tube current i(z) due to the strong influence of scatter from all other locations along z, and that the “local CTDI{sub vol}(z)” does not represent a local dose but rather only a relative i(z) ≡ mA(z). TCM is a shift-variant technique to which the CTDI-paradigm does not apply and its application to TCM leads to a CTDI{sub vol} of the second kind which lacks relevance.Conclusions: While the traditional CTDI{sub vol} at constant mA conveys useful information (the peak dose at the center of the scan length), CTDI{sub vol} of the second kind conveys no useful information about the associated TCM dose distribution it purportedly represents and its physical interpretation remains elusive. On the other hand, the total energy absorbed E (“integral dose”) as well as its surrogate DLP remain robust between variable i(z) TCM and constant current i{sub 0} techniques, both depending only on the total mAs =t{sub 0}=i{sub 0} t{sub 0} during the beam-on time t{sub 0}.

  16. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  17. Algorithms for skiascopy measurement automatization

    NASA Astrophysics Data System (ADS)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  18. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  19. Quantitative Analysis of Heavy Metals in Water Based on LIBS with an Automatic Device for Sample Preparation

    NASA Astrophysics Data System (ADS)

    Hu, Li; Zhao, Nanjing; Liu, Wenqing; Meng, Deshuo; Fang, Li; Wang, Yin; Yu, Yang; Ma, Mingjun

    2015-08-01

    Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. supported by National Natural Science Foundation of China (No. 60908018), National High Technology Research and Development Program of China (No. 2013AA065502) and Anhui Province Outstanding Youth Science Fund of China (No. 1108085J19)

  20. Experimental analysis of perching in the European starling (Sturnus vulgaris: Passeriformes; Passeres), and the automatic perching mechanism of birds.

    PubMed

    Galton, Peter M; Shepherd, Jeffrey D

    2012-04-01

    The avian automatic perching mechanism (APM) involves the automatic digital flexor mechanism (ADFM) and the digital tendon-locking mechanism (DTLM). When birds squat on a perch to sleep, the increased tendon travel distance due to flexion of the knee and ankle supposedly causes the toes to grip the perch (ADFM) and engage the DTLM so perching while sleeping involves no muscular effort. However, the knees and ankles of sleeping European starlings (Sturnus vulgaris) are only slightly flexed and, except for occasional balancing adjustments, the distal two-thirds of the toes are not flexed to grip a 6-mm-diameter perch. The cranial ankle angle (CAA) is ∼120° and the foot forms an inverted "U" that, with the mostly unflexed toes, provides a saddle-like structure so the bird balances its weight over the central pad of the foot (during day weight further back and digits actively grasp perch). In the region of the pad, the tendon sheath of many birds is unribbed, or only very slightly so, and it is always separated from the tendon of the M. flexor digitorum longus by tendons of the other toe flexor muscles. Passive leg flexion produces no toe flexion in anesthetized Starlings and only after 15-20 min, at the onset of rigor mortis, in freshly sacrificed Starlings. Anesthetized Starlings could not remain perched upon becoming unconscious (ADFM, DTLM intact). Birds whose digital flexor tendons were severed or the locking mechanism eliminated surgically (no ADFM or DTLM), so without ability to flex their toes, slept on the perch in a manner similar to unoperated Starlings (except CAA ∼90°-110°). Consequently, there is no APM or ADFM and the DTLM, although involved in lots of other activities, only acts in perching with active contraction of the digital flexor muscles. PMID:22539208

  1. Application of a method for the automatic detection and Ground-Based Velocity Track Display (GBVTD) analysis of a tornado crossing the Hong Kong International Airport

    NASA Astrophysics Data System (ADS)

    Chan, P. W.; Wurman, J.; Shun, C. M.; Robinson, P.; Kosiba, K.

    2012-03-01

    A weak tornado with a maximum Doppler velocity shear of about 40 m s - 1 moved across the Hong Kong International Airport (HKIA) during the evening of 20 May 2002. The tornado caused damage equivalent to F0 on the Fujita Scale, based on a damage survey. The Doppler velocity data from the Hong Kong Terminal Doppler Weather Radar (TDWR) are studied using the Ground-Based Velocity Track Display (GBVTD) method of single Doppler analysis. The GBVTD analysis is able to clearly depict the development and decay of the tornado though it appears to underestimate its magnitude. In the pre-tornadic state, the wind field is characterized by inflow toward the center near the ground and upward motion near the center. When the tornado attains its maximum strength, an eye-like structure with a downdraft appears to form in the center. Several minutes later the tornado begins to decay and outflow dominates at low levels. Assuming cyclostrophic balance, the pressure drop 200 m from the center of the tornado at its maximum strength is calculated to be about 6 hPa. To estimate the maximum ground-relative wind speed of the tornado, the TDWR's Doppler velocities are adjusted for the ratio of the sample-volume size of the radar and the radius of the tornado, resulting in a peak wind speed of 28 m s - 1 , consistent with the readings from a nearby ground-based anemometers and the F0 damage observed. An automatic tornado detection algorithm based on Doppler velocity difference (delta-V) and temporal and spatial continuity is applied to this event. The locations and the core flow radii of the tornado as determined by the automatic method and by subjective analysis agree closely.

  2. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  3. Automatic Analysis and Classification of the Roof Surfaces for the Installation of Solar Panels Using a Multi-Data Source and Multi-Sensor Aerial Platform

    NASA Astrophysics Data System (ADS)

    López, L.; Lagüela, S.; Picon, I.; González-Aguilera, D.

    2015-02-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbour solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the surfaces, slopes, orientations and the existence of obstacles. This way, large areas may be efficiently analysed obtaining as final result the optimal locations for the placement of solar panels as well as the required geometry of the supports for the installation of the panels in those roofs where geometry is not optimal.

  4. Automatic method of analysis of OCT images in the assessment of the tooth enamel surface after orthodontic treatment with fixed braces

    PubMed Central

    2014-01-01

    Introduction Fixed orthodontic appliances, despite years of research and development, still raise a lot of controversy because of its potentially destructive influence on enamel. Therefore, it is necessary to quantitatively assess the condition and therein the thickness of tooth enamel in order to select the appropriate orthodontic bonding and debonding methodology as well as to assess the quality of enamel after treatment and clean-up procedure in order to choose the most advantageous course of treatment. One of the assessment methods is computed tomography where the measurement of enamel thickness and the 3D reconstruction of image sequences can be performed fully automatically. Material and method OCT images of 180 teeth were obtained from the Topcon 3D OCT-2000 camera. The images were obtained in vitro by performing sequentially 7 stages of treatment on all the teeth: before any interference into enamel, polishing with orthodontic paste, etching and application of a bonding system, orthodontic bracket bonding, orthodontic bracket removal, cleaning off adhesive residue. A dedicated method for the analysis and processing of images involving median filtering, mathematical morphology, binarization, polynomial approximation and the active contour method has been proposed. Results The obtained results enable automatic measurement of tooth enamel thickness in 5 seconds using the Core i5 CPU M460 @ 2.5GHz 4GB RAM. For one patient, the proposed method of analysis confirms enamel thickness loss of 80 μm (from 730 ± 165 μm to 650 ± 129 μm) after polishing with paste, enamel thickness loss of 435 μm (from 730 ± 165 μm to 295 ± 55 μm) after etching and bonding resin application, growth of a layer having a thickness of 265 μm (from 295 ± 55 μm to 560 ± 98 μm after etching) which is the adhesive system. After removing an orthodontic bracket, the adhesive residue was 105 μm and after cleaning it off, the enamel thickness was

  5. An anatomy of automatism.

    PubMed

    Mackay, R D

    2015-07-01

    The automatism defence has been described as a quagmire of law and as presenting an intractable problem. Why is this so? This paper will analyse and explore the current legal position on automatism. In so doing, it will identify the problems which the case law has created, including the distinction between sane and insane automatism and the status of the 'external factor doctrine', and comment briefly on recent reform proposals. PMID:26378105

  6. Automatic identification of fault surfaces through Object Based Image Analysis of a Digital Elevation Model in the submarine area of the North Aegean Basin

    NASA Astrophysics Data System (ADS)

    Argyropoulou, Evangelia

    2015-04-01

    The current study was focused on the seafloor morphology of the North Aegean Basin in Greece, through Object Based Image Analysis (OBIA) using a Digital Elevation Model. The goal was the automatic extraction of morphologic and morphotectonic features, resulting into fault surface extraction. An Object Based Image Analysis approach was developed based on the bathymetric data and the extracted features, based on morphological criteria, were compared with the corresponding landforms derived through tectonic analysis. A digital elevation model of 150 meters spatial resolution was used. At first, slope, profile curvature, and percentile were extracted from this bathymetry grid. The OBIA approach was developed within the eCognition environment. Four segmentation levels were created having as a target "level 4". At level 4, the final classes of geomorphological features were classified: discontinuities, fault-like features and fault surfaces. On previous levels, additional landforms were also classified, such as continental platform and continental slope. The results of the developed approach were evaluated by two methods. At first, classification stability measures were computed within eCognition. Then, qualitative and quantitative comparison of the results took place with a reference tectonic map which has been created manually based on the analysis of seismic profiles. The results of this comparison were satisfactory, a fact which determines the correctness of the developed OBIA approach.

  7. Automatic crack propagation tracking

    NASA Technical Reports Server (NTRS)

    Shephard, M. S.; Weidner, T. J.; Yehia, N. A. B.; Burd, G. S.

    1985-01-01

    A finite element based approach to fully automatic crack propagation tracking is presented. The procedure presented combines fully automatic mesh generation with linear fracture mechanics techniques in a geometrically based finite element code capable of automatically tracking cracks in two-dimensional domains. The automatic mesh generator employs the modified-quadtree technique. Crack propagation increment and direction are predicted using a modified maximum dilatational strain energy density criterion employing the numerical results obtained by meshes of quadratic displacement and singular crack tip finite elements. Example problems are included to demonstrate the procedure.

  8. Automatic brain tumor segmentation

    NASA Astrophysics Data System (ADS)

    Clark, Matthew C.; Hall, Lawrence O.; Goldgof, Dmitry B.; Velthuizen, Robert P.; Murtaugh, F. R.; Silbiger, Martin L.

    1998-06-01

    A system that automatically segments and labels complete glioblastoma-multiform tumor volumes in magnetic resonance images of the human brain is presented. The magnetic resonance images consist of three feature images (T1- weighted, proton density, T2-weighted) and are processed by a system which integrates knowledge-based techniques with multispectral analysis and is independent of a particular magnetic resonance scanning protocol. Initial segmentation is performed by an unsupervised clustering algorithm. The segmented image, along with cluster centers for each class are provided to a rule-based expert system which extracts the intra-cranial region. Multispectral histogram analysis separates suspected tumor from the rest of the intra-cranial region, with region analysis used in performing the final tumor labeling. This system has been trained on eleven volume data sets and tested on twenty-two unseen volume data sets acquired from a single magnetic resonance imaging system. The knowledge-based tumor segmentation was compared with radiologist-verified `ground truth' tumor volumes and results generated by a supervised fuzzy clustering algorithm. The results of this system generally correspond well to ground truth, both on a per slice basis and more importantly in tracking total tumor volume during treatment over time.

  9. Automatic inspection of density in yarn-dyed fabrics by utilizing fabric light transmittance and Fourier analysis.

    PubMed

    Zhang, Jie; Pan, Ruru; Gao, Weidong

    2015-02-01

    Yarn density measurement is a significant part of yarn-dyed fabric analysis, traditionally based on reflective image analysis. In this paper, utilizing fabric light transmittance, a method for two-dimensional discrete Fourier transform (2D DFT) analysis on the transmission fabric image is developed for fabric density inspection. First, the power spectrum is generated from the fabric image by a 2D DFT. Next, the yarn skew angles are detected based on the power spectrum analysis. Then the fabric image is reconstructed by an inverse 2D DFT. Finally, projection curves are generated from the reconstructed images and the number of yarns is counted according to the peaks and valleys to obtain the fabric density. Through a comparison between analysis on the reflective and transmission images of multiple-color fabrics, it is proved that the latter method can segment the yarns with more satisfactory accuracy. Furthermore, the experimental and theoretical analyses demonstrate that the proposed method is effective for the density inspection of yarn-dyed fabrics with good robustness and great accuracy. PMID:25967813

  10. Application of an automatic thermal desorption-gas chromatography-mass spectrometry system for the analysis of polycyclic aromatic hydrocarbons in airborne particulate matter.

    PubMed

    Gil-Moltó, J; Varea, M; Galindo, N; Crespo, J

    2009-02-27

    The application of the thermal desorption (TD) method coupled with gas chromatography-mass spectrometry (GC-MS) to the analysis of aerosol organics has been the focus of many studies in recent years. This technique overcomes the main drawbacks of the solvent extraction approach such as the use of large amounts of toxic organic solvents and long and laborious extraction processes. In this work, the application of an automatic TD-GC-MS instrument for the determination of particle-bound polycyclic aromatic hydrocarbons (PAHs) is evaluated. This device offers the advantage of allowing the analysis of either gaseous or particulate organics without any modification. Once the thermal desorption conditions for PAH extraction were optimised, the method was verified on NIST standard reference material (SRM) 1649a urban dust, showing good linearity, reproducibility and accuracy for all target PAHs. The method has been applied to PM10 and PM2.5 samples collected on quartz fibre filters with low volume samplers, demonstrating its capability to quantify PAHs when only a small amount of sample is available. PMID:19150718

  11. On the implementation of automatic differentiation tools.

    SciTech Connect

    Bischof, C. H.; Hovland, P. D.; Norris, B.; Mathematics and Computer Science; Aachen Univ. of Technology

    2008-01-01

    Automatic differentiation is a semantic transformation that applies the rules of differential calculus to source code. It thus transforms a computer program that computes a mathematical function into a program that computes the function and its derivatives. Derivatives play an important role in a wide variety of scientific computing applications, including numerical optimization, solution of nonlinear equations, sensitivity analysis, and nonlinear inverse problems. We describe the forward and reverse modes of automatic differentiation and provide a survey of implementation strategies. We describe some of the challenges in the implementation of automatic differentiation tools, with a focus on tools based on source transformation. We conclude with an overview of current research and future opportunities.

  12. Automatic image analysis and spot classification for detection of fruit fly infestation in hyperspectral images of mangoes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An algorithm has been developed to identify spots generated in hyperspectral images of mangoes infested with fruit fly larvae. The algorithm incorporates background removal, application of a Gaussian blur, thresholding, and particle count analysis to identify locations of infestations. Each of the f...

  13. Treatment of Automatically Reinforced Object Mouthing with Noncontingent Reinforcement and Response Blocking: Experimental Analysis and Social Validation.

    ERIC Educational Resources Information Center

    Carr, James E.; Dozier, Claudia L.; Patel, Meeta R.; Adams, Amanda Nicolson; Martin, Nichelle

    2002-01-01

    A brief functional analysis indicated that the object mouthing of a young girl diagnosed with autism was maintained independent of social consequences. Separate and combined effects of response blocking and non-contingent reinforcement were then evaluated as treatments. Although both interventions were unsuccessful when implemented separately,…

  14. Automatic Whistler Detector and Analyzer system: Automatic Whistler Detector

    NASA Astrophysics Data System (ADS)

    Lichtenberger, J.; Ferencz, C.; BodnáR, L.; Hamar, D.; Steinbach, P.

    2008-12-01

    A new, unique system has been developed for the automatic detection and analysis of whistlers. The Automatic Whistler Detector and Analyzer (AWDA) system has two purposes: (1) to automatically provide plasmaspheric electron densities extracted from whistlers and (2) to collect statistical data for the investigation of whistler generation and propagation. This paper presents the details of and the first results obtained by the automatic detector segment. The detector algorithm is based on image correlation where the target image is a preprocessed spectrogram of raw VLF signals and the pattern is a model whistler. The first AWDA system has been working in Tihany, Hungary (L = 1.8), and has collected 100,000 whistler traces per year. The overall detection efficiency using a parameter set optimized for purpose 2 is 90% for misdetection and 50-80% for false detection. The statistical analysis over the period February 2002 to February 2008 including 600,000 whistler traces shows high diurnal variations; whistler were mainly, but not only, detected when both the source and receiver regions were unlit. The seasonal occurrence is high during austral summer and low during austral winter. Comparison with Tarcsai et al.'s (1988) statistical study on Tihany whistlers shows differences in both diurnal and seasonal variations, but the latter study was made on 1388 manually identified whistlers only. The L value distributions of both data sets are similar. A global network of AWDA systems (AWDAnet) has been set up to overcome the time and space limitations of a single station; the network consists of 13 nodes, and another 6 are envisaged for the near future.

  15. Toward dynamic isotopomer analysis in the rat brain in vivo: automatic quantitation of 13C NMR spectra using LCModel.

    PubMed

    Henry, Pierre-Gilles; Oz, Gülin; Provencher, Stephen; Gruetter, Rolf

    2003-01-01

    The LCModel method was adapted to analyze localized in vivo (13)C NMR spectra obtained from the rat brain in vivo at 9.4 T. Prior knowledge of chemical-shifts, J-coupling constants and J-evolution was included in the analysis. Up to 50 different isotopomer signals corresponding to 10 metabolites were quantified simultaneously in 400 microl volumes in the rat brain in vivo during infusion of [1,6-(13)C(2)]glucose. The analysis remained accurate even at low signal-to-noise ratio of the order of 3:1. The relative distribution of isotopomers in glutamate, glutamine and aspartate determined in vivo in 22 min was in excellent agreement with that measured in brain extracts. Quantitation of time series of (13)C spectra yielded time courses of total (13)C label incorporation into up to 16 carbon positions, as well as time courses of individual isotopomer signals, with a temporal resolution as low as 5 min (dynamic isotopomer analysis). The possibility of measuring in vivo a wealth of information that was hitherto accessible only in extracts is likely to expand the scope of metabolic studies in the intact brain. PMID:14679502

  16. Automatic registration of satellite imagery

    NASA Technical Reports Server (NTRS)

    Fonseca, Leila M. G.; Costa, Max H. M.; Manjunath, B. S.; Kenney, C.

    1997-01-01

    Image registration is one of the basic image processing operations in remote sensing. With the increase in the number of images collected every day from different sensors, automated registration of multi-sensor/multi-spectral images has become an important issue. A wide range of registration techniques has been developed for many different types of applications and data. The objective of this paper is to present an automatic registration algorithm which uses a multiresolution analysis procedure based upon the wavelet transform. The procedure is completely automatic and relies on the grey level information content of the images and their local wavelet transform modulus maxima. The registration algorithm is very simple and easy to apply because it needs basically one parameter. We have obtained very encouraging results on test data sets from the TM and SPOT sensor images of forest, urban and agricultural areas.

  17. Selection of shape parameters that differentiate sand grains, based on the automatic analysis of two-dimensional images

    NASA Astrophysics Data System (ADS)

    Sochan, Agata; Zieliński, Paweł; Bieganowski, Andrzej

    2015-08-01

    A grain shape analysis of sandy deposits has implications for determining the processes that affect grain shape. So far, most methods of carrying out a grain shape analysis are based on the subjective estimation of the researcher. The purpose of this study is to indicate the shape parameter/parameters that best differentiate sand grains, and to compare the results with those that have been obtained by the Krumbein method. We determined the shape parameters of sand grains (size range from 0.71 mm to 1 mm) using photos of two-dimensional images of particle projections onto a plane. The photos were taken under an optical microscope and were then subjected to an image analysis. We selected six shape parameters that best differentiate the studied sand grains, based on the criteria of: i) the monotonicity of parameter value, which changed depending on the categorization of the grains to the successive Krumbein roundness classes, and ii) the statistical significance of differences between the optical parameter values in the individual Krumbein classes. We selected three circularity parameters (θ1, θ4 and θ5) and three surface structure parameters (κ3, κ4 and κ6). None of these shape parameters allowed the direct categorization of a particle into a particular Krumbein roundness class. Nevertheless, despite the fact that an unambiguous characterization of deposits is not possible, this method can be helpful in identifying the origin of deposits. Moreover, it is possible to develop more advanced methods (e.g., based on artificial intelligence tools), which would allow an unambiguous categorization based on the determined shape parameters.

  18. Evaluating the reforested area for the municipality of Buri by automatic analysis of LANDSAT imagery. [Sao Paulo, Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Lee, D. C. L.; Filho, R. H.; Shimabukuro, Y. E.

    1979-01-01

    The author has identified the following significant results. The class of reforestation (Pinus, Eucalyptus, Araucaria) was defined using iterative image analysis (1-100) and LANDSAT MSS data. Estimates of class area by 1-100 were compared with data supplied by the forestry institute in Sao Paulo. LANDSAT channels 4 and 5 served to differentiate the Pinus, Eucalyptus, and Araucaria from the other trees. Channels 6 and 7 gave best results for differentiating between the classes. A good representative spectral response was obtained for Auraucaria on these two channels. The small relative differences obtained were +4.24% for Araucaria, -7.51% for Pinus, and -32.07% for Eucalyptus.

  19. The feasibility of a regional CTDI{sub vol} to estimate organ dose from tube current modulated CT exams

    SciTech Connect

    Khatonabadi, Maryam; Kim, Hyun J.; Lu, Peiyun; McMillan, Kyle L.; Cagnon, Chris H.; McNitt-Gray, Michael F.; DeMarco, John J.

    2013-05-15

    dose to correlate with patient size was investigated. Results: For all five organs, the correlations with patient size increased when organ doses were normalized by regional and organ-specific CTDI{sub vol} values. For example, when estimating dose to the liver, CTDI{sub vol,global} yielded a R{sup 2} value of 0.26, which improved to 0.77 and 0.86, when using the regional and organ-specific CTDI{sub vol} for abdomen and liver, respectively. For breast dose, the global CTDI{sub vol} yielded a R{sup 2} value of 0.08, which improved to 0.58 and 0.83, when using the regional and organ-specific CTDI{sub vol} for chest and breasts, respectively. The R{sup 2} values also increased once the thoracic models were separated for the analysis into females and males, indicating differences between genders in this region not explained by a simple measure of effective diameter. Conclusions: This work demonstrated the utility of regional and organ-specific CTDI{sub vol} as normalization factors when using TCM. It was demonstrated that CTDI{sub vol,global} is not an effective normalization factor in TCM exams where attenuation (and therefore tube current) varies considerably throughout the scan, such as abdomen/pelvis and even thorax. These exams can be more accurately assessed for dose using regional CTDI{sub vol} descriptors that account for local variations in scanner output present when TCM is employed.

  20. [1st experience on the use of an automatic ECG analysis in a large industrial plant in Erfurt].

    PubMed

    Giegler, I; Grossmann, K; Knorre, M; Reissmann, H C; Rübesam, M

    1977-10-01

    By means of a device system consisting of constituents of the SW-production and own developmental works in the factory Kombinat VEB Umformtechnik (combinate nationally owned enterprise transformation technology) for the first time ECG serial examinations were performed with the help of the mechanical ECG-analysis. The corrected orthogonal system of Frank with 3 leads served as deviation system. The ECG-registration was independently performed by function nurses. 1,720 male and female workers of this factory at the age of 21 to 59 years served as test persons. The ECG-registration lasted 20 sec., the whole time of examination including the changing of clothes and the way from the working place to the examination room did not last more than 4 to 8 min. As diagnosis programme served that one developed by Pipberger. The mechanical analysis resulted in 74.4% in a normal course of the electrocardiographic current curve. Among the pathological or abnormal ECGs (25.6%) prevailed the vegetative-functional heart diseases with 92%. Then followed the chronic ischaemic heart diseases with 7.9% and the hypertension with 5.1%. Diseases of the heart and the blood circulation established for the first time referred to 8.9%. Of them 5% needed control and 3.9% needed therapy. PMID:595717

  1. An environmental friendly method for the automatic determination of hypochlorite in commercial products using multisyringe flow injection analysis.

    PubMed

    Soto, N Ornelas; Horstkotte, B; March, J G; López de Alba, P L; López Martínez, L; Cerdá Martín, V

    2008-03-24

    A multisyringe flow injection analysis system was used for the determination of hypochlorite in cleaning agents, by measurement of the native absorbance of hypochlorite at 292 nm. The methodology was based on the selective decomposition of hypochlorite by a cobalt oxide catalyst giving chloride and oxygen. The difference of the absorbance of the sample before and after its pass through a cobalt oxide column was selected as analytical signal. As no further reagent was required this work can be considered as a contribution to environmental friendly analytical chemistry. The entire analytical procedure, including in-line sample dilution in three steps was automated by first, dilution in a stirred miniature vessel, second by dispersion and third by in-line addition of water using multisyringe flow injection technique. The dynamic concentration range was 0.04-0.78 gL(-1) (relative standard deviation lower than 3%), where the extension of the hypochlorite decomposition was of 90+/-4%. The proposed method was successfully applied to the analysis of commercial cleaning products. The accuracy of the method was established by iodometric titration. PMID:18328319

  2. Automatic amino acid analyzer

    NASA Technical Reports Server (NTRS)

    Berdahl, B. J.; Carle, G. C.; Oyama, V. I.

    1971-01-01

    Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.

  3. Automatic Payroll Deposit System.

    ERIC Educational Resources Information Center

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  4. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    NASA Astrophysics Data System (ADS)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of

  5. Automatic flow analysis method to determine traces of Mn²⁺ in sea and drinking waters by a kinetic catalytic process using LWCC-spectrophotometric detection.

    PubMed

    Chaparro, Laura; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2016-02-01

    A new automatic kinetic catalytic method has been developed for the measurement of Mn(2+) in drinking and seawater samples. The method is based on the catalytic effect of Mn(2+) on the oxidation of tiron by hydrogen peroxide in presence of Pb(2+) as an activator. The optimum conditions were obtained at pH 10 with 0.019 mol L(-1) 2'2 bipyridyl, 0.005 mol L(-1) tiron and 0.38 mol L(-1) hydrogen peroxide. Flow system is based on multisyringe flow injection analysis (MSFIA) coupled with a lab-on-valve (LOV) device exploiting on line spectrophotometric detection by a Liquid Waveguide Capillary Cell (LWCC), 1m optical length and performed at 445 nm. Under the optimized conditions by a multivariate approach, the method allowed the measurement of Mn(2+) in a range of 0.03-35 µg L(-1) with a detection limit of 0.010 µg L(-1), attaining a repeatability of 1.4% RSD. The method was satisfactorily applied to the determination of Mn(2+) in environmental water samples. The reliability of method was also verified by determining the manganese content of the certified standard reference seawater sample, CASS-4. PMID:26653487

  6. Determination of free and total sulfites in wine using an automatic flow injection analysis system with voltammetric detection.

    PubMed

    Goncalves, Luis Moreira; Grosso Pacheco, Joao; Jorge Magalhaes, Paulo; Antonio Rodrigues, Jose; Araujo Barros, Aquiles

    2010-02-01

    An automated flow injection analysis (FIA) system, based on an initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell, was developed for the determination of total and free sulfur dioxide (SO(2)) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and a simplified method commonly used by the wine industry). The developed method displayed good repeatability (RSD lower than 6%) and linearity (between 10 and 250 mg l(-1)) as well as a suitable LOD (3 mg l(-1)) and LOQ (9 mg l(-1)). A major advantage of this system is that SO(2) is directly detected by flow SWV. PMID:20013444

  7. An interdisciplinary analysis of multispectral satellite data for selected cover types in the Colorado Mountains, using automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has reported the following significant results. A data set containing SKYLAB, LANDSAT, and topographic data has been overlayed, registered, and geometrically corrected to a scale of 1:24,000. After geometrically correcting both sets of data, the SKYLAB data were overlayed on the LANDSAT data. Digital topographic data were then obtained, reformatted, and a data channel containing elevation information was then digitally overlayed onto the LANDSAT and SKYLAB spectral data. The 14,039 square kilometers involving 2,113, 776 LANDSAT pixels represents a relatively large data set available for digital analysis. The overlayed data set enables investigators to numerically analyze and compare two sources of spectral data and topographic data from any point in the scene. This capability is new and it will permit a numerical comparison of spectral response with elevation, slope, and aspect. Utilization of the spectral and topographic data together to obtain more accurate classifications of the various cover types present is feasible.

  8. Automatic switching matrix

    DOEpatents

    Schlecht, Martin F.; Kassakian, John G.; Caloggero, Anthony J.; Rhodes, Bruce; Otten, David; Rasmussen, Neil

    1982-01-01

    An automatic switching matrix that includes an apertured matrix board containing a matrix of wires that can be interconnected at each aperture. Each aperture has associated therewith a conductive pin which, when fully inserted into the associated aperture, effects electrical connection between the wires within that particular aperture. Means is provided for automatically inserting the pins in a determined pattern and for removing all the pins to permit other interconnecting patterns.

  9. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images

    PubMed Central

    2012-01-01

    Background Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. Method The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. Results The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice. PMID:22236465

  10. Semi-automatic quantification of neurite fasciculation in high-density neurite images by the Neurite Directional Distribution Analysis (NDDA)

    PubMed Central

    Hopkins, Amy M; Wheeler, Brandon; Staii, Cristian; Kaplan, David L.; Atherton, Timothy J.

    2014-01-01

    Background Bundling of neurite extensions occur during nerve development and regeneration. Understanding the factors that drive neurite bundling is important for designing biomaterials for nerve regeneration toward the innervation target and preventing nociceptive collateral sprouting. High-density neuron cultures including dorsal root ganglia explants are employed for in vitro screening of biomaterials designed to control directional outgrowth. Although some semiautomated image processing methods exist for quantification of neurite outgrowth, methods to quantify axonal fasciculation in terms of direction of neurite outgrowth are lacking. New Method This work presents a semi-automated program to analyze micrographs of high-density neurites; the program aims to quantify axonal fasciculation by determining the orientational distribution function of the tangent vectors of the neurites and calculating its Fourier series coefficients (‘c’ values). Results We found that neurite directional distribution analysis (NDDA) of fasciculated neurites yielded ‘c’ values of ≥ ~0.25 whereas branched outgrowth led to statistically significant lesser values of <~0.2. The ‘c’ values correlated directly to the width of neurite bundles and indirectly to the number of branching points. Comparison with Existing Methods Information about the directional distribution of outgrowth is lost in simple counting methods or achieved laboriously through manual analysis. The NDDA supplements previous quantitative analyses of axonal bundling using a vector-based approach that captures new information about the directionality of outgrowth. Conclusion The NDDA is a valuable addition to open source image processing tools available to biomedical researchers offering a robust, precise approach to quantification of imaged features important in tissue development, disease, and repair. PMID:24680908

  11. Histological analysis of tissue structures of the internal organs of steppe tortoises following their exposure to spaceflight conditions while circumnavigating the moon aboard the Zond-7 automatic station

    NASA Technical Reports Server (NTRS)

    Sutulov, L. S.; Sutulov, Y. L.; Trukhina, L. V.

    1975-01-01

    Tortoises flown around the Moon on the 6-1/2 day voyage of the Zond-7 automatic space station evidently did not suffer any pathological changes to their peripheral blood picture, heart, lungs, intestines, or liver.

  12. Flow injection analysis-based methodology for automatic on-line monitoring and quality control for biodiesel production.

    PubMed

    Pinzi, S; Priego Capote, F; Ruiz Jiménez, J; Dorado, M P; Luque de Castro, M D

    2009-01-01

    An automated on-line approach based on determination of free and bound glycerol was here proposed to monitor biodiesel production. The method was based on liquid-liquid extraction of glycerol from the biodiesel to an aqueous ethanolic phase in which glycerol is oxidized to formaldehyde with meta periodate with subsequent reaction with acetylacetone. The reaction product was photometrically measured at 410 nm. Free and bound glycerol were differentiated by glycerides hydrolysis with potassium ethylate. The experimental set-up consisted of a flow-injection manifold for liquid-liquid extraction without phase separation and iterative change of the flow direction that enabled: (a) filling the flow manifold with a meta periodate-acetylacetone acceptor phase; (b) sampling of small amounts (microl) from the reactor; (c) determination of free glycerol by extraction from biodiesel to the aqueous phase with simultaneous oxidation-reaction with acetylacetone in the acceptor phase; (d) continuous monitoring of the aqueous phase by passage through a photometric detector; (e) filling the flow manifold with a potassium ethylate-meta periodate-acetylacetone new acceptor phase; (d) repetition of steps b-to-d to determine total glycerol after saponification of the bound glycerol by potassium ethylate; and (f) determination of bound glycerol by difference between the second and first analyses. The results showed that the proposed automated on-line method is a suitable option in routine analysis during biodiesel production. PMID:18614358

  13. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  14. Automatic Analysis of Retinal Vascular Parameters for Detection of Diabetes in Indian Patients with No Retinopathy Sign.

    PubMed

    Aliahmad, Behzad; Kumar, Dinesh Kant; Jain, Rajeev

    2016-01-01

    This study has investigated the association between retinal vascular parameters with type II diabetes in Indian population with no observable diabetic retinopathy. It has introduced two new retinal vascular parameters: total number of branching angles (TBA) and average acute branching angles (ABA) as potential biomarkers of diabetes in an explanatory model. A total number of 180 retinal images (two (left and right) × two (ODC and MC) × 45 subjects (13 diabetics and 32 nondiabetics)) were analysed. Stepwise linear regression analysis was performed to model the association between type II diabetes with the best subset of explanatory variables (predictors), consisting of retinal vascular parameters and patients' demographic information. P value of the estimated coefficients (P < 0.001) indicated that, at α level of 0.05, the newly introduced retinal vascular parameters, that is, TBA and ABA together with CRAE, mean tortuosity, SD of branching angle, and VB, are related to type II diabetes when there is no observable sign of retinopathy. PMID:27579347

  15. Automatic Analysis of Retinal Vascular Parameters for Detection of Diabetes in Indian Patients with No Retinopathy Sign

    PubMed Central

    Jain, Rajeev

    2016-01-01

    This study has investigated the association between retinal vascular parameters with type II diabetes in Indian population with no observable diabetic retinopathy. It has introduced two new retinal vascular parameters: total number of branching angles (TBA) and average acute branching angles (ABA) as potential biomarkers of diabetes in an explanatory model. A total number of 180 retinal images (two (left and right) × two (ODC and MC) × 45 subjects (13 diabetics and 32 nondiabetics)) were analysed. Stepwise linear regression analysis was performed to model the association between type II diabetes with the best subset of explanatory variables (predictors), consisting of retinal vascular parameters and patients' demographic information. P value of the estimated coefficients (P < 0.001) indicated that, at α level of 0.05, the newly introduced retinal vascular parameters, that is, TBA and ABA together with CRAE, mean tortuosity, SD of branching angle, and VB, are related to type II diabetes when there is no observable sign of retinopathy. PMID:27579347

  16. CancerMA: a web-based tool for automatic meta-analysis of public cancer microarray data

    PubMed Central

    Feichtinger, Julia; McFarlane, Ramsay J.; Larcombe, Lee D.

    2012-01-01

    The identification of novel candidate markers is a key challenge in the development of cancer therapies. This can be facilitated by putting accessible and automated approaches analysing the current wealth of ‘omic’-scale data in the hands of researchers who are directly addressing biological questions. Data integration techniques and standardized, automated, high-throughput analyses are needed to manage the data available as well as to help narrow down the excessive number of target gene possibilities presented by modern databases and system-level resources. Here we present CancerMA, an online, integrated bioinformatic pipeline for automated identification of novel candidate cancer markers/targets; it operates by means of meta-analysing expression profiles of user-defined sets of biologically significant and related genes across a manually curated database of 80 publicly available cancer microarray datasets covering 13 cancer types. A simple-to-use web interface allows bioinformaticians and non-bioinformaticians alike to initiate new analyses as well as to view and retrieve the meta-analysis results. The functionality of CancerMA is shown by means of two validation datasets. Database URL: http://www.cancerma.org.uk PMID:23241162

  17. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    PubMed

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness. PMID:26189971

  18. Automatic lexical classification: bridging research and practice.

    PubMed

    Korhonen, Anna

    2010-08-13

    Natural language processing (NLP)--the automatic analysis, understanding and generation of human language by computers--is vitally dependent on accurate knowledge about words. Because words change their behaviour between text types, domains and sub-languages, a fully accurate static lexical resource (e.g. a dictionary, word classification) is unattainable. Researchers are now developing techniques that could be used to automatically acquire or update lexical resources from textual data. If successful, the automatic approach could considerably enhance the accuracy and portability of language technologies, such as machine translation, text mining and summarization. This paper reviews the recent and on-going research in automatic lexical acquisition. Focusing on lexical classification, it discusses the many challenges that still need to be met before the approach can benefit NLP on a large scale. PMID:20603372

  19. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, 1873 (PL XXIX top); illustration of full milll, as enlarged to south. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  20. 2. Historic American Buildings Survey Photocopy from Harpers, vol. 20 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Historic American Buildings Survey Photocopy from Harpers, vol. 20 1859 Courtesy of Library of Congress NORTH AND EAST FRONTS - United States General Post Office, Between Seventh, Eighth, E, & F Streets, Northwest, Washington, District of Columbia, DC

  1. Automatic analysis with thermometric detection.

    PubMed

    McLean, W R; Penketh, G E

    1968-11-01

    The construction of a cell and associated Wheatstone bridge detector circuitry is described for a thermometric detector suitable for attachment to a Technicon Autoanalyzer. The detector produces a d.c. mV signal linearly proportional to the concentration (0.005-0.1M) of the thermally reactive component in the sample stream when it is mixed in the cell with the reagent stream. The influence of various pertinent parameters such as ambient temperature, thermistor voltage, heats of reaction and sensitivity are discussed together with interference effects arising through chemistry, ionic strength effects and heat of dilution. PMID:18960422

  2. A new automatic synchronizer

    SciTech Connect

    Malm, C.F.

    1995-12-31

    A phase lock loop automatic synchronizer, PLLS, matches generator speed starting from dead stop to bus frequency, and then locks the phase difference at zero, thereby maintaining zero slip frequency while the generator breaker is being closed to the bus. The significant difference between the PLLS and a conventional automatic synchronizer is that there is no slip frequency difference between generator and bus. The PLL synchronizer is most advantageous when the penstock pressure fluctuates the grid frequency fluctuates, or both. The PLL synchronizer is relatively inexpensive. Hydroplants with multiple units can economically be equipped with a synchronizer for each unit.

  3. Automatic Program Synthesis Reports.

    ERIC Educational Resources Information Center

    Biermann, A. W.; And Others

    Some of the major results of future goals of an automatic program synthesis project are described in the two papers that comprise this document. The first paper gives a detailed algorithm for synthesizing a computer program from a trace of its behavior. Since the algorithm involves a search, the length of time required to do the synthesis of…

  4. AUTOmatic Message PACKing Facility

    Energy Science and Technology Software Center (ESTSC)

    2004-07-01

    AUTOPACK is a library that provides several useful features for programs using the Message Passing Interface (MPI). Features included are: 1. automatic message packing facility 2. management of send and receive requests. 3. management of message buffer memory. 4. determination of the number of anticipated messages from a set of arbitrary sends, and 5. deterministic message delivery for testing purposes.

  5. Automatic finite element generators

    NASA Technical Reports Server (NTRS)

    Wang, P. S.

    1984-01-01

    The design and implementation of a software system for generating finite elements and related computations are described. Exact symbolic computational techniques are employed to derive strain-displacement matrices and element stiffness matrices. Methods for dealing with the excessive growth of symbolic expressions are discussed. Automatic FORTRAN code generation is described with emphasis on improving the efficiency of the resultant code.

  6. Principles of Automatic Lemmatisation

    ERIC Educational Resources Information Center

    Hann, M. L.

    1974-01-01

    Introduces some algorithmic methods, for which no pre-editing is necessary, for automatically "lemmatising" raw text (changing raw text to an equivalent version in which all inflected words are artificially transformed to their dictionary look-up form). The results of a study of these methods, which used a German Text, are also given. (KM)

  7. Reactor component automatic grapple

    SciTech Connect

    Greenaway, P.R.

    1982-12-07

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment.

  8. Reactor component automatic grapple

    DOEpatents

    Greenaway, Paul R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment.

  9. Automatic sweep circuit

    DOEpatents

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  10. Automatic Data Processing Glossary.

    ERIC Educational Resources Information Center

    Bureau of the Budget, Washington, DC.

    The technology of the automatic information processing field has progressed dramatically in the past few years and has created a problem in common term usage. As a solution, "Datamation" Magazine offers this glossary which was compiled by the U.S. Bureau of the Budget as an official reference. The terms appear in a single alphabetic sequence,…

  11. Automatic Dance Lesson Generation

    ERIC Educational Resources Information Center

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  12. Automatic multiple applicator electrophoresis

    NASA Technical Reports Server (NTRS)

    Grunbaum, B. W.

    1977-01-01

    Easy-to-use, economical device permits electrophoresis on all known supporting media. System includes automatic multiple-sample applicator, sample holder, and electrophoresis apparatus. System has potential applicability to fields of taxonomy, immunology, and genetics. Apparatus is also used for electrofocusing.

  13. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  14. Automatic morphological classification of galaxy images

    PubMed Central

    Shamir, Lior

    2009-01-01

    We describe an image analysis supervised learning algorithm that can automatically classify galaxy images. The algorithm is first trained using a manually classified images of elliptical, spiral, and edge-on galaxies. A large set of image features is extracted from each image, and the most informative features are selected using Fisher scores. Test images can then be classified using a simple Weighted Nearest Neighbor rule such that the Fisher scores are used as the feature weights. Experimental results show that galaxy images from Galaxy Zoo can be classified automatically to spiral, elliptical and edge-on galaxies with accuracy of ~90% compared to classifications carried out by the author. Full compilable source code of the algorithm is available for free download, and its general-purpose nature makes it suitable for other uses that involve automatic image analysis of celestial objects. PMID:20161594

  15. Fully automatic telemetry data processor

    NASA Technical Reports Server (NTRS)

    Cox, F. B.; Keipert, F. A.; Lee, R. C.

    1968-01-01

    Satellite Telemetry Automatic Reduction System /STARS 2/, a fully automatic computer-controlled telemetry data processor, maximizes data recovery, reduces turnaround time, increases flexibility, and improves operational efficiency. The system incorporates a CDC 3200 computer as its central element.

  16. Sensitivity analysis of O{sub 3} and photochemical indicators using a mixed-phase chemistry box model and automatic differentiation technology

    SciTech Connect

    Zhang, Y.; Easter, R.C.; Bischof, C.H.; Wu, P.T.

    1997-12-31

    A comprehensive sensitivity analysis of a multi-phase atmospheric chemical mechanism is conducted under a variety of atmospheric conditions. The ADIFOR automatic differentiation technology is applied to evaluate the local sensitivities of species concentrations in gas, aqueous and aerosol phases with respect to a variety of model parameters. In this paper, sensitivities of tropospheric ozone and photochemical indicators with respect to species initial concentrations, gas-phase reaction rate constants, and aerosol surface uptake coefficients are presented and analyzed. The main gas-phase reaction pathways and aerosol surface uptake processes that affect tropospheric O{sub 3} formation, O{sub 3}-precursor relations and sensitivity of indicators are identified. The most influential gas-phase reactions include the photolytic reactions of NO{sub 2}, O{sub 3}, H{sub 2}O{sub 2}, HCHO, ALD{sub 2} and MGLY, the conversion of NO to NO{sub 2}, the generation and inter-conversion of OH, HO{sub 2} and RO{sub 2} radicals, and the formation and dissociation of oxidants and acids. Photochemical indicators such as O{sub 3}/NO{sub x} and H{sub 2}O{sub 2}/HNO{sub 3} are sensitive to changes in reaction rate constants, initial species concentrations, and uptake coefficients. These indicators are found to have higher sensitivities for hydrocarbon reactions and lower sensitivities for NO{sub x} reactions under polluted conditions as compared to less polluted conditions. Aerosol surface uptake is important when the total surface area is larger than 1,000 {micro}m{sup 2} cm{sup {minus}3}. The identified important heterogeneous processes include aerosol surface uptake of HCHO, O{sub 3}, HO{sub 2}, HNO{sub 3}, NO, NO{sub 2}, N{sub 2}O{sub 5}, PAN, H{sub 2}O{sub 2}, CH{sub 3}O{sub 2} and SO{sub 2}. These uptake processes can affect not only O{sub 3} formation and its sensitivity, but also O{sub 3}-precursor relations and sensitivities of indicators.

  17. Automatic Thesaurus Generation for an Electronic Community System.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; And Others

    1995-01-01

    This research reports an algorithmic approach to the automatic generation of thesauri for electronic community systems. The techniques used include term filtering, automatic indexing, and cluster analysis. The Worm Community System, used by molecular biologists studying the nematode worm C. elegans, was used as the testbed for this research.…

  18. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel...

  19. 19 CFR 360.103 - Automatic issuance of import licenses.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Automatic issuance of import licenses. 360.103 Section 360.103 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT OF COMMERCE STEEL IMPORT MONITORING AND ANALYSIS SYSTEM § 360.103 Automatic issuance of import licenses. (a) In general. Steel...

  20. Automatic TLI recognition system beta prototype testing

    SciTech Connect

    Lassahn, G.D.

    1996-06-01

    This report describes the beta prototype automatic target recognition system ATR3, and some performance tests done with this system. This is a fully operational system, with a high computational speed. It is useful for findings any kind of target in digitized image data, and as a general purpose image analysis tool.

  1. Automatic Processing of Current Affairs Queries

    ERIC Educational Resources Information Center

    Salton, G.

    1973-01-01

    The SMART system is used for the analysis, search and retrieval of news stories appearing in Time'' magazine. A comparison is made between the automatic text processing methods incorporated into the SMART system and a manual search using the classified index to Time.'' (14 references) (Author)

  2. Apparatus enables automatic microanalysis of body fluids

    NASA Technical Reports Server (NTRS)

    Soffen, G. A.; Stuart, J. L.

    1966-01-01

    Apparatus will automatically and quantitatively determine body fluid constituents which are amenable to analysis by fluorometry or colorimetry. The results of the tests are displayed as percentages of full scale deflection on a strip-chart recorder. The apparatus can also be adapted for microanalysis of various other fluids.

  3. Automatic carrier acquisition system

    NASA Technical Reports Server (NTRS)

    Bunce, R. C. (Inventor)

    1973-01-01

    An automatic carrier acquisition system for a phase locked loop (PLL) receiver is disclosed. It includes a local oscillator, which sweeps the receiver to tune across the carrier frequency uncertainty range until the carrier crosses the receiver IF reference. Such crossing is detected by an automatic acquisition detector. It receives the IF signal from the receiver as well as the IF reference. It includes a pair of multipliers which multiply the IF signal with the IF reference in phase and in quadrature. The outputs of the multipliers are filtered through bandpass filters and power detected. The output of the power detector has a signal dc component which is optimized with respect to the noise dc level by the selection of the time constants of the filters as a function of the sweep rate of the local oscillator.

  4. Automatism and driving offences.

    PubMed

    Rumbold, John

    2013-10-01

    Automatism is a rarely used defence, but it is particularly used for driving offences because many are strict liability offences. Medical evidence is almost always crucial to argue the defence, and it is important to understand the bars that limit the use of automatism so that the important medical issues can be identified. The issue of prior fault is an important public safeguard to ensure that reasonable precautions are taken to prevent accidents. The total loss of control definition is more problematic, especially with disorders of more gradual onset like hypoglycaemic episodes. In these cases the alternative of 'effective loss of control' would be fairer. This article explores several cases, how the criteria were applied to each, and the types of medical assessment required. PMID:24112330

  5. Automatic Abstraction in Planning

    NASA Technical Reports Server (NTRS)

    Christensen, J.

    1991-01-01

    Traditionally, abstraction in planning has been accomplished by either state abstraction or operator abstraction, neither of which has been fully automatic. We present a new method, predicate relaxation, for automatically performing state abstraction. PABLO, a nonlinear hierarchical planner, implements predicate relaxation. Theoretical, as well as empirical results are presented which demonstrate the potential advantages of using predicate relaxation in planning. We also present a new definition of hierarchical operators that allows us to guarantee a limited form of completeness. This new definition is shown to be, in some ways, more flexible than previous definitions of hierarchical operators. Finally, a Classical Truth Criterion is presented that is proven to be sound and complete for a planning formalism that is general enough to include most classical planning formalisms that are based on the STRIPS assumption.

  6. Automatic speech recognition

    NASA Astrophysics Data System (ADS)

    Espy-Wilson, Carol

    2005-04-01

    Great strides have been made in the development of automatic speech recognition (ASR) technology over the past thirty years. Most of this effort has been centered around the extension and improvement of Hidden Markov Model (HMM) approaches to ASR. Current commercially-available and industry systems based on HMMs can perform well for certain situational tasks that restrict variability such as phone dialing or limited voice commands. However, the holy grail of ASR systems is performance comparable to humans-in other words, the ability to automatically transcribe unrestricted conversational speech spoken by an infinite number of speakers under varying acoustic environments. This goal is far from being reached. Key to the success of ASR is effective modeling of variability in the speech signal. This tutorial will review the basics of ASR and the various ways in which our current knowledge of speech production, speech perception and prosody can be exploited to improve robustness at every level of the system.

  7. Adaptation is automatic.

    PubMed

    Samuel, A G; Kat, D

    1998-04-01

    Two experiments were used to test whether selective adaptation for speech occurs automatically or instead requires attentional resources. A control condition demonstrated the usual large identification shifts caused by repeatedly presenting an adapting sound (/wa/, with listeners identifying members of a /ba/-/wa/ test series). Two types of distractor tasks were used: (1) Subjects did a rapid series of arithmetic problems during the adaptation periods (Experiments 1 and 2), or (2) they made a series of rhyming judgments, requiring phonetic coding (Experiment 2). A control experiment (Experiment 3) demonstrated that these tasks normally impose a heavy attentional cost on phonetic processing. Despite this, for both experimental conditions, the observed adaptation effect was just as large as in the control condition. This result indicates that adaptation is automatic, operating at an early, preattentive level. The implications of these results for current models of speech perception are discussed. PMID:9599999

  8. Analysis of SSEM Sensor Data Using BEAM

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Park, Han; James, Mark

    2004-01-01

    A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.

  9. Automatic circuit interrupter

    NASA Technical Reports Server (NTRS)

    Dwinell, W. S.

    1979-01-01

    In technique, voice circuits connecting crew's cabin to launch station through umbilical connector disconnect automatically unused, or deadened portion of circuits immediately after vehicle is launched, eliminating possibility that unused wiring interferes with voice communications inside vehicle or need for manual cutoff switch and its associated wiring. Technique is applied to other types of electrical actuation circuits, also launch of mapped vehicles, such as balloons, submarines, test sleds, and test chambers-all requiring assistance of ground crew.

  10. Automatic digital image registration

    NASA Technical Reports Server (NTRS)

    Goshtasby, A.; Jain, A. K.; Enslin, W. R.

    1982-01-01

    This paper introduces a general procedure for automatic registration of two images which may have translational, rotational, and scaling differences. This procedure involves (1) segmentation of the images, (2) isolation of dominant objects from the images, (3) determination of corresponding objects in the two images, and (4) estimation of transformation parameters using the center of gravities of objects as control points. An example is given which uses this technique to register two images which have translational, rotational, and scaling differences.

  11. Automatic estimation of the first subglottal resonance.

    PubMed

    Arsikere, Harish; Lulich, Steven M; Alwan, Abeer

    2011-05-01

    This letter focuses on the automatic estimation of the first subglottal resonance (Sg1). A database comprising speech and subglottal data of native American English speakers and bilingual Spanish/English speakers was used for the analysis. Data from 11 speakers (five males and six females) were used to derive an empirical relation among the first formant frequency, fundamental frequency, and Sg1. Using the derived relation, Sg1 was automatically estimated from voiced sounds in English and Spanish sentences spoken by 22 different speakers (11 males and 11 females). The error in estimating Sg1 was less than 50 Hz, on average. PMID:21568375

  12. Analysis of results obtained using the automatic chemical control of the quality of the water heat carrier in the drum boiler of the Ivanovo CHP-3 power plant

    NASA Astrophysics Data System (ADS)

    Larin, A. B.; Kolegov, A. V.

    2012-10-01

    Results of industrial tests of the new method used for the automatic chemical control of the quality of boiler water of the drum-type power boiler ( P d = 13.8 MPa) are described. The possibility of using an H-cationite column for measuring the electric conductivity of an H-cationized sample of boiler water over a long period of time is shown.

  13. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  14. Automatic exposure control for space sequential camera

    NASA Technical Reports Server (NTRS)

    Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.

    1975-01-01

    The final report for the automatic exposure control study for space sequential cameras, for the NASA Johnson Space Center is presented. The material is shown in the same sequence that the work was performed. The purpose of the automatic exposure control is to automatically control the lens iris as well as the camera shutter so that the subject is properly exposed on the film. A study of design approaches is presented. Analysis of the light range of the spectrum covered indicates that the practical range would be from approximately 20 to 6,000 foot-lamberts, or about nine f-stops. Observation of film available from space flights shows that optimum scene illumination is apparently not present in vehicle interior photography as well as in vehicle-to-vehicle situations. The evaluation test procedure for a breadboard, and the results, which provided information for the design of a brassboard are given.

  15. Automatic weld torch guidance control system

    NASA Technical Reports Server (NTRS)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  16. An image-based automatic mesh generation and numerical simulation for a population-based analysis of aerosol delivery in the human lungs

    NASA Astrophysics Data System (ADS)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2013-11-01

    The authors propose a method to automatically generate three-dimensional subject-specific airway geometries and meshes for computational fluid dynamics (CFD) studies of aerosol delivery in the human lungs. The proposed method automatically expands computed tomography (CT)-based airway skeleton to generate the centerline (CL)-based model, and then fits it to the CT-segmented geometry to generate the hybrid CL-CT-based model. To produce a turbulent laryngeal jet known to affect aerosol transport, we developed a physiologically-consistent laryngeal model that can be attached to the trachea of the above models. We used Gmsh to automatically generate the mesh for the above models. To assess the quality of the models, we compared the regional aerosol distributions in a human lung predicted by the hybrid model and the manually generated CT-based model. The aerosol distribution predicted by the hybrid model was consistent with the prediction by the CT-based model. We applied the hybrid model to 8 healthy and 16 severe asthmatic subjects, and average geometric error was 3.8% of the branch radius. The proposed method can be potentially applied to the branch-by-branch analyses of a large population of healthy and diseased lungs. NIH Grants R01-HL-094315 and S10-RR-022421, CT data provided by SARP, and computer time provided by XSEDE.

  17. Using airborne LiDAR in geoarchaeological contexts: Assessment of an automatic tool for the detection and the morphometric analysis of grazing archaeological structures (French Massif Central).

    NASA Astrophysics Data System (ADS)

    Roussel, Erwan; Toumazet, Jean-Pierre; Florez, Marta; Vautier, Franck; Dousteyssier, Bertrand

    2014-05-01

    Airborne laser scanning (ALS) of archaeological regions of interest is nowadays a widely used and established method for accurate topographic and microtopographic survey. The penetration of the vegetation cover by the laser beam allows the reconstruction of reliable digital terrain models (DTM) of forested areas where traditional prospection methods are inefficient, time-consuming and non-exhaustive. The ALS technology provides the opportunity to discover new archaeological features hidden by vegetation and provides a comprehensive survey of cultural heritage sites within their environmental context. However, the post-processing of LiDAR points clouds produces a huge quantity of data in which relevant archaeological features are not easily detectable with common visualizing and analysing tools. Undoubtedly, there is an urgent need for automation of structures detection and morphometric extraction techniques, especially for the "archaeological desert" in densely forested areas. This presentation deals with the development of automatic detection procedures applied to archaeological structures located in the French Massif Central, in the western forested part of the Puy-de-Dôme volcano between 950 and 1100 m a.s.l.. These unknown archaeological sites were discovered by the March 2011 ALS mission and display a high density of subcircular depressions with a corridor access. The spatial organization of these depressions vary from isolated to aggregated or aligned features. Functionally, they appear to be former grazing constructions built from the medieval to the modern period. Similar grazing structures are known in other locations of the French Massif Central (Sancy, Artense, Cézallier) where the ground is vegetation-free. In order to develop a reliable process of automatic detection and mapping of these archaeological structures, a learning zone has been delineated within the ALS surveyed area. The grazing features were mapped and typical morphometric attributes

  18. Classification and automatic transcription of primate calls.

    PubMed

    Versteegh, Maarten; Kuhn, Jeremy; Synnaeve, Gabriel; Ravaux, Lucie; Chemla, Emmanuel; Cäsar, Cristiane; Fuller, James; Murphy, Derek; Schel, Anne; Dunbar, Ewan

    2016-07-01

    This paper reports on an automated and openly available tool for automatic acoustic analysis and transcription of primate calls, which takes raw field recordings and outputs call labels time-aligned with the audio. The system's output predicts a majority of the start times of calls accurately within 200 milliseconds. The tools do not require any manual acoustic analysis or selection of spectral features by the researcher. PMID:27475207

  19. Automatic microscopy for mitotic cell location.

    NASA Technical Reports Server (NTRS)

    Herron, J.; Ranshaw, R.; Castle, J.; Wald, N.

    1972-01-01

    Advances are reported in the development of an automatic microscope with which to locate hematologic or other cells in mitosis for subsequent chromosome analysis. The system under development is designed to perform the functions of: slide scanning to locate metaphase cells; conversion of images of selected cells into binary form; and on-line computer analysis of the digitized image for significant cytogenetic data. Cell detection criteria are evaluated using a test sample of 100 mitotic cells and 100 artifacts.

  20. AUTOMATIC FREQUENCY CONTROL SYSTEM

    DOEpatents

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  1. Automatic range selector

    DOEpatents

    McNeilly, Clyde E.

    1977-01-04

    A device is provided for automatically selecting from a plurality of ranges of a scale of values to which a meter may be made responsive, that range which encompasses the value of an unknown parameter. A meter relay indicates whether the unknown is of greater or lesser value than the range to which the meter is then responsive. The rotatable part of a stepping relay is rotated in one direction or the other in response to the indication from the meter relay. Various positions of the rotatable part are associated with particular scales. Switching means are sensitive to the position of the rotatable part to couple the associated range to the meter.

  2. Automatic clutch control system

    SciTech Connect

    Kasai, H.; Ogawa, N.; Hattori, T.; Ishihara, M.; Uriuhara, M.

    1986-12-16

    This patent describes an automatic clutch control system, comprising: a clutch having a full clutch engagement point and a clutch contact point; a clutch actuator for controlling a clutch stroke; a plurality of solenoid valves for controlling the clutch actuator; clutch stroke sensor means for measuring the clutch stroke and for detecting the full clutch engagement point and the clutch contact point in the clutch stroke; control means, for feeding back a stroke signal detected by the clutch stroke sensor and for controlling the solenoid valves to control clutch engagement and disengagement.

  3. Automatic speaker recognition system

    NASA Astrophysics Data System (ADS)

    Higgins, Alan; Naylor, Joe

    1984-07-01

    The Defense Communications Division of ITT (ITTDCD) has developed an automatic speaker recognition (ASR) system that meets the functional requirements defined in NRL's Statement of Work. This report is organized as follows. Chapter 2 is a short history of the development of the ASR system, both the algorithm and the implementation. Chapter 3 describes the methodology of system testing, and Chapter 4 summarizes test results. In Chapter 5, some additional testing performed using GFM test material is discussed. Conclusions derived from the contract work are given in Chapter 6.

  4. Automatic readout micrometer

    SciTech Connect

    Lauritzen, T.

    1982-03-23

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  5. Automatic engine control system

    SciTech Connect

    Geary, W.C.; Mirsaiidi, M.V.; Redfern, T.; Wolfe, D.W.

    1986-01-14

    This patent describes an automatic control circuit for an internal combustion engine and clutch assembly. One component of this circuit is a timer for determining the time the engine is allowed to run and the clutch is engaged and a second period of time when the clutch is automatically disengaged. Associated with the timer is a starter means to start the engine during the first time period and a clutch actuating mechanism for engaging the clutch near the first time period initiation after the starter starts the engine. An engine shut down and clutch disengagement mechanism is also responsive to the first timer. The patent then goes on to describe a supplemental timer mechanism for determining a third and fourth period of time within the second time period such that the third period being when the engine is shut off and the fourth period being when the engine runs with clutch disengaged. The starter mechanism is responsive to the supplemental timer to start the engine at the beginning of the fourth period. A shut down means stops the engine at the beginning of the third period in response to the timer.

  6. Automatic readout micrometer

    DOEpatents

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  7. Automatic readout micrometer

    DOEpatents

    Lauritzen, Ted

    1982-01-01

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  8. Automatic enrollment for gait-based person re-identification

    NASA Astrophysics Data System (ADS)

    Ortells, Javier; Martín-Félez, Raúl; Mollineda, Ramón A.

    2015-02-01

    Automatic enrollment involves a critical decision-making process within people re-identification context. However, this process has been traditionally undervalued. This paper studies the problem of automatic person enrollment from a realistic perspective relying on gait analysis. Experiments simulating random flows of people with considerable appearance variations between different observations of a person have been conducted, modeling both short- and longterm scenarios. Promising results based on ROC analysis show that automatically enrolling people by their gait is affordable with high success rates.

  9. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XX); illustration used by eminent British textile engineer to exemplify the ultimate development in American cotton mill technology. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  10. 3. Photocopy form Western Architect, Vol, 19, No. 8, August ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Photocopy form Western Architect, Vol, 19, No. 8, August 1913, following page 80. 'TOWN AND COMMUNITY PLANNING, WALTER BURLEY GRIFFEN.' ORIGINAL PRESENTATION DRAWING AT NORTHWESTERN UNIVERSITY, ART DEPARTMENT. - Joshua G. Melson House, 56 River Heights Drive, Mason City, Cerro Gordo County, IA

  11. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    PubMed

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. PMID:24473345

  12. Analysis and Design of a Gated Envelope Feedback Technique for Automatic Hardware Reconfiguration of RFIC Power Amplifiers, with Full On-Chip Implementation in Gallium Arsenide Heterojunction Bipolar Transistor Technology

    NASA Astrophysics Data System (ADS)

    Constantin, Nicolas Gerard David

    In this doctoral dissertation, the author presents the theoretical foundation, the analysis and design of analog and RF circuits, the chip level implementation, and the experimental validation pertaining to a new radio frequency integrated circuit (RFIC) power amplifier (PA) architecture that is intended for wireless portable transceivers. A method called Gated Envelope Feedback is proposed to allow the automatic hardware reconfiguration of a stand-alone RFIC PA in multiple states for power efficiency improvement purposes. The method uses self-operating and fully integrated circuitry comprising RF power detection, switching and sequential logic, and RF envelope feedback in conjunction with a hardware gating function for triggering and activating current reduction mechanisms as a function of the transmitted RF power level. Because of the critical role that RFIC PA components occupy in modern wireless transceivers, and given the major impact that these components have on the overall RF performances and energy consumption in wireless transceivers, very significant benefits stem from the underlying innovations. The method has been validated through the successful design of a 1.88GHz COMA RFIC PA with automatic hardware reconfiguration capability, using an industry renowned state-of-the-art GaAs HBT semiconductor process developed and owned by Skyworks Solutions, Inc., USA. The circuit techniques that have enabled the successful and full on-chip embodiment of the technique are analyzed in details. The IC implementation is discussed, and experimental results showing significant current reduction upon automatic hardware reconfiguration, gain regulation performances, and compliance with the stringent linearity requirements for COMA transmission demonstrate that the gated envelope feedback method is a viable and promising approach to automatic hardware reconfiguration of RFIC PA's for current reduction purposes. Moreover, in regard to on-chip integration of advanced PA

  13. Comment on "Why reduced-form regression models of health effects versus exposures should not replace QRA: livestock production and infant mortality as an example," by Louis Anthony (Tony) Cox, Jr., Risk Analysis 2009, Vol. 29, No. 12.

    PubMed

    Sneeringer, Stacy

    2010-04-01

    While a recent paper by Cox in this journal uses as its motivating factor the benefits of quantitative risk assessment, its content is entirely devoted to critiquing Sneeringer's article in the American Journal of Agricultural Economics. Cox's two main critiques of Sneeringer are fundamentally flawed and misrepresent the original article. Cox posits that Sneeringer did A and B, and then argues why A and B are incorrect. However, Sneeringer in fact did C and D; thus critiques of A and B are not applicable to Sneeringer's analysis. PMID:20345577

  14. Dry Sliding Wear Behavior of 6351 Al-(4 vol.% SiC + 4 vol.% Al2O3) Hybrid Composite

    NASA Astrophysics Data System (ADS)

    Show, Bijay Kumar; Mondal, Dipak Kumar; Maity, Joydeep

    2014-09-01

    In this research work, the dry sliding wear behavior of 6351 Al-(4 vol.% SiC + 4 vol.% Al2O3) hybrid composite was investigated at low sliding speed (1 m/s) against a hardened EN 31 disk at different loads. In general, the wear mechanism involved adhesion (along with associated subsurface cracking and delamination) and microcutting abrasion at lower load. While at higher load, abrasive wear involving microcutting and microploughing along with adherent oxide formation was observed. The overall wear rate increased with increasing normal load. The massive particle clusters as well as individual reinforcement particles were found to stand tall to resist abrasive wear. Besides, at higher load, the generation of adherent nodular tribo-oxide through nucleation and epitaxial growth on existing Al2O3 particles lowered down the wear rate. Accordingly, at any normal load, 6351 Al-(4 vol.% SiC + 4 vol.% Al2O3) hybrid composite exhibited superior wear resistance (lower overall wear rate) than the reported wear resistance of monolithic 6351 Al alloy.

  15. AUTOMATIC HAND COUNTER

    DOEpatents

    Mann J.R.; Wainwright, A.E.

    1963-06-11

    An automatic, personnel-operated, alpha-particle hand monitor is described which functions as a qualitative instrument to indicate to the person using it whether his hands are cold'' or hot.'' The monitor is activated by a push button and includes several capacitor-triggered thyratron tubes. Upon release of the push button, the monitor starts the counting of the radiation present on the hands of the person. If the count of the radiation exceeds a predetermined level within a predetermined time, then a capacitor will trigger a first thyratron tube to light a hot'' lamp. If, however, the count is below such level during this time period, another capacitor will fire a second thyratron to light a safe'' lamp. (AEC)

  16. Automatic routing module

    NASA Technical Reports Server (NTRS)

    Malin, Janice A.

    1987-01-01

    Automatic Routing Module (ARM) is a tool to partially automate Air Launched Cruise Missile (ALCM) routing. For any accessible launch point or target pair, ARM creates flyable routes that, within the fidelity of the models, are optimal in terms of threat avoidance, clobber avoidance, and adherence to vehicle and planning constraints. Although highly algorithmic, ARM is an expert system. Because of the heuristics applied, ARM generated routes closely resemble manually generated routes in routine cases. In more complex cases, ARM's ability to accumulate and assess threat danger in three dimensions and trade that danger off with the probability of ground clobber results in the safest path around or through difficult areas. The tools available prior to ARM did not provide the planner with enough information or present it in such a way that ensured he would select the safest path.

  17. Automatic Bayesian polarity determination

    NASA Astrophysics Data System (ADS)

    Pugh, D. J.; White, R. S.; Christie, P. A. F.

    2016-07-01

    The polarity of the first motion of a seismic signal from an earthquake is an important constraint in earthquake source inversion. Microseismic events often have low signal-to-noise ratios, which may lead to difficulties estimating the correct first-motion polarities of the arrivals. This paper describes a probabilistic approach to polarity picking that can be both automated and combined with manual picking. This approach includes a quantitative estimate of the uncertainty of the polarity, improving calculation of the polarity probability density function for source inversion. It is sufficiently fast to be incorporated into an automatic processing workflow. When used in source inversion, the results are consistent with those from manual observations. In some cases, they produce a clearer constraint on the range of high-probability source mechanisms, and are better constrained than source mechanisms determined using a uniform probability of an incorrect polarity pick.

  18. Automatic flowmeter calibration system

    NASA Technical Reports Server (NTRS)

    Lisle, R. V.; Wilson, T. L. (Inventor)

    1981-01-01

    A system for automatically calibrating the accuracy of a flowmeter is described. The system includes a calculator capable of performing mathematical functions responsive to receiving data signals and function command signals. A prover cylinder is provided for measuring the temperature, pressure, and time required for accumulating a predetermined volume of fluid. Along with these signals, signals representing the temperature and pressure of the fluid going into the meter are fed to a plurality of data registers. Under control of a progress controller, the data registers are read out and the information is fed through a data select circuit to the calculator. Command signals are also produced by a function select circuit and are fed to the calculator set indicating the desired function to be performed. The reading is then compared with the reading produced by the flowmeter.

  19. Automatic Bayesian polarity determination

    NASA Astrophysics Data System (ADS)

    Pugh, D. J.; White, R. S.; Christie, P. A. F.

    2016-04-01

    The polarity of the first motion of a seismic signal from an earthquake is an important constraint in earthquake source inversion. Microseismic events often have low signal-to-noise ratios, which may lead to difficulties estimating the correct first-motion polarities of the arrivals. This paper describes a probabilistic approach to polarity picking that can be both automated and combined with manual picking. This approach includes a quantitative estimate of the uncertainty of the polarity, improving calculation of the polarity probability density function for source inversion. It is sufficiently fast to be incorporated into an automatic processing workflow. When used in source inversion, the results are consistent with those from manual observations. In some cases, they produce a clearer constraint on the range of high-probability source mechanims, and are better constrained than source mechanisms determined using a uniform probability of an incorrect polarity pick.

  20. Automatic thermal switch

    NASA Technical Reports Server (NTRS)

    Wing, L. D.; Cunningham, J. W. (Inventor)

    1981-01-01

    An automatic thermal switch to control heat flow includes a first thermally conductive plate, a second thermally conductive plate and a thermal transfer plate pivotally mounted between the first and second plates. A phase change power unit, including a plunger connected to the transfer plate, is in thermal contact with the first thermally conductive plate. A biasing element, connected to the transfer plate, biases the transfer plate in a predetermined position with respect to the first and second plates. When the phase change power unit is actuated by an increase in heat transmitted through the first plate, the plunger extends and pivots the transfer plate to vary the thermal conduction between the first and second plates through the transfer plate. The biasing element, transfer plate and piston can be arranged to provide either a normally closed or normally open thermally conductive path between the first and second plates.

  1. Automatic TLI recognition system, general description

    SciTech Connect

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  2. Semi-automatic approach for music classification

    NASA Astrophysics Data System (ADS)

    Zhang, Tong

    2003-11-01

    Audio categorization is essential when managing a music database, either a professional library or a personal collection. However, a complete automation in categorizing music into proper classes for browsing and searching is not yet supported by today"s technology. Also, the issue of music classification is subjective to some extent as each user may have his own criteria for categorizing music. In this paper, we propose the idea of semi-automatic music classification. With this approach, a music browsing system is set up which contains a set of tools for separating music into a number of broad types (e.g. male solo, female solo, string instruments performance, etc.) using existing music analysis methods. With results of the automatic process, the user may further cluster music pieces in the database into finer classes and/or adjust misclassifications manually according to his own preferences and definitions. Such a system may greatly improve the efficiency of music browsing and retrieval, while at the same time guarantee accuracy and user"s satisfaction of the results. Since this semi-automatic system has two parts, i.e. the automatic part and the manual part, they are described separately in the paper, with detailed descriptions and examples of each step of the two parts included.

  3. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  4. Semi-automatic transmission

    SciTech Connect

    Morscheck, T.J.; Davis, A.R.; Huggins, M.J.

    1987-06-30

    This patent describes a semi-automatic mechanical change gear transmission system of the type comprising: a mechanical change gear transmission of the type comprising a transmission housing, an input shaft rotatably supported in the housing and driven by an engine through a nonpositive coupling, an output shaft rotatably supported in the housing and a plurality of selectable ratio gears selectively engageable one at a time to a first transmission element by means of positive, nonsynchronized jaw clutch assemblies for providing a plurality of manually selectable drive ratios between the input and output shafts, each of the jaw clutch assemblies comprising a first jaw clutch member rotatably associated with the first transmission element and a second jaw clutch member rotatably associated with a second transmission element, each of the first jaw clutch members axially moveable relative to the first transmission element; manually operated means for engaging and disengaging the nonpositive coupling; manually operated shifting means for engaging selected ratio gears to and disengaging selected ratio gears from the first transmission element; selection for sensing the identity of the particular ratio gear selected for manual engagement or disengagement from the first transmission element and for providing a signal; first and second rotational speed sensors for sensing the rotational speed of the first and second transmission elements and providing signals; a power synchronizer assembly selectively actuable for selectively varying the rotational speed of the second transmission element and the second jaw clutch members rotatably associated therewith; and a central processing unit semi-automatic mechanical change gear transmission system.

  5. Automatic alkaloid removal system.

    PubMed

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user. PMID:24783795

  6. A Comparative Analysis of DBSCAN, K-Means, and Quadratic Variation Algorithms for Automatic Identification of Swallows from Swallowing Accelerometry Signals

    PubMed Central

    Dudik, Joshua M.; Kurosu, Atsuko; Coyle, James L

    2015-01-01

    Background Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. Methods In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Results Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differen-tiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. Conclusions In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. PMID:25658505

  7. Identification of Xanthomonas fragariae, Xanthomonas axonopodis pv. phaseoli, and Xanthomonas fuscans subsp. fuscans with novel markers and using a dot blot platform coupled with automatic data analysis.

    PubMed

    Albuquerque, Pedro; Caridade, Cristina M R; Marcal, Andre R S; Cruz, Joana; Cruz, Leonor; Santos, Catarina L; Mendes, Marta V; Tavares, Fernando

    2011-08-15

    Phytosanitary regulations and the provision of plant health certificates still rely mainly on long and laborious culture-based methods of diagnosis, which are frequently inconclusive. DNA-based methods of detection can circumvent many of the limitations of currently used screening methods, allowing a fast and accurate monitoring of samples. The genus Xanthomonas includes 13 phytopathogenic quarantine organisms for which improved methods of diagnosis are needed. In this work, we propose 21 new Xanthomonas-specific molecular markers, within loci coding for Xanthomonas-specific protein domains, useful for DNA-based methods of identification of xanthomonads. The specificity of these markers was assessed by a dot blot hybridization array using 23 non-Xanthomonas species, mostly soil dwelling and/or phytopathogens for the same host plants. In addition, the validation of these markers on 15 Xanthomonas spp. suggested species-specific hybridization patterns, which allowed discrimination among the different Xanthomonas species. Having in mind that DNA-based methods of diagnosis are particularly hampered for unsequenced species, namely, Xanthomonas fragariae, Xanthomonas axonopodis pv. phaseoli, and Xanthomonas fuscans subsp. fuscans, for which comparative genomics tools to search for DNA signatures are not yet applicable, emphasis was given to the selection of informative markers able to identify X. fragariae, X. axonopodis pv. phaseoli, and X. fuscans subsp. fuscans strains. In order to avoid inconsistencies due to operator-dependent interpretation of dot blot data, an image-processing algorithm was developed to analyze automatically the dot blot patterns. Ultimately, the proposed markers and the dot blot platform, coupled with automatic data analyses, have the potential to foster a thorough monitoring of phytopathogenic xanthomonads. PMID:21705524

  8. An open system for automatic home-cage behavioral analysis and its application to male and female mouse models of Huntington's disease.

    PubMed

    Zarringhalam, Kourosh; Ka, Minhan; Kook, Yeon-Hee; Terranova, Joseph I; Suh, Yongjoon; King, Oliver D; Um, Moonkyoung

    2012-04-01

    Changes in routine mouse home-cage behavioral activities have been used recently to study alterations of neural circuits caused by genetic and environmental modifications and by drug administration. Nevertheless, automatic assessment of mouse home-cage behaviors remains challenging due to the cost of proprietary systems and to the difficulty in adjusting systems to different monitoring conditions. Here we present software for the automatic quantification of multiple facets of mouse home-cage behaviors, suitable for continuous 24 h video monitoring. We used this program to assess behavioral changes in male and female R6/2 transgenic mouse models of Huntington's disease over a 10-week period. Consistent with the well-known progressive motor coordination deficits of R6/2 mice, their hanging, rearing, and climbing activity declined as the disease progressed. R6/2 mice also exhibited frequent disturbances in their resting activity compared to wild-type mice, suggesting that R6/2 mice are more restless and wakeful. Behavioral differences were seen earlier for male R6/2 mice than female R6/2 mice, and "behavioral signatures" based on multiple behaviors enabled us to distinguish male R6/2 mice from sex- and age-matched wild-type controls as early as 5 weeks of age. These results demonstrate that the automated behavioral classification software that we developed ("OpenCage") provides a powerful tool for analyzing natural home-cage mouse behaviors, and for constructing behavioral signatures that will be useful for assessing therapeutic strategies. The OpenCage software is available under an open-source GNU General Public License, allowing other users to freely modify and extend it to suit their purposes. PMID:22266926

  9. Automatic Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1985-01-01

    Coal cutting and removal done with minimal hazard to people. Automatic coal mine cutting, transport and roof-support movement all done by automatic machinery. Exposure of people to hazardous conditions reduced to inspection tours, maintenance, repair, and possibly entry mining.

  10. Keystone feasibility study. Final report. Vol. 4

    SciTech Connect

    Not Available

    1982-12-01

    Volume four of the Keystone coal-to-methanol project includes the following: (1) project management; (2) economic and financial analyses; (3) market analysis; (4) process licensing and agreements; and (5) appendices. 24 figures, 27 tables.

  11. Automatic multidiagnosis system for slit lamp

    NASA Astrophysics Data System (ADS)

    Ventura, Liliane; Chiaradia, Caio; Vieira Messias, Andre M.; Faria de Sousa, Sidney J.; Isaac, Flavio; Caetano, Cesar A. C.; Rosa Filho, Andre B.

    2001-06-01

    We have developed a system for several automatic diagnose in Slit Lamp in order to provide 04 additional measurements to the biomicroscope: (1) counting of the endothelial cells of donated corneas; (2) automatic keratometry; (3) corneal ulcer evaluation; (4) measurement of linear distances and areas of the ocular image. The system consists in a Slit Lamp, a beam-splitter, some optical components, a CCD detector, a frame grabber and a PC. The optical components attached to the beam-splitter are the same for all the functions, except for 1. For function 1, we have developed an optical system that magnifies the image 290X and a software that counts the cells interactively and automatically. Results are in good agreement with commercial specular microscopes (correlation coefficient is 0,98081). The automatic keratometry function is able to measure cylinders over 30 di and also irregular astigmatisms. The system consists of projecting a light ring at the patient's cornea and the further analysis of the deformation of the ring provides the radius of curvature as well as the axis of the astigmatism. The nominal precision is 0,005 mm for the curvature radius and 2 degree(s) for the axis component. The results are in good agreement with commercial systems (correlation coefficient of 0,99347). For function 3, the ulcer is isolated by the usual clinical ways and the image of the green area is automatically detected by the developed software in order to evaluate the evolution of the disease. Function 4 simply allows the clinician do any linear or area measurement of the ocular image. The system is a low cost multi evaluation equipment and it is being used in a public hospital in Brazil.

  12. Automatic Command Sequence Generation

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladded, Roy; Khanampompan, Teerapat

    2007-01-01

    Automatic Sequence Generator (Autogen) Version 3.0 software automatically generates command sequences for the Mars Reconnaissance Orbiter (MRO) and several other JPL spacecraft operated by the multi-mission support team. Autogen uses standard JPL sequencing tools like APGEN, ASP, SEQGEN, and the DOM database to automate the generation of uplink command products, Spacecraft Command Message Format (SCMF) files, and the corresponding ground command products, DSN Keywords Files (DKF). Autogen supports all the major multi-mission mission phases including the cruise, aerobraking, mapping/science, and relay mission phases. Autogen is a Perl script, which functions within the mission operations UNIX environment. It consists of two parts: a set of model files and the autogen Perl script. Autogen encodes the behaviors of the system into a model and encodes algorithms for context sensitive customizations of the modeled behaviors. The model includes knowledge of different mission phases and how the resultant command products must differ for these phases. The executable software portion of Autogen, automates the setup and use of APGEN for constructing a spacecraft activity sequence file (SASF). The setup includes file retrieval through the DOM (Distributed Object Manager), an object database used to store project files. This step retrieves all the needed input files for generating the command products. Depending on the mission phase, Autogen also uses the ASP (Automated Sequence Processor) and SEQGEN to generate the command product sent to the spacecraft. Autogen also provides the means for customizing sequences through the use of configuration files. By automating the majority of the sequencing generation process, Autogen eliminates many sequence generation errors commonly introduced by manually constructing spacecraft command sequences. Through the layering of commands into the sequence by a series of scheduling algorithms, users are able to rapidly and reliably construct the

  13. Composite materials: Fatigue and fracture. Vol. 3

    NASA Technical Reports Server (NTRS)

    O'Brien, T. K. (Editor)

    1991-01-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  14. Composite materials: Fatigue and fracture. Vol. 3

    NASA Astrophysics Data System (ADS)

    O'Brien, T. K.

    1991-11-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  15. Empirical Research in Theatre, Vol 3.

    ERIC Educational Resources Information Center

    Addington, David W., Ed.; Kepke, Allen N., Ed.

    This journal provides a focal point for the collection and distribution of systematically processed information about theory and practice in theatre. Part of an irregularly published series, this issue contains investigations of the application of transactional analysis to the theatre, the psychological effect of counterattitudinal acting in…

  16. Youth Studies Abstracts. Vol. 4 No. 1.

    ERIC Educational Resources Information Center

    Youth Studies Abstracts, 1985

    1985-01-01

    This volume contains abstracts of 76 projects (most of which were conducted in Australia and New Zealand) concerned with programs for youth and with social and educational developments affecting youth. The abstracts are arranged in the following two categories: (1) Social and Educational Developments: Policy, Analysis, Research; and (2) Programs:…

  17. Electronically controlled automatic transmission

    SciTech Connect

    Ohkubo, M.; Shiba, H.; Nakamura, K.

    1989-03-28

    This patent describes an electronically controlled automatic transmission having a manual valve working in connection with a manual shift lever, shift valves operated by solenoid valves which are driven by an electronic control circuit previously memorizing shift patterns, and a hydraulic circuit controlled by these manual valve and shift valves for driving brakes and a clutch in order to change speed. Shift patterns of 2-range and L-range, in addition to a shift pattern of D-range, are memorized previously in the electronic control circuit, an operation switch is provided which changes the shift pattern of the electronic control circuit to any shift pattern among those of D-range, 2-range and L-range at time of the manual shift lever being in a D-range position, a releasable lock mechanism is provided which prevents the manual shift lever from entering 2-range and L-range positions, and the hydraulic circuit is set to a third speed mode when the manual shift lever is in the D-range position. The circuit is set to a second speed mode when it is in the 2-range position, and the circuit is set to a first speed mode when it is in the L-range position, respectively, in case where the shift valves are not working.

  18. Automatic imitation in dogs

    PubMed Central

    Range, Friederike; Huber, Ludwig; Heyes, Cecilia

    2011-01-01

    After preliminary training to open a sliding door using their head and their paw, dogs were given a discrimination task in which they were rewarded with food for opening the door using the same method (head or paw) as demonstrated by their owner (compatible group), or for opening the door using the alternative method (incompatible group). The incompatible group, which had to counterimitate to receive food reward, required more trials to reach a fixed criterion of discrimination performance (85% correct) than the compatible group. This suggests that, like humans, dogs are subject to ‘automatic imitation’; they cannot inhibit online the tendency to imitate head use and/or paw use. In a subsequent transfer test, where all dogs were required to imitate their owners' head and paw use for food reward, the incompatible group made a greater proportion of incorrect, counterimitative responses than the compatible group. These results are consistent with the associative sequence learning model, which suggests that the development of imitation depends on sensorimotor experience and phylogenetically general mechanisms of associative learning. More specifically, they suggest that the imitative behaviour of dogs is shaped more by their developmental interactions with humans than by their evolutionary history of domestication. PMID:20667875

  19. Automatic battery analyzer

    SciTech Connect

    Dougherty, T.J.; Frailing, C.E.

    1980-03-11

    Apparatus for automatically testing automotive-type, lead acid storage batteries is disclosed in which three separate tests are made and the results thereof compared to predetermined standards in a specified order to maximize the information obtained about the battery. The three tests measure (1) whether the battery meets its cold cranking rating by drawing a predetermined load current therefrom for a predetermined period of time and determining whether the battery terminal voltage is above a specified level at the end of that period, (2) whether the battery terminal voltage is above another specified level at the end of a predetermined period of time following the completion of the first test, and (3) whether the internal resistance is acceptably low. If the battery passes the first test, it is known to be acceptable. If the battery fails the first test and passes the second test, it is known to be unacceptable. If the battery fails the first and second tests, the third test is performed. If the battery then passes the third test, it is known to be acceptable but to require a recharge, whereas if the battery then fails the third test the acceptability of the battery is then not yet determined and it must be recharged and retested.

  20. Automatic Welding System

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Robotic welding has been of interest to industrial firms because it offers higher productivity at lower cost than manual welding. There are some systems with automated arc guidance available, but they have disadvantages, such as limitations on types of materials or types of seams that can be welded; susceptibility to stray electrical signals; restricted field of view; or tendency to contaminate the weld seam. Wanting to overcome these disadvantages, Marshall Space Flight Center, aided by Hayes International Corporation, developed system that uses closed-circuit TV signals for automatic guidance of the welding torch. NASA granted license to Combined Technologies, Inc. for commercial application of the technology. They developed a refined and improved arc guidance system. CTI in turn, licensed the Merrick Corporation, also of Nashville, for marketing and manufacturing of the new system, called the CT2 Optical Trucker. CT2 is a non-contracting system that offers adaptability to broader range of welding jobs and provides greater reliability in high speed operation. It is extremely accurate and can travel at high speed of up to 150 inches per minute.

  1. Attaining Automaticity in the Visual Numerosity Task is Not Automatic

    PubMed Central

    Speelman, Craig P.; Muller Townsend, Katrina L.

    2015-01-01

    This experiment is a replication of experiments reported by Lassaline and Logan (1993) using the visual numerosity task. The aim was to replicate the transition from controlled to automatic processing reported by Lassaline and Logan (1993), and to examine the extent to which this result, reported with average group results, can be observed in the results of individuals within a group. The group results in this experiment did replicate those reported by Lassaline and Logan (1993); however, one half of the sample did not attain automaticity with the task, and one-third did not exhibit a transition from controlled to automatic processing. These results raise questions about the pervasiveness of automaticity, and the interpretation of group means when examining cognitive processes. PMID:26635658

  2. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved. PMID:20329520

  3. Semi-automatic detection of gunshot residue (GSR) by scanning electron microscopy and energy dispersive X-ray analysis (SEM/EDX).

    PubMed

    Gansau, H; Becker, U

    1982-01-01

    Wide strips of adhesive tape (40 mm x 86 mm) are pressed for about one minute on the thumb, forefinger and outer edge of a subject's hand. Subsequently these tapes are fixed to a cylinder that rotates within the SEM chamber and is adjustable in an axial direction. The tapes are scanned for possible GSR particles with simultaneous SE and X-ray imaging. The X-ray signal caused by particles of high atomic number automatically stops the rotating cylinder, and the EDX spectrum of the suspect particle is produced within five seconds. A chart is then plotted to record the size, elemental composition, morphology and tape coordinates of the particles of interest. The outcome is a complete map of GSR particles found on certain parts of the hand, and this map has a characteristic pattern depending on the firearm used, the ammunition and the circumstances of the shooting incident. The time lapse between firing and sampling may allow this GSR pattern to change, but this method is helpful in suicide/homicide decisions. PMID:7167744

  4. Factors influencing relative speech intelligibility in patients with oral squamous cell carcinoma: a prospective study using automatic, computer-based speech analysis.

    PubMed

    Stelzle, F; Knipfer, C; Schuster, M; Bocklet, T; Nöth, E; Adler, W; Schempf, L; Vieler, P; Riemann, M; Neukam, F W; Nkenke, E

    2013-11-01

    Oral squamous cell carcinoma (OSCC) and its treatment impair speech intelligibility by alteration of the vocal tract. The aim of this study was to identify the factors of oral cancer treatment that influence speech intelligibility by means of an automatic, standardized speech-recognition system. The study group comprised 71 patients (mean age 59.89, range 35-82 years) with OSCC ranging from stage T1 to T4 (TNM staging). Tumours were located on the tongue (n=23), lower alveolar crest (n=27), and floor of the mouth (n=21). Reconstruction was conducted through local tissue plasty or microvascular transplants. Adjuvant radiotherapy was performed in 49 patients. Speech intelligibility was evaluated before, and at 3, 6, and 12 months after tumour resection, and compared to that of a healthy control group (n=40). Postoperatively, significant influences on speech intelligibility were tumour localization (P=0.010) and resection volume (P=0.019). Additionally, adjuvant radiotherapy (P=0.049) influenced intelligibility at 3 months after surgery. At 6 months after surgery, influences were resection volume (P=0.028) and adjuvant radiotherapy (P=0.034). The influence of tumour localization (P=0.001) and adjuvant radiotherapy (P=0.022) persisted after 12 months. Tumour localization, resection volume, and radiotherapy are crucial factors for speech intelligibility. Radiotherapy significantly impaired word recognition rate (WR) values with a progression of the impairment for up to 12 months after surgery. PMID:23845298

  5. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  6. Semiconductor yield improvements through automatic defect classification

    SciTech Connect

    Gleason, S.; Kulkarni, A.

    1995-09-30

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. Projections by semiconductor manufacturers predict that with larger wafer sizes and smaller line width technology the number of defects to be manually classified will increase exponentially. This cooperative research and development agreement (CRADA) between Martin Marietta Energy Systems (MMES) and KLA Instruments developed concepts, algorithms and systems to automate the classification of wafer defects to decrease inspection time, improve the reliability of defect classification, and hence increase process throughput and yield. Image analysis, feature extraction, pattern recognition and classification schemes were developed that are now being used as research tools for future products and are being integrated into the KLA line of wafer inspection hardware. An automatic defect classification software research tool was developed and delivered to the CRADA partner to facilitate continuation of this research beyond the end of the partnership.

  7. Retrotrapezoid nucleus, respiratory chemosensitivity and breathing automaticity

    PubMed Central

    Guyenet, Patrice G.; Bayliss, Douglas A.; Stornetta, Ruth L.; Fortuna, Michal G.; Abbott, Stephen B.; Depuy, Seth D.

    2009-01-01

    SUMMARY Breathing automaticity and CO2 regulation are inseparable neural processes. The retrotrapezoid nucleus (RTN), a group of glutamatergic neurons that express the transcription factor Phox2b, may be a crucial nodal point through which breathing automaticity is regulated to maintain CO2 constant. This review updates the analysis presented in prior publications. Additional evidence that RTN neurons have central respiratory chemoreceptor properties is presented but this is only one of many factors that determine their activity. The RTN is also regulated by powerful inputs from the carotid bodies and, at least in the adult, by many other synaptic inputs. We also analyze how RTN neurons may control the activity of the downstream central respiratory pattern generator. Specifically, we review the evidence which suggests that RTN neurons a) innervate the entire ventral respiratory column, and b) control both inspiration and expiration. Finally, we argue that the RTN neurons are the adult form of the parafacial respiratory group in neonate rats. PMID:19712903

  8. Automatic Weather Station (AWS) Lidar

    NASA Technical Reports Server (NTRS)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  9. Automatic landslides detection on Stromboli volcanic Island

    NASA Astrophysics Data System (ADS)

    Silengo, Maria Cristina; Delle Donne, Dario; Ulivieri, Giacomo; Cigolini, Corrado; Ripepe, Maurizio

    2016-04-01

    Landslides occurring in active volcanic islands play a key role in triggering tsunami and other related risks. Therefore, it becomes vital for a correct and prompt risk assessment to monitor landslides activity and to have an automatic system for a robust early-warning. We then developed a system based on a multi-frequency analysis of seismic signals for automatic landslides detection occurring at Stromboli volcano. We used a network of 4 seismic 3 components stations located along the unstable flank of the Sciara del Fuoco. Our method is able to recognize and separate the different sources of seismic signals related to volcanic and tectonic activity (e.g. tremor, explosions, earthquake) from landslides. This is done using a multi-frequency analysis combined with a waveform patter recognition. We applied the method to one year of seismic activity of Stromboli volcano centered during the last 2007 effusive eruption. This eruption was characterized by a pre-eruptive landslide activity reflecting the slow deformation of the volcano edifice. The algorithm is at the moment running off-line but has proved to be robust and efficient in picking automatically landslide. The method provides also real-time statistics on the landslide occurrence, which could be used as a proxy for the volcano deformation during the pre-eruptive phases. This method is very promising since the number of false detections is quite small (<5%) and is reducing when the size of the landslide increases. The final aim will be to apply this method on-line and for a real-time automatic detection as an improving tool for early warnings of tsunami-genic landslide activity. We suggest that a similar approach could be also applied to other unstable non-volcanic also slopes.

  10. Clothes Dryer Automatic Termination Evaluation

    SciTech Connect

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  11. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  12. Vision-based industrial automatic vehicle classifier

    NASA Astrophysics Data System (ADS)

    Khanipov, Timur; Koptelov, Ivan; Grigoryev, Anton; Kuznetsova, Elena; Nikolaev, Dmitry

    2015-02-01

    The paper describes the automatic motor vehicle video stream based classification system. The system determines vehicle type at payment collection plazas on toll roads. Classification is performed in accordance with a preconfigured set of rules which determine type by number of wheel axles, vehicle length, height over the first axle and full height. These characteristics are calculated using various computer vision algorithms: contour detectors, correlational analysis, fast Hough transform, Viola-Jones detectors, connected components analysis, elliptic shapes detectors and others. Input data contains video streams and induction loop signals. Output signals are vehicle enter and exit events, vehicle type, motion direction, speed and the above mentioned features.

  13. On-line dynamic fractionation and automatic determination of inorganic phosphorus in environmental solid substrates exploiting sequential injection microcolumn extraction and flow injection analysis.

    PubMed

    Buanuam, Janya; Miró, Manuel; Hansen, Elo Harald; Shiowatana, Juwadee

    2006-06-16

    Sequential injection microcolumn extraction (SI-MCE) based on the implementation of a soil-containing microcartridge as external reactor in a sequential injection network is, for the first time, proposed for dynamic fractionation of macronutrients in environmental solids, as exemplified by the partitioning of inorganic phosphorus in agricultural soils. The on-line fractionation method capitalises on the accurate metering and sequential exposure of the various extractants to the solid sample by application of programmable flow as precisely coordinated by a syringe pump. Three different soil phase associations for phosphorus, that is, exchangeable, Al- and Fe-bound, and Ca-bound fractions, were elucidated by accommodation in the flow manifold of the three steps of the Hieltjes-Lijklema (HL) scheme involving the use of 1.0M NH4Cl, 0.1M NaOH and 0.5M HCl, respectively, as sequential leaching reagents. The precise timing and versatility of SI for tailoring various operational extraction modes were utilized for investigating the extractability and the extent of phosphorus re-distribution for variable partitioning times. Automatic spectrophotometric determination of soluble reactive phosphorus in soil extracts was performed by a flow injection (FI) analyser based on the Molybdenum Blue (MB) chemistry. The 3sigma detection limit was 0.02 mg P L(-1) while the linear dynamic range extended up to 20 mg P L(-1) regardless of the extracting media. Despite the variable chemical composition of the HL extracts, a single FI set-up was assembled with no need for either manifold re-configuration or modification of chemical composition of reagents. The mobilization of trace elements, such as Cd, often present in grazed pastures as a result of the application of phosphate fertilizers, was also explored in the HL fractions by electrothermal atomic absorption spectrometry. PMID:17723403

  14. Automatic safety rod for reactors

    DOEpatents

    Germer, John H.

    1988-01-01

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-core flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  15. Prospects for de-automatization.

    PubMed

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness. PMID:20356765

  16. Microstructure and Mechanical Properties of Al6061-31vol.% B4C Composites Prepared by Hot Isostatic Pressing

    NASA Astrophysics Data System (ADS)

    Xian, Yajiang; Pang, Xiaoxuan; He, Shixiong; Wang, Wei; Wang, Xin; Zhang, Pengcheng

    2015-10-01

    Fabrication of durable and usable composites with high content of B4C (up to 31vol.%) is quite challenging in several aspects including blending, cold isostatic pressing, and hot isostatic pressing (HIP), and especially the optimal HIP process is essential to achieve the metal matrix composite with desirable properties. The microstructure and mechanical properties of Al6061-31vol.% B4C with different particle sizes were investigated by scanning electron microscopy (SEM) and tensile testing, respectively. SEM analysis and quantitative measurements of the particle distribution reveal that B4C particles were uniformly distributed in the matrix without agglomeration when the HIP treatment temperature was about 580 °C, and x-ray diffraction also identified a dispersion of B4C particles as well as reaction products (AlB2 and Al3BC) in the composites. Microhardness of Al6061-31vol.% B4C composites was improved with B4C particle size, and the tensile strength of all the samples declined with an increase in B4C particle size. The contribution from different strengthening mechanisms was also discussed.

  17. VolRoverN: Enhancing surface and volumetric reconstruction for realistic dynamical simulation of cellular and subcellular function

    PubMed Central

    Edwards, John; Daniel, Eric; Kinney, Justin; Bartol, Tom; Sejnowski, Terrence; Johnston, Daniel; Harris, Kristen; Bajaj, Chandrajit

    2014-01-01

    Establishing meaningful relationships between cellular structure and function requires accurate morphological reconstructions. In particular, there is an unmet need for high quality surface reconstructions to model subcellular and synaptic interactions among neurons at nanometer resolution. We address this need with VolRoverN, a software package that produces accurate, efficient, and automated 3D surface reconstructions from stacked 2D contour tracings. While many techniques and tools have been developed in the past for 3D visualization of cellular structure, the reconstructions from VolRoverN meet specific quality criteria that are important for dynamical simulations. These criteria include manifoldness, water-tightness, lack of self- and object-object-intersections, and geometric accuracy. These enhanced surface reconstructions are readily extensible to any cell type (including glia) and are used here on complex spiny dendrites and axons from mature rat hippocampal area CA1. Both spatially realistic surface reconstructions and reduced skeletonizations are produced and formatted by VolRoverN for easy input into analysis software packages for neurophysiological simulations at multiple spatial and temporal scales ranging from ion electro-diffusion to electrical cable models. PMID:24100964

  18. Automatic Collision Avoidance Technology (ACAT)

    NASA Technical Reports Server (NTRS)

    Swihart, Donald E.; Skoog, Mark A.

    2007-01-01

    This document represents two views of the Automatic Collision Avoidance Technology (ACAT). One viewgraph presentation reviews the development and system design of Automatic Collision Avoidance Technology (ACAT). Two types of ACAT exist: Automatic Ground Collision Avoidance (AGCAS) and Automatic Air Collision Avoidance (AACAS). The AGCAS Uses Digital Terrain Elevation Data (DTED) for mapping functions, and uses Navigation data to place aircraft on map. It then scans DTED in front of and around aircraft and uses future aircraft trajectory (5g) to provide automatic flyup maneuver when required. The AACAS uses data link to determine position and closing rate. It contains several canned maneuvers to avoid collision. Automatic maneuvers can occur at last instant and both aircraft maneuver when using data link. The system can use sensor in place of data link. The second viewgraph presentation reviews the development of a flight test and an evaluation of the test. A review of the operation and comparison of the AGCAS and a pilot's performance are given. The same review is given for the AACAS is given.

  19. Automatic contact in DYNA3D for vehicle crashworthiness

    SciTech Connect

    Whirley, R.G.; Engelmann, B.E.

    1993-07-15

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit nonlinear finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. This paper discusses in detail a new four-step automatic contact algorithm. Key aspects of the proposed method include automatic identification of adjacent and opposite surfaces in the global search phase, and the use of a smoothly varying surface normal which allows a consistent treatment of shell intersection and corner contact conditions without ad-hoc rules. The paper concludes with three examples which illustrate the performance of the newly proposed algorithm in the public DYNA3D code.

  20. Warmer temperatures stimulate respiration and reduce net ecosystem productivity in a northern Great Plains grassland: Analysis of CO2 exchange in automatic chambers

    NASA Astrophysics Data System (ADS)

    Flanagan, L. B.

    2013-12-01

    The interacting effects of altered temperature and precipitation are expected to have significant consequences for ecosystem net carbon storage. Here I report the results of an experiment that evaluated the effects of elevated temperature and altered precipitation on ecosystem CO2 exchange in a northern Great Plains grassland, near Lethbridge, Alberta Canada. Open-top chambers were used to establish an experiment in 2012 with three treatments (control, warmed, warmed plus 50% of normal precipitation input). A smaller experiment with only the two temperature treatments (control and warmed) was conducted in 2013. Continuous half-hourly net CO2 exchange measurements were made using nine automatic chambers during May-October in both years. My objectives were to determine the sensitivity of the ecosystem carbon budget to temperature and moisture manipulations, and to test for direct and indirect effects of the environmental changes on ecosystem CO2 exchange. The experimental manipulations resulted primarily in a significant increase in air temperature in the warmed treatment plots. A cumulative net loss of carbon or negative net ecosystem productivity (NEP) occurred during May through September in the warmed treatment (NEP = -659 g C m-2), while in the control treatment there was a cumulative net gain of carbon (NEP = +50 g C m-2). An eddy covariance system that operated at the site, over a footprint region that was not influenced by the experimental treatments, also showed a net gain of carbon by the ecosystem. The reduced NEP was due to higher plant and soil respiration rates in the warmed treatment that appeared to be caused by a combination of: (i) higher carbon substrate availability indirectly stimulating soil respiration in the warmed relative to the control treatment, and (ii) a strong increase in leaf respiration likely caused by a shift in electron partitioning to the alternative pathway respiration in the warmed treatment, particularly when exposed to high

  1. Automatic Detection of Dominance and Expected Interest

    NASA Astrophysics Data System (ADS)

    Escalera, Sergio; Pujol, Oriol; Radeva, Petia; Vitrià, Jordi; Anguera, M. Teresa

    2010-12-01

    Social Signal Processing is an emergent area of research that focuses on the analysis of social constructs. Dominance and interest are two of these social constructs. Dominance refers to the level of influence a person has in a conversation. Interest, when referred in terms of group interactions, can be defined as the degree of engagement that the members of a group collectively display during their interaction. In this paper, we argue that only using behavioral motion information, we are able to predict the interest of observers when looking at face-to-face interactions as well as the dominant people. First, we propose a simple set of movement-based features from body, face, and mouth activity in order to define a higher set of interaction indicators. The considered indicators are manually annotated by observers. Based on the opinions obtained, we define an automatic binary dominance detection problem and a multiclass interest quantification problem. Error-Correcting Output Codes framework is used to learn to rank the perceived observer's interest in face-to-face interactions meanwhile Adaboost is used to solve the dominant detection problem. The automatic system shows good correlation between the automatic categorization results and the manual ranking made by the observers in both dominance and interest detection problems.

  2. Automatic image classification for the urinoculture screening.

    PubMed

    Andreini, Paolo; Bonechi, Simone; Bianchini, Monica; Garzelli, Andrea; Mecocci, Alessandro

    2016-03-01

    Urinary tract infections (UTIs) are considered to be the most common bacterial infection and, actually, it is estimated that about 150 million UTIs occur world wide yearly, giving rise to roughly $6 billion in healthcare expenditures and resulting in 100,000 hospitalizations. Nevertheless, it is difficult to carefully assess the incidence of UTIs, since an accurate diagnosis depends both on the presence of symptoms and on a positive urinoculture, whereas in most outpatient settings this diagnosis is made without an ad hoc analysis protocol. On the other hand, in the traditional urinoculture test, a sample of midstream urine is put onto a Petri dish, where a growth medium favors the proliferation of germ colonies. Then, the infection severity is evaluated by a visual inspection of a human expert, an error prone and lengthy process. In this paper, we propose a fully automated system for the urinoculture screening that can provide quick and easily traceable results for UTIs. Based on advanced image processing and machine learning tools, the infection type recognition, together with the estimation of the bacterial load, can be automatically carried out, yielding accurate diagnoses. The proposed AID (Automatic Infection Detector) system provides support during the whole analysis process: first, digital color images of Petri dishes are automatically captured, then specific preprocessing and spatial clustering algorithms are applied to isolate the colonies from the culture ground and, finally, an accurate classification of the infections and their severity evaluation are performed. The AID system speeds up the analysis, contributes to the standardization of the process, allows result repeatability, and reduces the costs. Moreover, the continuous transition between sterile and external environments (typical of the standard analysis procedure) is completely avoided. PMID:26780249

  3. NASA automatic system for computer program documentation, volume 2

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1972-01-01

    The DYNASOR 2 program is used for the dynamic nonlinear analysis of shells of revolution. The equations of motion of the shell are solved using Houbolt's numerical procedure. The displacements and stress resultants are determined for both symmetrical and asymmetrical loading conditions. Asymmetrical dynamic buckling can be investigated. Solutions can be obtained for highly nonlinear problems utilizing as many as five of the harmonics generated by SAMMSOR program. A restart capability allows the user to restart the program at a specified time. For Vol. 1, see N73-22129.

  4. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  5. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, Anthony J.

    1994-05-10

    Disclosed are a method and apparatus for (1) automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, (2) automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, (3) manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and (4) automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly.

  6. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  7. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, A.J.

    1994-05-10

    Disclosed are a method and apparatus for automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly. 10 figures.

  8. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  9. Automatic diluter for bacteriological samples.

    PubMed

    Trinel, P A; Bleuze, P; Leroy, G; Moschetto, Y; Leclerc, H

    1983-02-01

    The described apparatus, carrying 190 tubes, allows automatic and aseptic dilution of liquid or suspended-solid samples. Serial 10-fold dilutions are programmable from 10(-1) to 10(-9) and are carried out in glass tubes with screw caps and split silicone septa. Dilution assays performed with strains of Escherichia coli and Bacillus stearothermophilus permitted efficient conditions for sterilization of the needle to be defined and showed that the automatic dilutions were as accurate and as reproducible as the most rigorous conventional dilutions. PMID:6338826

  10. Automatic diluter for bacteriological samples.

    PubMed Central

    Trinel, P A; Bleuze, P; Leroy, G; Moschetto, Y; Leclerc, H

    1983-01-01

    The described apparatus, carrying 190 tubes, allows automatic and aseptic dilution of liquid or suspended-solid samples. Serial 10-fold dilutions are programmable from 10(-1) to 10(-9) and are carried out in glass tubes with screw caps and split silicone septa. Dilution assays performed with strains of Escherichia coli and Bacillus stearothermophilus permitted efficient conditions for sterilization of the needle to be defined and showed that the automatic dilutions were as accurate and as reproducible as the most rigorous conventional dilutions. Images PMID:6338826

  11. Grinding Parts For Automatic Welding

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  12. Users manual for AUTOMESH-2D: A program of automatic mesh generation for two-dimensional scattering analysis by the finite element method

    NASA Technical Reports Server (NTRS)

    Hua, Chongyu; Volakis, John L.

    1990-01-01

    AUTOMESH-2D is a computer program specifically designed as a preprocessor for the scattering analysis of two dimensional bodies by the finite element method. This program was developed due to a need for reproducing the effort required to define and check the geometry data, element topology, and material properties. There are six modules in the program: (1) Parameter Specification; (2) Data Input; (3) Node Generation; (4) Element Generation; (5) Mesh Smoothing; and (5) Data File Generation.

  13. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  14. Automatic segmentation of solitary pulmonary nodules based on local intensity structure analysis and 3D neighborhood features in 3D chest CT images

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Kitasaka, Takayuki; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2012-03-01

    This paper presents a solitary pulmonary nodule (SPN) segmentation method based on local intensity structure analysis and neighborhood feature analysis in chest CT images. Automated segmentation of SPNs is desirable for a chest computer-aided detection/diagnosis (CAS) system since a SPN may indicate early stage of lung cancer. Due to the similar intensities of SPNs and other chest structures such as blood vessels, many false positives (FPs) are generated by nodule detection methods. To reduce such FPs, we introduce two features that analyze the relation between each segmented nodule candidate and it neighborhood region. The proposed method utilizes a blob-like structure enhancement (BSE) filter based on Hessian analysis to augment the blob-like structures as initial nodule candidates. Then a fine segmentation is performed to segment much more accurate region of each nodule candidate. FP reduction is mainly addressed by investigating two neighborhood features based on volume ratio and eigenvector of Hessian that are calculates from the neighborhood region of each nodule candidate. We evaluated the proposed method by using 40 chest CT images, include 20 standard-dose CT images that we randomly chosen from a local database and 20 low-dose CT images that were randomly chosen from a public database: LIDC. The experimental results revealed that the average TP rate of proposed method was 93.6% with 12.3 FPs/case.

  15. Approaches to the automatic generation and control of finite element meshes

    NASA Technical Reports Server (NTRS)

    Shephard, Mark S.

    1987-01-01

    The algorithmic approaches being taken to the development of finite element mesh generators capable of automatically discretizing general domains without the need for user intervention are discussed. It is demonstrated that because of the modeling demands placed on a automatic mesh generator, all the approaches taken to date produce unstructured meshes. Consideration is also given to both a priori and a posteriori mesh control devices for automatic mesh generators as well as their integration with geometric modeling and adaptive analysis procedures.

  16. Automatic 35 mm slide duplicator

    NASA Technical Reports Server (NTRS)

    Seidel, H. F.; Texler, R. E.

    1980-01-01

    Automatic duplicator is readily assembled from conventional, inexpensive equipment and parts. Series of slides can be exposed without operator attention, eliminating considerable manual handling and processing ordinarily required. At end of programmed exposure sequence, unit shuts off and audible alarm signals completion of process.

  17. Automatic Association of News Items.

    ERIC Educational Resources Information Center

    Carrick, Christina; Watters, Carolyn

    1997-01-01

    Discussion of electronic news delivery systems and the automatic generation of electronic editions focuses on the association of related items of different media type, specifically photos and stories. The goal is to be able to determine to what degree any two news items refer to the same news event. (Author/LRW)

  18. Automatically Preparing Safe SQL Queries

    NASA Astrophysics Data System (ADS)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  19. Bubble vector in automatic merging

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Butler, T. G.

    1987-01-01

    It is shown that it is within the capability of the DMAP language to build a set of vectors that can grow incrementally to be applied automatically and economically within a DMAP loop that serves to append sub-matrices that are generated within a loop to a core matrix. The method of constructing such vectors is explained.

  20. Automatic Identification of Metaphoric Utterances

    ERIC Educational Resources Information Center

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…