Science.gov

Sample records for automatic vol analysis

  1. Automatic analysis of macroarrays images.

    PubMed

    Caridade, C R; Marcal, A S; Mendonca, T; Albuquerque, P; Mendes, M V; Tavares, F

    2010-01-01

    The analysis of dot blot (macroarray) images is currently based on the human identification of positive/negative dots, which is a subjective and time consuming process. This paper presents a system for the automatic analysis of dot blot images, using a pre-defined grid of markers, including a number of ON and OFF controls. The geometric deformations of the input image are corrected, and the individual markers detected, both tasks fully automatically. Based on a previous training stage, the probability for each marker to be ON is established. This information is provided together with quality parameters for training, noise and classification, allowing for a fully automatic evaluation of a dot blot image. PMID:21097139

  2. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  3. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  4. Automatic emotional expression analysis from eye area

    NASA Astrophysics Data System (ADS)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  5. Multiple Regression Analysis and Automatic Interaction Detection.

    ERIC Educational Resources Information Center

    Koplyay, Janos B.

    The Automatic Interaction Detector (AID) is discussed as to its usefulness in multiple regression analysis. The algorithm of AID-4 is a reversal of the model building process; it starts with the ultimate restricted model, namely, the whole group as a unit. By a unique splitting process maximizing the between sum of squares for the categories of…

  6. Indexing and Automatic Significance Analysis

    ERIC Educational Resources Information Center

    Steinacker, Ivo

    1974-01-01

    An algorithm is proposed to solve the problem of sequential indexing which does not use any grammatical or semantic analysis, but follows the principle of emulating human judgement by evaluation of machine-recognizable attributes of structured word assemblies. (Author)

  7. Automatic photointerpretation via texture and morphology analysis

    NASA Technical Reports Server (NTRS)

    Tou, J. T.

    1982-01-01

    Computer-based techniques for automatic photointerpretation based upon information derived from texture and morphology analysis of images are discussed. By automatic photointerpretation, is meant the determination of semantic descriptions of the content of the images by computer. To perform semantic analysis of morphology, a heirarchical structure of knowledge representation was developed. The simplest elements in a morphology are strokes, which are used to form alphabets. The alphabets are the elements for generating words, which are used to describe the function or property of an object or a region. The words are the elements for constructing sentences, which are used for semantic description of the content of the image. Photointerpretation based upon morphology is then augmented by textural information. Textural analysis is performed using a pixel-vector approach.

  8. Automatic Prosodic Analysis to Identify Mild Dementia.

    PubMed

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  9. Automatic Prosodic Analysis to Identify Mild Dementia

    PubMed Central

    Gonzalez-Moreira, Eduardo; Torres-Boza, Diana; Kairuz, Héctor Arturo; Ferrer, Carlos; Garcia-Zamora, Marlene; Espinoza-Cuadros, Fernando; Hernandez-Gómez, Luis Alfonso

    2015-01-01

    This paper describes an exploratory technique to identify mild dementia by assessing the degree of speech deficits. A total of twenty participants were used for this experiment, ten patients with a diagnosis of mild dementia and ten participants like healthy control. The audio session for each subject was recorded following a methodology developed for the present study. Prosodic features in patients with mild dementia and healthy elderly controls were measured using automatic prosodic analysis on a reading task. A novel method was carried out to gather twelve prosodic features over speech samples. The best classification rate achieved was of 85% accuracy using four prosodic features. The results attained show that the proposed computational speech analysis offers a viable alternative for automatic identification of dementia features in elderly adults. PMID:26558287

  10. Automatic cortical thickness analysis on rodent brain

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek

    2011-03-01

    Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.

  11. Automatic variance analysis of multistage care pathways.

    PubMed

    Li, Xiang; Liu, Haifeng; Zhang, Shilei; Mei, Jing; Xie, Guotong; Yu, Yiqin; Li, Jing; Lakshmanan, Geetika T

    2014-01-01

    A care pathway (CP) is a standardized process that consists of multiple care stages, clinical activities and their relations, aimed at ensuring and enhancing the quality of care. However, actual care may deviate from the planned CP, and analysis of these deviations can help clinicians refine the CP and reduce medical errors. In this paper, we propose a CP variance analysis method to automatically identify the deviations between actual patient traces in electronic medical records (EMR) and a multistage CP. As the care stage information is usually unavailable in EMR, we first align every trace with the CP using a hidden Markov model. From the aligned traces, we report three types of deviations for every care stage: additional activities, absent activities and violated constraints, which are identified by using the techniques of temporal logic and binomial tests. The method has been applied to a CP for the management of congestive heart failure and real world EMR, providing meaningful evidence for the further improvement of care quality. PMID:25160280

  12. AUTOMATISM.

    PubMed

    MCCALDON, R J

    1964-10-24

    Individuals can carry out complex activity while in a state of impaired consciousness, a condition termed "automatism". Consciousness must be considered from both an organic and a psychological aspect, because impairment of consciousness may occur in both ways. Automatism may be classified as normal (hypnosis), organic (temporal lobe epilepsy), psychogenic (dissociative fugue) or feigned. Often painstaking clinical investigation is necessary to clarify the diagnosis. There is legal precedent for assuming that all crimes must embody both consciousness and will. Jurists are loath to apply this principle without reservation, as this would necessitate acquittal and release of potentially dangerous individuals. However, with the sole exception of the defence of insanity, there is at present no legislation to prohibit release without further investigation of anyone acquitted of a crime on the grounds of "automatism".

  13. Remote weapon station for automatic target recognition system demand analysis

    NASA Astrophysics Data System (ADS)

    Lei, Zhang; Li, Sheng-cai; Shi, Cai

    2015-08-01

    Introduces a remote weapon station basic composition and the main advantage, analysis of target based on image automatic recognition system for remote weapon station of practical significance, the system elaborated the image based automatic target recognition system in the photoelectric stabilized technology, multi-sensor image fusion technology, integrated control target image enhancement, target behavior risk analysis technology, intelligent based on the character of the image automatic target recognition algorithm research, micro sensor technology as the key technology of the development in the field of demand.

  14. Automatic basal slice detection for cardiac analysis

    NASA Astrophysics Data System (ADS)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  15. Automatic basal slice detection for cardiac analysis.

    PubMed

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S

    2016-07-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction of the left ventricle. Despite all the effort placed on automatic cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, suffers from high interobserver variability. As a result, an automatic algorithm for basal slice identification is required. Guidelines published in 2013 identify the basal slice based on the percentage of myocardium surrounding the blood cavity in the short-axis view. Existing methods, however, assume that the basal slice is the first short-axis view slice below the mitral valve and are consequently at times identifying the incorrect short-axis slice. Correct identification of the basal slice under the Society for Cardiovascular Magnetic Resonance guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that utilizes the two-chamber view to determine the basal slice while following the guidelines. To this end, an active shape model is trained to segment the two-chamber view and create temporal binary profiles from which the basal slice is identified. From the 51 tested cases, our method obtains 92% and 84% accurate basal slice detection for the end-systole and the end-diastole, respectively. PMID:27660805

  16. Automatism

    PubMed Central

    McCaldon, R. J.

    1964-01-01

    Individuals can carry out complex activity while in a state of impaired consciousness, a condition termed “automatism”. Consciousness must be considered from both an organic and a psychological aspect, because impairment of consciousness may occur in both ways. Automatism may be classified as normal (hypnosis), organic (temporal lobe epilepsy), psychogenic (dissociative fugue) or feigned. Often painstaking clinical investigation is necessary to clarify the diagnosis. There is legal precedent for assuming that all crimes must embody both consciousness and will. Jurists are loath to apply this principle without reservation, as this would necessitate acquittal and release of potentially dangerous individuals. However, with the sole exception of the defence of insanity, there is at present no legislation to prohibit release without further investigation of anyone acquitted of a crime on the grounds of “automatism”. PMID:14199824

  17. Automatic ionospheric layers detection: Algorithms analysis

    NASA Astrophysics Data System (ADS)

    Molina, María G.; Zuccheretti, Enrico; Cabrera, Miguel A.; Bianchi, Cesidio; Sciacca, Umberto; Baskaradas, James

    2016-03-01

    Vertical sounding is a widely used technique to obtain ionosphere measurements, such as an estimation of virtual height versus frequency scanning. It is performed by high frequency radar for geophysical applications called "ionospheric sounder" (or "ionosonde"). Radar detection depends mainly on targets characteristics. While several targets behavior and correspondent echo detection algorithms have been studied, a survey to address a suitable algorithm for ionospheric sounder has to be carried out. This paper is focused on automatic echo detection algorithms implemented in particular for an ionospheric sounder, target specific characteristics were studied as well. Adaptive threshold detection algorithms are proposed, compared to the current implemented algorithm, and tested using actual data obtained from the Advanced Ionospheric Sounder (AIS-INGV) at Rome Ionospheric Observatory. Different cases of study have been selected according typical ionospheric and detection conditions.

  18. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  19. Automatic analysis of a skull fracture based on image content

    NASA Astrophysics Data System (ADS)

    Shao, Hong; Zhao, Hong

    2003-09-01

    Automatic analysis based on image content is a hotspot with bright future of medical image diagnosis technology research. Analysis of the fracture of skull can help doctors diagnose. In this paper, a new approach is proposed to automatically detect the fracture of skull based on CT image content. First region growing method, whose seeds and growing rules are chosen by k-means clustering dynamically, is applied for image automatic segmentation. The segmented region boundary is found by boundary tracing. Then the shape of the boundary is analyzed, and the circularity measure is taken as description parameter. At last the rules for computer automatic diagnosis of the fracture of the skull are reasoned by entropy function. This method is used to analyze the images from the third ventricles below layer to cerebral cortex top layer. Experimental result shows that the recognition rate is 100% for the 100 images, which are chosen from medical image database randomly and are not included in the training examples. This method integrates color and shape feature, and isn't affected by image size and position. This research achieves high recognition rate and sets a basis for automatic analysis of brain image.

  20. Environment for the automatic manipulation and analysis of morphological expressions

    NASA Astrophysics Data System (ADS)

    Richardson, Craig H.; Schafer, Ronald W.

    1990-11-01

    This paper describes a LISP based environment for the automatic manipulation and analysis of morphological expressions. The foundation of this environment is an aggregation of morphological knowledge that includes signal and system property information rule bases for representing morphological relationships and inferencing mechanisms for using this collection of knowledge. The layers surrounding this foundation include representations of abstract signal and structuring element classes as well as actual structuring elements implementations of the morphological operators and the ability to optimally decompose structels. The representational requirements for automatically manipulating expressions and determining the computational cost are described and the capabilities of the environment are illustrated by examples of symbolic manipulations and expression analysis.

  1. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  2. Profiling School Shooters: Automatic Text-Based Analysis

    PubMed Central

    Neuman, Yair; Assaf, Dan; Cohen, Yochai; Knoll, James L.

    2015-01-01

    School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters’ texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology. PMID:26089804

  3. Profiling School Shooters: Automatic Text-Based Analysis.

    PubMed

    Neuman, Yair; Assaf, Dan; Cohen, Yochai; Knoll, James L

    2015-01-01

    School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology. PMID:26089804

  4. Trends of Science Education Research: An Automatic Content Analysis

    ERIC Educational Resources Information Center

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  5. A new approach to automatic radiation spectrum analysis

    SciTech Connect

    Olmos, P.; Diaz, J.C.; Perez, J.M.; Aguayo, P.; Bru, A.; Garcia-Belmonte, G.; de Pablos, J.L. ); Gomez, P.; Rodellar, V. )

    1991-08-01

    In this paper the application of adaptive methods to the solution of the automatic radioisotope identification problem using the energy spectrum is described. The identification is carried out by means of neural networks, which allow the use of relatively reduced computational structures, while keeping high pattern recognition capability. In this context, it has been found that one of these simple structures, once adequately trained, is quite suitable to identify a given isotope present in a mixture of elements as well as the relative proportions of each identified substance. Preliminary results are good enough to consider these adaptive structures as powerful and simple tools in the automatic spectrum analysis.

  6. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    PubMed

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  7. Effectiveness of an automatic tracking software in underwater motion analysis.

    PubMed

    Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and

  8. Development of an automatic identification algorithm for antibiogram analysis.

    PubMed

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a <4mm difference between AIA and human analysis, exhibiting a correlation index of 0.85 for all images, 0.90 for standards and 0.80 for oddities with no significant difference between automatic and manual method. AIA resolved some reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  9. Cernuc: A program for automatic high-resolution radioelemental analysis

    NASA Astrophysics Data System (ADS)

    Roca, V.; Terrasi, F.; Moro, R.; Sorrentino, G.

    1981-04-01

    A computer program capable of qualitative and quantitative radioelemental analysis with high accuracy, a high degree of automatism and great ease in utilization, is presented. It has been produced to be used for Ge(Li) gammay-ray spectroscopy and can be used for X-ray spectroscopy as well. This program provides automatic searching and fitting of peaks, energy and intensity determination, identification and calculation of activities of the radioisotopes present in the sample. The last step is carried out by using a radionuclides library. The problem of a gamma line being assigned to more than one nuclide, is solved by searching the least-squares solution of a set of equations for the activities of the isotopes. Two versions of this program have been written to be run batchwise on a medium sized computer (UNIVAC 1106) and interactively on a small computer (HP 2100A).

  10. Facilitator control as automatic behavior: A verbal behavior analysis

    PubMed Central

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is automatic when the speaker or writer is not stimulated by the behavior at the time of emission, the behavior is not edited, the products of behavior differ from what the person would produce normally, and the behavior is attributed to an outside source. All of these characteristics appear to be present in facilitator behavior. Other variables seem to account for the thematic content of the typed messages. These variables also are discussed. PMID:22477083

  11. An integrated spatial signature analysis and automatic defect classification system

    SciTech Connect

    Gleason, S.S.; Tobin, K.W.; Karnowski, T.P.

    1997-08-01

    An integrated Spatial Signature Analysis (SSA) and automatic defect classification (ADC) system for improved automatic semiconductor wafer manufacturing characterization is presented. Both concepts of SSA and ADC methodologies are reviewed and then the benefits of an integrated system are described, namely, focused ADC and signature-level sampling. Focused ADC involves the use of SSA information on a defect signature to reduce the number of possible classes that an ADC system must consider, thus improving the ADC system performance. Signature-level sampling improved the ADC system throughput and accuracy by intelligently sampling defects within a given spatial signature for subsequent off-line, high-resolution ADC. A complete example of wafermap characterization via an integrated SSA/ADC system is presented where a wafer with 3274 defects is completely characterized by revisiting only 25 defects on an off-line ADC review station. 13 refs., 7 figs.

  12. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    PubMed Central

    2012-01-01

    Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding

  13. Feature++: Automatic Feature Construction for Clinical Data Analysis.

    PubMed

    Sun, Wen; Hao, Bibo; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong

    2016-01-01

    With the rapid growth of clinical data and knowledge, feature construction for clinical analysis becomes increasingly important and challenging. Given a clinical dataset with up to hundreds or thousands of columns, the traditional manual feature construction process is usually too labour intensive to generate a full spectrum of features with potential values. As a result, advanced large-scale data analysis technologies, such as feature selection for predictive modelling, cannot be fully utilized for clinical data analysis. In this paper, we propose an automatic feature construction framework for clinical data analysis, namely, Feature++. It leverages available public knowledge to understand the semantics of the clinical data, and is able to integrate external data sources to automatically construct new features based on predefined rules and clinical knowledge. We demonstrate the effectiveness of Feature++ in a typical predictive modelling use case with a public clinical dataset, and the results suggest that the proposed approach is able to fulfil typical feature construction tasks with minimal dataset specific configurations, so that more accurate models can be obtained from various clinical datasets in a more efficient way. PMID:27577443

  14. Rapid automatic keyword extraction for information retrieval and analysis

    DOEpatents

    Rose, Stuart J; Cowley,; Wendy E; Crow, Vernon L; Cramer, Nicholas O

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  15. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  16. Corpus analysis and automatic detection of emotion-including keywords

    NASA Astrophysics Data System (ADS)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  17. Automatic analysis of attack data from distributed honeypot network

    NASA Astrophysics Data System (ADS)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  18. Processing, analysis, recognition, and automatic understanding of medical images

    NASA Astrophysics Data System (ADS)

    Tadeusiewicz, Ryszard; Ogiela, Marek R.

    2004-07-01

    Paper presents some new ideas introducing automatic understanding of the medical images semantic content. The idea under consideration can be found as next step on the way starting from capturing of the images in digital form as two-dimensional data structures, next going throw images processing as a tool for enhancement of the images visibility and readability, applying images analysis algorithms for extracting selected features of the images (or parts of images e.g. objects), and ending on the algorithms devoted to images classification and recognition. In the paper we try to explain, why all procedures mentioned above can not give us full satisfaction in many important medical problems, when we do need understand image semantic sense, not only describe the image in terms of selected features and/or classes. The general idea of automatic images understanding is presented as well as some remarks about the successful applications of such ideas for increasing potential possibilities and performance of computer vision systems dedicated to advanced medical images analysis. This is achieved by means of applying linguistic description of the picture merit content. After this we try use new AI methods to undertake tasks of the automatic understanding of images semantics in intelligent medical information systems. A successful obtaining of the crucial semantic content of the medical image may contribute considerably to the creation of new intelligent multimedia cognitive medical systems. Thanks to the new idea of cognitive resonance between stream of the data extracted form the image using linguistic methods and expectations taken from the representation of the medical knowledge, it is possible to understand the merit content of the image even if the form of the image is very different from any known pattern.

  19. Entropy analysis of OCT signal for automatic tissue characterization

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Qiu, Yi; Zaki, Farzana; Xu, Yiqing; Hubbi, Basil; Belfield, Kevin D.; Liu, Xuan

    2016-03-01

    Optical coherence tomography (OCT) signal can provide microscopic characterization of biological tissue and assist clinical decision making in real-time. However, raw OCT data is noisy and complicated. It is challenging to extract information that is directly related to the pathological status of tissue through visual inspection on huge volume of OCT signal streaming from the high speed OCT engine. Therefore, it is critical to discover concise, comprehensible information from massive OCT data through novel strategies for signal analysis. In this study, we perform Shannon entropy analysis on OCT signal for automatic tissue characterization, which can be applied in intraoperative tumor margin delineation for surgical excision of cancer. The principle of this technique is based on the fact that normal tissue is usually more structured with higher entropy value, compared to pathological tissue such as cancer tissue. In this study, we develop high-speed software based on graphic processing units (GPU) for real-time entropy analysis of OCT signal.

  20. Spectral saliency via automatic adaptive amplitude spectrum analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  1. Automatic Visual Tracking and Social Behaviour Analysis with Multiple Mice

    PubMed Central

    Giancardo, Luca; Sona, Diego; Huang, Huiping; Sannino, Sara; Managò, Francesca; Scheggia, Diego; Papaleo, Francesco; Murino, Vittorio

    2013-01-01

    Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain) and BTBR T+tf/J (a mouse model for autism spectrum disorders). Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2) interacting mice, and its versatility to deal with different experimental settings and

  2. [Automatic analysis pipeline of next-generation sequencing data].

    PubMed

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  3. Automatic analysis for neuron by confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Satou, Kouhei; Aoki, Yoshimitsu; Mataga, Nobuko; Hensh, Takao K.; Taki, Katuhiko

    2005-12-01

    The aim of this study is to develop a system that recognizes both the macro- and microscopic configurations of nerve cells and automatically performs the necessary 3-D measurements and functional classification of spines. The acquisition of 3-D images of cranial nerves has been enabled by the use of a confocal laser scanning microscope, although the highly accurate 3-D measurements of the microscopic structures of cranial nerves and their classification based on their configurations have not yet been accomplished. In this study, in order to obtain highly accurate measurements of the microscopic structures of cranial nerves, existing positions of spines were predicted by the 2-D image processing of tomographic images. Next, based on the positions that were predicted on the 2-D images, the positions and configurations of the spines were determined more accurately by 3-D image processing of the volume data. We report the successful construction of an automatic analysis system that uses a coarse-to-fine technique to analyze the microscopic structures of cranial nerves with high speed and accuracy by combining 2-D and 3-D image analyses.

  4. Automatic Analysis of Cellularity in Glioblastoma and Correlation with ADC Using Trajectory Analysis and Automatic Nuclei Counting

    PubMed Central

    Burth, Sina; Kieslich, Pascal J.; Jungk, Christine; Sahm, Felix; Kickingereder, Philipp; Kiening, Karl; Unterberg, Andreas; Wick, Wolfgang; Schlemmer, Heinz-Peter; Bendszus, Martin; Radbruch, Alexander

    2016-01-01

    Objective Several studies have analyzed a correlation between the apparent diffusion coefficient (ADC) derived from diffusion-weighted MRI and the tumor cellularity of corresponding histopathological specimens in brain tumors with inconclusive findings. Here, we compared a large dataset of ADC and cellularity values of stereotactic biopsies of glioblastoma patients using a new postprocessing approach including trajectory analysis and automatic nuclei counting. Materials and Methods Thirty-seven patients with newly diagnosed glioblastomas were enrolled in this study. ADC maps were acquired preoperatively at 3T and coregistered to the intraoperative MRI that contained the coordinates of the biopsy trajectory. 561 biopsy specimens were obtained; corresponding cellularity was calculated by semi-automatic nuclei counting and correlated to the respective preoperative ADC values along the stereotactic biopsy trajectory which included areas of T1-contrast-enhancement and necrosis. Results There was a weak to moderate inverse correlation between ADC and cellularity in glioblastomas that varied depending on the approach towards statistical analysis: for mean values per patient, Spearman’s ρ = -0.48 (p = 0.002), for all trajectory values in one joint analysis Spearman’s ρ = -0.32 (p < 0.001). The inverse correlation was additionally verified by a linear mixed model. Conclusions Our data confirms a previously reported inverse correlation between ADC and tumor cellularity. However, the correlation in the current article is weaker than the pooled correlation of comparable previous studies. Hence, besides cell density, other factors, such as necrosis and edema might influence ADC values in glioblastomas. PMID:27467557

  5. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    SciTech Connect

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  6. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    PubMed

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  7. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  8. The interprocedural analysis and automatic parallelization of scheme programs

    SciTech Connect

    Harrison, W.L. III.

    1989-01-01

    Lisp and its descendants are among the most important and widely used of programming languages. At the same time, parallelism in the architecture of computer systems is becoming commonplace. There is a pressing need to extend the technology of automatic parallelization that has become available to Fortran programmers of parallel machines, to the realm of Lisp programs and symbolic computing. In this thesis the authors presents a comprehensive approach to the compilation of Scheme programs for share-memory multiprocessors. The strategy has two principal components: interprocedural analysis and program restructuring. He introduces procedure strings and stack configurations as a framework in which to reason about interprocedural side-effects and object lifetimes, and develop a system of interprocedural analysis, using abstract interpretation, that is used in the dependence analysis and memory management of Scheme programs. He introduces the transformations of exit-loop translation and recursion splitting to treat the control structures of iteration and recursion that arise commonly in Scheme programs. He proposes an alternative representation for s-expressions that facilitates the parallel creation and access of lists. He has implemented these ideas in a parallelizing scheme compiler and run-time system, and he complements the theory of the work with snapshots of programs during the restructuring process, and some preliminary performance results of the execution of object codes produced by the compiler.

  9. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  10. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  11. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  12. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  13. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  14. Trends of Science Education Research: An Automatic Content Analysis

    NASA Astrophysics Data System (ADS)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  15. Automatic analysis of ciliary beat frequency using optical flow

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  16. Variable frame rate analysis for automatic speech recognition

    NASA Astrophysics Data System (ADS)

    Tan, Zheng-Hua

    2007-09-01

    In this paper we investigate the use of variable frame rate (VFR) analysis in automatic speech recognition (ASR). First, we review VFR technique and analyze its behavior. It is experimentally shown that VFR improves ASR performance for signals with low signal-to-noise ratios since it generates improved acoustic models and substantially reduces insertion and substitution errors although it may increase deletion errors. It is also underlined that the match between the average frame rate and the number of hidden Markov model states is critical in implementing VFR. Secondly, we analyze an effective VFR method that uses a cumulative, weighted cepstral-distance criterion for frame selection and present a revision for it. Lastly, the revised VFR method is combined with spectral- and cepstral-domain enhancement methods including the minimum statistics noise estimation (MSNE) based spectral subtraction and the cepstral mean subtraction, variance normalization and ARMA filtering (MVA) process. Experiments on the Aurora 2 database justify that VFR is highly complementary to the enhancement methods. Enhancement of speech both facilitates the frame selection in VFR and provides de-noised speech for recognition.

  17. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  18. Automatic analysis of silver-stained comets by CellProfiler software.

    PubMed

    González, J E; Romero, I; Barquinero, J F; García, O

    2012-10-01

    The comet assay is one of the most widely used methods to evaluate DNA damage and repair in eukaryotic cells. The comets can be measured by software, in a semi-automatic or automatic process. In this paper, we apply the CellProfiler open-source software for automatic analysis of comets from digitized images, reporting the percentage of tail DNA. A side-by-side comparison of CellProfiler with CASP software demonstrated good agreement between the two packages. Our work demonstrates that automatic measurement of silver-stained comets with open-source software is possible, providing significant time savings. PMID:22771502

  19. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  20. Automatic analysis of the in vitro micronucleus test on V79 cells.

    PubMed

    Frieauff, W; Pötter-Locher, F; Cordier, A; Suter, W

    1998-02-23

    The in vitro micronucleus test is a well established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and specific parameter. As a measure for numerical and structural chromosome aberrations, the in vitro micronucleus test consists of determining the frequency of micronucleated cells in a representative fraction of cells in a culture. So far, manual counting has been the only method for evaluating microscopic V79 Chinese hamster cell preparations. To replace this tedious and time consuming procedure, a fully automatic system for micronucleus scoring in V79 cells by image analysis has been developed and introduced into the routine genotoxicity screening of drug candidates. The comparison of manual and automatic micronucleus analysis showed a high degree of concordance between the results obtained by the two techniques. For concentration series of cyclophosphamide (CP) and ethyl-methanesulphonate (EMS) as test compounds, the frequency of erroneously missed micronuclei through automatic scoring proved to be below 15% in comparison with manual scoring. Generally, false positive micronucleus decisions could be controlled easily by fast and simple relocation of the automatically detected patterns. The possibility to analyze 24 slides within 1 day by fully automatic overnight analysis and the high reproducibility of the results make automatic image processing a powerful tool for the in vitro micronucleus analysis.

  1. Automatic Match between Delimitation Line and Real Terrain Based on Least-Cost Path Analysis

    NASA Astrophysics Data System (ADS)

    Feng, C. Q.; Jiang, N.; Zhang, X. N.; Ma, J.

    2013-11-01

    Nowadays, during the international negotiation on separating dispute areas, manual adjusting is lonely applied to the match between delimitation line and real terrain, which not only consumes much time and great labor force, but also cannot ensure high precision. Concerning that, the paper mainly explores automatic match between them and study its general solution based on Least -Cost Path Analysis. First, under the guidelines of delimitation laws, the cost layer is acquired through special disposals of delimitation line and terrain features line. Second, a new delimitation line gets constructed with the help of Least-Cost Path Analysis. Third, the whole automatic match model is built via Module Builder in order to share and reuse it. Finally, the result of automatic match is analyzed from many different aspects, including delimitation laws, two-sided benefits and so on. Consequently, a conclusion is made that the method of automatic match is feasible and effective.

  2. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    NASA Astrophysics Data System (ADS)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  3. Automatic Crowd Analysis from Very High Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Reinartz, P.

    2011-04-01

    Recently automatic detection of people crowds from images became a very important research field, since it can provide crucial information especially for police departments and crisis management teams. Due to the importance of the topic, many researchers tried to solve this problem using street cameras. However, these cameras cannot be used to monitor very large outdoor public events. In order to bring a solution to the problem, herein we propose a novel approach to detect crowds automatically from remotely sensed images, and especially from very high resolution satellite images. To do so, we use a local feature based probabilistic framework. We extract local features from color components of the input image. In order to eliminate redundant local features coming from other objects in given scene, we apply a feature selection method. For feature selection purposes, we benefit from three different type of information; digital elevation model (DEM) of the region which is automatically generated using stereo satellite images, possible street segment which is obtained by segmentation, and shadow information. After eliminating redundant local features, remaining features are used to detect individual persons. Those local feature coordinates are also assumed as observations of the probability density function (pdf) of the crowds to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf which gives us information about dense crowd and people locations. We test our algorithm usingWorldview-2 satellite images over Cairo and Munich cities. Besides, we also provide test results on airborne images for comparison of the detection accuracy. Our experimental results indicate the possible usage of the proposed approach in real-life mass events.

  4. Automatic emotion recognition based on body movement analysis: a survey.

    PubMed

    Zacharatos, Haris; Gatzoulis, Christos; Chrysanthou, Yiorgos L

    2014-01-01

    Humans are emotional beings, and their feelings influence how they perform and interact with computers. One of the most expressive modalities for humans is body posture and movement, which researchers have recently started exploiting for emotion recognition. This survey describes emerging techniques and modalities related to emotion recognition based on body movement, as well as recent advances in automatic emotion recognition. It also describes application areas and notation systems and explains the importance of movement segmentation. It then discusses unsolved problems and provides promising directions for future research. The Web extra (a PDF file) contains tables with additional information related to the article. PMID:25216477

  5. California State Library: Processing Center Design and Specifications, Vol. V: Cost Analysis. Supplemental Volume

    ERIC Educational Resources Information Center

    Hargrove, Thomas L; Stirling, Keith H.

    Presenting this cost analysis as a supplemental volume, separate from the main report, allows the chief activities in implementing the Processing Center Design to be correlated with costs as of a particular date and according to varying rates of production. In considering the total budget, three main areas are distinguished: (1) Systems…

  6. System for the Analysis of Global Energy Markets - Vol. II, Model Documentation

    EIA Publications

    2003-01-01

    The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.

  7. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  8. Development of a System for Automatic Facial Expression Analysis

    NASA Astrophysics Data System (ADS)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  9. Hand radiograph analysis for fully automatic bone age assessment

    NASA Astrophysics Data System (ADS)

    Chassignet, Philippe; Nitescu, Teodor; Hassan, Max; Stanescu, Ruxandra

    1999-05-01

    This paper describes a method for the fully automatic and reliable segmentation of the bones in a radiograph of the child's hand. The problem consists in identifying the contours of the bones and the difficulty lies in the large variability of the anatomical structures, according to age, hand pose or individual. The model shall not force any standard interpretation, hence we use a simple hierarchical geometric model that provides the only information required for the identification of the chunks of contours. The phalangeal and metacarpal resulting segmentation is proved robust over a set of many hundred of images and measurements of shapes, sizes, areas, ..., are now quite allowed. The next step consists in extending the model for more accurate measurements and also for the localization of the carpal bones.

  10. [Development of a Japanese version of the Valuation of Life (VOL) scale].

    PubMed

    Nakagawa, Takeshi; Gondo, Yasuyuki; Masui, Yukie; Ishioka, Yoshiko; Tabuchi, Megumi; Kamide, Kei; Ikebe, Kazunori; Arai, Yasumichi; Takahashi, Ryutaro

    2013-04-01

    This study developed a Japanese version of the Valuation of Life (VOL) scale, to measure psychological wellbeing among older adults. In Analysis 1, we conducted a factor analysis of 13 items, and identified two factors: positive VOL and spiritual well-being. These factors had adequate degrees of internal consistency, and were related to positive mental health. In Analysis 2, we examined sociodemographic, social, and health predictors for VOL. The role of social factors was stronger than the role of health factors, and spiritual well-being was more related to moral or religious activities than positive VOL. These results suggest that predictors for VOL vary by culture. In Analysis 3, we investigated the relationship between VOL and desired years of life. Positive VOL significantly predicted more desired years of life, whereas spiritual well-being did not. Positive VOL had acceptable reliability and validity. Future research is required to investigate whether VOL predicts survival duration or end-of-life decisions.

  11. Automatic Segmentation of Cell Nuclei in Bladder and Skin Tissue for Karyometric Analysis

    PubMed Central

    Korde, Vrushali R.; Bartels, Hubert; Barton, Jennifer; Ranger-Moore, James

    2010-01-01

    Objective To automatically segment cell nuclei in histology images of bladder and skin tissue for karyometric analysis. Study Design The four main steps in the program were as follows: median filtering and thresholding, segmentation, categorizing, and cusp correction. This robust segmentation technique used properties of the image histogram to optimally select a threshold and create closed four-way chain code nuclear segmentations. Each cell nucleus segmentation was treated as an individual object whose properties of segmentation quality were used for criteria to classify each nucleus as: throw away, salvageable, or good. An erosion/dilation procedure and re-thresholding were performed on salvageable nuclei to correct cusps. Results Ten bladder histology images were segmented both by hand and using this automatic segmentation algorithm. The automatic segmentation resulted in a sensitivity of 76.4%, defined as the percentage of hand segmented nuclei that were automatically segmented with good quality. The median proportional difference between hand and automatic segmentations over 42 nuclei each with 95 features used in karyometric analysis was 1.6%. The same procedure was performed on 10 skin histology images with a sensitivity of 83.0% and median proportional difference of 2.6%. Conclusion The close agreement in karyometric features with hand segmentation shows that automated segmentation can be used for analysis of bladder and skin histology images. PMID:19402384

  12. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  13. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  14. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  15. Automatic Method of Supernovae Classification by Modeling Human Procedure of Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Módolo, Marcelo; Rosa, Reinaldo; Guimaraes, Lamartine N. F.

    2016-07-01

    The classification of a recently discovered supernova must be done as quickly as possible in order to define what information will be captured and analyzed in the following days. This classification is not trivial and only a few experts astronomers are able to perform it. This paper proposes an automatic method that models the human procedure of classification. It uses Multilayer Perceptron Neural Networks to analyze the supernovae spectra. Experiments were performed using different pre-processing and multiple neural network configurations to identify the classic types of supernovae. Significant results were obtained indicating the viability of using this method in places that have no specialist or that require an automatic analysis.

  16. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  17. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  18. Rapid Automatized Naming and Reading Performance: A Meta-Analysis

    ERIC Educational Resources Information Center

    Araújo, Susana; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2015-01-01

    Evidence that rapid naming skill is associated with reading ability has become increasingly prevalent in recent years. However, there is considerable variation in the literature concerning the magnitude of this relationship. The objective of the present study was to provide a comprehensive analysis of the evidence on the relationship between rapid…

  19. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kaisko, Outi; Kortström, Jari; Vuorinen, Tommi; Uski, Marja; Korja, Annakaisa

    2015-04-01

    The site of a new planned nuclear power plant is located in Pyhäjoki, eastern coast of the Bay of Bothnia. The area is characterized by low-active intraplate seismicity, with earthquake magnitudes rarely exceeding 4.0. IAEA guidelines state that when a nuclear power plant site is evaluated a network of sensitive seismographs having a recording capability for micro-earthquakes should be installed to acquire more detailed information on potential seismic sources. The operation period of the network should be long enough to obtain a comprehensive earthquake catalogue for seismotectonic interpretation. A near optimal configuration of ten seismograph stations will be installed around the site. A central station, including 3-C high-frequency and strong motion seismographs, is located in the site area. In addition, the network comprises nine high-frequency 3-C stations within a distance of 50 km from the central station. The network is dense enough to fulfil the requirements of azimuthal coverage better than 180o and automatic event location capability down to ~ ML -0.1 within a radius of 25 km from the site. Automatic processing and analysis of the planned seismic network is presented. Following the IAEA guidelines, real-time monitoring of the site area is integrated with the automatic detection and location process operated by the Institute of Seismology, University of Helsinki. In addition interactive data analysis is needed. At the end of year 2013 5 stations have been installed. The automatic analysis utilizes also 7 near by stations of national seismic networks of Finland and Sweden. During this preliminary phase several small earthquakes have been detected. The detection capability and location accuracy of the automatic analysis is estimated using chemical explosions at 15 known sites.

  20. Automatic system for brain MRI analysis using a novel combination of fuzzy rule-based and automatic clustering techniques

    NASA Astrophysics Data System (ADS)

    Hillman, Gilbert R.; Chang, Chih-Wei; Ying, Hao; Kent, T. A.; Yen, John

    1995-05-01

    Analysis of magnetic resonance images (MRI) of the brain permits the identification and measurement of brain compartments. These compartments include normal subdivisions of brain tissue, such as gray matter, white matter and specific structures, and also include pathologic lesions associated with stroke or viral infection. A fuzzy system has been developed to analyze images of animal and human brain, segmenting the images into physiologically meaningful regions for display and measurement. This image segmentation system consists of two stages which include a fuzzy rule-based system and fuzzy c-means algorithm (FCM). The first stage of this system is a fuzzy rule-based system which classifies most pixels in MR images into several known classes and one `unclassified' group, which fails to fit the predetermined rules. In the second stage, this system uses the result of the first stage as initial estimates for the properties of the compartments and applies FCM to classify all the previously unclassified pixels. The initial prototypes are estimated by using the averages of the previously classified pixels. The combined processes constitute a fast, accurate and robust image segmentation system. This method can be applied to many clinical image segmentation problems. While the rule-based portion of the system allows specialized knowledge about the images to be incorporated, the FCM allows the resolution of ambiguities that result from noise and artifacts in the image data. The volumes and locations of the compartments can easily be measured and reported quantitatively once they are identified. It is easy to adapt this approach to new imaging problems, by introducing a new set of fuzzy rules and adjusting the number of expected compartments. However, for the purpose of building a practical fully automatic system, a rule learning mechanism may be necessary to improve the efficiency of modification of the fuzzy rules.

  1. Automatic forensic analysis of automotive paints using optical microscopy.

    PubMed

    Thoonen, Guy; Nys, Bart; Vander Haeghen, Yves; De Roy, Gilbert; Scheunders, Paul

    2016-02-01

    The timely identification of vehicles involved in an accident, such as a hit-and-run situation, bears great importance in forensics. To this end, procedures have been defined for analyzing car paint samples that combine techniques such as visual analysis and Fourier transform infrared spectroscopy. This work proposes a new methodology in order to automate the visual analysis using image retrieval. Specifically, color and texture information is extracted from a microscopic image of a recovered paint sample, and this information is then compared with the same features for a database of paint types, resulting in a shortlist of candidate paints. In order to demonstrate the operation of the methodology, a test database has been set up and two retrieval experiments have been performed. The first experiment quantifies the performance of the procedure for retrieving exact matches, while the second experiment emulates the real-life situation of paint samples that experience changes in color and texture over time. PMID:26774250

  2. A framework for automatic heart sound analysis without segmentation

    PubMed Central

    2011-01-01

    Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS). The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR), and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set. PMID:21303558

  3. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography

    PubMed Central

    Zheng, Rencheng; Yamabe, Shigeyuki; Nakano, Kimihiko; Suda, Yoshihiro

    2015-01-01

    Nowadays insight into human-machine interaction is a critical topic with the large-scale development of intelligent vehicles. Biosignal analysis can provide a deeper understanding of driver behaviors that may indicate rationally practical use of the automatic technology. Therefore, this study concentrates on biosignal analysis to quantitatively evaluate mental stress of drivers during automatic driving of trucks, with vehicles set at a closed gap distance apart to reduce air resistance to save energy consumption. By application of two wearable sensor systems, a continuous measurement was realized for palmar perspiration and masseter electromyography, and a biosignal processing method was proposed to assess mental stress levels. In a driving simulator experiment, ten participants completed automatic driving with 4, 8, and 12 m gap distances from the preceding vehicle, and manual driving with about 25 m gap distance as a reference. It was found that mental stress significantly increased when the gap distances decreased, and an abrupt increase in mental stress of drivers was also observed accompanying a sudden change of the gap distance during automatic driving, which corresponded to significantly higher ride discomfort according to subjective reports. PMID:25738768

  4. Automatic quantification of neurite outgrowth by means of image analysis

    NASA Astrophysics Data System (ADS)

    Van de Wouwer, Gert; Nuydens, Rony; Meert, Theo; Weyn, Barbara

    2004-07-01

    A system for quantification of neurite outgrowth in in-vitro experiments is described. The system is developed for routine use in a high-throughput setting and is therefore needs fast, cheap, and robust. It relies on automated digital microscopical imaging of microtiter plates. Image analysis is applied to extract features for characterisation of neurite outgrowth. The system is tested in a dose-response experiment on PC12 cells + Taxol. The performance of the system and its ability to measure changes on neuronal morphology is studied.

  5. Automatic movie skimming with story units via general tempo analysis

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Hung; Yeh, Chia H.; Kuo, C.-C. J.

    2003-12-01

    A skimming system for movie content exploration is proposed using story units extracted via general tempo analysis of audio and visual data. Quite a few schemes have been proposed to segment video data into shots with low-level features, yet the grouping of shots into meaningful units, called story units here, is important and challenging. In this work, we detect similar shots using key frames and include these similar shots as a node in the scene transition graph. Then, an importance measure is calculated based on the total length of each node. Finally, we select sinks and shots according to this measure. Based on these semantic shots, a meaningful skims can be successfully generated. Simulation results will be presented to show that the proposed video skimming scheme can preserve the essential and significant content of the original video data.

  6. Automatic Fatigue Detection of Drivers through Yawning Analysis

    NASA Astrophysics Data System (ADS)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  7. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  8. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks. PMID:20924860

  9. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    PubMed Central

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays. PMID:24404030

  10. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays.

    PubMed

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays.

  11. Image analysis techniques for automatic evaluation of two-dimensional electrophoresis.

    PubMed

    Häder, D P; Kauer, G

    1990-05-01

    Techniques for automatic analysis of two-dimensional electrophoresis gels by computer-aided image analysis are described. Original gels or photographic films are scanned using a laser scanner and the files are transferred to a microcomputer. The program package first performs a compression and preevaluation of the files. Spot identification and quantification is performed by the chain code algorithm after appropriate zooming and cutting. Labeling facilitates spot identification and quantification in numerical and graphical (pseudocolor) representation on peripheral devices for camera ready output. Interpolation between measured basepoints is performed by cubic spline algorithms which are automatically switched on and off, depending on the need by the program. High speed analysis and graphic representation is achieved using fast Assembler language routines rather than high level languages. One-dimensional gels can be analyzed using the same software. Spot matching between parallel two-dimensional gels has not yet been implemented.

  12. Automatic localization of cerebral cortical malformations using fractal analysis

    NASA Astrophysics Data System (ADS)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  13. Automatic localization of cerebral cortical malformations using fractal analysis.

    PubMed

    De Luca, A; Arrigoni, F; Romaniello, R; Triulzi, F M; Peruzzo, D; Bertoldo, A

    2016-08-21

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity. PMID:27444964

  14. Analysis of Fiber deposition using Automatic Image Processing Method

    NASA Astrophysics Data System (ADS)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  15. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms

    PubMed Central

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F.

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7–76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  16. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms.

    PubMed

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F

    2016-09-20

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7-76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment.

  17. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms.

    PubMed

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7-76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  18. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  19. Investigation of Ballistic Evidence through an Automatic Image Analysis and Identification System.

    PubMed

    Kara, Ilker

    2016-05-01

    Automated firearms identification (AFI) systems contribute to shedding light on criminal events by comparison between different pieces of evidence on cartridge cases and bullets and by matching similar ones that were fired from the same firearm. Ballistic evidence can be rapidly analyzed and classified by means of an automatic image analysis and identification system. In addition, it can be used to narrow the range of possible matching evidence. In this study conducted on the cartridges ejected from the examined pistol, three imaging areas, namely the firing pin impression, capsule traces, and the intersection of these traces, were compared automatically using the image analysis and identification system through the correlation ranking method to determine the numeric values that indicate the significance of the similarities. These numerical features that signify the similarities and differences between pistol makes and models can be used in groupings to make a distinction between makes and models of pistols. PMID:27122419

  20. Analysis of outdoor radon progeny concentration measured at the Spanish radioactive aerosol automatic monitoring network.

    PubMed

    Arnold, D; Vargas, A; Ortega, X

    2009-05-01

    An analysis of 10-year radon progeny data, provided by the Spanish automatic radiological surveillance network, in relation to meteorology is presented. Results show great spatial variability depending mainly on the station location and thus, the surrounding radon exhalation rate. Hourly averages show the typical diurnal cycle with an early morning maximum and a minimum at noon, except for one mountain station, which shows an inverse behaviour. Monthly averaged values show lower concentrations during months with higher atmospheric instability.

  1. Automatic Assessment and Reduction of Noise using Edge Pattern Analysis in Non-Linear Image Enhancement

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.; Hines, Glenn D.

    2004-01-01

    Noise is the primary visibility limit in the process of non-linear image enhancement, and is no longer a statistically stable additive noise in the post-enhancement image. Therefore novel approaches are needed to both assess and reduce spatially variable noise at this stage in overall image processing. Here we will examine the use of edge pattern analysis both for automatic assessment of spatially variable noise and as a foundation for new noise reduction methods.

  2. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  3. Nonverbal Social Withdrawal in Depression: Evidence from manual and automatic analysis

    PubMed Central

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, S. Mohammad; Hammal, Zakia; Rosenwald, Dean P.

    2014-01-01

    The relationship between nonverbal behavior and severity of depression was investigated by following depressed participants over the course of treatment and video recording a series of clinical interviews. Facial expressions and head pose were analyzed from video using manual and automatic systems. Both systems were highly consistent for FACS action units (AUs) and showed similar effects for change over time in depression severity. When symptom severity was high, participants made fewer affiliative facial expressions (AUs 12 and 15) and more non-affiliative facial expressions (AU 14). Participants also exhibited diminished head motion (i.e., amplitude and velocity) when symptom severity was high. These results are consistent with the Social Withdrawal hypothesis: that depressed individuals use nonverbal behavior to maintain or increase interpersonal distance. As individuals recover, they send more signals indicating a willingness to affiliate. The finding that automatic facial expression analysis was both consistent with manual coding and revealed the same pattern of findings suggests that automatic facial expression analysis may be ready to relieve the burden of manual coding in behavioral and clinical science. PMID:25378765

  4. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  5. Automatic inspection of the printing on soft drink cans by image processing analysis

    NASA Astrophysics Data System (ADS)

    Ni, Catherine W.

    1999-03-01

    This paper describes a machine vision, automatic inspection algorithm of the printings of soft drink cans by image processing analysis. There are two new techniques employed in this procedure to make the automatic inspection possible: (1) barcode referencing: we develop a fast barcode detection algorithm, such that when the cans pass through the image- taking area in the inspection lines with uncertain directions, we use barcode location as the reference point, (2) 2D matching: we connect the multiple view-angle images of the whole 3D cans' surfaces, then with artificial 2D images, we adjust the matching process for a flexible system inspection resolution requirements for the quality control decision making. This process inspects 3D cans with true color information and can easily replace different cans for inspection.

  6. Analysis of Social Variables when an Initial Functional Analysis Indicates Automatic Reinforcement as the Maintaining Variable for Self-Injurious Behavior

    ERIC Educational Resources Information Center

    Kuhn, Stephanie A. Contrucci; Triggs, Mandy

    2009-01-01

    Self-injurious behavior (SIB) that occurs at high rates across all conditions of a functional analysis can suggest automatic or multiple functions. In the current study, we conducted a functional analysis for 1 individual with SIB. Results indicated that SIB was, at least in part, maintained by automatic reinforcement. Further analyses using…

  7. Automatic extraction of initial moving object based on advanced feature and video analysis

    NASA Astrophysics Data System (ADS)

    Liu, Mao-Ying; Dai, Qiong-Hai; Liu, Xiao-Dong; Er, Gui-Hua

    2005-07-01

    Traditionally, video segmentation usually extracts object using low-level features such as color, texture, edge, motion, and optical flow. This paper originally proposes that the connectivity of object motion is an advanced feature of video moving object because it can reflect semantic meanings of object to some extent. And it can be fully represented on cumulated difference image which is the combination of a certain number of interframe difference images. Based on this principle, a novel system is designed to extract initial moving object automatically. The system includes 3 key innovations: 1) System is applied on cumulated difference image which can make object more prominent than background noise. Object extraction is based on the connectivity of object motion and it can guarantee the integrity of the extracted object while eliminate big background regions which cannot be removed by conventional change detection methods, for example, intense-noise regions and shadow regions that are not connected tightly to object. 2) Video sequence analysis is performed ahead of video segmentation. Proper object extraction methods are adopted according to the characteristics of background noise and object motion. 3) The adaptive threshold is automatically determined on cumulated difference image after acute noises is removed. The threshold determined here is more reasonable. And with it, most noise can be eliminated while small-motion regions of object are preserved. Results show that this system can extract object in different kinds of sequences automatically, promptly and properly. Thus, this system is very suitable for real time video applications.

  8. Automaticity in acute ischemia: Bifurcation analysis of a human ventricular model

    NASA Astrophysics Data System (ADS)

    Bouchard, Sylvain; Jacquemet, Vincent; Vinet, Alain

    2011-01-01

    Acute ischemia (restriction in blood supply to part of the heart as a result of myocardial infarction) induces major changes in the electrophysiological properties of the ventricular tissue. Extracellular potassium concentration ([Ko+]) increases in the ischemic zone, leading to an elevation of the resting membrane potential that creates an “injury current” (IS) between the infarcted and the healthy zone. In addition, the lack of oxygen impairs the metabolic activity of the myocytes and decreases ATP production, thereby affecting ATP-sensitive potassium channels (IKatp). Frequent complications of myocardial infarction are tachycardia, fibrillation, and sudden cardiac death, but the mechanisms underlying their initiation are still debated. One hypothesis is that these arrhythmias may be triggered by abnormal automaticity. We investigated the effect of ischemia on myocyte automaticity by performing a comprehensive bifurcation analysis (fixed points, cycles, and their stability) of a human ventricular myocyte model [K. H. W. J. ten Tusscher and A. V. Panfilov, Am. J. Physiol. Heart Circ. Physiol.AJPHAP0363-613510.1152/ajpheart.00109.2006 291, H1088 (2006)] as a function of three ischemia-relevant parameters [Ko+], IS, and IKatp. In this single-cell model, we found that automatic activity was possible only in the presence of an injury current. Changes in [Ko+] and IKatp significantly altered the bifurcation structure of IS, including the occurrence of early-after depolarization. The results provide a sound basis for studying higher-dimensional tissue structures representing an ischemic heart.

  9. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    PubMed

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments.

  10. Automatic segmentation and quantitative analysis of the articular cartilages from magnetic resonance images of the knee.

    PubMed

    Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K; Ourselin, Sébastien

    2010-01-01

    In this paper, we present a segmentation scheme that automatically and accurately segments all the cartilages from magnetic resonance (MR) images of nonpathological knees. Our scheme involves the automatic segmentation of the bones using a three-dimensional active shape model, the extraction of the expected bone-cartilage interface (BCI), and cartilage segmentation from the BCI using a deformable model that utilizes localization, patient specific tissue estimation and a model of the thickness variation. The accuracy of this scheme was experimentally validated using leave one out experiments on a database of fat suppressed spoiled gradient recall MR images. The scheme was compared to three state of the art approaches, tissue classification, a modified semi-automatic watershed algorithm and nonrigid registration (B-spline based free form deformation). Our scheme obtained an average Dice similarity coefficient (DSC) of (0.83, 0.83, 0.85) for the (patellar, tibial, femoral) cartilages, while (0.82, 0.81, 0.86) was obtained with a tissue classifier and (0.73, 0.79, 0.76) was obtained with nonrigid registration. The average DSC obtained for all the cartilages using a semi-automatic watershed algorithm (0.90) was slightly higher than our approach (0.89), however unlike this approach we segment each cartilage as a separate object. The effectiveness of our approach for quantitative analysis was evaluated using volume and thickness measures with a median volume difference error of (5.92, 4.65, 5.69) and absolute Laplacian thickness difference of (0.13, 0.24, 0.12) mm.

  11. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis.

    PubMed

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text "The North Wind and the Sun" were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis.

  12. Automatic Evaluation of Voice Quality Using Text-Based Laryngograph Measurements and Prosodic Analysis

    PubMed Central

    Haderlein, Tino; Schwemmle, Cornelia; Döllinger, Michael; Matoušek, Václav; Ptok, Martin; Nöth, Elmar

    2015-01-01

    Due to low intra- and interrater reliability, perceptual voice evaluation should be supported by objective, automatic methods. In this study, text-based, computer-aided prosodic analysis and measurements of connected speech were combined in order to model perceptual evaluation of the German Roughness-Breathiness-Hoarseness (RBH) scheme. 58 connected speech samples (43 women and 15 men; 48.7 ± 17.8 years) containing the German version of the text “The North Wind and the Sun” were evaluated perceptually by 19 speech and voice therapy students according to the RBH scale. For the human-machine correlation, Support Vector Regression with measurements of the vocal fold cycle irregularities (CFx) and the closed phases of vocal fold vibration (CQx) of the Laryngograph and 33 features from a prosodic analysis module were used to model the listeners' ratings. The best human-machine results for roughness were obtained from a combination of six prosodic features and CFx (r = 0.71, ρ = 0.57). These correlations were approximately the same as the interrater agreement among human raters (r = 0.65, ρ = 0.61). CQx was one of the substantial features of the hoarseness model. For hoarseness and breathiness, the human-machine agreement was substantially lower. Nevertheless, the automatic analysis method can serve as the basis for a meaningful objective support for perceptual analysis. PMID:26136813

  13. Automatic phase correction of fourier transform NMR spectra based on the dispersion versus absorption (DISPA) lineshape analysis

    NASA Astrophysics Data System (ADS)

    Sotak, Christopher H.; Dumoulin, Charles L.; Newsham, Mark D.

    A method for automatic phase correction of Fourier transform NMR spectra bused on the dispersion versus absorption (DISPA) lineshape analysis is described. The DISPA display of a single misphased Lorentzian line gives a unit circle which has been rotated about the origin (relative to its "reference circle") by a number of degrees equal to the phase misadjustment. This rotation, Φ, is a combination of the zero- and first-order phase angles at the frequency of the resonance. Calculation of Φ for two or more resonances allows the spectral phasing parameters to be determined and applied to correct the spectrum. This approach has been implemented in both automatic and "semi-automatic" modes.

  14. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    SciTech Connect

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.; Jin, Shuangshuang; Lin, Jenglung; Hauer, Matthew L.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliably and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.

  15. Automatic generation of stop word lists for information retrieval and analysis

    DOEpatents

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  16. [A brain tumor automatic assisted-diagnostic system based on medical image shape analysis].

    PubMed

    Wang, Li-Li; Yang, Jie

    2005-03-01

    This paper covers a brain tumor assisted diagnosis system based on medical image analysis. The system supplements the PACS functions such as display of medical images and database inquiry, segments slice in real-time using the algorithm of fuzzy region competition, extracts shape feature factors such as contour label, compactness, moment, Fourier Descriptor, chord length, radius and other medical data on the brain tumor image with irregular contour feature after segmentation and then feeds to Bayesian network in order to sort the brain tumor for the implementation of automatic assisted diagnosis. PMID:16011110

  17. Automatic quantitative analysis of ultrasound tongue contours via wavelet-based functional mixed models.

    PubMed

    Lancia, Leonardo; Rausch, Philip; Morris, Jeffrey S

    2015-02-01

    This paper illustrates the application of wavelet-based functional mixed models to automatic quantification of differences between tongue contours obtained through ultrasound imaging. The reliability of this method is demonstrated through the analysis of tongue positions recorded from a female and a male speaker at the onset of the vowels /a/ and /i/ produced in the context of the consonants /t/ and /k/. The proposed method allows detection of significant differences between configurations of the articulators that are visible in ultrasound images during the production of different speech gestures and is compatible with statistical designs containing both fixed and random terms.

  18. Performance Analysis of Distributed Applications using Automatic Classification of Communication Inefficiencies

    SciTech Connect

    Vetter, J.

    1999-11-01

    We present a technique for performance analysis that helps users understand the communication behavior of their message passing applications. Our method automatically classifies individual communication operations and it reveals the cause of communication inefficiencies in the application. This classification allows the developer to focus quickly on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, we trace the message operations of MPI applications and then classify each individual communication event using decision tree classification, a supervised learning technique. We train our decision tree using microbenchmarks that demonstrate both efficient and inefficient communication. Since our technique adapts to the target system's configuration through these microbenchmarks, we can simultaneously automate the performance analysis process and improve classification accuracy. Our experiments on four applications demonstrate that our technique can improve the accuracy of performance analysis, and dramatically reduce the amount of data that users must encounter.

  19. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  20. Urban land use of the Sao Paulo metropolitan area by automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Niero, M.; Foresti, C.

    1983-01-01

    The separability of urban land use classes in the metropolitan area of Sao Paulo was studied by means of automatic analysis of MSS/LANDSAT digital data. The data were analyzed using the media K and MAXVER classification algorithms. The land use classes obtained were: CBD/vertical growth area, residential area, mixed area, industrial area, embankment area type 1, embankment area type 2, dense vegetation area and sparse vegetation area. The spectral analysis of representative samples of urban land use classes was done using the "Single Cell" analysis option. The classes CBD/vertical growth area, residential area and embankment area type 2 showed better spectral separability when compared to the other classes.

  1. Intercellular fluorescence background on microscope slides: some problems and solutions for automatic analysis

    NASA Astrophysics Data System (ADS)

    Piper, Jim; Sudar, Damir; Peters, Don; Pinkel, Daniel

    1994-05-01

    Although high contrast between signal and the dark background is often claimed as a major advantage of fluorescence staining in cytology and cytogenetics, in practice this is not always the case and in some circumstances the inter-cellular or, in the case of metaphase preparations, the inter-chromosome background can be both brightly fluorescent and vary substantially across the slide or even across a single metaphase. Bright background results in low image contrast, making automatic detection of metaphase cells more difficult. The background correction strategy employed in automatic search must both cope with variable background and be computationally efficient. The method employed in a fluorescence metaphase finder is presented, and the compromises involved are discussed. A different set of problems arise when the analysis is aimed at accurate quantification of the fluorescence signal. Some insight into the nature of the background in the case of comparative genomic hybridization is obtained by image analysis of data obtained from experiments using cell lines with known abnormal copy numbers of particular chromosome types.

  2. Automatic analysis of image of surface structure of cell wall-deficient EVC.

    PubMed

    Li, S; Hu, K; Cai, N; Su, W; Xiong, H; Lou, Z; Lin, T; Hu, Y

    2001-01-01

    Some computer applications for cell characterization in medicine and biology, such as analysis of surface structure of cell wall-deficient EVC (El Tor Vibrio of Cholera), operate with cell samples taken from very small areas of interest. In order to perform texture characterization in such an application, only a few texture operators can be employed: the operators should be insensitive to noise and image distortion and be reliable in order to estimate texture quality from images. Therefore, we introduce wavelet theory and mathematical morphology to analyse the cellular surface micro-area image obtained by SEM (Scanning Electron Microscope). In order to describe the quality of surface structure of cell wall-deficient EVC, we propose a fully automatic computerized method. The image analysis process is carried out in two steps. In the first, we decompose the given image by dyadic wavelet transform and form an image approximation with higher resolution, by doing so, we perform edge detection of given images efficiently. In the second, we introduce many operations of mathematical morphology to obtain morphological quantitative parameters of surface structure of cell wall-deficient EVC. The obtained results prove that the method can eliminate noise, detect the edge and extract the feature parameters validly. In this work, we have built automatic analytic software named "EVC.CELL".

  3. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    NASA Astrophysics Data System (ADS)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  4. Semiautomated ROI analysis in dynamic MR studies. Part I: Image analysis tools for automatic correction of organ displacements.

    PubMed

    Gerig, G; Kikinis, R; Kuoni, W; von Schulthess, G K; Kübler, O

    1991-01-01

    The most important problem in the analysis of time sequences is the compensation for artifactual motion. Owing to motion, medical images of the abdominal region do not represent organs with fixed configuration. Analysis of organ function with dynamic contrast medium studies using regions of interest (ROIs) is thus not readily accomplished. Images of the organ of interest need to be registered and corrected prior to a detailed local analysis. We have developed an image analysis scheme that allows the automatic detection of the organ contours, the extraction of the motion parameters per frame, and the registration of images. The complete procedure requires only minimal user interaction and results in a readjusted image sequence, where organs of interest remain fixed. Both a visual analysis of the dynamic behavior of functional properties and a quantitative statistical analysis of signal intensity versus time within local ROIs are considerably facilitated using the corrected series.

  5. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    PubMed

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. PMID:24418010

  6. Techniques for automatic large scale change analysis of temporal multispectral imagery

    NASA Astrophysics Data System (ADS)

    Mercovich, Ryan A.

    Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring

  7. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  8. Glioma grading using apparent diffusion coefficient map: application of histogram analysis based on automatic segmentation.

    PubMed

    Lee, Jeongwon; Choi, Seung Hong; Kim, Ji-Hoon; Sohn, Chul-Ho; Lee, Sooyeul; Jeong, Jaeseung

    2014-09-01

    The accurate diagnosis of glioma subtypes is critical for appropriate treatment, but conventional histopathologic diagnosis often exhibits significant intra-observer variability and sampling error. The aim of this study was to investigate whether histogram analysis using an automatically segmented region of interest (ROI), excluding cystic or necrotic portions, could improve the differentiation between low-grade and high-grade gliomas. Thirty-two patients (nine low-grade and 23 high-grade gliomas) were included in this retrospective investigation. The outer boundaries of the entire tumors were manually drawn in each section of the contrast-enhanced T1 -weighted MR images. We excluded cystic or necrotic portions from the entire tumor volume. The histogram analyses were performed within the ROI on normalized apparent diffusion coefficient (ADC) maps. To evaluate the contribution of the proposed method to glioma grading, we compared the area under the receiver operating characteristic (ROC) curves. We found that an ROI excluding cystic or necrotic portions was more useful for glioma grading than was an entire tumor ROI. In the case of the fifth percentile values of the normalized ADC histogram, the area under the ROC curve for the tumor ROIs excluding cystic or necrotic portions was significantly higher than that for the entire tumor ROIs (p < 0.005). The automatic segmentation of a cystic or necrotic area probably improves the ability to differentiate between high- and low-grade gliomas on an ADC map. PMID:25042540

  9. Automatic brain tumour detection and neovasculature assessment with multiseries MRI analysis.

    PubMed

    Szwarc, Pawel; Kawa, Jacek; Rudzki, Marcin; Pietka, Ewa

    2015-12-01

    In this paper a novel multi-stage automatic method for brain tumour detection and neovasculature assessment is presented. First, the brain symmetry is exploited to register the magnetic resonance (MR) series analysed. Then, the intracranial structures are found and the region of interest (ROI) is constrained within them to tumour and peritumoural areas using the Fluid Light Attenuation Inversion Recovery (FLAIR) series. Next, the contrast-enhanced lesions are detected on the basis of T1-weighted (T1W) differential images before and after contrast medium administration. Finally, their vascularisation is assessed based on the Regional Cerebral Blood Volume (RCBV) perfusion maps. The relative RCBV (rRCBV) map is calculated in relation to a healthy white matter, also found automatically, and visualised on the analysed series. Three main types of brain tumours, i.e. HG gliomas, metastases and meningiomas have been subjected to the analysis. The results of contrast enhanced lesions detection have been compared with manual delineations performed independently by two experts, yielding 64.84% sensitivity, 99.89% specificity and 71.83% Dice Similarity Coefficient (DSC) for twenty analysed studies of subjects with brain tumours diagnosed.

  10. Hardware and software system for automatic microemulsion assay evaluation by analysis of optical properties

    NASA Astrophysics Data System (ADS)

    Maeder, Ulf; Schmidts, Thomas; Burg, Jan-Michael; Heverhagen, Johannes T.; Runkel, Frank; Fiebich, Martin

    2010-03-01

    A new hardware device called Microemulsion Analyzer (MEA), which facilitates the preparation and evaluation of microemulsions, was developed. Microemulsions, consisting of three phases (oil, surfactant and water) and prepared on deep well plates according to the PDMPD method can be automatically evaluated by means of the optical properties. The ratio of ingredients to form a microemulsion strongly depends on the properties and the amounts of the used ingredients. A microemulsion assay is set up on deep well plates to determine these ratios. The optical properties of the ingredients change from turbid to transparent as soon as a microemulsion is formed. The MEA contains a frame and an imageprocessing and analysis algorithm. The frame itself consists of aluminum, an electro luminescent foil (ELF) and a camera. As the frame keeps the well plate at the correct position and angle, the ELF provides constant illumination of the plate from below. The camera provides an image that is processed by the algorithm to automatically evaluate the turbidity in the wells. Using the determined parameters, a phase diagram is created that visualizes the information. This build-up can be used to analyze microemulsion assays and to get results in a standardized way. In addition, it is possible to perform stability tests of the assay by creating special differential stability diagrams after a period of time.

  11. Automatic geocoding of high-value targets using structural image analysis and GIS data

    NASA Astrophysics Data System (ADS)

    Soergel, Uwe; Thoennessen, Ulrich

    1999-12-01

    Geocoding based merely on navigation data and sensor model is often not possible or precise enough. In these cases an improvement of the preregistration through image-based approaches is a solution. Due to the large amount of data in remote sensing automatic geocoding methods are necessary. For geocoding purposes appropriate tie points, which are present in image and map, have to be detected and matched. The tie points are base of the transformation function. Assigning the tie points is combinatorial problem depending on the number of tie points. This number can be reduced using structural tie points like corners or crossings of prominent extended targets (e.g. harbors, airfields). Additionally the reliability of the tie points is improved. Our approach extracts structural tie points independently in the image and in the vector map by a model-based image analysis. The vector map is provided by a GIS using ATKIS data base. The model parameters are extracted from maps or collateral information of the scenario. The two sets of tie points are automatically matched with a Geometric Hashing algorithm. The algorithm was successfully applied to VIS, IR and SAR data.

  12. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-01

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use.

  13. Automatic computer-aided detection of prostate cancer based on multiparametric magnetic resonance image analysis

    NASA Astrophysics Data System (ADS)

    Vos, P. C.; Barentsz, J. O.; Karssemeijer, N.; Huisman, H. J.

    2012-03-01

    In this paper, a fully automatic computer-aided detection (CAD) method is proposed for the detection of prostate cancer. The CAD method consists of multiple sequential steps in order to detect locations that are suspicious for prostate cancer. In the initial stage, a voxel classification is performed using a Hessian-based blob detection algorithm at multiple scales on an apparent diffusion coefficient map. Next, a parametric multi-object segmentation method is applied and the resulting segmentation is used as a mask to restrict the candidate detection to the prostate. The remaining candidates are characterized by performing histogram analysis on multiparametric MR images. The resulting feature set is summarized into a malignancy likelihood by a supervised classifier in a two-stage classification approach. The detection performance for prostate cancer was tested on a screening population of 200 consecutive patients and evaluated using the free response operating characteristic methodology. The results show that the CAD method obtained sensitivities of 0.41, 0.65 and 0.74 at false positive (FP) levels of 1, 3 and 5 per patient, respectively. In conclusion, this study showed that it is feasible to automatically detect prostate cancer at a FP rate lower than systematic biopsy. The CAD method may assist the radiologist to detect prostate cancer locations and could potentially guide biopsy towards the most aggressive part of the tumour.

  14. Automatic computer-aided detection of prostate cancer based on multiparametric magnetic resonance image analysis.

    PubMed

    Vos, P C; Barentsz, J O; Karssemeijer, N; Huisman, H J

    2012-03-21

    In this paper, a fully automatic computer-aided detection (CAD) method is proposed for the detection of prostate cancer. The CAD method consists of multiple sequential steps in order to detect locations that are suspicious for prostate cancer. In the initial stage, a voxel classification is performed using a Hessian-based blob detection algorithm at multiple scales on an apparent diffusion coefficient map. Next, a parametric multi-object segmentation method is applied and the resulting segmentation is used as a mask to restrict the candidate detection to the prostate. The remaining candidates are characterized by performing histogram analysis on multiparametric MR images. The resulting feature set is summarized into a malignancy likelihood by a supervised classifier in a two-stage classification approach. The detection performance for prostate cancer was tested on a screening population of 200 consecutive patients and evaluated using the free response operating characteristic methodology. The results show that the CAD method obtained sensitivities of 0.41, 0.65 and 0.74 at false positive (FP) levels of 1, 3 and 5 per patient, respectively. In conclusion, this study showed that it is feasible to automatically detect prostate cancer at a FP rate lower than systematic biopsy. The CAD method may assist the radiologist to detect prostate cancer locations and could potentially guide biopsy towards the most aggressive part of the tumour.

  15. Semi-automatic system for UV images analysis of historical musical instruments

    NASA Astrophysics Data System (ADS)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  16. [SMEAC Newsletters, Science Education, Vol. 2, No. 2--Vol. 2, No. 3, 1969].

    ERIC Educational Resources Information Center

    ERIC Clearinghouse for Science, Mathematics, and Environmental Education, Columbus, OH.

    Each of these newsletters, produced by the ERIC Information Analysis Center for Science, Mathematics, and Environmental Education, Contains information concerning center publications and activities, as well as other items considered of interest to researchers and educators of various educational levels. One of the emphases in Vol. 2, No. 2, is a…

  17. [SMEAC Newsletters, Science Education, Vol. 1, No. 1--Vol. 2, No. 1, 1967-1968].

    ERIC Educational Resources Information Center

    ERIC Clearinghouse for Science, Mathematics, and Environmental Education, Columbus, OH.

    Each of these newsletters, produced by the ERIC Information Analysis Center for Science, Mathematics, and Environmental Education, contains information concerning center publications and other items considered of interest to researchers and educators of various education levels. Vol. 1, No. 1 highlights selected bibliographies (no longer produced…

  18. Two-Stage Automatic Calibration and Predictive Uncertainty Analysis of a Semi-distributed Watershed Model

    NASA Astrophysics Data System (ADS)

    Lin, Z.; Radcliffe, D. E.; Doherty, J.

    2004-12-01

    monthly flow produced a very good fit to the measured data. Nash and Sutcliffe coefficients for daily and monthly flow over the calibration period were 0.60 and 0.86, respectively; they were 0.61 and 0.87 respectively over the validation period. Regardless of the level of model-to-measurement fit, nonuniqueness of the optimal parameter values renders the necessity of uncertainty analysis for model prediction. The nonlinear prediction uncertainty analysis showed that cautions must be exercised when using the SWAT model to predict instantaneous peak flows. The PEST (Parameter Estimation) free software was used to conduct the two-stage automatic calibration and prediction uncertainty analysis of the SWAT model.

  19. Group-wise automatic mesh-based analysis of cortical thickness

    NASA Astrophysics Data System (ADS)

    Vachet, Clement; Cody Hazlett, Heather; Niethammer, Marc; Oguz, Ipek; Cates, Joshua; Whitaker, Ross; Piven, Joseph; Styner, Martin

    2011-03-01

    The analysis of neuroimaging data from pediatric populations presents several challenges. There are normal variations in brain shape from infancy to adulthood and normal developmental changes related to tissue maturation. Measurement of cortical thickness is one important way to analyze such developmental tissue changes. We developed a novel framework that allows group-wise automatic mesh-based analysis of cortical thickness. Our approach is divided into four main parts. First an individual pre-processing pipeline is applied on each subject to create genus-zero inflated white matter cortical surfaces with cortical thickness measurements. The second part performs an entropy-based group-wise shape correspondence on these meshes using a particle system, which establishes a trade-off between an even sampling of the cortical surfaces and the similarity of corresponding points across the population using sulcal depth information and spatial proximity. A novel automatic initial particle sampling is performed using a matched 98-lobe parcellation map prior to a particle-splitting phase. Third, corresponding re-sampled surfaces are computed with interpolated cortical thickness measurements, which are finally analyzed via a statistical vertex-wise analysis module. This framework consists of a pipeline of automated 3D Slicer compatible modules. It has been tested on a small pediatric dataset and incorporated in an open-source C++ based high-level module called GAMBIT. GAMBIT's setup allows efficient batch processing, grid computing and quality control. The current research focuses on the use of an average template for correspondence and surface re-sampling, as well as thorough validation of the framework and its application to clinical pediatric studies.

  20. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  1. Automatic quantitative analysis of t-tubule organization in cardiac myocytes using ImageJ.

    PubMed

    Pasqualin, Côme; Gannier, François; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2015-02-01

    The transverse tubule system in mammalian striated muscle is highly organized and contributes to optimal and homogeneous contraction. Diverse pathologies such as heart failure and atrial fibrillation include disorganization of t-tubules and contractile dysfunction. Few tools are available for the quantification of the organization of the t-tubule system. We developed a plugin for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. This plugin (TTorg) analyzes raw confocal microscopy images. Analysis options include the whole image, specific regions of the image (cropping), and z-axis analysis of the same image. Batch analysis of a series of images with identical criteria is also one of the options. There is no need to either reorientate any specimen to the horizontal or to do a thresholding of the image to perform analysis. TTorg includes a synthetic "myocyte-like" image generator to test the plugin's efficiency in the user's own experimental conditions. This plugin was validated on synthetic images for different simulated cell characteristics and acquisition parameters. TTorg was able to detect significant differences between the organization of the t-tubule systems in experimental data of mouse ventricular myocytes isolated from wild-type and dystrophin-deficient mice. TTorg is freely distributed, and its source code is available. It provides a reliable, easy-to-use, automatic, and unbiased measurement of t-tubule organization in a wide variety of experimental conditions.

  2. Drift problems in the automatic analysis of gamma-ray spectra using associative memory algorithms

    SciTech Connect

    Olmos, P.; Diaz, J.C.; Perez, J.M.; Aguayo, P. ); Gomez, P.; Rodellar, V. )

    1994-06-01

    Perturbations affecting nuclear radiation spectrometers during their operation frequently spoil the accuracy of automatic analysis methods. One of the problems usually found in practice refers to fluctuations in the spectrum gain and zero, produced by drifts in the detector and nuclear electronics. The pattern acquired in these conditions may be significantly different from that expected with stable instrumentation, thus complicating the identification and quantification of the radionuclides present in it. In this work, the performance of Associative Memory algorithms when dealing with spectra affected by drifts is explored considering a linear energy-calibration function. The formulation of the extended algorithm, constructed to quantify the possible presence of drifts in the spectrometer, is deduced and the results obtained from its application to several practical cases are commented.

  3. SAUNA—a system for automatic sampling, processing, and analysis of radioactive xenon

    NASA Astrophysics Data System (ADS)

    Ringbom, A.; Larson, T.; Axelsson, A.; Elmgren, K.; Johansson, C.

    2003-08-01

    A system for automatic sampling, processing, and analysis of atmospheric radioxenon has been developed. From an air sample of about 7 m3 collected during 12 h, 0.5 cm3 of xenon is extracted, and the atmospheric activities from the four xenon isotopes 133Xe, 135Xe, 131mXe, and 133mXe are determined with a beta-gamma coincidence technique. The collection is performed using activated charcoal and molecular sieves at ambient temperature. The sample preparation and quantification are performed using preparative gas chromatography. The system was tested under routine conditions for a 5-month period, with average minimum detectable concentrations below 1 mBq/ m3 for all four isotopes.

  4. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  5. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens.

    PubMed

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-04-01

    This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear-cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  6. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  7. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    PubMed

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p < 0.01; RV, r = 0.79, p < 0.01). Our semi-automatic longitudinal strain analysis in 4CH cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  8. Automatic Differentiation Package

    SciTech Connect

    Gay, David M.; Phipps, Eric; Bratlett, Roscoe

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  9. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool.

  10. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    NASA Astrophysics Data System (ADS)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  11. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    PubMed

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. PMID:20534329

  12. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  13. Automatic Structure Analysis in High-Throughput Characterization of Porous Materials.

    PubMed

    Haranczyk, Maciej; Sethian, James A

    2010-11-01

    Inspection of the structure and the void space of a porous material is a critical step in most computational studies involving guest molecules. Some sections of the void space, like inaccessible pockets, have to be identified and blocked in molecular simulations. These pockets are typically detected by visual analysis of the geometry, potential or free energy landscapes, or a histogram of an initial molecular simulation. Such visual analysis is time-consuming and inhibits characterization of large sets of materials required in studies focused on identification of the best materials for a given application. We present an automatic approach that bypasses manual visual analysis of this kind, thereby enabling execution of molecular simulations in an unsupervised, high-throughput manner. In our approach, we used a partial differential equations-based front propagation technique to segment out channels and inaccessible pockets of a periodic unit cell of a material. We cast the problem as a path planning problem in 3D space representing a periodic fragment of porous material, and solve the resulting Eikonal equation by using Fast Marching Methods. One attractive feature of this approach is that the to-be-analyzed data can be of varying types, including, for example, a 3D grid representing the distance to the material's surface, the potential or free energy of a molecule inside the material, or even a histogram (a set of snapshots) from a molecular simulation showing areas which were visited by the molecule during the simulation. PMID:26617098

  14. Automatic target detection algorithm for foliage-penetrating ultrawideband SAR data using split spectral analysis

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Kapoor, Ravinder; Ressler, Marc A.

    1999-07-01

    We present an automatic target detection (ATD) algorithm for foliage penetrating (FOPEN) ultra-wideband (UWB) synthetic aperture radar (SAR) data using split spectral analysis. Split spectral analysis is commonly used in the ultrasonic, non-destructive evaluation of materials using wide band pulses for flaw detection. In this paper, we show the application of split spectral analysis for detecting obscured targets in foliage using UWB pulse returns to discriminate targets from foliage, the data spectrum is split into several bands, namely, 20 to 75, 75 to 150, ..., 825 to 900 MHz. An ATD algorithm is developed based on the relative energy levels in various bands, the number of bands containing significant energy (spread of energy), and chip size (number of crossrange and range bins). The algorithm is tested on the (FOPEN UWB SAR) data of foliage and vehicles obscured by foliage collected at Aberdeen Proving Ground, MD. The paper presents various split spectral parameters used in the algorithm and discusses the rationale for their use.

  15. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  16. Automatic generation of skeletal mechanisms for ignition combustion based on level of importance analysis

    SciTech Connect

    Loevaas, Terese

    2009-07-15

    A level of importance (LOI) selection parameter is employed in order to identify species with general low importance to the overall accuracy of a chemical model. This enables elimination of the minor reaction paths in which these species are involved. The generation of such skeletal mechanisms is performed automatically in a pre-processing step ranking species according to their level of importance. This selection criterion is a combined parameter based on a time scale and sensitivity analysis, identifying both short lived species and species with respect to which the observable of interest has low sensitivity. In this work a careful element flux analysis demonstrates that such species do not interact in major reaction paths. Employing the LOI procedure replaces the previous method of identifying redundant species through a two step procedure involving a reaction flow analysis followed by a sensitivity analysis. The flux analysis is performed using DARS {sup copyright}, a digital analysis tool modelling reactive systems. Simplified chemical models are generated based on a detailed ethylene mechanism involving 111 species and 784 reactions (1566 forward and backward reactions) proposed by Wang et al. Eliminating species from detailed mechanisms introduces errors in the predicted combustion parameters. In the present work these errors are systematically studied for a wide range of conditions, including temperature, pressure and mixtures. Results show that the accuracy of simplified models is particularly lowered when the initial temperatures are close to the transition between low- and high-temperature chemistry. A speed-up factor of 5 is observed when using a simplified model containing only 27% of the original species and 19% of the original reactions. (author)

  17. A new and fast methodology to assess oxidative damage in cardiovascular diseases risk development through eVol-MEPS-UHPLC analysis of four urinary biomarkers.

    PubMed

    Mendes, Berta; Silva, Pedro; Mendonça, Isabel; Pereira, Jorge; Câmara, José S

    2013-11-15

    In this work, a new, fast and reliable methodology using a digitally controlled microextraction by packed sorbent (eVol(®)-MEPS) followed by ultra-high pressure liquid chromatography (UHPLC) analysis with photodiodes (PDA) detection, was developed to establish the urinary profile levels of four putative oxidative stress biomarkers (OSBs) in healthy subjects and patients evidencing cardiovascular diseases (CVDs). This data was used to verify the suitability of the selected OSBs (uric acid-UAc, malondialdehyde-MDA, 5-(hydroxymethyl)uracil-5-HMUra and 8-hydroxy-2'-deoxyguanosine-8-oxodG) as potential biomarkers of CVDs progression. Important parameters affecting the efficiency of the extraction process were optimized, particularly stationary phase selection, pH influence, sample volume, number of extraction cycles and washing and elution volumes. The experimental conditions that allowed the best extraction efficiency, expressed in terms of total area of the target analytes and data reproducibility, includes a 10 times dilution and pH adjustment of the urine samples to 6.0, followed by a gradient elution through the C8 adsorbent with 5 times 50 µL of 0.01% formic acid and 3×50 µL of 20% methanol in 0.01% formic acid. The chromatographic separation of the target analytes was performed with a HSS T3 column (100 mm × 2.1 mm, 1.7 µm in particle size) using 0.01% formic acid 20% methanol at 250 µL min(-1). The methodology was validated in terms of selectivity, linearity, instrumental limit of detection (LOD), method limit of quantification (LOQ), matrix effect, accuracy and precision (intra-and inter-day). Good results were obtained in terms of selectivity and linearity (r(2)>0.9906), as well as the LOD and LOQ, whose values were low, ranging from 0.00005 to 0.72 µg mL(-1) and 0.00023 to 2.31 µg mL(-1), respectively. The recovery results (91.1-123.0%), intra-day (1.0-8.3%), inter-day precision (4.6-6.3%) and the matrix effect (60.1-110.3%) of eVol

  18. Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition.

    PubMed

    Sariyanidi, Evangelos; Gunes, Hatice; Cavallaro, Andrea

    2015-06-01

    Automatic affect analysis has attracted great interest in various contexts including the recognition of action units and basic or non-basic emotions. In spite of major efforts, there are several open questions on what the important cues to interpret facial expressions are and how to encode them. In this paper, we review the progress across a range of affect recognition applications to shed light on these fundamental questions. We analyse the state-of-the-art solutions by decomposing their pipelines into fundamental components, namely face registration, representation, dimensionality reduction and recognition. We discuss the role of these components and highlight the models and new trends that are followed in their design. Moreover, we provide a comprehensive analysis of facial representations by uncovering their advantages and limitations; we elaborate on the type of information they encode and discuss how they deal with the key challenges of illumination variations, registration errors, head-pose variations, occlusions, and identity bias. This survey allows us to identify open issues and to define future directions for designing real-world affect recognition systems. PMID:26357337

  19. Automatic roof plane detection and analysis in airborne lidar point clouds for solar potential assessment.

    PubMed

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m(2).

  20. Automatic Behavior Analysis During a Clinical Interview with a Virtual Human.

    PubMed

    Rizzo, Albert; Lucas, Gale; Gratch, Jonathan; Stratou, Giota; Morency, Louis-Philippe; Chavez, Kenneth; Shilling, Russ; Scherer, Stefan

    2016-01-01

    SimSensei is a Virtual Human (VH) interviewing platform that uses off-the-shelf sensors (i.e., webcams, Microsoft Kinect and a microphone) to capture and interpret real-time audiovisual behavioral signals from users interacting with the VH system. The system was specifically designed for clinical interviewing and health care support by providing a face-to-face interaction between a user and a VH that can automatically react to the inferred state of the user through analysis of behavioral signals gleaned from the user's facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human-to-human interaction and communication, SimSensei aims to capture and infer user state from signals generated from user non-verbal communication to improve engagement between a VH and a user and to quantify user state from the data captured across a 20 minute interview. Results from of sample of service members (SMs) who were interviewed before and after a deployment to Afghanistan indicate that SMs reveal more PTSD symptoms to the VH than they report on the Post Deployment Health Assessment. Pre/Post deployment facial expression analysis indicated more sad expressions and few happy expressions at post deployment. PMID:27046598

  1. [Development of a Japanese version of the Valuation of Life (VOL) scale].

    PubMed

    Nakagawa, Takeshi; Gondo, Yasuyuki; Masui, Yukie; Ishioka, Yoshiko; Tabuchi, Megumi; Kamide, Kei; Ikebe, Kazunori; Arai, Yasumichi; Takahashi, Ryutaro

    2013-04-01

    This study developed a Japanese version of the Valuation of Life (VOL) scale, to measure psychological wellbeing among older adults. In Analysis 1, we conducted a factor analysis of 13 items, and identified two factors: positive VOL and spiritual well-being. These factors had adequate degrees of internal consistency, and were related to positive mental health. In Analysis 2, we examined sociodemographic, social, and health predictors for VOL. The role of social factors was stronger than the role of health factors, and spiritual well-being was more related to moral or religious activities than positive VOL. These results suggest that predictors for VOL vary by culture. In Analysis 3, we investigated the relationship between VOL and desired years of life. Positive VOL significantly predicted more desired years of life, whereas spiritual well-being did not. Positive VOL had acceptable reliability and validity. Future research is required to investigate whether VOL predicts survival duration or end-of-life decisions. PMID:23705232

  2. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    PubMed

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  3. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    PubMed Central

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J.; Wild, Conor J.; Auer, Tibor; Linke, Annika C.; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  4. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    PubMed

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  5. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    PubMed

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation. PMID:26010594

  6. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    PubMed

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation.

  7. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  8. Automatic differentiation bibliography

    SciTech Connect

    Corliss, G.F.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  9. Automatic Imitation

    ERIC Educational Resources Information Center

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  10. Automatic analysis of selected choroidal diseases in OCT images of the eye fundus

    PubMed Central

    2013-01-01

    Introduction This paper describes a method for automatic analysis of the choroid in OCT images of the eye fundus in ophthalmology. The problem of vascular lesions occurs e.g. in a large population of patients having diabetes or macular degeneration. Their correct diagnosis and quantitative assessment of the treatment progress are a critical part of the eye fundus diagnosis. Material and method The study analysed about 1’000 OCT images acquired using SOCT Copernicus (Optopol Tech. SA, Zawiercie, Poland). The proposed algorithm for image analysis enabled to analyse the texture of the choroid portion located beneath the RPE (Retinal Pigment Epithelium) layer. The analysis was performed using the profiled algorithm based on morphological analysis and texture analysis and a classifier in the form of decision trees. Results The location of the centres of gravity of individual objects present in the image beneath the RPE layer proved to be important in the evaluation of different types of images. In addition, the value of the standard deviation and the number of objects in a scene were equally important. These features enabled classification of three different forms of the choroid that were related to retinal pathology: diabetic edema (the classification gave accuracy ACC1 = 0.73), ischemia of the inner retinal layers (ACC2 = 0.83) and scarring fibro vascular tissue (ACC3 = 0.69). For the cut decision tree the results were as follows: ACC1 = 0.76, ACC2 = 0.81, ACC3 = 0.68. Conclusions The created decision tree enabled to obtain satisfactory results of the classification of three types of choroidal imaging. In addition, it was shown that for the assumed characteristics and the developed classifier, the location of B-scan does not significantly affect the results. The image analysis method for texture analysis presented in the paper confirmed its usefulness in choroid imaging. Currently the application is further studied in the Clinical Department

  11. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  12. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  13. Automatic extraction of faults and fractal analysis from remote sensing data

    NASA Astrophysics Data System (ADS)

    Gloaguen, R.; Marpu, P. R.; Niemeyer, I.

    2007-03-01

    Object-based classification is a promising technique for image classification. Unlike pixel-based methods, which only use the measured radiometric values, the object-based techniques can also use shape and context information of scene textures. These extra degrees of freedom provided by the objects allow the automatic identification of geological structures. In this article, we present an evaluation of object-based classification in the context of extraction of geological faults. Digital elevation models and radar data of an area near Lake Magadi (Kenya) have been processed. We then determine the statistics of the fault populations. The fractal dimensions of fault dimensions are similar to fractal dimensions directly measured on remote sensing images of the study area using power spectra (PSD) and variograms. These methods allow unbiased statistics of faults and help us to understand the evolution of the fault systems in extensional domains. Furthermore, the direct analysis of image texture is a good indicator of the fault statistics and allows us to classify the intensity and type of deformation. We propose that extensional fault networks can be modeled by iterative function system (IFS).

  14. Automatic aerial image shadow detection through the hybrid analysis of RGB and HIS color space

    NASA Astrophysics Data System (ADS)

    Wu, Jun; Li, Huilin; Peng, Zhiyong

    2015-12-01

    This paper presents our research on automatic shadow detection from high-resolution aerial image through the hybrid analysis of RGB and HIS color space. To this end, the spectral characteristics of shadow are firstly discussed and three kinds of spectral components including the difference between normalized blue and normalized red component - BR, intensity and saturation components are selected as criterions to obtain initial segmentation of shadow region (called primary segmentation). After that, within the normalized RGB color space and HIS color space, the shadow region is extracted again (called auxiliary segmentation) using the OTSU operation, respectively. Finally, the primary segmentation and auxiliary segmentation are combined through a logical AND-connection operation to obtain reliable shadow region. In this step, small shadow areas are removed from combined shadow region and morphological algorithms are apply to fill small holes as well. The experimental results show that the proposed approach can effectively detect the shadow region from high-resolution aerial image and in high degree of automaton.

  15. Gene Ontology density estimation and discourse analysis for automatic GeneRiF extraction

    PubMed Central

    Gobeill, Julien; Tbahriti, Imad; Ehrler, Frédéric; Mottaz, Anaïs; Veuthey, Anne-Lise; Ruch, Patrick

    2008-01-01

    Background This paper describes and evaluates a sentence selection engine that extracts a GeneRiF (Gene Reference into Functions) as defined in ENTREZ-Gene based on a MEDLINE record. Inputs for this task include both a gene and a pointer to a MEDLINE reference. In the suggested approach we merge two independent sentence extraction strategies. The first proposed strategy (LASt) uses argumentative features, inspired by discourse-analysis models. The second extraction scheme (GOEx) uses an automatic text categorizer to estimate the density of Gene Ontology categories in every sentence; thus providing a full ranking of all possible candidate GeneRiFs. A combination of the two approaches is proposed, which also aims at reducing the size of the selected segment by filtering out non-content bearing rhetorical phrases. Results Based on the TREC-2003 Genomics collection for GeneRiF identification, the LASt extraction strategy is already competitive (52.78%). When used in a combined approach, the extraction task clearly shows improvement, achieving a Dice score of over 57% (+10%). Conclusions Argumentative representation levels and conceptual density estimation using Gene Ontology contents appear complementary for functional annotation in proteomics. PMID:18426554

  16. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks.

    PubMed

    Wang, Changhan; Yan, Xinchen; Smith, Max; Kochhar, Kanika; Rubin, Marcie; Warren, Stephen M; Wrobel, James; Lee, Honglak

    2015-08-01

    Wound surface area changes over multiple weeks are highly predictive of the wound healing process. Furthermore, the quality and quantity of the tissue in the wound bed also offer important prognostic information. Unfortunately, accurate measurements of wound surface area changes are out of reach in the busy wound practice setting. Currently, clinicians estimate wound size by estimating wound width and length using a scalpel after wound treatment, which is highly inaccurate. To address this problem, we propose an integrated system to automatically segment wound regions and analyze wound conditions in wound images. Different from previous segmentation techniques which rely on handcrafted features or unsupervised approaches, our proposed deep learning method jointly learns task-relevant visual features and performs wound segmentation. Moreover, learned features are applied to further analysis of wounds in two ways: infection detection and healing progress prediction. To the best of our knowledge, this is the first attempt to automate long-term predictions of general wound healing progress. Our method is computationally efficient and takes less than 5 seconds per wound image (480 by 640 pixels) on a typical laptop computer. Our evaluations on a large-scale wound database demonstrate the effectiveness and reliability of the proposed system. PMID:26736781

  17. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis.

    PubMed

    Liu, Chanjuan; van Netten, Jaap J; van Baal, Jeff G; Bus, Sicco A; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8% ± 1.1% sensitivity and 98.4% ± 0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  18. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.

  19. Sensitivity analysis of a mixed-phase chemical mechanism using automatic differentiation

    SciTech Connect

    Zhang, Y.; Easter, R.C.

    1998-08-01

    A sensitivity analysis of a comprehensive mixed-phase chemical mechanism is conducted under a variety of atmospheric conditions. The local sensitivities of gas and aqueous phase species concentrations with respect to a variety of model parameters are calculated using the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect tropospheric O{sub 3} formation and O{sub 3}-precursor relations under all modeled conditions are identified and analyzed. The results show that the presence of clouds not only reduces many gas phase species concentrations and the total oxidizing capacity but alters O{sub 3}-precursor relations. Decreases in gas phase concentrations and photochemical formation rates of O{sub 3} can be up to 9{percent} and 100{percent}, respectively, depending on the preexisting atmospheric conditions. The decrease in O{sub 3} formation is primarily caused by the aqueous phase reactions of O{sub 2}{sup {minus}} with dissolved HO{sub 2} and O{sub 3} under most cloudy conditions. {copyright} 1998 American Geophysical Union

  20. Sensitivity Analysis of Photochemical Indicators for O3 Chemistry Using Automatic Differentiation

    SciTech Connect

    Zhang, Yang; Bischof, Christian H.; Easter, Richard C.; Wu, Po-Ting

    2005-05-01

    Photochemical indicators for determination of O{sub 3}-NO{sub x}-ROG sensitivity and their sensitivity to model parameters are studied for a variety of polluted conditions using a comprehensive mixed-phase chemistry box model and the novel automatic differentiation ADIFOR tool. The main chemical reaction pathways in all phases, interfacial mass transfer processes, and ambient physical parameters that affect the indicators are identified and analyzed. Condensed mixed-phase chemical mechanisms are derived from the sensitivity analysis. Our results show that cloud chemistry has a significant impact on the indicators and their sensitivities, particularly on those involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z}. Caution should be taken when applying the established threshold values of indicators in regions with large cloud coverage. Among the commonly used indicators, NO{sub y} and O{sub 3}/NO{sub y} are relatively insensitive to most model parameters, whereas indicators involving H{sub 2}O{sub 2}, HNO{sub 3}, HCHO, and NO{sub z} are highly sensitive to changes in initial species concentrations, reaction rate constants, equilibrium constants, temperature, relative humidity, cloud droplet size, and cloud water content.

  1. Automatic analysis and characterization of the hummingbird wings motion using dense optical flow features.

    PubMed

    Martínez, Fabio; Manzanera, Antoine; Romero, Eduardo

    2015-01-01

    A new method for automatic analysis and characterization of recorded hummingbird wing motion is proposed. The method starts by computing a multiscale dense optical flow field, which is used to segment the wings, i.e., pixels with larger velocities. Then, the kinematic and deformation of the wings were characterized as a temporal set of global and local measures: a global angular acceleration as a time function of each wing and a local acceleration profile that approximates the dynamics of the different wing segments. Additionally, the variance of the apparent velocity orientation estimates those wing foci with larger deformation. Finally a local measure of the orientation highlights those regions with maximal deformation. The approach was evaluated in a total of 91 flight cycles, captured using three different setups. The proposed measures follow the yaw turn hummingbird flight dynamics, with a strong correlation of all computed paths, reporting a standard deviation of [Formula: see text] and [Formula: see text] for the global angular acceleration and the global wing deformation respectively. PMID:25599248

  2. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  3. Automatic Analysis for the Chemical Testing of Urine Examination Using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Vilardy, Juan M.; Peña, Jose C.; Daza, Miller F.; Torres, Cesar O.; Mattos, Lorenzo

    2008-04-01

    For to make the chemical testing of urine examination a dipstick is used, which contains pads that have incorporated within them the reagents for chemical reactions for the detection of a number from substances in the urine. Urine is added to the pads for reaction by dipping the dipstick into the urine and then slowly withdrawing it. The subsequent colorimetric reactions are timed to an endpoint; the extent of colors formation is directly related to the level of the urine constituent. The colors can be read manually by comparison with color charts or with the use of automated reflectance meters. The aim of the System described in this paper is to analyze and to determine automatically the changes of the colors in the dipstick when this is retired of the urine sample and to compare the results with color charts for the diagnosis of many common diseases such as diabetes. The system consists of: (a) a USB camera. (b) Computer. (c) Software Matlab v7.4. Image analysis begins with a digital capturing of the image as data. Once the image is acquired in digital format, the data can be manipulated through digital image processing. Our objective was to develop a computerised image processing system and an interactive software package for the backing of clinicians, medical research and medical students.

  4. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations

    PubMed Central

    2015-01-01

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states. PMID:25516725

  5. A visual latent semantic approach for automatic analysis and interpretation of anaplastic medulloblastoma virtual slides.

    PubMed

    Cruz-Roa, Angel; González, Fabio; Galaro, Joseph; Judkins, Alexander R; Ellison, David; Baccon, Jennifer; Madabhushi, Anant; Romero, Eduardo

    2012-01-01

    A method for automatic analysis and interpretation of histopathology images is presented. The method uses a representation of the image data set based on bag of features histograms built from visual dictionary of Haar-based patches and a novel visual latent semantic strategy for characterizing the visual content of a set of images. One important contribution of the method is the provision of an interpretability layer, which is able to explain a particular classification by visually mapping the most important visual patterns associated with such classification. The method was evaluated on a challenging problem involving automated discrimination of medulloblastoma tumors based on image derived attributes from whole slide images as anaplastic or non-anaplastic. The data set comprised 10 labeled histopathological patient studies, 5 for anaplastic and 5 for non-anaplastic, where 750 square images cropped randomly from cancerous region from whole slide per study. The experimental results show that the new method is competitive in terms of classification accuracy achieving 0.87 in average.

  6. Automatic Detections of P and S Phases using Singular Value Decomposition Analysis

    NASA Astrophysics Data System (ADS)

    Kurzon, I.; Vernon, F.; Ben-Zion, Y.; Rosenberger, A.

    2012-12-01

    We implement a new method for the automatic detection of the primary P and S phases using Singular Value Decomposition (SVD) analysis. The method is based on a real-time iteration algorithm of Rosenberger (2010) for the SVD of three component seismograms. Rosenberger's algorithm identifies the incidence angle by applying SVD and separates the waveforms into their P and S components. We have been using the same algorithm, with the modification that we apply a set of filters prior to the SVD, and study the success of these filters in detecting correctly the P and S arrivals, in different stations and segments of the San Jacinto Fault Zone. A recent deployment in San Jacinto Fault Zone area provides a very dense seismic networks, with ~ 90 stations in a fault zone which is 150km long and 30km wide. Embedded in this network are 5 linear arrays crossing the fault trace, with ~ 10 stations at ~ 25-50m spacing in each array. This allows us to test the detection algorithm in a diverse setting, including events with different source mechanisms, stations with different site characteristics, and ray paths that diverge from the SVD approximation used in the algorithm, such as rays propagating within the fault and recorded on the linear arrays. Comparing our new method with classic automatic detection methods using Short Time Average (STA) to Long Time Average (LTA) ratios, we show the success of this SVD detection. Unlike the STA to LTA ratio methods that normally tend to detect the P phase, but in many cases cannot distinguish the S arrival, the main advantage of the SVD method is that almost all the P arrivals have an associated S arrival. Moreover, even for cases of short distance events, in which the S arrivals are masked by the P waves, the SVD algorithm under low band filters, manages to detect those S arrivals. The method is less consistent for stations located directly on the fault traces, in which the SVD approximation is not always valid; but even in such cases the

  7. Multiple-point unit for remote automatic spectral analysis. [on vibrating mechanical devices

    NASA Technical Reports Server (NTRS)

    Aleksandrov, O. A.; Ivanov, V. A.; Karovetskiy, V. N.; Lapenko, A. N.; Masharskiy, B. N.

    1973-01-01

    An experimental model of a mechanical spectrometer is reported that permits vibration measurements at 297 points on a mechanical device and processes this information by digital computer for automatic printout.

  8. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  9. Multifractal Analysis and Relevance Vector Machine-Based Automatic Seizure Detection in Intracranial EEG.

    PubMed

    Zhang, Yanli; Zhou, Weidong; Yuan, Shasha

    2015-09-01

    Automatic seizure detection technology is of great significance for long-term electroencephalogram (EEG) monitoring of epilepsy patients. The aim of this work is to develop a seizure detection system with high accuracy. The proposed system was mainly based on multifractal analysis, which describes the local singular behavior of fractal objects and characterizes the multifractal structure using a continuous spectrum. Compared with computing the single fractal dimension, multifractal analysis can provide a better description on the transient behavior of EEG fractal time series during the evolvement from interictal stage to seizures. Thus both interictal EEG and ictal EEG were analyzed by multifractal formalism and their differences in the multifractal features were used to distinguish the two class of EEG and detect seizures. In the proposed detection system, eight features (α0, α(min), α(max), Δα, f(α(min)), f(α(max)), Δf and R) were extracted from the multifractal spectrums of the preprocessed EEG to construct feature vectors. Subsequently, relevance vector machine (RVM) was applied for EEG patterns classification, and a series of post-processing operations were used to increase the accuracy and reduce false detections. Both epoch-based and event-based evaluation methods were performed to appraise the system's performance on the EEG recordings of 21 patients in the Freiburg database. The epoch-based sensitivity of 92.94% and specificity of 97.47% were achieved, and the proposed system obtained a sensitivity of 92.06% with a false detection rate of 0.34/h in event-based performance assessment.

  10. Implementation of terbium-sensitized luminescence in sequential-injection analysis for automatic analysis of orbifloxacin.

    PubMed

    Llorent-Martínez, E J; Ortega-Barrales, P; Molina-Díaz, A; Ruiz-Medina, A

    2008-12-01

    Orbifloxacin (ORBI) is a third-generation fluoroquinolone developed exclusively for use in veterinary medicine, mainly in companion animals. This antimicrobial agent has bactericidal activity against numerous gram-negative and gram-positive bacteria. A few chromatographic methods for its analysis have been described in the scientific literature. Here, coupling of sequential-injection analysis and solid-phase spectroscopy is described in order to develop, for the first time, a terbium-sensitized luminescent optosensor for analysis of ORBI. The cationic resin Sephadex-CM C-25 was used as solid support and measurements were made at 275/545 nm. The system had a linear dynamic range of 10-150 ng mL(-1), with a detection limit of 3.3 ng mL(-1) and an R.S.D. below 3% (n = 10). The analyte was satisfactorily determined in veterinary drugs and dog and horse urine.

  11. Automatic detection of local arterial input functions through Independent Component Analysis on Dynamic Contrast enhanced Magnetic Resonance Imaging.

    PubMed

    Narvaez, Mario; Ruiz-Espana, Silvia; Arana, Estanislao; Moratal, David

    2015-08-01

    Arterial Input Function (AIF) is obtained from perfusion studies as a basic parameter for the calculus of hemodynamic variables used as surrogate markers of the vascular status of tissues. However, at present, its identification is made manually leading to high subjectivity, low repeatability and considerable time consumption. We propose an alternative method to automatically identify local AIF in perfusion images using Independent Component Analysis. PMID:26737244

  12. Automatic quantitative analysis of experimental primary and secondary retinal neurodegeneration: implications for optic neuropathies

    PubMed Central

    Davis, B M; Guo, L; Brenton, J; Langley, L; Normando, E M; Cordeiro, M F

    2016-01-01

    Secondary neurodegeneration is thought to play an important role in the pathology of neurodegenerative disease, which potential therapies may target. However, the quantitative assessment of the degree of secondary neurodegeneration is difficult. The present study describes a novel algorithm from which estimates of primary and secondary degeneration are computed using well-established rodent models of partial optic nerve transection (pONT) and ocular hypertension (OHT). Brn3-labelled retinal ganglion cells (RGCs) were identified in whole-retinal mounts from which RGC density, nearest neighbour distances and regularity indices were determined. The spatial distribution and rate of RGC loss were assessed and the percentage of primary and secondary degeneration in each non-overlapping segment was calculated. Mean RGC number (82 592±681) and RGC density (1695±23.3 RGC/mm2) in naïve eyes were comparable with previous studies, with an average decline in RGC density of 71±17 and 23±5% over the time course of pONT and OHT models, respectively. Spatial analysis revealed greatest RGC loss in the superior and central retina in pONT, but significant RGC loss in the inferior retina from 3 days post model induction. In comparison, there was no significant difference between superior and inferior retina after OHT induction, and RGC loss occurred mainly along the superior/inferior axis (~30%) versus the nasal–temporal axis (~15%). Intriguingly, a significant loss of RGCs was also observed in contralateral eyes in experimental OHT. In conclusion, a novel algorithm to automatically segment Brn3a-labelled retinal whole-mounts into non-overlapping segments is described, which enables automated spatial and temporal segmentation of RGCs, revealing heterogeneity in the spatial distribution of primary and secondary degenerative processes. This method provides an attractive means to rapidly determine the efficacy of neuroprotective therapies with implications for any

  13. Image structural analysis in the tasks of automatic navigation of unmanned vehicles and inspection of Earth surface

    NASA Astrophysics Data System (ADS)

    Lutsiv, Vadim; Malyshev, Igor

    2013-10-01

    The automatic analysis of images of terrain is urgent for several decades. On the one hand, such analysis is a base of automatic navigation of unmanned vehicles. On the other hand, the amount of information transferred to the Earth by modern video-sensors increases, thus a preliminary classification of such data by onboard computer becomes urgent. We developed an object-independent approach to structural analysis of images. While creating the methods of image structural description, we did our best to abstract away from the partial peculiarities of scenes. Only the most general limitations were taken into account, that were derived from the laws of organization of observable environment and from the properties of image formation systems. The practical application of this theoretic approach enables reliable matching the aerospace photographs acquired from differing aspect angles, in different day-time and seasons by sensors of differing types. The aerospace photographs can be matched even with the geographic maps. The developed approach enabled solving the tasks of automatic navigation of unmanned vehicles. The signs of changes and catastrophes can be detected by means of matching and comparison of aerospace photographs acquired at different time. We present the theoretical proofs of chosen strategy of structural description and matching of images. Several examples of matching of acquired images with template pictures and maps of terrain are shown within the frameworks of navigation of unmanned vehicles or detection of signs of disasters.

  14. AUTOMATIC MASS SPECTROMETER

    DOEpatents

    Hanson, M.L.; Tabor, C.D. Jr.

    1961-12-01

    A mass spectrometer for analyzing the components of a gas is designed which is capable of continuous automatic operation such as analysis of samples of process gas from a continuous production system where the gas content may be changing. (AEC)

  15. Design of advanced automatic inspection system for turbine blade FPI analysis

    NASA Astrophysics Data System (ADS)

    Zheng, J.; Xie, W. F.; Viens, M.; Birglen, L.; Mantegh, I.

    2013-01-01

    Aircraft engine turbine blade is the most susceptible part to discontinuities as it works in the extremely high pressure and temperature. Among various types of NDT method, Fluorescent Penetrant Inspection (FPI) is comparably cheap and efficient thus suitable for detecting turbine blade surface discontinuities. In this paper, we have developed an Advanced Automatic Inspection System (AAIS) with Image Processing and Pattern Recognition techniques to aid human inspector. The system can automatically detect, measure and classify the discontinuities from turbine blade FPI images. The tests on the sample images provided by industrial partner have been performed to evaluate the system.

  16. A system for automatic analysis of blood pressure data for digital computer entry

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1972-01-01

    Operation of automatic blood pressure data system is described. Analog blood pressure signal is analyzed by three separate circuits, systolic, diastolic, and cycle defect. Digital computer output is displayed on teletype paper tape punch and video screen. Illustration of system is included.

  17. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    ERIC Educational Resources Information Center

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  18. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    NASA Astrophysics Data System (ADS)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  19. Analysis of operators for detection of corners set in automatic image matching

    NASA Astrophysics Data System (ADS)

    Zawieska, D.

    2011-12-01

    Reconstruction of three dimensional models of objects from images has been a long lasting research topic in photogrammetry and computer vision. The demand for 3D models is continuously increasing in such fields as cultural heritage, computer graphics, robotics and many others. The number and types of features of a 3D model are highly dependent on the use of the models, and can be very variable in terms of accuracy and time for their creation. In last years, both computer vision and photogrammetric communities have approached the reconstruction problems by using different methods to solve the same tasks, such as camera calibration, orientation, object reconstruction and modelling. The terminology which is used for addressing the particular task in both disciplines is sometimes diverse. On the other hand, the integration of methods and algorithms coming from them can be used to improve both. The image based modelling of an object has been defined as a complete process that starts with image acquisition and ends with an interactive 3D virtual model. The photogrammetric approach to create 3D models involves the followings steps: image pre-processing, camera calibration, orientation of images network, image scanning for point detection, surface measurement and point triangulation, blunder detection and statistical filtering, mesh generation and texturing, visualization and analysis. Currently there is no single software package available that allows for each of those steps to be executed within the same environment. For high accuracy of 3D objects reconstruction operators are required as a preliminary step in the surface measurement process, to find the features that serve as suitable points when matching across multiple images. Operators are the algorithms which detect the features of interest in an image, such as corners, edges or regions. This paper reports on the first phase of research on the generation of high accuracy 3D model measurement and modelling, focusing

  20. Semi-automatic measures of activity in selected south polar regions of Mars using morphological image analysis

    NASA Astrophysics Data System (ADS)

    Aye, Klaus-Michael; Portyankina, Ganna; Pommerol, Antoine; Thomas, Nicolas

    results of these semi-automatically determined seasonal fan count evolutions for Inca City, Ithaca and Manhattan ROIs, compare these evolutionary patterns with each other and with surface reflectance evolutions of both HiRISE and CRISM for the same locations. References: Aye, K.-M. et. al. (2010), LPSC 2010, 2707 Hansen, C. et. al (2010) Icarus, 205, Issue 1, p. 283-295 Kieffer, H.H. (2007), JGR 112 Portyankina, G. et. al. (2010), Icarus, 205, Issue 1, p. 311-320 Thomas, N. et. Al. (2009), Vol. 4, EPSC2009-478

  1. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  2. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals. PMID:25571461

  3. Studies on quantitative analysis and automatic recognition of cell types of lung cancer.

    PubMed

    Chen, Yi-Chen; Hu, Kuang-Hu; Li, Fang-Zhen; Li, Shu-Yu; Su, Wan-Fang; Huang, Zhi-Ying; Hu, Ying-Xiong

    2006-01-01

    Recognition of lung cancer cells is very important to the clinical diagnosis of lung cancer. In this paper we present a novel method to extract the structure characteristics of lung cancer cells and automatically recognize their types. Firstly soft mathematical morphology methods are used to enhance the grayscale image, to improve the definition of images, and to eliminate most of disturbance, noise and information of subordinate images, so the contour of target lung cancer cell and biological shape characteristic parameters can be extracted accurately. Then the minimum distance classifier is introduced to realize the automatic recognition of different types of lung cancer cells. A software system named "CANCER.LUNG" is established to demonstrate the efficiency of this method. The clinical experiments show that this method can accurately and objectively recognize the type of lung cancer cells, which can significantly improve the pathology research on the pathological changes of lung cancer and clinical assistant diagnoses.

  4. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  5. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    PubMed

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals.

  6. Computer analysis of two-dimensional gels: semi-automatic matching.

    PubMed

    Miller, M J; Vo, P K; Nielsen, C; Geiduschek, E P; Xuong, N H

    1982-04-01

    We describe a computer program system for finding, quantitating, and matching the protein spots resolved on a two-dimensional electropherogram. The programs that locate and quantitate the incorporation of radioactivity into individual spots are totally automatic, as are the programs for matching protein spots between two exposures of the same gel. A semi-automatic method is used to match protein spots between different gels. This procedure is quite fast with the use of a computer-graphic display, which is also helpful in the editing process. A data base is set up and programs have been written to correlate matched protein spots from multi-gel experiments and to efficiently plot out quantitative data from sequences of equivalent spots from many gels or even many multi-gel experiments. The practical use of this system is discussed.

  7. [Automated analysis of bacterial preparations manufactured on automatic heat fixation and staining equipment].

    PubMed

    2012-01-01

    Heat fixation of preparations was made in the fixation bath designed by EMKO (Russia). Programmable "Emkosteiner" (EMKO, Russia) was used for trial staining. Reagents set Micko-GRAM-NITsF was applied for Gram's method of staining. It was demostrated that automatic smear fixation equipment and programmable staining ensure high-quality imaging (1% chromaticity variation) good enough for standardization of Gram's staining of microbial preparations.

  8. Texture analysis of automatic graph cuts segmentations for detection of lung cancer recurrence after stereotactic radiotherapy

    NASA Astrophysics Data System (ADS)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2015-03-01

    Stereotactic ablative radiotherapy (SABR) is a treatment for early-stage lung cancer with local control rates comparable to surgery. After SABR, benign radiation induced lung injury (RILI) results in tumour-mimicking changes on computed tomography (CT) imaging. Distinguishing recurrence from RILI is a critical clinical decision determining the need for potentially life-saving salvage therapies whose high risks in this population dictate their use only for true recurrences. Current approaches do not reliably detect recurrence within a year post-SABR. We measured the detection accuracy of texture features within automatically determined regions of interest, with the only operator input being the single line segment measuring tumour diameter, normally taken during the clinical workflow. Our leave-one-out cross validation on images taken 2-5 months post-SABR showed robustness of the entropy measure, with classification error of 26% and area under the receiver operating characteristic curve (AUC) of 0.77 using automatic segmentation; the results using manual segmentation were 24% and 0.75, respectively. AUCs for this feature increased to 0.82 and 0.93 at 8-14 months and 14-20 months post SABR, respectively, suggesting even better performance nearer to the date of clinical diagnosis of recurrence; thus this system could also be used to support and reinforce the physician's decision at that time. Based on our ongoing validation of this automatic approach on a larger sample, we aim to develop a computer-aided diagnosis system which will support the physician's decision to apply timely salvage therapies and prevent patients with RILI from undergoing invasive and risky procedures.

  9. Validation of an automatic comet assay analysis system integrating the curve fitting of combined comet intensity profiles.

    PubMed

    Dehon, G; Catoire, L; Duez, P; Bogaerts, P; Dubois, J

    2008-02-29

    In recent years, the single-cell gel electrophoresis (comet) assay has become a reference technique for the assessment of DNA fragmentation both in vitro and in vivo at the cellular level. In order to improve the throughput of genotoxicity screening, development of fully automated systems is clearly a must. This would allow us to increase processing time and to avoid subjectivity brought about by frequent manual settings required for the 'classical' analysis systems. To validate a fully automatic system developed in our laboratory, different experiments were conducted in vitro on murine P388D1 cells with increasing doses of ethyl methanesulfonate (up to 5 mM), thus covering a large range of DNA damage (up to 80% of DNA in the tail). The present study (1) validates our 'in house' fully automatic system versus a widely used semi-automatic commercial system for the image-analysis step, and versus the human eye for the image acquisition step, (2) shows that computing tail DNA a posteriori on the basis of a curve fitting concept that combines intensity profiles [G. Dehon, P. Bogaerts, P. Duez, L. Catoire, J. Dubois, Curve fitting of combined comet intensity profiles: a new global concept to quantify DNA damage by the comet assay, Chemom. Intell. Lab. Syst. 73 (2004) 235-243] gives results not significantly different from the 'classical' approach but is much more accurate and easy to undertake and (3) demonstrates that, with these increased performances, the number of comets to be scored can be reduced to a minimum of 20 comets per slide without sacrificing statistical reliability. PMID:18160335

  10. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    PubMed

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  11. Finite Element Analysis of Osteosynthesis Screw Fixation in the Bone Stock: An Appropriate Method for Automatic Screw Modelling

    PubMed Central

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  12. Automatic flow injection analysis (FIA) determination of total reducing capacity in serum and urine samples.

    PubMed

    Segundo, Marcela A; Tóth, Ildikó V; Magalhães, Luís M; Reis, Salette

    2015-01-01

    Automation of total antioxidant capacity assessment can substantially increase the determination throughput, allowing large scale studies and screening experiments. Total reducing capacity evaluation can be implemented under different chemistries, including the CUPRAC-Cupric Ion Reducing Antioxidant Capacity -assay. This assay is based on reduction of Cu(II)-neocuproine complex to highly colored Cu(I)-neocuproine complex by reducing (antioxidant) components of biological samples. In this chapter, we propose an automatic flow injection method for evaluation of total reducing capacity in serum and urine samples, attaining end-point data within 4 min using a kinetic matching strategy.

  13. Computer aided diagnosis system for retinal analysis: automatic assessment of the vascular tortuosity.

    PubMed

    Sánchez, L; Barreira, N; Penedo, M G; Coll De Tuero, G

    2014-01-01

    The tortuosity of a vessel, that is, how many times a vessel curves, and how these turns are, is an important value for the diagnosis of certain diseases. Clinicians analyze fundus images manually in order to estimate it, but there is many drawbacks as it is a tedious, time-consuming and subjective work. Thus, automatic image processing methods become a necessity, as they make possible the efficient computation of objective parameters. In this paper we will discuss Sirius (System for the Integration of Retinal Images Understanding Service), a web-based application that enables the storage and treatment of various types of diagnostic tests and, more specifically, its tortuosity calculation module.

  14. Nondestructive activation analysis of sample of lunar surface material returned by Luna 16 automatic station. [chemical composition

    NASA Technical Reports Server (NTRS)

    Chayko, M.; Sabo, E.

    1974-01-01

    The composition of a sample of lunar surface material returned by the Luna 16 automatic station from the Sea of Fertility was studied, using nondestructive activation analysis. The structure of the returned surface material is inhomogeneous; the surficial material is thin, quite homogeneous, and the granularity increases with depth. Based on grain size, the sample was separated into five zones. The activation analysis was conducted on a sample taken from the friable surficial layer, zone A. The content of Al, Mn, Na, Cr, Co, Fe, and Sc was determined by nondestructive activation analysis of the sample. In determining Cr, Co, Fe, and Sc, the sample was irradiated for 24 hours and cooled for 10 days. Gamma spectra of the samples were recorded with a semiconductor Ge(Li)-detector and a multichannel analyzer, and measurement data were processed with an electronic computer.

  15. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    PubMed

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  16. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    PubMed

    Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie

    2013-01-01

    Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated. PMID:23762484

  17. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images.

    PubMed

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  18. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  19. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI.

    PubMed

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z; Stone, Maureen; Prince, Jerry L

    2014-12-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations.

  20. Automatic analysis of composite physical signals using non-negative factorization and information criterion.

    PubMed

    Watanabe, Kenji; Hidaka, Akinori; Otsu, Nobuyuki; Kurita, Takio

    2012-01-01

    In time-resolved spectroscopy, composite signal sequences representing energy transfer in fluorescence materials are measured, and the physical characteristics of the materials are analyzed. Each signal sequence is represented by a sum of non-negative signal components, which are expressed by model functions. For analyzing the physical characteristics of a measured signal sequence, the parameters of the model functions are estimated. Furthermore, in order to quantitatively analyze real measurement data and to reduce the risk of improper decisions, it is necessary to obtain the statistical characteristics from several sequences rather than just a single sequence. In the present paper, we propose an automatic method by which to analyze composite signals using non-negative factorization and an information criterion. The proposed method decomposes the composite signal sequences using non-negative factorization subjected to parametric base functions. The number of components (i.e., rank) is also estimated using Akaike's information criterion. Experiments using simulated and real data reveal that the proposed method automatically estimates the acceptable ranks and parameters.

  1. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    NASA Astrophysics Data System (ADS)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  2. Automatic Analysis of Composite Physical Signals Using Non-Negative Factorization and Information Criterion

    PubMed Central

    Watanabe, Kenji; Hidaka, Akinori; Otsu, Nobuyuki; Kurita, Takio

    2012-01-01

    In time-resolved spectroscopy, composite signal sequences representing energy transfer in fluorescence materials are measured, and the physical characteristics of the materials are analyzed. Each signal sequence is represented by a sum of non-negative signal components, which are expressed by model functions. For analyzing the physical characteristics of a measured signal sequence, the parameters of the model functions are estimated. Furthermore, in order to quantitatively analyze real measurement data and to reduce the risk of improper decisions, it is necessary to obtain the statistical characteristics from several sequences rather than just a single sequence. In the present paper, we propose an automatic method by which to analyze composite signals using non-negative factorization and an information criterion. The proposed method decomposes the composite signal sequences using non-negative factorization subjected to parametric base functions. The number of components (i.e., rank) is also estimated using Akaike's information criterion. Experiments using simulated and real data reveal that the proposed method automatically estimates the acceptable ranks and parameters. PMID:22396759

  3. A clinically viable capsule endoscopy video analysis platform for automatic bleeding detection

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Jiao, Heng; Xie, Jean; Mui, Peter; Leighton, Jonathan A.; Pasha, Shabana; Rentz, Lauri; Abedi, Mahmood

    2013-02-01

    In this paper, we present a novel and clinically valuable software platform for automatic bleeding detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos for GI tract run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. As a result, the process is time consuming and is prone to disease miss-finding. While researchers have made efforts to automate this process, however, no clinically acceptable software is available on the marketplace today. Working with our collaborators, we have developed a clinically viable software platform called GISentinel for fully automated GI tract bleeding detection and classification. Major functional modules of the SW include: the innovative graph based NCut segmentation algorithm, the unique feature selection and validation method (e.g. illumination invariant features, color independent features, and symmetrical texture features), and the cascade SVM classification for handling various GI tract scenes (e.g. normal tissue, food particles, bubbles, fluid, and specular reflection). Initial evaluation results on the SW have shown zero bleeding instance miss-finding rate and 4.03% false alarm rate. This work is part of our innovative 2D/3D based GI tract disease detection software platform. While the overall SW framework is designed for intelligent finding and classification of major GI tract diseases such as bleeding, ulcer, and polyp from the CE videos, this paper will focus on the automatic bleeding detection functional module.

  4. Semi-automatic segmentation for 3D motion analysis of the tongue with dynamic MRI

    PubMed Central

    Lee, Junghoon; Woo, Jonghye; Xing, Fangxu; Murano, Emi Z.; Stone, Maureen; Prince, Jerry L.

    2014-01-01

    Dynamic MRI has been widely used to track the motion of the tongue and measure its internal deformation during speech and swallowing. Accurate segmentation of the tongue is a prerequisite step to define the target boundary and constrain the tracking to tissue points within the tongue. Segmentation of 2D slices or 3D volumes is challenging because of the large number of slices and time frames involved in the segmentation, as well as the incorporation of numerous local deformations that occur throughout the tongue during motion. In this paper, we propose a semi-automatic approach to segment 3D dynamic MRI of the tongue. The algorithm steps include seeding a few slices at one time frame, propagating seeds to the same slices at different time frames using deformable registration, and random walker segmentation based on these seed positions. This method was validated on the tongue of five normal subjects carrying out the same speech task with multi-slice 2D dynamic cine-MR images obtained at three orthogonal orientations and 26 time frames. The resulting semi-automatic segmentations of a total of 130 volumes showed an average dice similarity coefficient (DSC) score of 0.92 with less segmented volume variability between time frames than in manual segmentations. PMID:25155697

  5. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images

    PubMed Central

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality. PMID:27190636

  6. Analysis of the Distances Covered by First Division Brazilian Soccer Players Obtained with an Automatic Tracking Method

    PubMed Central

    Barros, Ricardo M.L.; Misuta, Milton S.; Menezes, Rafael P.; Figueroa, Pascual J.; Moura, Felipe A.; Cunha, Sergio A.; Anido, Ricardo; Leite, Neucimar J.

    2007-01-01

    Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players’, both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. Key pointsA novel automatic tracking method was presented. No previous

  7. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat . PMID:26285671

  8. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  9. Automatic cell segmentation and nuclear-to-cytoplasmic ratio analysis for third harmonic generated microscopy medical images.

    PubMed

    Lee, Gwo Giun; Lin, Huan-Hsiang; Tsai, Ming-Rung; Chou, Sin-Yo; Lee, Wen-Jeng; Liao, Yi-Hua; Sun, Chi-Kuang; Chen, Chun-Fu

    2013-04-01

    Traditional biopsy procedures require invasive tissue removal from a living subject, followed by time-consuming and complicated processes, so noninvasive in vivo virtual biopsy, which possesses the ability to obtain exhaustive tissue images without removing tissues, is highly desired. Some sets of in vivo virtual biopsy images provided by healthy volunteers were processed by the proposed cell segmentation approach, which is based on the watershed-based approach and the concept of convergence index filter for automatic cell segmentation. Experimental results suggest that the proposed algorithm not only reveals high accuracy for cell segmentation but also has dramatic potential for noninvasive analysis of cell nuclear-to-cytoplasmic ratio (NC ratio), which is important in identifying or detecting early symptoms of diseases with abnormal NC ratios, such as skin cancers during clinical diagnosis via medical imaging analysis.

  10. Automatic detection of CT perfusion datasets unsuitable for analysis due to head movement of acute ischemic stroke patients.

    PubMed

    Fahmi, Fahmi; Marquering, Henk A; Streekstra, Geert J; Beenen, Ludo F M; Janssen, Natasja N Y; Majoie, Charles B L; van Bavel, Ed

    2014-01-01

    Head movement during brain Computed Tomography Perfusion (CTP) can deteriorate perfusion analysis quality in acute ischemic stroke patients. We developed a method for automatic detection of CTP datasets with excessive head movement, based on 3D image-registration of CTP, with non-contrast CT providing transformation parameters. For parameter values exceeding predefined thresholds, the dataset was classified as 'severely moved'. Threshold values were determined by digital CTP phantom experiments. The automated selection was compared to manual screening by 2 experienced radiologists for 114 brain CTP datasets. Based on receiver operator characteristics, optimal thresholds were found of respectively 1.0°, 2.8° and 6.9° for pitch, roll and yaw, and 2.8 mm for z-axis translation. The proposed method had a sensitivity of 91.4% and a specificity of 82.3%. This method allows accurate automated detection of brain CTP datasets that are unsuitable for perfusion analysis. PMID:24691387

  11. Analysis of switch-induced error voltage for automatic conversion mode change charge pumps

    NASA Astrophysics Data System (ADS)

    Yuan, Chi; Xinquan, Lai; Hanxiao, Du

    2015-05-01

    This paper presents an exact expression for switch-induced error voltage which would cause a spike voltage on the output capacitor of the automatic conversion mode change (ACMC) charge pumps. The spike voltage will introduce several undesired problems—large output voltage ripple, serious frequency noise and low efficiency. Some methods used for reducing the spike voltage are provided by the proposed expression. An equivalent lumped model is used for deducing the expression. The ACMC charge pump circuit has been designed in SILTERRA 0.18 μm CMOS process. The experiment results show that the value of the spike voltage can match the expression well. Compared with three different improved versions, the spike voltage caused by the switch-induced error voltage can be reduced obviously. Project supported by the National Natural Science Foundation of China (No. 61106026).

  12. Accuracy and Block Deformation Analysis in Automatic Uav and Terrestrial Photogrammetry - Lesson Learnt -

    NASA Astrophysics Data System (ADS)

    Nocerino; Menna; Remondino; Saleri

    2013-07-01

    The paper reports the results of an integrated Unmanned Aerial Vehicle (UAV) and terrestrial photogrammetric survey realized in the archaeological site of the Roman theatre in Ventimiglia, Italy. The main deliverables were 2D drawings at scale 1:20, which required a Ground Sample Distance (GSD) less than 4 mm and, consequently, accuracy better than 4 mm. The UAV was employed to acquire both vertical and oblique images, while the terrestrial imaging acquisition was realized with the aim of generating separate dense point clouds of some vertical structures, corresponding to the sections required. The variability of results with automatic photogrammetric procedures against different image network configurations, with and without ground control, are analysed and presented.

  13. Versatile, high sensitivity, and automatized angular dependent vectorial Kerr magnetometer for the analysis of nanostructured materials.

    PubMed

    Teixeira, J M; Lusche, R; Ventura, J; Fermento, R; Carpinteiro, F; Araujo, J P; Sousa, J B; Cardoso, S; Freitas, P P

    2011-04-01

    Magneto-optical Kerr effect (MOKE) magnetometry is an indispensable, reliable, and one of the most widely used techniques for the characterization of nanostructured magnetic materials. Information, such as the magnitude of coercive fields or anisotropy strengths, can be readily obtained from MOKE measurements. We present a description of our state-of-the-art vectorial MOKE magnetometer, being an extremely versatile, accurate, and sensitivity unit with a low cost and comparatively simple setup. The unit includes focusing lenses and an automatized stepper motor stage for angular dependent measurements. The performance of the magnetometer is demonstrated by hysteresis loops of Co thin films displaying uniaxial anisotropy induced on growth, MnIr/CoFe structures exhibiting the so called exchange bias effect, spin valves, and microfabricated flux guides produced by optical lithography.

  14. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    NASA Astrophysics Data System (ADS)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  15. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA) and Spectral Discrimination Measurements

    PubMed Central

    Song, Ahram; Chang, Anjin; Choi, Jaewan; Choi, Seokkeun; Kim, Yongil

    2015-01-01

    Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs) have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE) images obtained from Iterative Error Analysis (IEA) and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA) and Compact Airborne Spectrographic Imager (CASI) data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials. PMID:25625907

  16. Automatic spike sorting for extracellular electrophysiological recording using unsupervised single linkage clustering based on grey relational analysis

    NASA Astrophysics Data System (ADS)

    Lai, Hsin-Yi; Chen, You-Yin; Lin, Sheng-Huang; Lo, Yu-Chun; Tsang, Siny; Chen, Shin-Yuan; Zhao, Wan-Ting; Chao, Wen-Hung; Chang, Yao-Chuan; Wu, Robby; Shih, Yen-Yu I.; Tsai, Sheng-Tsung; Jaw, Fu-Shan

    2011-06-01

    Automatic spike sorting is a prerequisite for neuroscience research on multichannel extracellular recordings of neuronal activity. A novel spike sorting framework, combining efficient feature extraction and an unsupervised clustering method, is described here. Wavelet transform (WT) is adopted to extract features from each detected spike, and the Kolmogorov-Smirnov test (KS test) is utilized to select discriminative wavelet coefficients from the extracted features. Next, an unsupervised single linkage clustering method based on grey relational analysis (GSLC) is applied for spike clustering. The GSLC uses the grey relational grade as the similarity measure, instead of the Euclidean distance for distance calculation; the number of clusters is automatically determined by the elbow criterion in the threshold-cumulative distribution. Four simulated data sets with four noise levels and electrophysiological data recorded from the subthalamic nucleus of eight patients with Parkinson's disease during deep brain stimulation surgery are used to evaluate the performance of GSLC. Feature extraction results from the use of WT with the KS test indicate a reduced number of feature coefficients, as well as good noise rejection, despite similar spike waveforms. Accordingly, the use of GSLC for spike sorting achieves high classification accuracy in all simulated data sets. Moreover, J-measure results in the electrophysiological data indicating that the quality of spike sorting is adequate with the use of GSLC.

  17. Conversation analysis at work: detection of conflict in competitive discussions through semi-automatic turn-organization analysis.

    PubMed

    Pesarin, Anna; Cristani, Marco; Murino, Vittorio; Vinciarelli, Alessandro

    2012-10-01

    This study proposes a semi-automatic approach aimed at detecting conflict in conversations. The approach is based on statistical techniques capable of identifying turn-organization regularities associated with conflict. The only manual step of the process is the segmentation of the conversations into turns (time intervals during which only one person talks) and overlapping speech segments (time intervals during which several persons talk at the same time). The rest of the process takes place automatically and the results show that conflictual exchanges can be detected with Precision and Recall around 70% (the experiments have been performed over 6 h of political debates). The approach brings two main benefits: the first is the possibility of analyzing potentially large amounts of conversational data with a limited effort, the second is that the model parameters provide indications on what turn-regularities are most likely to account for the presence of conflict. PMID:22009168

  18. Automatic Stabilization

    NASA Technical Reports Server (NTRS)

    Haus, FR

    1936-01-01

    This report lays more stress on the principles underlying automatic piloting than on the means of applications. Mechanical details of servomotors and the mechanical release device necessary to assure instantaneous return of the controls to the pilot in case of malfunction are not included. Descriptions are provided of various commercial systems.

  19. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents

    PubMed Central

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M.; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  20. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents.

    PubMed

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing. PMID:27471462

  1. Automatic control of a robot camera for broadcasting and subjective evaluation and analysis of reproduced images

    NASA Astrophysics Data System (ADS)

    Kato, Daiichiro; Ishikawa, Akio; Tsuda, Takao; Shimoda, Shigeru; Fukushima, Hiroshi

    2000-06-01

    We are studying about an intelligent robot camera that can automatically shoot an object and produce images with a powerful sense of reality as if a very skilled cameraman were at work. In this study, we designed a control algorithm based on cameramen's techniques for the control of the robot camera and conducted a series of experiments to understand the effects of camera work on how images look to viewers. The findings were as follows: (1) Evaluation scores are high when actual data by cameraman, especially typical data, are used as the position adjusting velocity curve of the target. (2) Evaluation scores are relatively high for images taken with feedback-feedforward camera control method when the target moves in one direction. (3) When both the direction and velocity of the target change and when the target gets bigger and faster in the view finder, it becomes increasingly difficult to keep the target within the view finder using the control method that imitates the human camera handling. (4) The method with mechanical feedback, on the other hand, is able to cope with rapid changes in the target's direction and velocity, constantly keeping the target within the view finder. Even so, the viewer finds the image rather mechanical than natural.

  2. A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents.

    PubMed

    Colomer Granero, Adrián; Fuentes-Hurtado, Félix; Naranjo Ornedo, Valery; Guixeres Provinciale, Jaime; Ausín, Jose M; Alcañiz Raya, Mariano

    2016-01-01

    This work focuses on finding the most discriminatory or representative features that allow to classify commercials according to negative, neutral and positive effectiveness based on the Ace Score index. For this purpose, an experiment involving forty-seven participants was carried out. In this experiment electroencephalography (EEG), electrocardiography (ECG), Galvanic Skin Response (GSR) and respiration data were acquired while subjects were watching a 30-min audiovisual content. This content was composed by a submarine documentary and nine commercials (one of them the ad under evaluation). After the signal pre-processing, four sets of features were extracted from the physiological signals using different state-of-the-art metrics. These features computed in time and frequency domains are the inputs to several basic and advanced classifiers. An average of 89.76% of the instances was correctly classified according to the Ace Score index. The best results were obtained by a classifier consisting of a combination between AdaBoost and Random Forest with automatic selection of features. The selected features were those extracted from GSR and HRV signals. These results are promising in the audiovisual content evaluation field by means of physiological signal processing.

  3. Multi-Objective Differential Evolution for Automatic Clustering with Application to Micro-Array Data Analysis

    PubMed Central

    Suresh, Kaushik; Kundu, Debarati; Ghosh, Sayan; Das, Swagatam; Abraham, Ajith; Han, Sang Yong

    2009-01-01

    This paper applies the Differential Evolution (DE) algorithm to the task of automatic fuzzy clustering in a Multi-objective Optimization (MO) framework. It compares the performances of two multi-objective variants of DE over the fuzzy clustering problem, where two conflicting fuzzy validity indices are simultaneously optimized. The resultant Pareto optimal set of solutions from each algorithm consists of a number of non-dominated solutions, from which the user can choose the most promising ones according to the problem specifications. A real-coded representation of the search variables, accommodating variable number of cluster centers, is used for DE. The performances of the multi-objective DE-variants have also been contrasted to that of two most well-known schemes of MO clustering, namely the Non Dominated Sorting Genetic Algorithm (NSGA II) and Multi-Objective Clustering with an unknown number of Clusters K (MOCK). Experimental results using six artificial and four real life datasets of varying range of complexities indicate that DE holds immense promise as a candidate algorithm for devising MO clustering schemes. PMID:22412346

  4. Analysis of electric energy consumption of automatic milking systems in different configurations and operative conditions.

    PubMed

    Calcante, Aldo; Tangorra, Francesco M; Oberti, Roberto

    2016-05-01

    Automatic milking systems (AMS) have been a revolutionary innovation in dairy cow farming. Currently, more than 10,000 dairy cow farms worldwide use AMS to milk their cows. Electric consumption is one of the most relevant and uncontrollable operational cost of AMS, ranging between 35 and 40% of their total annual operational costs. The aim of the present study was to measure and analyze the electric energy consumption of 4 AMS with different configurations: single box, central unit featuring a central vacuum system for 1 cow unit and for 2 cow units. The electrical consumption (daily consumption, daily consumption per cow milked, consumption per milking, and consumption per 100L of milk) of each AMS (milking unit + air compressor) was measured using 2 energy analyzers. The measurement period lasted 24h with a sampling frequency of 0.2Hz. The daily total energy consumption (milking unit + air compressor) ranged between 45.4 and 81.3 kWh; the consumption per cow milked ranged between 0.59 and 0.99 kWh; the consumption per milking ranged between 0.21 and 0.33 kWh; and the consumption per 100L of milk ranged between 1.80 to 2.44 kWh according to the different configurations and operational contexts considered. Results showed that AMS electric consumption was mainly conditioned by farm management rather than machine characteristics/architectures.

  5. Formant analysis in dysphonic patients and automatic Arabic digit speech recognition

    PubMed Central

    2011-01-01

    Background and objective There has been a growing interest in objective assessment of speech in dysphonic patients for the classification of the type and severity of voice pathologies using automatic speech recognition (ASR). The aim of this work was to study the accuracy of the conventional ASR system (with Mel frequency cepstral coefficients (MFCCs) based front end and hidden Markov model (HMM) based back end) in recognizing the speech characteristics of people with pathological voice. Materials and methods The speech samples of 62 dysphonic patients with six different types of voice disorders and 50 normal subjects were analyzed. The Arabic spoken digits were taken as an input. The distribution of the first four formants of the vowel /a/ was extracted to examine deviation of the formants from normal. Results There was 100% recognition accuracy obtained for Arabic digits spoken by normal speakers. However, there was a significant loss of accuracy in the classifications while spoken by voice disordered subjects. Moreover, no significant improvement in ASR performance was achieved after assessing a subset of the individuals with disordered voices who underwent treatment. Conclusion The results of this study revealed that the current ASR technique is not a reliable tool in recognizing the speech of dysphonic patients. PMID:21624137

  6. Automatic simultaneous determination of copper and lead in biological samples by flow injection/stripping voltammetric analysis

    PubMed Central

    Izquierdo, Andrés; de Castro, M. D. Luque; Valcárcel, Miguel

    1993-01-01

    An automatic-continuous method for the simultaneous determination of copper and lead based on flow injection analysis (FIA) and stripping voltammetry (SV) is proposed. The method affords the determination of the analytes at the ng/ml level (linear ranges 0.64 to 64.0 ng/ml and 2.1 to 62.2 ng/ml for copper and lead, respectively) with good precision (r.s.d. values smaller than 4%). The selectivity of SV allows the method to be applied to the determination of these analytes in bovine liver fresh samples and certified reference materials from the National Institute for Standards and Technology and the Community Bureau of Reference. The performance of the method was assessed by repeatability and validation statistical studies. PMID:18924966

  7. Laser-induced breakdown spectroscopy for 24/7 automatic liquid slag analysis at a steel works.

    PubMed

    Sturm, Volker; Fleige, Rüdiger; de Kanter, Martinus; Leitner, Richard; Pilz, Karl; Fischer, Daniel; Hubmer, Gerhard; Noll, Reinhard

    2014-10-01

    Laser-induced breakdown spectroscopy (LIBS) is applied for the inline analysis of liquid slag at a steel works. The slag in the ladle of a slag transporter is measured at a distance of several meters during a short stop of the transporter. The slag surface with temperatures from ≈600 to ≈1400 °C consists of liquid slag and solidified slag parts. Automatic measurements at varying filling levels of the ladle are realized, and the duration amounts to 2 min including data transmission to the host computer. Analytical results of the major components such as CaO, Fe, SiO2, MgO, Mn, and Al2O3 are compared with reference values from the steel works laboratory for solid pressed slag samples as well as for samples from the liquid slag. Stable 24/7 operation during the first three-month test run was achieved.

  8. Antenna system analysis and design for automatic detection and real-time tracking of electron Bernstein waves in FTU

    NASA Astrophysics Data System (ADS)

    Bin, W.; Alessi, E.; Bruschi, A.; D'Arcangelo, O.; Figini, L.; Galperti, C.; Garavaglia, S.; Granucci, G.; Moro, A.

    2014-05-01

    The algorithm for the automatic control of the new front steering antenna of the Frascati Tokamak Upgrade device has been improved, in view of forthcoming experiments aimed at testing the mode conversion of electron cyclotron waves at a frequency of 140 GHz. The existing antenna system has been prepared to provide two-point real-time measurements of electron Bernstein waves and to allow real-time tracking of the optimal conversion region. This required an accurate analysis of the antenna to minimize the risk of a mechanical damage of the movable launching mirrors, when accessing the high toroidal launching angles needed for this kind of experiment. A detailed description is presented of the work carried out to safely reach and validate the desired range of steering angles, which include the region of interest, and a technique is proposed to track and chase the correct line of sight for electron Bernstein waves detection during the shot.

  9. Comparison of automatic control systems

    NASA Technical Reports Server (NTRS)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  10. AUTOMATIC COUNTER

    DOEpatents

    Robinson, H.P.

    1960-06-01

    An automatic counter of alpha particle tracks recorded by a sensitive emulsion of a photographic plate is described. The counter includes a source of mcdulated dark-field illumination for developing light flashes from the recorded particle tracks as the photographic plate is automatically scanned in narrow strips. Photoelectric means convert the light flashes to proportional current pulses for application to an electronic counting circuit. Photoelectric means are further provided for developing a phase reference signal from the photographic plate in such a manner that signals arising from particle tracks not parallel to the edge of the plate are out of phase with the reference signal. The counting circuit includes provision for rejecting the out-of-phase signals resulting from unoriented tracks as well as signals resulting from spurious marks on the plate such as scratches, dust or grain clumpings, etc. The output of the circuit is hence indicative only of the tracks that would be counted by a human operator.

  11. Automatic regional analysis of DTI properties in the developmental macaque brain

    NASA Astrophysics Data System (ADS)

    Styner, Martin; Knickmeyer, Rebecca; Coe, Christopher; Short, Sarah J.; Gilmore, John

    2008-03-01

    Many neuroimaging studies are applied to monkeys as pathologies and environmental exposures can be studied in well-controlled settings and environment. In this work, we present a framework for the use of an atlas based, fully automatic segmentation of brain tissues, lobar parcellations, subcortical structures and the regional extraction of Diffusion Tensor Imaging (DTI) properties. We first built a structural atlas from training images by iterative, joint deformable registration into an unbiased average image. On this atlas, probabilistic tissue maps, a lobar parcellation and subcortical structures were determined. This information is applied to each subjects structural image via affine, followed by deformable registration. The affinely transformed atlas is employed for a joint T1 and T2 based tissue classification. The deformed parcellation regions mask the tissue segmentations to define the parcellation for white and gray matter separately. Each subjects structural image is then non-rigidly matched with its DTI image by normalized mutual information, b-spline based registration. The DTI property histograms were then computed using the probabilistic white matter information for each lobar parcellation. We successfully built an average atlas using a developmental training datasets of 18 cases aged 16-34 months. Our framework was successfully applied to over 50 additional subjects in the age range of 9 70 months. The probabilistically weighted FA average in the corpus callosum region showed the largest increase over time in the observed age range. Most cortical regions show modest FA increase, whereas the cerebellums FA values remained stable. The individual methods used in this segmentation framework have been applied before, but their combination is novel, as is their application to macaque MRI data. Furthermore, this is the first study to date looking at the DTI properties of the developing macaque brain.

  12. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    SciTech Connect

    Rashidi, M.; Dehmeshid, J.; Dickenson, E.; Daemi, F.

    1997-07-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurements within a porous medium. An aqueous fluid laced with a fluorescent dye or microspheres flows through a transparent, reflective-index-matched column packed with a transparent crystals. For illumination purposes, a planar sheet of lasers passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fluids have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows though the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder.The recorder images are acquired frame by frame and transferred to the computer foe processing by using a frame grabber and written relevant algorithms through an RD-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these measurement, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact imaged (at difference times during the experiments) that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentration as a function of time within the porous column.

  13. Automatic Indexing of Full Texts.

    ERIC Educational Resources Information Center

    Jonak, Zdenek

    1984-01-01

    Demonstrates efficiency of preparation of query description using semantic analyser method based on analysis of semantic structure of documents in field of automatic indexing. Results obtained are compared with automatic indexing results performed by traditional methods and results of indexing done by human indexers. Sample terms and codes are…

  14. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    SciTech Connect

    Fang, Y; Huang, H; Su, T

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  15. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  16. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  17. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    NASA Astrophysics Data System (ADS)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78

  18. Automatic tremor detection and waveform component analysis using a neural network approach

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E. S.; Wang, T.; Potier, C. E.

    2010-12-01

    Recent studies over the last decade have established that non-volcanic tremor is a ubiquitous phenomenon commonly observed in subduction zones. In recent years, it has also been widely observed in strike-slip faulting environments. In particular, observations of tremor along the San Andreas Fault indicate that many of the events occur at depths ranging between 15 and 45 km, suggesting that tremor typically occurs in the zone where fault slip behaviour transitions between stick-slip and stable-sliding frictional regimes. As such, much of the tremor occurs at or below the depths of microseismicity, and therefore has the potential to provide clues about the slip behaviour of faults at depth. Despite several recent advances, the origin and characteristics of tremor along strike-slip faults are not well-resolved. The emergent phase arrivals, low amplitude waveforms, and variable event durations associated with non-volcanic tremor make automatic tremor event detection a non-trivial task. Recent approaches employ a cross-correlation technique which readily isolates individual template tremor bursts within tremor episodes. However the method tends to detect events with nearly identical waveforms and moveout across stations within an array. We employ a new method to identify tremor in large data volumes using an automated technique that does not require the use of a designated template event. Furthermore, the same method can be used to identify distinctive tremor waveform features, such as frequency content, polarity, and amplitude ratios. We use continuous broadband waveforms from 13 STS-2 seismometers deployed in May 2010 along the Cholame segment of the San Andreas Fault. The maximum station spacing within the array is approximately 25km. We first cross-correlate waveform envelopes to reduce the data volume and find isolated seismic events. Next, we use a neural network approach to cluster events in the reduced data set. Because the unsupervised neural network algorithm

  19. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  20. Clarifying Inconclusive Functional Analysis Results: Assessment and Treatment of Automatically Reinforced Aggression

    PubMed Central

    Saini, Valdeep; Greer, Brian D.; Fisher, Wayne W.

    2016-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy’s functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences maintaining aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation. PMID:25891269

  1. Clarifying inconclusive functional analysis results: Assessment and treatment of automatically reinforced aggression.

    PubMed

    Saini, Valdeep; Greer, Brian D; Fisher, Wayne W

    2015-01-01

    We conducted a series of studies in which multiple strategies were used to clarify the inconclusive results of one boy's functional analysis of aggression. Specifically, we (a) evaluated individual response topographies to determine the composition of aggregated response rates, (b) conducted a separate functional analysis of aggression after high rates of disruption masked the consequences that maintained aggression during the initial functional analysis, (c) modified the experimental design used during the functional analysis of aggression to improve discrimination and decrease interaction effects between conditions, and (d) evaluated a treatment matched to the reinforcer hypothesized to maintain aggression. An effective yet practical intervention for aggression was developed based on the results of these analyses and from data collected during the matched-treatment evaluation.

  2. On 3-D modeling and automatic regridding in shape design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Yao, Tse-Min

    1987-01-01

    The material derivative idea of continuum mechanics and the adjoint variable method of design sensitivity analysis are used to obtain a computable expression for the effect of shape variations on measures of structural performance of three-dimensional elastic solids.

  3. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    PubMed

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design. PMID:24449648

  4. How automatic is the musical stroop effect? Commentary on “the musical stroop effect: opening a new avenue to research on automatisms” by l. Grégoire, P. Perruchet, and B. Poulin-Charronnat (Experimental Psychology, 2013, vol. 60, pp. 269–278).

    PubMed

    Moeller, Birte; Frings, Christian

    2014-01-01

    Grégoire, Perruchet, and Poulin-Charronnat (2013) investigated a musical variant of the reversed Stroop effect. According to the authors, one big advantage of this variant is that the automaticity of note naming can be better controlled than in other Stroop variants as musicians are very practiced in note reading whereas non-musicians are not. In this comment we argue that at present the exact impact of automaticity in this Stroop variant remains somewhat unclear for at least three reasons, namely due to the type of information that is automatically retrieved when notes are encountered, due to the possible influence of object-based attention, and finally due to the fact that the exact influence of expertise on interference cannot be pinpointed with an extreme group design.

  5. Automatic transmission

    SciTech Connect

    Ohkubo, M.

    1988-02-16

    An automatic transmission is described combining a stator reversing type torque converter and speed changer having first and second sun gears comprising: (a) a planetary gear train composed of first and second planetary gears sharing one planetary carrier in common; (b) a clutch and requisite brakes to control the planetary gear train; and (c) a speed-increasing or speed-decreasing mechanism is installed both in between a turbine shaft coupled to a turbine of the stator reversing type torque converter and the first sun gear of the speed changer, and in between a stator shaft coupled to a reversing stator and the second sun gear of the speed changer.

  6. Automatic stabilization

    NASA Technical Reports Server (NTRS)

    Haus, FR

    1936-01-01

    This report concerns the study of automatic stabilizers and extends it to include the control of the three-control system of the airplane instead of just altitude control. Some of the topics discussed include lateral disturbed motion, static stability, the mathematical theory of lateral motion, and large angles of incidence. Various mechanisms and stabilizers are also discussed. The feeding of Diesel engines by injection pumps actuated by engine compression, achieves the required high speeds of injection readily and permits rigorous control of the combustible charge introduced into each cylinder and of the peak pressure in the resultant cycle.

  7. Automatic transmission

    SciTech Connect

    Miki, N.

    1988-10-11

    This patent describes an automatic transmission including a fluid torque converter, a first gear unit having three forward-speed gears and a single reverse gear, a second gear unit having a low-speed gear and a high-speed gear, and a hydraulic control system, the hydraulic control system comprising: a source of pressurized fluid; a first shift valve for controlling the shifting between the first-speed gear and the second-speed gear of the first gear unit; a second shift valve for controlling the shifting between the second-speed gear and the third-speed gear of the first gear unit; a third shift valve equipped with a spool having two positions for controlling the shifting between the low-speed gear and the high-speed gear of the second gear unit; a manual selector valve having a plurality of shift positions for distributing the pressurized fluid supply from the source of pressurized fluid to the first, second and third shift valves respectively; first, second and third solenoid valves corresponding to the first, second and third shift valves, respectively for independently controlling the operation of the respective shift valves, thereby establishing a six forward-speed automatic transmission by combining the low-speed gear and the high-speed gear of the second gear unit with each of the first-speed gear, the second speed gear and the third-speed gear of the first gear unit; and means to fixedly position the spool of the third shift valve at one of the two positions by supplying the pressurized fluid to the third shift valve when the manual selector valve is shifted to a particular shift position, thereby locking the second gear unit in one of low-speed gear and the high-speed gear, whereby the six forward-speed automatic transmission is converted to a three forward-speed automatic transmission when the manual selector valve is shifted to the particular shift position.

  8. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice.

  9. [Reliability of % vol. declarations on labels of wine bottles].

    PubMed

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice. PMID:15887778

  10. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  11. FamPipe: An Automatic Analysis Pipeline for Analyzing Sequencing Data in Families for Disease Studies

    PubMed Central

    Chung, Ren-Hua; Tsai, Wei-Yun; Kang, Chen-Yu; Yao, Po-Ju; Tsai, Hui-Ju; Chen, Chia-Hsiang

    2016-01-01

    In disease studies, family-based designs have become an attractive approach to analyzing next-generation sequencing (NGS) data for the identification of rare mutations enriched in families. Substantial research effort has been devoted to developing pipelines for automating sequence alignment, variant calling, and annotation. However, fewer pipelines have been designed specifically for disease studies. Most of the current analysis pipelines for family-based disease studies using NGS data focus on a specific function, such as identifying variants with Mendelian inheritance or identifying shared chromosomal regions among affected family members. Consequently, some other useful family-based analysis tools, such as imputation, linkage, and association tools, have yet to be integrated and automated. We developed FamPipe, a comprehensive analysis pipeline, which includes several family-specific analysis modules, including the identification of shared chromosomal regions among affected family members, prioritizing variants assuming a disease model, imputation of untyped variants, and linkage and association tests. We used simulation studies to compare properties of some modules implemented in FamPipe, and based on the results, we provided suggestions for the selection of modules to achieve an optimal analysis strategy. The pipeline is under the GNU GPL License and can be downloaded for free at http://fampipe.sourceforge.net. PMID:27272119

  12. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    SciTech Connect

    Wei, J; Yuan, A; Li, G

    2014-06-15

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  13. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  14. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  15. [Automatic EEG analysis in the time domain and its possible clinical significance--presentation of a flexible software package].

    PubMed

    Spiel, G; Benninger, F

    1986-01-01

    The intention of the automatic EEG analysis is to take several EEG characteristics into account and therefore be usable for different applications to quantify events in the EEG. This procedure of analysis is based on the estimation of maxima and minima points within the measured data and the calculation of the wavelength of the half-waves. This is done by correction of the actually measured maxima-minima-points along the t-axis by means of an interpolation technique, and the frequency of half waves are calculated from this solution with an accuracy of half a Hertz. This method was necessary because our equipment allows only a digitalisation rate of 8 ms (Harner, 1977). Using this procedure it is possible to record the frequency distribution, the distribution of amplitudes, and the distribution of steepness as distributions of elementary EEG characteristics. To characterize specified EEG patterns, the EEG data can be classified according to categories of combinations of quantified characteristics. If we consider topological aspects as well there are the following possibilities: 1 element. characteristic--1 EEG channel; 1 element. characteristic--2 or more EEG channels; several element. characteristics--1 EEG channel; several element. characteristics--2 or more channels. There are possibilities of data reduction, exemplified on the distribution of frequencies without taking into account the topological aspects. The above mentioned methods of data reduction are useful for one EEG channel. On the other hand a comparison of the EEG activity in different channels can be done. PMID:3774345

  16. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  17. Tensor based singular spectrum analysis for automatic scoring of sleep EEG.

    PubMed

    Kouchaki, Samaneh; Sanei, Saeid; Arbon, Emma L; Dijk, Derk-Jan

    2015-01-01

    A new supervised approach for decomposition of single channel signal mixtures is introduced in this paper. The performance of the traditional singular spectrum analysis algorithm is significantly improved by applying tensor decomposition instead of traditional singular value decomposition. As another contribution to this subspace analysis method, the inherent frequency diversity of the data has been effectively exploited to highlight the subspace of interest. As an important application, sleep electroencephalogram has been analyzed and the stages of sleep for the subjects in normal condition, with sleep restriction, and with sleep extension have been accurately estimated and compared with the results of sleep scoring by clinical experts.

  18. Automatic transmission

    SciTech Connect

    Aoki, H.

    1989-03-21

    An automatic transmission is described, comprising: a torque converter including an impeller having a connected member, a turbine having an input member and a reactor; and an automatic transmission mechanism having first to third clutches and plural gear units including a single planetary gear unit with a ring gear and a dual planetary gear unit with a ring gear. The single and dual planetary gear units have respective carriers integrally coupled with each other and respective sun gears integrally coupled with each other, the input member of the turbine being coupled with the ring gear of the single planetary gear unit through the first clutch, and being coupled with the sun gear through the second clutch. The connected member of the impeller is coupled with the ring gear of the dual planetary gear of the dual planetary gear unit is made to be and ring gear of the dual planetary gear unit is made to be restrained as required, and the carrier is coupled with an output member.

  19. Automatic co-registration of space-based sensors for precision change detection and analysis

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  20. Automatic analysis of nuclear-magnetic-resonance-spectroscopy clinical research data

    NASA Astrophysics Data System (ADS)

    Scott, Katherine N.; Wilson, David C.; Bruner, Angela P.; Lyles, Teresa A.; Underhill, Brandon; Geiser, Edward A.; Ballinger, J. Ray; Scott, James D.; Stopka, Christine B.

    1998-03-01

    A major problem of P-31 nuclear magnetic spectroscopy (MRS) in vivo applications is that when large data sets are acquired, the time invested in data reduction and analysis with currently available technologies may totally overshadow the time required for data acquisition. An example is out MRS monitoring of exercise therapy for patients with peripheral vascular disease. In these, the spectral acquisition requires 90 minutes per patient study, whereas data analysis and reduction requires 6-8 hours. Our laboratory currently uses the proprietary software SA/GE developed by General Electric. However, other software packages have similar limitations. When data analysis takes this long, the researcher does not have the rapid feedback required to ascertain the quality of data acquired nor the result of the study. This highly undesirable even in a research environment, but becomes intolerable in the clinical setting. The purpose of this report is to outline progress towards the development of an automated method for eliminating the spectral analysis burden on the researcher working in the clinical setting.

  1. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    ERIC Educational Resources Information Center

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  2. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  3. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    NASA Astrophysics Data System (ADS)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  4. A Semantic Approach to the Automatic Analysis of Japanese and English.

    ERIC Educational Resources Information Center

    Tamati, Tuneo; Kurihara, Tosihiko

    In order to mechanize the processing of natural language, the linguist must make the machine interpret the meaning, or semantic content of the language, in some way or other. This means that the machine should extract not only syntactic but also semantic information from the source sentence through the analysis of it. In this paper, the authors…

  5. Enzymatic Microreactors for the Determination of Ethanol by an Automatic Sequential Injection Analysis System

    NASA Astrophysics Data System (ADS)

    Alhadeff, Eliana M.; Salgado, Andrea M.; Cos, Oriol; Pereira, Nei; Valdman, Belkis; Valero, Francisco

    A sequential injection analysis system with two enzymatic microreactors for the determination of ethanol has been designed. Alcohol oxidase and horseradish peroxidase were separately immobilized on glass aminopropyl beads, and packed in 0.91-mL volume microreactors, working in line with the sequential injection analysis system. A stop flow of 120 s was selected for a linear ethanol range of 0.005-0.04 g/L±0.6% relative standard deviation with a throughput of seven analyses per hour. The system was applied to measure ethanol concentrations in samples of distilled and nondistilled alcoholic beverages, and of alcoholic fermentation with good performance and no significant difference compared with other analytical procedures (gas chromatography and high-performance liquid chromatography).

  6. Toward Automatic Scalability Analysis of Message Passing Programs: A Case Study

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; Mehra, Pankaj; Block, Robert; Tucker, Deanne (Technical Monitor)

    1994-01-01

    Scalability analysis forms an important component of any performance debugging cycle, for massively parallel machines. However, tools that help in performing such analysis for parallel programs are non-existent. The primary reason for lack of such tools is the complexity involved in capturing program dynamics such as communication-computation overlap, communication latencies and memory hierarchy reference patterns. In this paper, we highlight some simple techniques that can be used to study scalability of explicit message-passing parallel programs that consider the above issues. We start from the high level source code and use a methodology for deducing communication characteristics and its impact on the total execution time of the program. The approach is validated with the help of a pipelined method for solving scalar tri-diagonal systems, using both simulations and symbolic cost models on the Intel hypercube.

  7. Automatic backscatter analysis of regional left ventricular systolic function using color kinesis.

    PubMed

    Schwartz, S L; Cao, Q L; Vannan, M A; Pandian, N G

    1996-06-15

    Assessment of regional wall motion by 2-dimensional echocardiography can be performed by either semiquantitative wall motion scoring or by quantitative analysis. The former is subjective and requires expertise. Quantitative methods are too time-consuming for routine use in a busy clinical laboratory. Color kinesis is a new algorithm utilizing acoustic backscatter analysis. It provides a color encoded map of endocardial motion in real time. In each frame a new color layer is added; the thickness of the color beam represents endocardial motion during that frame. The end-systolic image has multiple color layers, representing regional and temporal heterogeneity of segmental motion. The purpose of this study was to validate the use of color kinesis for semiquantitative analysis of regional left ventricular systolic function and quantitatively in measurement of endocardial excursion. Semiquantitative wall motion scoring was performed in 18 patients using both 2-dimensional echo and color kinesis. Scoring was identical in 74% of segments; there was 84% agreement in definition of normal vs. abnormal. There was less interobserver variability in wall motion scoring using color kinesis. Endocardial excursion was quantified in 21 patients. 70% of the imaged segments were suitable for analysis. Correlation between 2-dimensional echocardiographic measurements and color kinesis was excellent, r = 0.87. The mean difference in excursion as measured by the 2 methods was -0.05 +/- 2.0 mm. In conclusion, color kinesis is a useful method for assessing regional contraction by displaying a color map of systolic endocardial excursion. This algorithm may improve the confidence and accuracy of assessment of segmental ventricular function by echocardiographic methods.

  8. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  9. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    NASA Astrophysics Data System (ADS)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  10. Automatic snow extent extraction in alpine environments: short and medium term 2000-2006 analysis

    NASA Astrophysics Data System (ADS)

    Gamba, P.; Lisini, G.; Merlin, E.; Riva, F.

    2007-10-01

    Water resources in Northern Italy has dramatically shortened in the past 10 to 20 years, and recent phenomena connected to the climate change have further sharpened the trend. To match the observable and collected information with this experience and find methodologies to improve the water management cycle in the Lombardy Region, University of Milan Bicocca, Fondazione Lombardia per l'Ambiente and ARPA Lombardia are currently funding a project, named "Regional Impact of Climatic Change in Lombardy Water Resources: Modelling and Applications" (RICLIC-WARM). In the framework of this project, the analysis of the fraction of water available and provided to the whole regional network by the snow cover of the Alps will be investigated by means of remotely sensed data. While there are already a number of algorithms devoted to this task for data coming from various and different sensors in the visible and infrared regions, no operative comparison and analytical analysis of the advantages and drawbacks of using different data has been attempted. This idea will pave the way for a fusion of the available information as well as a multi-source mapping procedure which will be able to exploit successfully the huge quantity of data available for the past and the even larger amount that may be accessed in the future. To this aim, a comparison on selected dates for the whole 2000/2006 period was performed.

  11. The metagenomics RAST server - a public resource for the automatic phylogenetic and functional analysis of metagenomes.

    SciTech Connect

    Meyer, F.; Paarmann, D.; D'Souza, M.; Olson, R.; Glass, E. M.; Kubal, M.; Paczian, T.; Stevens, R.; Wilke, A.; Wilkening, J.; Edwards, R. A.; Rodriguez, A.; Mathematics and Computer Science; Univ. of Chicago; San Diego State Univ.

    2008-09-19

    Random community genomes (metagenomes) are now commonly used to study microbes in different environments. Over the past few years, the major challenge associated with metagenomics shifted from generating to analyzing sequences. High-throughput, low-cost next-generation sequencing has provided access to metagenomics to a wide range of researchers. A high-throughput pipeline has been constructed to provide high-performance computing to all researchers interested in using metagenomics. The pipeline produces automated functional assignments of sequences in the metagenome by comparing both protein and nucleotide databases. phylogenetic and functional summaries of the metagenomes are generated, and tools for comparative metagenomics are incorporated into the standard views. user access is controlled to ensure data privacy, but the collaborative environment underpinning the service provides a framework for sharing databasets between multiple users. In the metagenomics RAST, all users retain full control of their data, and everything is available for download in a variety of formats. The open-source metagenomics RAST service provides a new paradigm for the annotation and analysis of metagenomes. With built-in support for multiple data sources and a back end that houses abstract data types, the metagenomics RAST is stable, extensible, and freely available to all researchers. This service has removed one of the primary bottlenecks in metagenome sequence analysis--the available of high-performance computing for annotating the data.

  12. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    NASA Astrophysics Data System (ADS)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  13. Automaticity of Conceptual Magnitude

    PubMed Central

    Gliksman, Yarden; Itamar, Shai; Leibovich, Tali; Melman, Yonatan; Henik, Avishai

    2016-01-01

    What is bigger, an elephant or a mouse? This question can be answered without seeing the two animals, since these objects elicit conceptual magnitude. How is an object’s conceptual magnitude processed? It was suggested that conceptual magnitude is automatically processed; namely, irrelevant conceptual magnitude can affect performance when comparing physical magnitudes. The current study further examined this question and aimed to expand the understanding of automaticity of conceptual magnitude. Two different objects were presented and participants were asked to decide which object was larger on the screen (physical magnitude) or in the real world (conceptual magnitude), in separate blocks. By creating congruent (the conceptually larger object was physically larger) and incongruent (the conceptually larger object was physically smaller) pairs of stimuli it was possible to examine the automatic processing of each magnitude. A significant congruity effect was found for both magnitudes. Furthermore, quartile analysis revealed that the congruity was affected similarly by processing time for both magnitudes. These results suggest that the processing of conceptual and physical magnitudes is automatic to the same extent. The results support recent theories suggested that different types of magnitude processing and representation share the same core system. PMID:26879153

  14. Automatic neuron segmentation and neural network analysis method for phase contrast microscopy images

    PubMed Central

    Pang, Jincheng; Özkucur, Nurdan; Ren, Michael; Kaplan, David L.; Levin, Michael; Miller, Eric L.

    2015-01-01

    Phase Contrast Microscopy (PCM) is an important tool for the long term study of living cells. Unlike fluorescence methods which suffer from photobleaching of fluorophore or dye molecules, PCM image contrast is generated by the natural variations in optical index of refraction. Unfortunately, the same physical principles which allow for these studies give rise to complex artifacts in the raw PCM imagery. Of particular interest in this paper are neuron images where these image imperfections manifest in very different ways for the two structures of specific interest: cell bodies (somas) and dendrites. To address these challenges, we introduce a novel parametric image model using the level set framework and an associated variational approach which simultaneously restores and segments this class of images. Using this technique as the basis for an automated image analysis pipeline, results for both the synthetic and real images validate and demonstrate the advantages of our approach. PMID:26601004

  15. Automatic clustering and population analysis of white matter tracts using maximum density paths.

    PubMed

    Prasad, Gautam; Joshi, Shantanu H; Jahanshad, Neda; Villalon-Reina, Julio; Aganj, Iman; Lenglet, Christophe; Sapiro, Guillermo; McMahon, Katie L; de Zubicaray, Greig I; Martin, Nicholas G; Wright, Margaret J; Toga, Arthur W; Thompson, Paul M

    2014-08-15

    We introduce a framework for population analysis of white matter tracts based on diffusion-weighted images of the brain. The framework enables extraction of fibers from high angular resolution diffusion images (HARDI); clustering of the fibers based partly on prior knowledge from an atlas; representation of the fiber bundles compactly using a path following points of highest density (maximum density path; MDP); and registration of these paths together using geodesic curve matching to find local correspondences across a population. We demonstrate our method on 4-Tesla HARDI scans from 565 young adults to compute localized statistics across 50 white matter tracts based on fractional anisotropy (FA). Experimental results show increased sensitivity in the determination of genetic influences on principal fiber tracts compared to the tract-based spatial statistics (TBSS) method. Our results show that the MDP representation reveals important parts of the white matter structure and considerably reduces the dimensionality over comparable fiber matching approaches. PMID:24747738

  16. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    NASA Astrophysics Data System (ADS)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  17. Automatic Clustering and Population Analysis of White Matter Tracts using Maximum Density Paths

    PubMed Central

    Prasad, Gautam; Joshi, Shantanu H.; Jahanshad, Neda; Villalon-Reina, Julio; Aganj, Iman; Lenglet, Christophe; Sapiro, Guillermo; McMahon, Katie L.; de Zubicaray, Greig I.; Martin, Nicholas G.; Wright, Margaret J.; Toga, Arthur W.; Thompson, Paul M.

    2014-01-01

    We introduce a framework for population analysis of white matter tracts based on diffusion-weighted images of the brain. The framework enables extraction of fibers from high angular resolution diffusion images (HARDI); clustering of the fibers based partly on prior knowledge from an atlas; representation of the fiber bundles compactly using a path following points of highest density (maximum density path; MDP); and registration of these paths together using geodesic curve matching to find local correspondences across a population. We demonstrate our method on 4-Tesla HARDI scans from 565 young adults to compute localized statistics across 50 white matter tracts based on fractional anisotropy (FA). Experimental results show increased sensitivity in the determination of genetic influences on principal fiber tracts compared to the tract-based spatial statistics (TBSS) method. Our results show that the MDP representation reveals important parts of the white matter structure and considerably reduces the dimensionality over comparable fiber matching approaches. PMID:24747738

  18. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  19. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature

    PubMed Central

    MacGillivray, Thomas J; Cameron, James R.; Zhang, Qiuli; El-Medany, Ahmed; Mulholland, Carl; Sheng, Ziyan; Dhillon, Bal; Doubal, Fergus N.; Foster, Paul J.

    2015-01-01

    Purpose To assess the suitability of retinal images held in the UK Biobank - the largest retinal data repository in a prospective population-based cohort - for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease. Methods Non-mydriatic fundus images from both eyes of 2,690 participants - people with a self-reported history of myocardial infarction (n=1,345) and a matched control group (n=1,345) - were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry. Results Total operator time was approximately 360 hours (4 minutes per image). 2,252 (84%) of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60%) of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004) and analysed (p<0.0001). Cases exhibited more acute arteriolar branching angles (p=0.02) as well as lower arteriolar and venular tortuosity (p<0.0001). Conclusions A proportion of the retinal images in UK Biobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group. PMID:26000792

  20. Index to the Journal of American Indian Education, Vol. 1, No. 1 - Vol. 8, No. 1.

    ERIC Educational Resources Information Center

    Loomis, Charlotte Ann

    All articles (112) that appeared in the "Journal of American Indian Education" (JAIE), Vol. 1., No. 1 (June 1961) through Vol. 8, No 1 (October 1968) are indexed and annotated. The publication is divided into 3 parts: (1) annotations listed in order of appearance in JAIE by volume, number, and page; (2) author index; and (3) subject index. Later…

  1. Bayesian analysis of fingerprint, face and signature evidences with automatic biometric systems.

    PubMed

    Gonzalez-Rodriguez, Joaquin; Fierrez-Aguilar, Julian; Ramos-Castro, Daniel; Ortega-Garcia, Javier

    2005-12-20

    The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.

  2. Automatic unattended sampling and analysis of background levels of C 2;C 5 hydrocarbons

    NASA Astrophysics Data System (ADS)

    Mowrer, Jacques; Lindskog, Anne

    As part of the European program for monitoring anthropogenic air pollutants (EUROTRAC), C 2C 5 hydrocarbons (gas phase) are being routinely measured at a background station at Rörvik, Sweden. A 2 ℓ air sample is taken every 4 h, and a compressed air standard and helium blank are analysed daily. The method is based on adsorption of the hydrocarbons onto an active charcoal based adsorbent, desorption/crofocusing onto a capillary trap, and analysis using capillary gas chromatography with a flame ionization detector. A Perma Pure dryer is used to remove water from the sample, and hydrocarbons > C 6 are removed using a Tenax adsorbent. The analytical instrument can be left unattended for up to 2 weeks at a time, depending on the consumption of liquid nitrogen and the compressed gases. Baseline or near baseline resolution is obtained for the 23 hydrocarbons monitored in this study. Reproducibility for the C 2C 4 isomers is 1-2%, and 2-15% for the C 5 isomers. The detection limit is 1-7 pptv. Preliminary mean hydrocarbon concentrations are presented for the period 21 February-9 April 1989.

  3. Automatic QSO Selection Algorithm Using Time Series Analysis and Machine Learning

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Won; Protopapas, P.; Alcock, C.; Byun, Y.; Khardon, R.

    2011-01-01

    We present a new QSO selection algorithm using time series analysis and supervised machine learning. To characterize the lightcurves, we extracted multiple times series features such as period, amplitude, color and autocorrelation value. We then used Support Vector Machine (SVM), a supervised machine learning algorithm, to separate QSOs from other types of variable stars, microlensing events and non-variable stars. In order to train the QSO SVM model, we used 58 known QSOs, 1,629 variable stars and 4,288 non-variable stars from the MAssive Compact Halo Objects (MACHO) database. Cross-validation test shows that the model identifies 80% of known QSOs and have 25% false positive rate. Most of the false positives during the cross-validation are Be stars, known to show similar variability characteristic with QSOs. We applied the trained QSO SVM model to the MACHO Large Magellanic Cloud (LMC) dataset, which consists of 40million lightcurves, and found 1,097 QSO candidates. We crossmatched the candidates with several astronomical catalogs including the Spizter SAGE (Surveying the Agents of a Galaxy's Evolution) LMC catalog and various X-ray catalogs. The results suggest that the most of the candidates are likely true QSOs.

  4. Assessment of features for automatic CTG analysis based on expert annotation.

    PubMed

    Chudácek, Vacláv; Spilka, Jirí; Lhotská, Lenka; Janku, Petr; Koucký, Michal; Huptych, Michal; Bursa, Miroslav

    2011-01-01

    Cardiotocography (CTG) is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO) since 1960's used routinely by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the ever-used features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and the features are assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. Annotation derived from the panel of experts instead of the commonly utilized pH values was used for evaluation of the features on a large data set (552 records). We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. Number of acceleration and deceleration, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features. PMID:22255719

  5. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    PubMed

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study.

  6. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    PubMed

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study. PMID:27413193

  7. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  8. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    PubMed Central

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-01-01

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms. PMID:26393595

  9. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    PubMed

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  10. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    PubMed

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  11. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    NASA Astrophysics Data System (ADS)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  12. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    NASA Astrophysics Data System (ADS)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  13. Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion

    NASA Astrophysics Data System (ADS)

    Beigi, Parmida; Salcudean, Septimiu E.; Rohling, Robert; Ng, Gary C.

    2016-03-01

    This paper presents an automatic localization method for a standard hand-held needle in ultrasound based on temporal motion analysis of spatially decomposed data. Subtle displacement arising from tremor motion has a periodic pattern which is usually imperceptible in the intensity image but may convey information in the phase image. Our method aims to detect such periodic motion of a hand-held needle and distinguish it from intrinsic tissue motion, using a technique inspired by video magnification. Complex steerable pyramids allow specific design of the wavelets' orientations according to the insertion angle as well as the measurement of the local phase. We therefore use steerable pairs of even and odd Gabor wavelets to decompose the ultrasound B-mode sequence into various spatial frequency bands. Variations of the local phase measurements in the spatially decomposed input data is then temporally analyzed using a finite impulse response bandpass filter to detect regions with a tremor motion pattern. Results obtained from different pyramid levels are then combined and thresholded to generate the binary mask input for the Hough transform, which determines an estimate of the direction angle and discards some of the outliers. Polynomial fitting is used at the final stage to remove any remaining outliers and improve the trajectory detection. The detected needle is finally added back to the input sequence as an overlay of a cloud of points. We demonstrate the efficiency of our approach to detect the needle using subtle tremor motion in an agar phantom and in-vivo porcine cases where intrinsic motion is also present. The localization accuracy was calculated by comparing to expert manual segmentation, and presented in (mean, standard deviation and root-mean-square error) of (0.93°, 1.26° and 0.87°) and (1.53 mm, 1.02 mm and 1.82 mm) for the trajectory and the tip, respectively.

  14. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    PubMed

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes. PMID:12435377

  15. Automatic transmission

    SciTech Connect

    Miura, M.; Inuzuka, T.

    1986-08-26

    1. An automatic transmission with four forward speeds and one reverse position, is described which consists of: an input shaft; an output member; first and second planetary gear sets each having a sun gear, a ring gear and a carrier supporting a pinion in mesh with the sun gear and ring gear; the carrier of the first gear set, the ring gear of the second gear set and the output member all being connected; the ring gear of the first gear set connected to the carrier of the second gear set; a first clutch means for selectively connecting the input shaft to the sun gear of the first gear set, including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a second clutch means for selectively connecting the input shaft to the sun gear of the second gear set a third clutch means for selectively connecting the input shaft to the carrier of the second gear set including friction elements, a piston selectively engaging the friction elements and a fluid servo in which hydraulic fluid is selectively supplied to the piston; a first drive-establishing means for selectively preventing rotation of the ring gear of the first gear set and the carrier of the second gear set in only one direction and, alternatively, in any direction; a second drive-establishing means for selectively preventing rotation of the sun gear of the second gear set; and a drum being open to the first planetary gear set, with a cylindrical intermediate wall, an inner peripheral wall and outer peripheral wall and forming the hydraulic servos of the first and third clutch means between the intermediate wall and the inner peripheral wall and between the intermediate wall and the outer peripheral wall respectively.

  16. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    NASA Astrophysics Data System (ADS)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  17. Automatic analysis of medial temporal lobe atrophy from structural MRIs for the early assessment of Alzheimer disease

    PubMed Central

    Calvini, Piero; Chincarini, Andrea; Gemme, Gianluca; Penco, Maria Antonietta; Squarcia, Sandro; Nobili, Flavio; Rodriguez, Guido; Bellotti, Roberto; Catanzariti, Ezio; Cerello, Piergiorgio; De Mitri, Ivan; Fantacci, Maria Evelina

    2009-01-01

    The purpose of this study is to develop a software for the extraction of the hippocampus and surrounding medial temporal lobe (MTL) regions from T1-weighted magnetic resonance (MR) images with no interactive input from the user, to introduce a novel statistical indicator, computed on the intensities in the automatically extracted MTL regions, which measures atrophy, and to evaluate the accuracy of the newly developed intensity-based measure of MTL atrophy to (a) distinguish between patients with Alzheimer disease (AD), patients with amnestic mild cognitive impairment (aMCI), and elderly controls by using established criteria for patients with AD and aMCI as the reference standard and (b) infer about the clinical outcome of aMCI patients. For the development of the software, the study included 61 patients with mild AD (17 men, 44 women; mean age±standard deviation (SD), 75.8 years±7.8; Mini Mental State Examination (MMSE) score, 24.1±3.1), 42 patients with aMCI (11 men, 31 women; mean age±SD, 75.2 years±4.9; MMSE score, 27.9±1.9), and 30 elderly healthy controls (10 men, 20 women; mean age±SD, 74.7 years±5.2; MMSE score, 29.1±0.8). For the evaluation of the statistical indicator, 150 patients with mild AD (62 men, 88 women; mean age±SD, 76.3 years±5.8; MMSE score, 23.2±4.1), 247 patients with aMCI (143 men, 104 women; mean age±SD, 75.3 years±6.7; MMSE score, 27.0±1.8), and 135 elderly healthy controls (61 men, 74 women; mean age±SD, 76.4 years±6.1). Fifty aMCI patients were evaluated every 6 months over a 3 year period to assess conversion to AD. For each participant, two subimages of the MTL regions were automatically extracted from T1-weighted MR images with high spatial resolution. An intensity-based MTL atrophy measure was found to separate control, MCI, and AD cohorts. Group differences wereassessed by using two-sample t test. Individual classification was analyzed by using receiver operating characteristic (ROC) curves. Compared to controls

  18. "AID"-ing Academic Program Evaluation: The "Automatic Interaction Detector" as Analysis Tool. AIR 1984 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Bloom, Allan M.; And Others

    The use of of the Automatic Interaction Detector (program AID3 of the OSIRIS statistical package) to study a university program is discussed. The performance of students who took general physics lecture and laboratory concurrently is compared to the performance of those who took them separately. Five years of data are analyzed, covering 1,997…

  19. How well Do Phonological Awareness and Rapid Automatized Naming Correlate with Chinese Reading Accuracy and Fluency? A Meta-Analysis

    ERIC Educational Resources Information Center

    Song, Shuang; Georgiou, George K.; Su, Mengmeng; Hua, Shu

    2016-01-01

    Previous meta-analyses on the relationship between phonological awareness, rapid automatized naming (RAN), and reading have been conducted primarily in English, an atypical alphabetic orthography. Here, we aimed to examine the association between phonological awareness, RAN, and word reading in a nonalphabetic language (Chinese). A random-effects…

  20. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  1. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells.

  2. CIRF Publications, Vol. 12, No. 5.

    ERIC Educational Resources Information Center

    International Labour Office, Geneva (Switzerland).

    CIRF Publications, Vol. 12, No. 5 is a collection of 80 abstracts giving particular attention to education, training, and economic growth in developing countries, Iran, Japan, Kenya, the Solomon Islands, and Sri Lanka; vocational rehabilitation in Italy, Spain, the United Kingdom, and the U. S. A.; agriculture in Chad, developing countries, and…

  3. Dose equations for tube current modulation in CT scanning and the interpretation of the associated CTDI{sub vol}

    SciTech Connect

    Dixon, Robert L.; Boone, John M.

    2013-11-15

    Purpose: The scanner-reported CTDI{sub vol} for automatic tube current modulation (TCM) has a different physical meaning from the traditional CTDI{sub vol} at constant mA, resulting in the dichotomy “CTDI{sub vol} of the first and second kinds” for which a physical interpretation is sought in hopes of establishing some commonality between the two.Methods: Rigorous equations are derived to describe the accumulated dose distributions for TCM. A comparison with formulae for scanner-reported CTDI{sub vol} clearly identifies the source of their differences. Graphical dose simulations are also provided for a variety of TCM tube current distributions (including constant mA), all having the same scanner-reported CTDI{sub vol}.Results: These convolution equations and simulations show that the local dose at z depends only weakly on the local tube current i(z) due to the strong influence of scatter from all other locations along z, and that the “local CTDI{sub vol}(z)” does not represent a local dose but rather only a relative i(z) ≡ mA(z). TCM is a shift-variant technique to which the CTDI-paradigm does not apply and its application to TCM leads to a CTDI{sub vol} of the second kind which lacks relevance.Conclusions: While the traditional CTDI{sub vol} at constant mA conveys useful information (the peak dose at the center of the scan length), CTDI{sub vol} of the second kind conveys no useful information about the associated TCM dose distribution it purportedly represents and its physical interpretation remains elusive. On the other hand, the total energy absorbed E (“integral dose”) as well as its surrogate DLP remain robust between variable i(z) TCM and constant current i{sub 0} techniques, both depending only on the total mAs =t{sub 0}=i{sub 0} t{sub 0} during the beam-on time t{sub 0}.

  4. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  5. Automatic Spectroscopic Data Categorization by Clustering Analysis (ASCLAN): A Data-Driven Approach for Distinguishing Discriminatory Metabolites for Phenotypic Subclasses.

    PubMed

    Zou, Xin; Holmes, Elaine; Nicholson, Jeremy K; Loo, Ruey Leng

    2016-06-01

    We propose a novel data-driven approach aiming to reliably distinguish discriminatory metabolites from nondiscriminatory metabolites for a given spectroscopic data set containing two biological phenotypic subclasses. The automatic spectroscopic data categorization by clustering analysis (ASCLAN) algorithm aims to categorize spectral variables within a data set into three clusters corresponding to noise, nondiscriminatory and discriminatory metabolites regions. This is achieved by clustering each spectral variable based on the r(2) value representing the loading weight of each spectral variable as extracted from a orthogonal partial least-squares discriminant (OPLS-DA) model of the data set. The variables are ranked according to r(2) values and a series of principal component analysis (PCA) models are then built for subsets of these spectral data corresponding to ranges of r(2) values. The Q(2)X value for each PCA model is extracted. K-means clustering is then applied to the Q(2)X values to generate two clusters based on minimum Euclidean distance criterion. The cluster consisting of lower Q(2)X values is deemed devoid of metabolic information (noise), while the cluster consists of higher Q(2)X values is then further subclustered into two groups based on the r(2) values. We considered the cluster with high Q(2)X but low r(2) values as nondiscriminatory, while the cluster with high Q(2)X and r(2) values as discriminatory variables. The boundaries between these three clusters of spectral variables, on the basis of the r(2) values were considered as the cut off values for defining the noise, nondiscriminatory and discriminatory variables. We evaluated the ASCLAN algorithm using six simulated (1)H NMR spectroscopic data sets representing small, medium and large data sets (N = 50, 500, and 1000 samples per group, respectively), each with a reduced and full resolution set of variables (0.005 and 0.0005 ppm, respectively). ASCLAN correctly identified all discriminatory

  6. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  7. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  8. Fully Automatic Determination of Soil Bacterium Numbers, Cell Volumes, and Frequencies of Dividing Cells by Confocal Laser Scanning Microscopy and Image Analysis

    PubMed Central

    Bloem, J.; Veninga, M.; Shepherd, J.

    1995-01-01

    We describe a fully automatic image analysis system capable of measuring cell numbers, volumes, lengths, and widths of bacteria in soil smears. The system also determines the number of cells in agglomerates and thus provides the frequency of dividing cells (FDC). Images are acquired from a confocal laser scanning microscope. The grey images are smoothed by convolution and by morphological erosion and dilation to remove noise. The background is equalized by flooding holes in the image and is then subtracted by two top hat transforms. Finally, the grey image is sharpened by delineation, and all particles above a fixed threshold are detected. The number of cells in each detected particle is determined by counting the number of local grey-level maxima in the particle. Thus, up to 1,500 cells in 10 fields of view in a soil smear are analyzed in 30 min without human intervention. Automatic counts of cell numbers and FDC were similar to visual counts in field samples. In microcosms, automatic measurements showed significant increases in cell numbers, FDC, mean cell volume, and length-to-width ratio after amendment of the soil. Volumes of fluorescent microspheres were measured with good approximation, but the absolute values obtained were strongly affected by the settings of the detector sensitivity. Independent measurements of bacterial cell numbers and volumes by image analysis and of cell carbon by a total organic carbon analyzer yielded an average specific carbon content of 200 fg of C (mu)m(sup-3), which indicates that our volume estimates are reasonable. PMID:16534976

  9. NAGWS Research Reports. Vol. III.

    ERIC Educational Resources Information Center

    Adrian, Marlene, Ed.; Brame, Judith, Ed.

    The emphasis in this collection of research reports is on women athletes, their training, physical characteristics, and mental attitudes. Each article describes a research project on one of these topics, the methodology used to obtain findings, analysis of results, and a summary of applications a teacher or coach may use. (JD)

  10. [Wearable Automatic External Defibrillators].

    PubMed

    Luo, Huajie; Luo, Zhangyuan; Jin, Xun; Zhang, Leilei; Wang, Changjin; Zhang, Wenzan; Tu, Quan

    2015-11-01

    Defibrillation is the most effective method of treating ventricular fibrillation(VF), this paper introduces wearable automatic external defibrillators based on embedded system which includes EGG measurements, bioelectrical impedance measurement, discharge defibrillation module, which can automatic identify VF signal, biphasic exponential waveform defibrillation discharge. After verified by animal tests, the device can realize EGG acquisition and automatic identification. After identifying the ventricular fibrillation signal, it can automatic defibrillate to abort ventricular fibrillation and to realize the cardiac electrical cardioversion.

  11. A prostate CAD system based on multiparametric analysis of DCE T1-w, and DW automatically registered images

    NASA Astrophysics Data System (ADS)

    Giannini, Valentina; Vignati, Anna; Mazzetti, Simone; De Luca, Massimo; Bracco, Christian; Stasi, Michele; Russo, Filippo; Armando, Enrico; Regge, Daniele

    2013-02-01

    Prostate specific antigen (PSA)-based screening reduces the rate of death from prostate cancer (PCa) by 31%, but this benefit is associated with a high risk of overdiagnosis and overtreatment. As prostate transrectal ultrasound-guided biopsy, the standard procedure for prostate histological sampling, has a sensitivity of 77% with a considerable false-negative rate, more accurate methods need to be found to detect or rule out significant disease. Prostate magnetic resonance imaging has the potential to improve the specificity of PSA-based screening scenarios as a non-invasive detection tool, in particular exploiting the combination of anatomical and functional information in a multiparametric framework. The purpose of this study was to describe a computer aided diagnosis (CAD) method that automatically produces a malignancy likelihood map by combining information from dynamic contrast enhanced MR images and diffusion weighted images. The CAD system consists of multiple sequential stages, from a preliminary registration of images of different sequences, in order to correct for susceptibility deformation and/or movement artifacts, to a Bayesian classifier, which fused all the extracted features into a probability map. The promising results (AUROC=0.87) should be validated on a larger dataset, but they suggest that the discrimination on a voxel basis between benign and malignant tissues is feasible with good performances. This method can be of benefit to improve the diagnostic accuracy of the radiologist, reduce reader variability and speed up the reading time, automatically highlighting probably cancer suspicious regions.

  12. Automatic and imperative motor activations in stimulus-response compatibility: magnetoencephalographic analysis of upper and lower limbs.

    PubMed

    Kato, Yuichiro; Endo, Hiroshi; Kizuka, Tomohiro; Asami, Takaaki

    2006-01-01

    The stimulus-response (S-R) compatibility effect refers to the difference in performance due to the spatial S-R relationship in choice reaction time. We investigated the mechanism of neural activities in S-R compatibility at the level of the primary motor cortices for upper and lower limbs responses using magnetoencephalography (MEG). In the S-R compatible task, subjects were required to respond on the same side of the stimulus light using either an upper or lower limb. In the incompatible task, subjects were required to respond in the reverse manner. Premotor times of upper and lower limbs were faster for the compatible response than for the incompatible response. The neuromagnetic brain activities related to response execution were estimated using a multi-dipole model. Stimulus-locked MEG indicated that the current moments of motor dipoles for both effectors occurred bilaterally and reached the first peak at a constant delay irrespective of whether the task was compatible or incompatible. This indicates that the neural activation of the primary motor cortex is automatically synchronized with the stimulus onset. Response-locked MEG showed that the peak current moment of the motor dipole contralateral to the response was stronger for the compatible task than for the incompatible one regardless of whether the responses were made using the upper or lower limbs. The MEG results suggest that automatic motor activation facilitates imperative motor activation for a compatible response, whereas it is not sufficient to prime imperative motor activation for an incompatible response.

  13. Automatic recognition of T and teleseismic P waves by statistical analysis of their spectra: An application to continuous records of moored hydrophones

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexey; Irisson, Jean-Olivier; Perrot, Julie; Nolet, Guust

    2014-08-01

    A network of moored hydrophones is an effective way of monitoring seismicity of oceanic ridges since it allows detection and localization of underwater events by recording generated T waves. The high cost of ship time necessitates long periods (normally a year) of autonomous functioning of the hydrophones, which results in very large data sets. The preliminary but indispensable part of the data analysis consists of identifying all T wave signals. This process is extremely time consuming if it is done by a human operator who visually examines the entire database. We propose a new method for automatic signal discrimination based on the Gradient Boosted Decision Trees technique that uses the distribution of signal spectral power among different frequency bands as the discriminating characteristic. We have applied this method to automatically identify the types of acoustic signals in data collected by two moored hydrophones in the North Atlantic. We show that the method is capable of efficiently resolving the signals of seismic origin with a small percentage of wrong identifications and missed events: 1.2% and 0.5% for T waves and 14.5% and 2.8% for teleseismic P waves, respectively. In addition, good identification rates for signals of other types (iceberg and ship generated) are obtained. Our results indicate that the method can be successfully applied to automate the analysis of other (not necessarily acoustic) databases provided that enough information is available to describe statistical properties of the signals to be identified.

  14. Annual Report: Automatic Informative Abstracting and Extracting.

    ERIC Educational Resources Information Center

    Earl, L. L.; And Others

    The development of automatic indexing, abstracting, and extracting systems is investigated. Part I describes the development of tools for making syntactic and semantic distinctions of potential use in automatic indexing and extracting. One of these tools is a program for syntactic analysis (i.e., parsing) of English, the other is a dictionary of…

  15. Algorithms for skiascopy measurement automatization

    NASA Astrophysics Data System (ADS)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  16. Automatic differentiation as a tool for sensitivity analysis of a convective storm in a 3-D cloud model

    SciTech Connect

    Park, S.K.; Droegemeier, K.K.; Bischof, C.H.

    1996-10-01

    The ADIFOR automatic differentiation tool is applied to a 3-D storm-scale meteorological model to generate a sensitivity-enhanced code capable of providing derivatives of all model output variables and related diagnostic (derived) parameters as a function of specified control parameters. The tangent linear approximation, applied to a deep convective storm by the first of its kind using a full-physics compressible model, is valid up to 50 min for a 1% water vapor perturbations. The result is very encouraging considering the highly nonlinear and discontinuous properties of solutions. The ADIFOR-generated code has provided valuable sensitivity information on storm dynamics. Especially, it is very efficient and useful for investigating how a perturbation inserted at earlier time propagates through the model variables at later times. However, it is computationally very expensive to be applied to the variational data assimilation, especially for 3-D meteorological models, which potentially have a large number of input variables.

  17. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  18. Dissimilarity analysis and automatic identification of monomethylalkanes from gas chromatography mass spectrometry data 1. Principle and protocols.

    PubMed

    Zhang, Liangxiao; Liang, Yizeng

    2009-07-01

    Monomethylalkanes are common but important components in many naturally occurring and synthetic organic materials. Generally, this kind of compounds is routinely analyzed by gas chromatography mass spectrometry (GC-MS) and identified by the retention pattern or similarity matching to the reference mass spectral library. However, these identification approaches rely on the limited standard database or costly standard compounds. When unknown monomethylalkane is absent from the reference library, these approaches might be less useful. In this study, based on the fragmentation rules and empirical observation, many interesting mass spectral characteristics of monomethylalkanes were discovered and employed to infer the number of carbon atoms and methylated position. Combined with the retention pattern, a protocol was described for the identification of monomethylalkane analyzed by GC-MS. After tested by simulated data and GC-MS data of the gasoline sample, it was demonstrated that the developing approach could automatically and correctly identify monomethylalkanes in complicated GC-MS data. PMID:19477452

  19. Experimental analysis of perching in the European starling (Sturnus vulgaris: Passeriformes; Passeres), and the automatic perching mechanism of birds.

    PubMed

    Galton, Peter M; Shepherd, Jeffrey D

    2012-04-01

    The avian automatic perching mechanism (APM) involves the automatic digital flexor mechanism (ADFM) and the digital tendon-locking mechanism (DTLM). When birds squat on a perch to sleep, the increased tendon travel distance due to flexion of the knee and ankle supposedly causes the toes to grip the perch (ADFM) and engage the DTLM so perching while sleeping involves no muscular effort. However, the knees and ankles of sleeping European starlings (Sturnus vulgaris) are only slightly flexed and, except for occasional balancing adjustments, the distal two-thirds of the toes are not flexed to grip a 6-mm-diameter perch. The cranial ankle angle (CAA) is ∼120° and the foot forms an inverted "U" that, with the mostly unflexed toes, provides a saddle-like structure so the bird balances its weight over the central pad of the foot (during day weight further back and digits actively grasp perch). In the region of the pad, the tendon sheath of many birds is unribbed, or only very slightly so, and it is always separated from the tendon of the M. flexor digitorum longus by tendons of the other toe flexor muscles. Passive leg flexion produces no toe flexion in anesthetized Starlings and only after 15-20 min, at the onset of rigor mortis, in freshly sacrificed Starlings. Anesthetized Starlings could not remain perched upon becoming unconscious (ADFM, DTLM intact). Birds whose digital flexor tendons were severed or the locking mechanism eliminated surgically (no ADFM or DTLM), so without ability to flex their toes, slept on the perch in a manner similar to unoperated Starlings (except CAA ∼90°-110°). Consequently, there is no APM or ADFM and the DTLM, although involved in lots of other activities, only acts in perching with active contraction of the digital flexor muscles. PMID:22539208

  20. Analysis of automatically generated peptide mass fingerprints of cellular proteins and antigens from Helicobacter pylori 26695 separated by two-dimensional electrophoresis.

    PubMed

    Krah, Alexander; Schmidt, Frank; Becher, Dörte; Schmid, Monika; Albrecht, Dirk; Rack, Axel; Büttner, Knut; Jungblut, Peter R

    2003-12-01

    Helicobacter pylori is a causative agent of severe diseases of the gastric tract ranging from chronic gastritis to gastric cancer. Cellular proteins of H. pylori were separated by high resolution two-dimensional gel electrophoresis. A dataset of 384 spots was automatically picked, digested, spotted, and analyzed by matrix-assisted laser desorption ionization mass spectrometry peptide mass fingerprint in triple replicates. This procedure resulted in 960 evaluable mass spectra. Using a new version of our data analysis software MS-Screener we improved identification and tested reliability of automatically generated data by comparing with manually produced data. Antigenic proteins from H. pylori are candidates for vaccines and diagnostic tests. Previous immunoproteomics studies of our group revealed antigen candidates, and 24 of them were now closely analyzed using the MS-Screener software. Only in three spots minor components were found that may have influenced their antigenicities. These findings affirm the value of immunoproteomics as a hypothesis-free approach. Additionally, the protein species distribution of the known antigen GroEL was investigated, dimers of the protein alkyl hydroperoxide reductase were found, and the fragmentation of gamma-glutamyltranspeptidase was demonstrated.

  1. Automatic segmentation and identification of solitary pulmonary nodules on follow-up CT scans based on local intensity structure analysis and non-rigid image registration

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Naito, Hideto; Nakamura, Yoshihiko; Kitasaka, Takayuki; Rueckert, Daniel; Honma, Hirotoshi; Takabatake, Hirotsugu; Mori, Masaki; Natori, Hiroshi; Mori, Kensaku

    2011-03-01

    This paper presents a novel method that can automatically segment solitary pulmonary nodule (SPN) and match such segmented SPNs on follow-up thoracic CT scans. Due to the clinical importance, a physician needs to find SPNs on chest CT and observe its progress over time in order to diagnose whether it is benign or malignant, or to observe the effect of chemotherapy for malignant ones using follow-up data. However, the enormous amount of CT images makes large burden tasks to a physician. In order to lighten this burden, we developed a method for automatic segmentation and assisting observation of SPNs in follow-up CT scans. The SPNs on input 3D thoracic CT scan are segmented based on local intensity structure analysis and the information of pulmonary blood vessels. To compensate lung deformation, we co-register follow-up CT scans based on an affine and a non-rigid registration. Finally, the matches of detected nodules are found from registered CT scans based on a similarity measurement calculation. We applied these methods to three patients including 14 thoracic CT scans. Our segmentation method detected 96.7% of SPNs from the whole images, and the nodule matching method found 83.3% correspondences from segmented SPNs. The results also show our matching method is robust to the growth of SPN, including integration/separation and appearance/disappearance. These confirmed our method is feasible for segmenting and identifying SPNs on follow-up CT scans.

  2. Application of a method for the automatic detection and Ground-Based Velocity Track Display (GBVTD) analysis of a tornado crossing the Hong Kong International Airport

    NASA Astrophysics Data System (ADS)

    Chan, P. W.; Wurman, J.; Shun, C. M.; Robinson, P.; Kosiba, K.

    2012-03-01

    A weak tornado with a maximum Doppler velocity shear of about 40 m s - 1 moved across the Hong Kong International Airport (HKIA) during the evening of 20 May 2002. The tornado caused damage equivalent to F0 on the Fujita Scale, based on a damage survey. The Doppler velocity data from the Hong Kong Terminal Doppler Weather Radar (TDWR) are studied using the Ground-Based Velocity Track Display (GBVTD) method of single Doppler analysis. The GBVTD analysis is able to clearly depict the development and decay of the tornado though it appears to underestimate its magnitude. In the pre-tornadic state, the wind field is characterized by inflow toward the center near the ground and upward motion near the center. When the tornado attains its maximum strength, an eye-like structure with a downdraft appears to form in the center. Several minutes later the tornado begins to decay and outflow dominates at low levels. Assuming cyclostrophic balance, the pressure drop 200 m from the center of the tornado at its maximum strength is calculated to be about 6 hPa. To estimate the maximum ground-relative wind speed of the tornado, the TDWR's Doppler velocities are adjusted for the ratio of the sample-volume size of the radar and the radius of the tornado, resulting in a peak wind speed of 28 m s - 1 , consistent with the readings from a nearby ground-based anemometers and the F0 damage observed. An automatic tornado detection algorithm based on Doppler velocity difference (delta-V) and temporal and spatial continuity is applied to this event. The locations and the core flow radii of the tornado as determined by the automatic method and by subjective analysis agree closely.

  3. Automatism and hypoglycaemia.

    PubMed

    Beaumont, Guy

    2007-02-01

    A case of a detained person (DP) suffering from insulin-dependent diabetes, who subsequently used the disorder in his defence as a reason to claim automatism, is discussed. The legal and medical history of automatism is outlined along with the present day situation. Forensic physicians should be aware when examining any diabetic that automatism may subsequently be claimed. With this in mind, the importance of relevant history taking specifically relating to diabetic control and symptoms is discussed.

  4. An anatomy of automatism.

    PubMed

    Mackay, R D

    2015-07-01

    The automatism defence has been described as a quagmire of law and as presenting an intractable problem. Why is this so? This paper will analyse and explore the current legal position on automatism. In so doing, it will identify the problems which the case law has created, including the distinction between sane and insane automatism and the status of the 'external factor doctrine', and comment briefly on recent reform proposals.

  5. An anatomy of automatism.

    PubMed

    Mackay, R D

    2015-07-01

    The automatism defence has been described as a quagmire of law and as presenting an intractable problem. Why is this so? This paper will analyse and explore the current legal position on automatism. In so doing, it will identify the problems which the case law has created, including the distinction between sane and insane automatism and the status of the 'external factor doctrine', and comment briefly on recent reform proposals. PMID:26378105

  6. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    NASA Astrophysics Data System (ADS)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  7. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  8. Automatic crack propagation tracking

    NASA Technical Reports Server (NTRS)

    Shephard, M. S.; Weidner, T. J.; Yehia, N. A. B.; Burd, G. S.

    1985-01-01

    A finite element based approach to fully automatic crack propagation tracking is presented. The procedure presented combines fully automatic mesh generation with linear fracture mechanics techniques in a geometrically based finite element code capable of automatically tracking cracks in two-dimensional domains. The automatic mesh generator employs the modified-quadtree technique. Crack propagation increment and direction are predicted using a modified maximum dilatational strain energy density criterion employing the numerical results obtained by meshes of quadratic displacement and singular crack tip finite elements. Example problems are included to demonstrate the procedure.

  9. Automatic Analysis and Classification of the Roof Surfaces for the Installation of Solar Panels Using a Multi-Data Source and Multi-Sensor Aerial Platform

    NASA Astrophysics Data System (ADS)

    López, L.; Lagüela, S.; Picon, I.; González-Aguilera, D.

    2015-02-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbour solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the surfaces, slopes, orientations and the existence of obstacles. This way, large areas may be efficiently analysed obtaining as final result the optimal locations for the placement of solar panels as well as the required geometry of the supports for the installation of the panels in those roofs where geometry is not optimal.

  10. Index to the "Journal of American Indian Education", Vol. 1, No. 1, 1961 - Vol. 13, No. 2, 1974.

    ERIC Educational Resources Information Center

    Gill, George A., Ed.

    Articles that appear in the "Journal of American Indian Education", Vol. 1, No. 1 (June 1961) through Vol. 13, No. 2 (January 1974), inclusive, are annotated in this index. Each of the approximately 190 citations gives: (1) title of article/manuscript, (2) author(s), (3) volume, number, pages, and date, and (4) subject annotation. Although most of…

  11. The feasibility of a regional CTDI{sub vol} to estimate organ dose from tube current modulated CT exams

    SciTech Connect

    Khatonabadi, Maryam; Kim, Hyun J.; Lu, Peiyun; McMillan, Kyle L.; Cagnon, Chris H.; McNitt-Gray, Michael F.; DeMarco, John J.

    2013-05-15

    dose to correlate with patient size was investigated. Results: For all five organs, the correlations with patient size increased when organ doses were normalized by regional and organ-specific CTDI{sub vol} values. For example, when estimating dose to the liver, CTDI{sub vol,global} yielded a R{sup 2} value of 0.26, which improved to 0.77 and 0.86, when using the regional and organ-specific CTDI{sub vol} for abdomen and liver, respectively. For breast dose, the global CTDI{sub vol} yielded a R{sup 2} value of 0.08, which improved to 0.58 and 0.83, when using the regional and organ-specific CTDI{sub vol} for chest and breasts, respectively. The R{sup 2} values also increased once the thoracic models were separated for the analysis into females and males, indicating differences between genders in this region not explained by a simple measure of effective diameter. Conclusions: This work demonstrated the utility of regional and organ-specific CTDI{sub vol} as normalization factors when using TCM. It was demonstrated that CTDI{sub vol,global} is not an effective normalization factor in TCM exams where attenuation (and therefore tube current) varies considerably throughout the scan, such as abdomen/pelvis and even thorax. These exams can be more accurately assessed for dose using regional CTDI{sub vol} descriptors that account for local variations in scanner output present when TCM is employed.

  12. Automatic Versus Manual Indexing

    ERIC Educational Resources Information Center

    Vander Meulen, W. A.; Janssen, P. J. F. C.

    1977-01-01

    A comparative evaluation of results in terms of recall and precision from queries submitted to systems with automatic and manual subject indexing. Differences were attributed to query formulation. The effectiveness of automatic indexing was found equivalent to manual indexing. (Author/KP)

  13. Automatic and Flexible

    PubMed Central

    Hassin, Ran R.; Bargh, John A.; Zimerman, Shira

    2008-01-01

    Arguing from the nature of goal pursuit and from the economy of mental resources this paper suggests that automatic goal pursuit, much like its controlled counterpart, may be flexible. Two studies that employ goal priming procedures examine this hypothesis using the Wisconsin Card Sorting Test (Study 1) and a variation of the Iowa Gambling Task (Study 2). Implications of the results for our understanding of the dichotomy between automatic and controlled processes in general, and for our conception of automatic goal pursuit in particular, are discussed. PMID:19325712

  14. Application of an automatic thermal desorption-gas chromatography-mass spectrometry system for the analysis of polycyclic aromatic hydrocarbons in airborne particulate matter.

    PubMed

    Gil-Moltó, J; Varea, M; Galindo, N; Crespo, J

    2009-02-27

    The application of the thermal desorption (TD) method coupled with gas chromatography-mass spectrometry (GC-MS) to the analysis of aerosol organics has been the focus of many studies in recent years. This technique overcomes the main drawbacks of the solvent extraction approach such as the use of large amounts of toxic organic solvents and long and laborious extraction processes. In this work, the application of an automatic TD-GC-MS instrument for the determination of particle-bound polycyclic aromatic hydrocarbons (PAHs) is evaluated. This device offers the advantage of allowing the analysis of either gaseous or particulate organics without any modification. Once the thermal desorption conditions for PAH extraction were optimised, the method was verified on NIST standard reference material (SRM) 1649a urban dust, showing good linearity, reproducibility and accuracy for all target PAHs. The method has been applied to PM10 and PM2.5 samples collected on quartz fibre filters with low volume samplers, demonstrating its capability to quantify PAHs when only a small amount of sample is available. PMID:19150718

  15. Application of an automatic thermal desorption-gas chromatography-mass spectrometry system for the analysis of polycyclic aromatic hydrocarbons in airborne particulate matter.

    PubMed

    Gil-Moltó, J; Varea, M; Galindo, N; Crespo, J

    2009-02-27

    The application of the thermal desorption (TD) method coupled with gas chromatography-mass spectrometry (GC-MS) to the analysis of aerosol organics has been the focus of many studies in recent years. This technique overcomes the main drawbacks of the solvent extraction approach such as the use of large amounts of toxic organic solvents and long and laborious extraction processes. In this work, the application of an automatic TD-GC-MS instrument for the determination of particle-bound polycyclic aromatic hydrocarbons (PAHs) is evaluated. This device offers the advantage of allowing the analysis of either gaseous or particulate organics without any modification. Once the thermal desorption conditions for PAH extraction were optimised, the method was verified on NIST standard reference material (SRM) 1649a urban dust, showing good linearity, reproducibility and accuracy for all target PAHs. The method has been applied to PM10 and PM2.5 samples collected on quartz fibre filters with low volume samplers, demonstrating its capability to quantify PAHs when only a small amount of sample is available.

  16. Direct automatic determination of bitterness and total phenolic compounds in virgin olive oil using a pH-based flow-injection analysis system.

    PubMed

    Garcia-Mesa, José A; Mateos, Raquel

    2007-05-16

    Flavor and taste are sensorial attributes of virgin olive oil (VOO) highly appreciated by consumers. Among the organoleptic properties of VOO, bitterness is related to the natural phenolic compounds present in the oil. Sensorial analysis is the official method to evaluate VOO flavor and bitterness, which requires highly specialized experts. Alternatively, methods based on physicochemical determinations could be useful for the industry. The present work presents a flow-injection analysis system for the direct automatic determination of bitterness and total phenolic compounds in VOO without prior isolation, based on the spectral shift undergone by phenolic compounds upon pH variation. This system enables a complete automation of the process, including dilution of the sample and its sequential injection into buffer solutions of acidic and alkaline pH. The variation of the absorbance at 274 nm showed a high correlation with bitterness and the total phenolic content of VOO, due to the close relationship between these two parameters. Thus, the proposed method determines the bitterness and phenolic compounds, with results similar to those from reference methods (relative errors ranging from 1% to 8% for bitterness and from 2% and 7% for phenolic compounds). The precision evaluated at two levels of both parameters ranged between 0.6% and 1.5% for bitterness and between 0.7% and 2.6% for phenolic compounds.

  17. Automatic image analysis and spot classification for detection of fruit fly infestation in hyperspectral images of mangoes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An algorithm has been developed to identify spots generated in hyperspectral images of mangoes infested with fruit fly larvae. The algorithm incorporates background removal, application of a Gaussian blur, thresholding, and particle count analysis to identify locations of infestations. Each of the f...

  18. Automatic amino acid analyzer

    NASA Technical Reports Server (NTRS)

    Berdahl, B. J.; Carle, G. C.; Oyama, V. I.

    1971-01-01

    Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.

  19. Automatic Payroll Deposit System.

    ERIC Educational Resources Information Center

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  20. Automatic switching matrix

    DOEpatents

    Schlecht, Martin F.; Kassakian, John G.; Caloggero, Anthony J.; Rhodes, Bruce; Otten, David; Rasmussen, Neil

    1982-01-01

    An automatic switching matrix that includes an apertured matrix board containing a matrix of wires that can be interconnected at each aperture. Each aperture has associated therewith a conductive pin which, when fully inserted into the associated aperture, effects electrical connection between the wires within that particular aperture. Means is provided for automatically inserting the pins in a determined pattern and for removing all the pins to permit other interconnecting patterns.

  1. GAIT-ER-AID: An Expert System for Analysis of Gait with Automatic Intelligent Pre-Processing of Data

    PubMed Central

    Bontrager, EL.; Perry, J.; Bogey, R.; Gronley, J.; Barnes, L.; Bekey, G.; Kim, JW.

    1990-01-01

    This paper describes the architecture and applications of an expert system designed to identify the specific muscles responsible for a given dysfunctional gait pattern. The system consists of two parts: a data analysis expert system (DA/ES) and a gait pathology expert system (GP/ES). The DA/ES processes raw data on joint angles, foot-floor contact patterns and EMG's from relevant muscles and synthesizes them into a data frame for use by the GP/ES. Various aspects of the intelligent data pre-processing are described in detail, followed by a presentation of the GP/ES, including its structure, knowledge base, rule base, and inference engine. The inference process is clarified by careful analysis of an actual case, a patient with an equinus gait.

  2. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 2), Manchester, 1873 (PL XXIX top); illustration of full milll, as enlarged to south. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  3. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XXI); illustration of turbine and belt system. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  4. 2. Historic American Buildings Survey Photocopy from Harpers, vol. 20 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Historic American Buildings Survey Photocopy from Harpers, vol. 20 1859 Courtesy of Library of Congress NORTH AND EAST FRONTS - United States General Post Office, Between Seventh, Eighth, E, & F Streets, Northwest, Washington, District of Columbia, DC

  5. Selection of shape parameters that differentiate sand grains, based on the automatic analysis of two-dimensional images

    NASA Astrophysics Data System (ADS)

    Sochan, Agata; Zieliński, Paweł; Bieganowski, Andrzej

    2015-08-01

    A grain shape analysis of sandy deposits has implications for determining the processes that affect grain shape. So far, most methods of carrying out a grain shape analysis are based on the subjective estimation of the researcher. The purpose of this study is to indicate the shape parameter/parameters that best differentiate sand grains, and to compare the results with those that have been obtained by the Krumbein method. We determined the shape parameters of sand grains (size range from 0.71 mm to 1 mm) using photos of two-dimensional images of particle projections onto a plane. The photos were taken under an optical microscope and were then subjected to an image analysis. We selected six shape parameters that best differentiate the studied sand grains, based on the criteria of: i) the monotonicity of parameter value, which changed depending on the categorization of the grains to the successive Krumbein roundness classes, and ii) the statistical significance of differences between the optical parameter values in the individual Krumbein classes. We selected three circularity parameters (θ1, θ4 and θ5) and three surface structure parameters (κ3, κ4 and κ6). None of these shape parameters allowed the direct categorization of a particle into a particular Krumbein roundness class. Nevertheless, despite the fact that an unambiguous characterization of deposits is not possible, this method can be helpful in identifying the origin of deposits. Moreover, it is possible to develop more advanced methods (e.g., based on artificial intelligence tools), which would allow an unambiguous categorization based on the determined shape parameters.

  6. Evaluating the reforested area for the municipality of Buri by automatic analysis of LANDSAT imagery. [Sao Paulo, Brazil

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Lee, D. C. L.; Filho, R. H.; Shimabukuro, Y. E.

    1979-01-01

    The author has identified the following significant results. The class of reforestation (Pinus, Eucalyptus, Araucaria) was defined using iterative image analysis (1-100) and LANDSAT MSS data. Estimates of class area by 1-100 were compared with data supplied by the forestry institute in Sao Paulo. LANDSAT channels 4 and 5 served to differentiate the Pinus, Eucalyptus, and Araucaria from the other trees. Channels 6 and 7 gave best results for differentiating between the classes. A good representative spectral response was obtained for Auraucaria on these two channels. The small relative differences obtained were +4.24% for Araucaria, -7.51% for Pinus, and -32.07% for Eucalyptus.

  7. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    PubMed

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  8. An environmental friendly method for the automatic determination of hypochlorite in commercial products using multisyringe flow injection analysis.

    PubMed

    Soto, N Ornelas; Horstkotte, B; March, J G; López de Alba, P L; López Martínez, L; Cerdá Martín, V

    2008-03-24

    A multisyringe flow injection analysis system was used for the determination of hypochlorite in cleaning agents, by measurement of the native absorbance of hypochlorite at 292 nm. The methodology was based on the selective decomposition of hypochlorite by a cobalt oxide catalyst giving chloride and oxygen. The difference of the absorbance of the sample before and after its pass through a cobalt oxide column was selected as analytical signal. As no further reagent was required this work can be considered as a contribution to environmental friendly analytical chemistry. The entire analytical procedure, including in-line sample dilution in three steps was automated by first, dilution in a stirred miniature vessel, second by dispersion and third by in-line addition of water using multisyringe flow injection technique. The dynamic concentration range was 0.04-0.78 gL(-1) (relative standard deviation lower than 3%), where the extension of the hypochlorite decomposition was of 90+/-4%. The proposed method was successfully applied to the analysis of commercial cleaning products. The accuracy of the method was established by iodometric titration. PMID:18328319

  9. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  10. An image-analysis system based on support vector machines for automatic grade diagnosis of brain-tumour astrocytomas in clinical routine.

    PubMed

    Glotsos, D; Spyridonos, P; Cavouras, D; Ravazoula, P; Dadioti, P Arapantoni; Nikiforidis, G

    2005-09-01

    An image-analysis system based on the concept of Support Vector Machines (SVM) was developed to assist in grade diagnosis of brain tumour astrocytomas in clinical routine. One hundred and forty biopsies of astrocytomas were characterized according to the WHO system as grade II, III and IV. Images from biopsies were digitized, and cell nuclei regions were automatically detected by encoding texture variations in a set of wavelet, autocorrelation and parzen estimated descriptors and using an unsupervised SVM clustering methodology. Based on morphological and textural nuclear features, a decision-tree classification scheme distinguished between different grades of tumours employing an SVM classifier. The system was validated for clinical material collected from two different hospitals. On average, the SVM clustering algorithm correctly identified and accurately delineated 95% of all nuclei. Low-grade tumours were distinguished from high-grade tumours with an accuracy of 90.2% and grade III from grade IV with an accuracy of 88.3% The system was tested in a new clinical data set, and the classification rates were 87.5 and 83.8%, respectively. Segmentation and classification results are very encouraging, considering that the method was developed based on every-day clinical standards. The proposed methodology might be used in parallel with conventional grading to support the regular diagnostic procedure and reduce subjectivity in astrocytomas grading. PMID:16403707

  11. Automatic flow analysis method to determine traces of Mn²⁺ in sea and drinking waters by a kinetic catalytic process using LWCC-spectrophotometric detection.

    PubMed

    Chaparro, Laura; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2016-02-01

    A new automatic kinetic catalytic method has been developed for the measurement of Mn(2+) in drinking and seawater samples. The method is based on the catalytic effect of Mn(2+) on the oxidation of tiron by hydrogen peroxide in presence of Pb(2+) as an activator. The optimum conditions were obtained at pH 10 with 0.019 mol L(-1) 2'2 bipyridyl, 0.005 mol L(-1) tiron and 0.38 mol L(-1) hydrogen peroxide. Flow system is based on multisyringe flow injection analysis (MSFIA) coupled with a lab-on-valve (LOV) device exploiting on line spectrophotometric detection by a Liquid Waveguide Capillary Cell (LWCC), 1m optical length and performed at 445 nm. Under the optimized conditions by a multivariate approach, the method allowed the measurement of Mn(2+) in a range of 0.03-35 µg L(-1) with a detection limit of 0.010 µg L(-1), attaining a repeatability of 1.4% RSD. The method was satisfactorily applied to the determination of Mn(2+) in environmental water samples. The reliability of method was also verified by determining the manganese content of the certified standard reference seawater sample, CASS-4.

  12. Application of 3-D digital deconvolution to optically sectioned images for improving the automatic analysis of fluorescent-labeled tumor specimens

    NASA Astrophysics Data System (ADS)

    Lockett, Stephen J.; Jacobson, Kenneth A.; Herman, Brian

    1992-06-01

    The analysis of fluorescent stained clusters of cells has been improved by recording multiple images of the same microscopic scene at different focal planes and then applying a three dimensional (3-D) out of focus background subtraction algorithm. The algorithm significantly reduced the out of focus signal and improved the spatial resolution. The method was tested on specimens of 10 micrometers diameter ((phi) ) beads embedded in agarose and on a 5 micrometers breast tumor section labeled with a fluorescent DNA stain. The images were analyzed using an algorithm for automatically detecting fluorescent objects. The proportion of correctly detected in focus beads and breast nuclei increased from 1/8 to 8/8 and from 56/104 to 81/104 respectively after processing by the subtraction algorithm. Furthermore, the subtraction algorithm reduced the proportion of out of focus relative to in focus total intensity detected in the bead images from 51% to 33%. Further developments of these techniques, that utilize the 3-D point spread function (PSF) of the imaging system and a 3-D segmentation algorithm, should result in the correct detection and precise quantification of virtually all cells in solid tumor specimens. Thus the approach should serve as a highly reliable automated screening method for a wide variety of clinical specimens.

  13. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    NASA Astrophysics Data System (ADS)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of

  14. Histological analysis of tissue structures of the internal organs of steppe tortoises following their exposure to spaceflight conditions while circumnavigating the moon aboard the Zond-7 automatic station

    NASA Technical Reports Server (NTRS)

    Sutulov, L. S.; Sutulov, Y. L.; Trukhina, L. V.

    1975-01-01

    Tortoises flown around the Moon on the 6-1/2 day voyage of the Zond-7 automatic space station evidently did not suffer any pathological changes to their peripheral blood picture, heart, lungs, intestines, or liver.

  15. An interdisciplinary analysis of multispectral satellite data for selected cover types in the Colorado Mountains, using automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has reported the following significant results. A data set containing SKYLAB, LANDSAT, and topographic data has been overlayed, registered, and geometrically corrected to a scale of 1:24,000. After geometrically correcting both sets of data, the SKYLAB data were overlayed on the LANDSAT data. Digital topographic data were then obtained, reformatted, and a data channel containing elevation information was then digitally overlayed onto the LANDSAT and SKYLAB spectral data. The 14,039 square kilometers involving 2,113, 776 LANDSAT pixels represents a relatively large data set available for digital analysis. The overlayed data set enables investigators to numerically analyze and compare two sources of spectral data and topographic data from any point in the scene. This capability is new and it will permit a numerical comparison of spectral response with elevation, slope, and aspect. Utilization of the spectral and topographic data together to obtain more accurate classifications of the various cover types present is feasible.

  16. Determination of free and total sulfites in wine using an automatic flow injection analysis system with voltammetric detection.

    PubMed

    Goncalves, Luis Moreira; Grosso Pacheco, Joao; Jorge Magalhaes, Paulo; Antonio Rodrigues, Jose; Araujo Barros, Aquiles

    2010-02-01

    An automated flow injection analysis (FIA) system, based on an initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell, was developed for the determination of total and free sulfur dioxide (SO(2)) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and a simplified method commonly used by the wine industry). The developed method displayed good repeatability (RSD lower than 6%) and linearity (between 10 and 250 mg l(-1)) as well as a suitable LOD (3 mg l(-1)) and LOQ (9 mg l(-1)). A major advantage of this system is that SO(2) is directly detected by flow SWV.

  17. Semi-automatic quantification of neurite fasciculation in high-density neurite images by the Neurite Directional Distribution Analysis (NDDA)

    PubMed Central

    Hopkins, Amy M; Wheeler, Brandon; Staii, Cristian; Kaplan, David L.; Atherton, Timothy J.

    2014-01-01

    Background Bundling of neurite extensions occur during nerve development and regeneration. Understanding the factors that drive neurite bundling is important for designing biomaterials for nerve regeneration toward the innervation target and preventing nociceptive collateral sprouting. High-density neuron cultures including dorsal root ganglia explants are employed for in vitro screening of biomaterials designed to control directional outgrowth. Although some semiautomated image processing methods exist for quantification of neurite outgrowth, methods to quantify axonal fasciculation in terms of direction of neurite outgrowth are lacking. New Method This work presents a semi-automated program to analyze micrographs of high-density neurites; the program aims to quantify axonal fasciculation by determining the orientational distribution function of the tangent vectors of the neurites and calculating its Fourier series coefficients (‘c’ values). Results We found that neurite directional distribution analysis (NDDA) of fasciculated neurites yielded ‘c’ values of ≥ ~0.25 whereas branched outgrowth led to statistically significant lesser values of <~0.2. The ‘c’ values correlated directly to the width of neurite bundles and indirectly to the number of branching points. Comparison with Existing Methods Information about the directional distribution of outgrowth is lost in simple counting methods or achieved laboriously through manual analysis. The NDDA supplements previous quantitative analyses of axonal bundling using a vector-based approach that captures new information about the directionality of outgrowth. Conclusion The NDDA is a valuable addition to open source image processing tools available to biomedical researchers offering a robust, precise approach to quantification of imaged features important in tissue development, disease, and repair. PMID:24680908

  18. Automatic small bowel tumor diagnosis by using multi-scale wavelet-based analysis in wireless capsule endoscopy images

    PubMed Central

    2012-01-01

    Background Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. Method The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. Results The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice. PMID:22236465

  19. Automatic Analysis of Retinal Vascular Parameters for Detection of Diabetes in Indian Patients with No Retinopathy Sign

    PubMed Central

    Jain, Rajeev

    2016-01-01

    This study has investigated the association between retinal vascular parameters with type II diabetes in Indian population with no observable diabetic retinopathy. It has introduced two new retinal vascular parameters: total number of branching angles (TBA) and average acute branching angles (ABA) as potential biomarkers of diabetes in an explanatory model. A total number of 180 retinal images (two (left and right) × two (ODC and MC) × 45 subjects (13 diabetics and 32 nondiabetics)) were analysed. Stepwise linear regression analysis was performed to model the association between type II diabetes with the best subset of explanatory variables (predictors), consisting of retinal vascular parameters and patients' demographic information. P value of the estimated coefficients (P < 0.001) indicated that, at α level of 0.05, the newly introduced retinal vascular parameters, that is, TBA and ABA together with CRAE, mean tortuosity, SD of branching angle, and VB, are related to type II diabetes when there is no observable sign of retinopathy. PMID:27579347

  20. Automatic Analysis of Retinal Vascular Parameters for Detection of Diabetes in Indian Patients with No Retinopathy Sign.

    PubMed

    Aliahmad, Behzad; Kumar, Dinesh Kant; Jain, Rajeev

    2016-01-01

    This study has investigated the association between retinal vascular parameters with type II diabetes in Indian population with no observable diabetic retinopathy. It has introduced two new retinal vascular parameters: total number of branching angles (TBA) and average acute branching angles (ABA) as potential biomarkers of diabetes in an explanatory model. A total number of 180 retinal images (two (left and right) × two (ODC and MC) × 45 subjects (13 diabetics and 32 nondiabetics)) were analysed. Stepwise linear regression analysis was performed to model the association between type II diabetes with the best subset of explanatory variables (predictors), consisting of retinal vascular parameters and patients' demographic information. P value of the estimated coefficients (P < 0.001) indicated that, at α level of 0.05, the newly introduced retinal vascular parameters, that is, TBA and ABA together with CRAE, mean tortuosity, SD of branching angle, and VB, are related to type II diabetes when there is no observable sign of retinopathy. PMID:27579347

  1. Flow injection analysis-based methodology for automatic on-line monitoring and quality control for biodiesel production.

    PubMed

    Pinzi, S; Priego Capote, F; Ruiz Jiménez, J; Dorado, M P; Luque de Castro, M D

    2009-01-01

    An automated on-line approach based on determination of free and bound glycerol was here proposed to monitor biodiesel production. The method was based on liquid-liquid extraction of glycerol from the biodiesel to an aqueous ethanolic phase in which glycerol is oxidized to formaldehyde with meta periodate with subsequent reaction with acetylacetone. The reaction product was photometrically measured at 410 nm. Free and bound glycerol were differentiated by glycerides hydrolysis with potassium ethylate. The experimental set-up consisted of a flow-injection manifold for liquid-liquid extraction without phase separation and iterative change of the flow direction that enabled: (a) filling the flow manifold with a meta periodate-acetylacetone acceptor phase; (b) sampling of small amounts (microl) from the reactor; (c) determination of free glycerol by extraction from biodiesel to the aqueous phase with simultaneous oxidation-reaction with acetylacetone in the acceptor phase; (d) continuous monitoring of the aqueous phase by passage through a photometric detector; (e) filling the flow manifold with a potassium ethylate-meta periodate-acetylacetone new acceptor phase; (d) repetition of steps b-to-d to determine total glycerol after saponification of the bound glycerol by potassium ethylate; and (f) determination of bound glycerol by difference between the second and first analyses. The results showed that the proposed automated on-line method is a suitable option in routine analysis during biodiesel production. PMID:18614358

  2. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    PubMed

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness.

  3. A new automatic synchronizer

    SciTech Connect

    Malm, C.F.

    1995-12-31

    A phase lock loop automatic synchronizer, PLLS, matches generator speed starting from dead stop to bus frequency, and then locks the phase difference at zero, thereby maintaining zero slip frequency while the generator breaker is being closed to the bus. The significant difference between the PLLS and a conventional automatic synchronizer is that there is no slip frequency difference between generator and bus. The PLL synchronizer is most advantageous when the penstock pressure fluctuates the grid frequency fluctuates, or both. The PLL synchronizer is relatively inexpensive. Hydroplants with multiple units can economically be equipped with a synchronizer for each unit.

  4. WOLF; automatic typing program

    USGS Publications Warehouse

    Evenden, G.I.

    1982-01-01

    A FORTRAN IV program for the Hewlett-Packard 1000 series computer provides for automatic typing operations and can, when employed with manufacturer's text editor, provide a system to greatly facilitate preparation of reports, letters and other text. The input text and imbedded control data can perform nearly all of the functions of a typist. A few of the features available are centering, titles, footnotes, indentation, page numbering (including Roman numerals), automatic paragraphing, and two forms of tab operations. This documentation contains both user and technical description of the program.

  5. AUTOMATIC COUNTING APPARATUS

    DOEpatents

    Howell, W.D.

    1957-08-20

    An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.

  6. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  7. Modification of Hewlett-Packard Chemstation (G1034C version C.02.00 and G1701AA version C.02.00 and C.03.00) data analysis program for addition of automatic extracted ion chromatographic groups.

    PubMed

    Amick, G D

    1998-01-01

    A modification to data analysis macros to create "ion groups" and add a menu item to the data analysis menu bar is described. These "ion groups" consist of up to 10 ions for extracted ion chromatographs. The present manuscript describes modifications to yield a drop down menu in data analysis that contains user-defined names of groups (e.g., opiates, barbiturates, benzodiazepines) of ions that, when selected, will automatically perform extracted ion chromatographs with up to 10 ions in that group for the loaded datafile.

  8. Automatic sweep circuit

    DOEpatents

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  9. Automatic Program Synthesis Reports.

    ERIC Educational Resources Information Center

    Biermann, A. W.; And Others

    Some of the major results of future goals of an automatic program synthesis project are described in the two papers that comprise this document. The first paper gives a detailed algorithm for synthesizing a computer program from a trace of its behavior. Since the algorithm involves a search, the length of time required to do the synthesis of…

  10. Brut: Automatic bubble classifier

    NASA Astrophysics Data System (ADS)

    Beaumont, Christopher; Goodman, Alyssa; Williams, Jonathan; Kendrew, Sarah; Simpson, Robert

    2014-07-01

    Brut, written in Python, identifies bubbles in infrared images of the Galactic midplane; it uses a database of known bubbles from the Milky Way Project and Spitzer images to build an automatic bubble classifier. The classifier is based on the Random Forest algorithm, and uses the WiseRF implementation of this algorithm.

  11. Automatic multiple applicator electrophoresis

    NASA Technical Reports Server (NTRS)

    Grunbaum, B. W.

    1977-01-01

    Easy-to-use, economical device permits electrophoresis on all known supporting media. System includes automatic multiple-sample applicator, sample holder, and electrophoresis apparatus. System has potential applicability to fields of taxonomy, immunology, and genetics. Apparatus is also used for electrofocusing.

  12. Automatic finite element generators

    NASA Technical Reports Server (NTRS)

    Wang, P. S.

    1984-01-01

    The design and implementation of a software system for generating finite elements and related computations are described. Exact symbolic computational techniques are employed to derive strain-displacement matrices and element stiffness matrices. Methods for dealing with the excessive growth of symbolic expressions are discussed. Automatic FORTRAN code generation is described with emphasis on improving the efficiency of the resultant code.

  13. Reactor component automatic grapple

    SciTech Connect

    Greenaway, P.R.

    1982-12-07

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment.

  14. Reactor component automatic grapple

    DOEpatents

    Greenaway, Paul R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment.

  15. Automatic Data Processing Glossary.

    ERIC Educational Resources Information Center

    Bureau of the Budget, Washington, DC.

    The technology of the automatic information processing field has progressed dramatically in the past few years and has created a problem in common term usage. As a solution, "Datamation" Magazine offers this glossary which was compiled by the U.S. Bureau of the Budget as an official reference. The terms appear in a single alphabetic sequence,…

  16. AUTOmatic Message PACKing Facility

    2004-07-01

    AUTOPACK is a library that provides several useful features for programs using the Message Passing Interface (MPI). Features included are: 1. automatic message packing facility 2. management of send and receive requests. 3. management of message buffer memory. 4. determination of the number of anticipated messages from a set of arbitrary sends, and 5. deterministic message delivery for testing purposes.

  17. Fully automatic telemetry data processor

    NASA Technical Reports Server (NTRS)

    Cox, F. B.; Keipert, F. A.; Lee, R. C.

    1968-01-01

    Satellite Telemetry Automatic Reduction System /STARS 2/, a fully automatic computer-controlled telemetry data processor, maximizes data recovery, reduces turnaround time, increases flexibility, and improves operational efficiency. The system incorporates a CDC 3200 computer as its central element.

  18. Automatism and driving offences.

    PubMed

    Rumbold, John

    2013-10-01

    Automatism is a rarely used defence, but it is particularly used for driving offences because many are strict liability offences. Medical evidence is almost always crucial to argue the defence, and it is important to understand the bars that limit the use of automatism so that the important medical issues can be identified. The issue of prior fault is an important public safeguard to ensure that reasonable precautions are taken to prevent accidents. The total loss of control definition is more problematic, especially with disorders of more gradual onset like hypoglycaemic episodes. In these cases the alternative of 'effective loss of control' would be fairer. This article explores several cases, how the criteria were applied to each, and the types of medical assessment required. PMID:24112330

  19. Automatic transmission control method

    SciTech Connect

    Hasegawa, H.; Ishiguro, T.

    1989-07-04

    This patent describes a method of controlling an automatic transmission of an automotive vehicle. The transmission has a gear train which includes a brake for establishing a first lowest speed of the transmission, the brake acting directly on a ring gear which meshes with a pinion, the pinion meshing with a sun gear in a planetary gear train, the ring gear connected with an output member, the sun gear being engageable and disengageable with an input member of the transmission by means of a clutch. The method comprises the steps of: detecting that a shift position of the automatic transmission has been shifted to a neutral range; thereafter introducing hydraulic pressure to the brake if present vehicle velocity is below a predetermined value, whereby the brake is engaged to establish the first lowest speed; and exhausting hydraulic pressure from the brake if present vehicle velocity is higher than a predetermined value, whereby the brake is disengaged.

  20. Automatism and driving offences.

    PubMed

    Rumbold, John

    2013-10-01

    Automatism is a rarely used defence, but it is particularly used for driving offences because many are strict liability offences. Medical evidence is almost always crucial to argue the defence, and it is important to understand the bars that limit the use of automatism so that the important medical issues can be identified. The issue of prior fault is an important public safeguard to ensure that reasonable precautions are taken to prevent accidents. The total loss of control definition is more problematic, especially with disorders of more gradual onset like hypoglycaemic episodes. In these cases the alternative of 'effective loss of control' would be fairer. This article explores several cases, how the criteria were applied to each, and the types of medical assessment required.

  1. Automatic Abstraction in Planning

    NASA Technical Reports Server (NTRS)

    Christensen, J.

    1991-01-01

    Traditionally, abstraction in planning has been accomplished by either state abstraction or operator abstraction, neither of which has been fully automatic. We present a new method, predicate relaxation, for automatically performing state abstraction. PABLO, a nonlinear hierarchical planner, implements predicate relaxation. Theoretical, as well as empirical results are presented which demonstrate the potential advantages of using predicate relaxation in planning. We also present a new definition of hierarchical operators that allows us to guarantee a limited form of completeness. This new definition is shown to be, in some ways, more flexible than previous definitions of hierarchical operators. Finally, a Classical Truth Criterion is presented that is proven to be sound and complete for a planning formalism that is general enough to include most classical planning formalisms that are based on the STRIPS assumption.

  2. Automatic vehicle monitoring

    NASA Technical Reports Server (NTRS)

    Bravman, J. S.; Durrani, S. H.

    1976-01-01

    Automatic vehicle monitoring systems are discussed. In a baseline system for highway applications, each vehicle obtains position information through a Loran-C receiver in rural areas and through a 'signpost' or 'proximity' type sensor in urban areas; the vehicle transmits this information to a central station via a communication link. In an advance system, the vehicle carries a receiver for signals emitted by satellites in the Global Positioning System and uses a satellite-aided communication link to the central station. An advanced railroad car monitoring system uses car-mounted labels and sensors for car identification and cargo status; the information is collected by electronic interrogators mounted along the track and transmitted to a central station. It is concluded that automatic vehicle monitoring systems are technically feasible but not economically feasible unless a large market develops.

  3. Automatic speech recognition

    NASA Astrophysics Data System (ADS)

    Espy-Wilson, Carol

    2005-04-01

    Great strides have been made in the development of automatic speech recognition (ASR) technology over the past thirty years. Most of this effort has been centered around the extension and improvement of Hidden Markov Model (HMM) approaches to ASR. Current commercially-available and industry systems based on HMMs can perform well for certain situational tasks that restrict variability such as phone dialing or limited voice commands. However, the holy grail of ASR systems is performance comparable to humans-in other words, the ability to automatically transcribe unrestricted conversational speech spoken by an infinite number of speakers under varying acoustic environments. This goal is far from being reached. Key to the success of ASR is effective modeling of variability in the speech signal. This tutorial will review the basics of ASR and the various ways in which our current knowledge of speech production, speech perception and prosody can be exploited to improve robustness at every level of the system.

  4. Automatic volume calibration system

    SciTech Connect

    Gates, A.J.; Aaron, C.C.

    1985-05-06

    The Automatic Volume Calibration System presently consists of three independent volume-measurement subsystems and can possibly be expanded to five subsystems. When completed, the system will manually or automatically perform the sequence of valve-control and data-acquisition operations required to measure given volumes. An LSI-11 minicomputer controls the vacuum and pressure sources and controls solenoid control valves to open and close various volumes. The input data are obtained from numerous displacement, temperature, and pressure sensors read by the LSI-11. The LSI-11 calculates the unknown volume from the data acquired during the sequence of valve operations. The results, based on the Ideal Gas Law, also provide information for feedback and control. This paper describes the volume calibration system, its subsystems, and the integration of the various instrumentation used in the system's design and development. 11 refs., 13 figs., 4 tabs.

  5. Automatic Thesaurus Generation for an Electronic Community System.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; And Others

    1995-01-01

    This research reports an algorithmic approach to the automatic generation of thesauri for electronic community systems. The techniques used include term filtering, automatic indexing, and cluster analysis. The Worm Community System, used by molecular biologists studying the nematode worm C. elegans, was used as the testbed for this research.…

  6. Automatic TLI recognition system beta prototype testing

    SciTech Connect

    Lassahn, G.D.

    1996-06-01

    This report describes the beta prototype automatic target recognition system ATR3, and some performance tests done with this system. This is a fully operational system, with a high computational speed. It is useful for findings any kind of target in digitized image data, and as a general purpose image analysis tool.

  7. Automatic Skin Color Beautification

    NASA Astrophysics Data System (ADS)

    Chen, Chih-Wei; Huang, Da-Yuan; Fuh, Chiou-Shann

    In this paper, we propose an automatic skin beautification framework based on color-temperature-insensitive skin-color detection. To polish selected skin region, we apply bilateral filter to smooth the facial flaw. Last, we use Poisson image cloning to integrate the beautified parts into the original input. Experimental results show that the proposed method can be applied in varied light source environment. In addition, this method can naturally beautify the portrait skin.

  8. Sensitivity analysis of O{sub 3} and photochemical indicators using a mixed-phase chemistry box model and automatic differentiation technology

    SciTech Connect

    Zhang, Y.; Easter, R.C.; Bischof, C.H.; Wu, P.T.

    1997-12-31

    A comprehensive sensitivity analysis of a multi-phase atmospheric chemical mechanism is conducted under a variety of atmospheric conditions. The ADIFOR automatic differentiation technology is applied to evaluate the local sensitivities of species concentrations in gas, aqueous and aerosol phases with respect to a variety of model parameters. In this paper, sensitivities of tropospheric ozone and photochemical indicators with respect to species initial concentrations, gas-phase reaction rate constants, and aerosol surface uptake coefficients are presented and analyzed. The main gas-phase reaction pathways and aerosol surface uptake processes that affect tropospheric O{sub 3} formation, O{sub 3}-precursor relations and sensitivity of indicators are identified. The most influential gas-phase reactions include the photolytic reactions of NO{sub 2}, O{sub 3}, H{sub 2}O{sub 2}, HCHO, ALD{sub 2} and MGLY, the conversion of NO to NO{sub 2}, the generation and inter-conversion of OH, HO{sub 2} and RO{sub 2} radicals, and the formation and dissociation of oxidants and acids. Photochemical indicators such as O{sub 3}/NO{sub x} and H{sub 2}O{sub 2}/HNO{sub 3} are sensitive to changes in reaction rate constants, initial species concentrations, and uptake coefficients. These indicators are found to have higher sensitivities for hydrocarbon reactions and lower sensitivities for NO{sub x} reactions under polluted conditions as compared to less polluted conditions. Aerosol surface uptake is important when the total surface area is larger than 1,000 {micro}m{sup 2} cm{sup {minus}3}. The identified important heterogeneous processes include aerosol surface uptake of HCHO, O{sub 3}, HO{sub 2}, HNO{sub 3}, NO, NO{sub 2}, N{sub 2}O{sub 5}, PAN, H{sub 2}O{sub 2}, CH{sub 3}O{sub 2} and SO{sub 2}. These uptake processes can affect not only O{sub 3} formation and its sensitivity, but also O{sub 3}-precursor relations and sensitivities of indicators.

  9. Automatic payload deployment system

    NASA Astrophysics Data System (ADS)

    Pezeshkian, Narek; Nguyen, Hoa G.; Burmeister, Aaron; Holz, Kevin; Hart, Abraham

    2010-04-01

    The ability to precisely emplace stand-alone payloads in hostile territory has long been on the wish list of US warfighters. This type of activity is one of the main functions of special operation forces, often conducted at great danger. Such risk can be mitigated by transitioning the manual placement of payloads over to an automated placement mechanism by the use of the Automatic Payload Deployment System (APDS). Based on the Automatically Deployed Communication Relays (ADCR) system, which provides non-line-of-sight operation for unmanned ground vehicles by automatically dropping radio relays when needed, the APDS takes this concept a step further and allows for the delivery of a mixed variety of payloads. For example, payloads equipped with a camera and gas sensor in addition to a radio repeater, can be deployed in support of rescue operations of trapped miners. Battlefield applications may include delivering food, ammunition, and medical supplies to the warfighter. Covert operations may require the unmanned emplacement of a network of sensors for human-presence detection, before undertaking the mission. The APDS is well suited for these tasks. Demonstrations have been conducted using an iRobot PackBot EOD in delivering a variety of payloads, for which the performance and results will be discussed in this paper.

  10. Technical comparison between Hythane, GNG and gasoline fueled vehicles. [Hythane = 85 vol% natural gas, 15 vol% H[sub 2

    SciTech Connect

    Not Available

    1992-05-01

    This interim report documents progress on this 2-year Alternative Fuel project, scheduled to end early 1993. Hythane is 85 vol% compressed natural gas (CNG) and 15 vol% hydrogen; it has the potential to meet or exceed the California Ultra-Low Emission Vehicle (ULEV) standard. Three USA trucks (3/4 ton pickup) were operated on single fuel (unleaded gasoline, CNG, Hythane) in Denver. The report includes emission testing, fueling facility, hazard and operability study, and a framework for a national hythane strategy.

  11. Automatic CT Measurement In Lumbar Vertebrae

    NASA Astrophysics Data System (ADS)

    Bisseling, Johannes T.; van Erning, Leon J. T. O.; Schouten, Theo E.; Lemmen, J. Albert M.

    1989-04-01

    Reliable software for automatic determination of the border between the cancellous bone and the cortical bone of lumbar vertebrae has been developed. An automatic procedure is needed because calculations in a larger series of patient data take too much time due to the inevitable human interaction required by available software packages. Processing in batch mode is essential. An important advantage of automatic outlining is its reproducibility, because only a single technique with objective criteria is used. In a so-called Region Of Interest (ROI) texture analysis can be performed to quantify the condition of the vertebral body in order to diagnose osteoporosis. This technique may be an alternative to a classification based solely on the average X-ray absorption value.

  12. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  13. Automatic weld torch guidance control system

    NASA Technical Reports Server (NTRS)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  14. Automatic exposure control for space sequential camera

    NASA Technical Reports Server (NTRS)

    Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.

    1975-01-01

    The final report for the automatic exposure control study for space sequential cameras, for the NASA Johnson Space Center is presented. The material is shown in the same sequence that the work was performed. The purpose of the automatic exposure control is to automatically control the lens iris as well as the camera shutter so that the subject is properly exposed on the film. A study of design approaches is presented. Analysis of the light range of the spectrum covered indicates that the practical range would be from approximately 20 to 6,000 foot-lamberts, or about nine f-stops. Observation of film available from space flights shows that optimum scene illumination is apparently not present in vehicle interior photography as well as in vehicle-to-vehicle situations. The evaluation test procedure for a breadboard, and the results, which provided information for the design of a brassboard are given.

  15. Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy from Evan Leigh's Modern Cotton Spinning (Vol 1), Manchester, 1873 (PL XX); illustration used by eminent British textile engineer to exemplify the ultimate development in American cotton mill technology. - Harmony Manufacturing Company, Mill Number 3, 100 North Mohawk Street, Cohoes, Albany County, NY

  16. 14. Photocopy of engraving from History of Westchester County, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. Photocopy of engraving from History of Westchester County, Vol. 2, by L.E. Preston & Company, Philadelphia, 1886 ALEXANDER SMITH AND SONS CARPET COMPANY, DETAIL, SPINNING AND PRINT MILLS, - Moquette Row Housing, Moquette Row North & Moquette Row South, Yonkers, Westchester County, NY

  17. 13. Photocopy of engraving from History of Westchester County, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Photocopy of engraving from History of Westchester County, Vol. 2, by J. Thomas Scharf, published by L.E. Preston & Company, Philadelphia, 1886 ALEXANDER SMITH AND SONS CARPET COMPANY, MOQUETTE MILLS, WEAVING MILLS, SPINNING AND PRINT MILLS - Moquette Row Housing, Moquette Row North & Moquette Row South, Yonkers, Westchester County, NY

  18. 15. Photocopy of engraving from History of Westchester County, Vol. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Photocopy of engraving from History of Westchester County, Vol. 2, by J. Thomas Scharf, published by L.E. Preston ALEXANDER SMITH AND SONS CARPET COMPANY, DETAIL, MOQUETTE MILLS - Moquette Row Housing, Moquette Row North & Moquette Row South, Yonkers, Westchester County, NY

  19. 3. Photocopy form Western Architect, Vol, 19, No. 8, August ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Photocopy form Western Architect, Vol, 19, No. 8, August 1913, following page 80. 'TOWN AND COMMUNITY PLANNING, WALTER BURLEY GRIFFEN.' ORIGINAL PRESENTATION DRAWING AT NORTHWESTERN UNIVERSITY, ART DEPARTMENT. - Joshua G. Melson House, 56 River Heights Drive, Mason City, Cerro Gordo County, IA

  20. An image-based automatic mesh generation and numerical simulation for a population-based analysis of aerosol delivery in the human lungs

    NASA Astrophysics Data System (ADS)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2013-11-01

    The authors propose a method to automatically generate three-dimensional subject-specific airway geometries and meshes for computational fluid dynamics (CFD) studies of aerosol delivery in the human lungs. The proposed method automatically expands computed tomography (CT)-based airway skeleton to generate the centerline (CL)-based model, and then fits it to the CT-segmented geometry to generate the hybrid CL-CT-based model. To produce a turbulent laryngeal jet known to affect aerosol transport, we developed a physiologically-consistent laryngeal model that can be attached to the trachea of the above models. We used Gmsh to automatically generate the mesh for the above models. To assess the quality of the models, we compared the regional aerosol distributions in a human lung predicted by the hybrid model and the manually generated CT-based model. The aerosol distribution predicted by the hybrid model was consistent with the prediction by the CT-based model. We applied the hybrid model to 8 healthy and 16 severe asthmatic subjects, and average geometric error was 3.8% of the branch radius. The proposed method can be potentially applied to the branch-by-branch analyses of a large population of healthy and diseased lungs. NIH Grants R01-HL-094315 and S10-RR-022421, CT data provided by SARP, and computer time provided by XSEDE.

  1. Using airborne LiDAR in geoarchaeological contexts: Assessment of an automatic tool for the detection and the morphometric analysis of grazing archaeological structures (French Massif Central).

    NASA Astrophysics Data System (ADS)

    Roussel, Erwan; Toumazet, Jean-Pierre; Florez, Marta; Vautier, Franck; Dousteyssier, Bertrand

    2014-05-01

    Airborne laser scanning (ALS) of archaeological regions of interest is nowadays a widely used and established method for accurate topographic and microtopographic survey. The penetration of the vegetation cover by the laser beam allows the reconstruction of reliable digital terrain models (DTM) of forested areas where traditional prospection methods are inefficient, time-consuming and non-exhaustive. The ALS technology provides the opportunity to discover new archaeological features hidden by vegetation and provides a comprehensive survey of cultural heritage sites within their environmental context. However, the post-processing of LiDAR points clouds produces a huge quantity of data in which relevant archaeological features are not easily detectable with common visualizing and analysing tools. Undoubtedly, there is an urgent need for automation of structures detection and morphometric extraction techniques, especially for the "archaeological desert" in densely forested areas. This presentation deals with the development of automatic detection procedures applied to archaeological structures located in the French Massif Central, in the western forested part of the Puy-de-Dôme volcano between 950 and 1100 m a.s.l.. These unknown archaeological sites were discovered by the March 2011 ALS mission and display a high density of subcircular depressions with a corridor access. The spatial organization of these depressions vary from isolated to aggregated or aligned features. Functionally, they appear to be former grazing constructions built from the medieval to the modern period. Similar grazing structures are known in other locations of the French Massif Central (Sancy, Artense, Cézallier) where the ground is vegetation-free. In order to develop a reliable process of automatic detection and mapping of these archaeological structures, a learning zone has been delineated within the ALS surveyed area. The grazing features were mapped and typical morphometric attributes

  2. Automatic range selector

    DOEpatents

    McNeilly, Clyde E.

    1977-01-04

    A device is provided for automatically selecting from a plurality of ranges of a scale of values to which a meter may be made responsive, that range which encompasses the value of an unknown parameter. A meter relay indicates whether the unknown is of greater or lesser value than the range to which the meter is then responsive. The rotatable part of a stepping relay is rotated in one direction or the other in response to the indication from the meter relay. Various positions of the rotatable part are associated with particular scales. Switching means are sensitive to the position of the rotatable part to couple the associated range to the meter.

  3. AUTOMATIC FREQUENCY CONTROL SYSTEM

    DOEpatents

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  4. Automatic level control circuit

    NASA Technical Reports Server (NTRS)

    Toole, P. C.; Mccarthy, D. M. (Inventor)

    1983-01-01

    An automatic level control circuit for an operational amplifier for minimizing spikes or instantaneous gain of the amplifier at a low period wherein no signal is received on the input is provided. The apparatus includes a multibranch circuit which is connected between an output terminal and a feedback terminal. A pair of zener diodes are connected back to back in series with a capacitor provided in one of the branches. A pair of voltage dividing resistors are connected in another of the branches and a second capacitor is provided in the remaining branch of controlling the high frequency oscillations of the operational amplifier.

  5. Classification and automatic transcription of primate calls.

    PubMed

    Versteegh, Maarten; Kuhn, Jeremy; Synnaeve, Gabriel; Ravaux, Lucie; Chemla, Emmanuel; Cäsar, Cristiane; Fuller, James; Murphy, Derek; Schel, Anne; Dunbar, Ewan

    2016-07-01

    This paper reports on an automated and openly available tool for automatic acoustic analysis and transcription of primate calls, which takes raw field recordings and outputs call labels time-aligned with the audio. The system's output predicts a majority of the start times of calls accurately within 200 milliseconds. The tools do not require any manual acoustic analysis or selection of spectral features by the researcher.

  6. Automatic microscopy for mitotic cell location.

    NASA Technical Reports Server (NTRS)

    Herron, J.; Ranshaw, R.; Castle, J.; Wald, N.

    1972-01-01

    Advances are reported in the development of an automatic microscope with which to locate hematologic or other cells in mitosis for subsequent chromosome analysis. The system under development is designed to perform the functions of: slide scanning to locate metaphase cells; conversion of images of selected cells into binary form; and on-line computer analysis of the digitized image for significant cytogenetic data. Cell detection criteria are evaluated using a test sample of 100 mitotic cells and 100 artifacts.

  7. Classification and automatic transcription of primate calls.

    PubMed

    Versteegh, Maarten; Kuhn, Jeremy; Synnaeve, Gabriel; Ravaux, Lucie; Chemla, Emmanuel; Cäsar, Cristiane; Fuller, James; Murphy, Derek; Schel, Anne; Dunbar, Ewan

    2016-07-01

    This paper reports on an automated and openly available tool for automatic acoustic analysis and transcription of primate calls, which takes raw field recordings and outputs call labels time-aligned with the audio. The system's output predicts a majority of the start times of calls accurately within 200 milliseconds. The tools do not require any manual acoustic analysis or selection of spectral features by the researcher. PMID:27475207

  8. Automatic thermographic image defect detection of composites

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP

    2011-05-01

    Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.

  9. Automatic readout micrometer

    DOEpatents

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  10. Automatic readout micrometer

    DOEpatents

    Lauritzen, Ted

    1982-01-01

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  11. Automatic temperature control

    SciTech Connect

    Sheridan, J.P.

    1986-07-22

    An automatic temperature control system is described for maintaining a preset temperature in an enclosed space in a building, comprising: heating and cooling means for conditioning the air in the enclosed space to maintain the preset temperature; exterior thermostat means outside the building for sensing ambient exterior temperature levels; interior thermostat means in the enclosed space, preset to the preset temperature to be maintained and connected with the heating and cooling means to energize the means for heating or cooling, as appropriate, when the preset temperature is reached; means defining a heat sink containing a volume of air heated by solar radiation, the volume of the heat sink being such that the temperature level therein is not affected by minor or temporary ambient temperature fluctuations; and heat sink thermostat means in the heat sink sensing the temperature in the heat sink, the heat sink thermostat means being connected in tandem with the exterior thermostat means and operative with the exterior thermostat means to switch the interior thermostat means to either a first readiness state for heating or a second readiness state for cooling, depending upon which mode is indicated by both the exterior and heat sink thermostat means, whereby the system automatically switches between heating and cooling, as required, in response to a comparison of exterior and heat sink temperatures.

  12. Image feature meaning for automatic key-frame extraction

    NASA Astrophysics Data System (ADS)

    Di Lecce, Vincenzo; Guerriero, Andrea

    2003-12-01

    Video abstraction and summarization, being request in several applications, has address a number of researches to automatic video analysis techniques. The processes for automatic video analysis are based on the recognition of short sequences of contiguous frames that describe the same scene, shots, and key frames representing the salient content of the shot. Since effective shot boundary detection techniques exist in the literature, in this paper we will focus our attention on key frames extraction techniques to identify the low level visual features of the frames that better represent the shot content. To evaluate the features performance, key frame automatically extracted using these features, are compared to human operator video annotations.

  13. Automatic enrollment for gait-based person re-identification

    NASA Astrophysics Data System (ADS)

    Ortells, Javier; Martín-Félez, Raúl; Mollineda, Ramón A.

    2015-02-01

    Automatic enrollment involves a critical decision-making process within people re-identification context. However, this process has been traditionally undervalued. This paper studies the problem of automatic person enrollment from a realistic perspective relying on gait analysis. Experiments simulating random flows of people with considerable appearance variations between different observations of a person have been conducted, modeling both short- and longterm scenarios. Promising results based on ROC analysis show that automatically enrolling people by their gait is affordable with high success rates.

  14. Automatism, medicine and the law.

    PubMed

    Fenwick, P

    1990-01-01

    The law on automatism is undergoing change. For some time there has been a conflict between the medical and the legal views. The medical profession believes that the present division between sane and insane automatism makes little medical sense. Insane automatism is due to an internal factor, that is, a disease of the brain, while sane automatism is due to an external factor, such as a blow on the head or an injection of a drug. This leads to the situation where, for example, the hypoglycaemia resulting from injected insulin would be sane automatism, while hypoglycaemia while results from an islet tumour would be insane automatism. This would not matter if the consequences were the same. However, sane automatism leads to an acquittal, whereas insane automatism leads to committal to a secure mental hospital. This article traces the development of the concept of automatism in the 1950s to the present time, and looks at the anomalies in the law as it now stands. It considers the medical conditions of, and the law relating to, epilepsy, alcohol and drug automatism, hypoglycaemic automatisms, transient global amnesia, and hysterical automatisms. Sleep automatisms, and offences committed during a somnambulistic automatism, are also discussed in detail. The article also examines the need of the Courts to be provided with expert evidence and the role that the qualified medical practitioner should take. It clarifies the various points which medical practitioners should consider when assessing whether a defence of automatism is justified on medical grounds, and in seeking to establish such a defence. The present law is unsatisfactory, as it does not allow any discretion in sentencing on the part of the judge once a verdict of not guilty by virtue of insane automatism has been passed. The judge must sentence the defendant to detention in a secure mental hospital. This would certainly be satisfactory where violent crimes have been committed. However, it is inappropriate in

  15. Keystone feasibility study. Final report. Vol. 4

    SciTech Connect

    Not Available

    1982-12-01

    Volume four of the Keystone coal-to-methanol project includes the following: (1) project management; (2) economic and financial analyses; (3) market analysis; (4) process licensing and agreements; and (5) appendices. 24 figures, 27 tables.

  16. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    PubMed

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery.

  17. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    PubMed

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. PMID:24473345

  18. Automatic repair in active-matrix liquid crystal display (AMLCD)

    NASA Astrophysics Data System (ADS)

    Qiu, Hongjie; Sheng, King C.; Lam, Joseph K.; Knuth, Tim; Miller, Mike; Addiego, Ginetto

    1994-04-01

    This paper presents an automatic AMLCD repair system utilizing real-time video, image processing and analysis, pattern recognition, and artificial intelligence. The system fundamentally includes automatic optical focus, automatic alignment, defect detection, defect analysis and identification, repair point and path definition, and automatic metal removal and addition (cutting, ablating, and metal deposition). Automatic alignment includes mark alignment as well AMLCD pixel alignment. The features (area, centroid, slope, perimeter, length, width and relative location between objects of interest) are measured for defect analysis. A least cost criterion is employed for defect detection and classification. The choice of repair process is determined by two defect types, either `Open' or `Short'. The repair point and path definition is made from the material structure type such as Data line, Gate line, and ITO area, defect position, and repair rules. The rules are generated from the global and local knowledge. In the automatic repair process, the system automatically performs optical focus, mark and pixel alignment, defect detection and classification, and laser writing or cutting.

  19. Automatic routing module

    NASA Technical Reports Server (NTRS)

    Malin, Janice A.

    1987-01-01

    Automatic Routing Module (ARM) is a tool to partially automate Air Launched Cruise Missile (ALCM) routing. For any accessible launch point or target pair, ARM creates flyable routes that, within the fidelity of the models, are optimal in terms of threat avoidance, clobber avoidance, and adherence to vehicle and planning constraints. Although highly algorithmic, ARM is an expert system. Because of the heuristics applied, ARM generated routes closely resemble manually generated routes in routine cases. In more complex cases, ARM's ability to accumulate and assess threat danger in three dimensions and trade that danger off with the probability of ground clobber results in the safest path around or through difficult areas. The tools available prior to ARM did not provide the planner with enough information or present it in such a way that ensured he would select the safest path.

  20. AUTOMATIC HAND COUNTER

    DOEpatents

    Mann J.R.; Wainwright, A.E.

    1963-06-11

    An automatic, personnel-operated, alpha-particle hand monitor is described which functions as a qualitative instrument to indicate to the person using it whether his hands are cold'' or hot.'' The monitor is activated by a push button and includes several capacitor-triggered thyratron tubes. Upon release of the push button, the monitor starts the counting of the radiation present on the hands of the person. If the count of the radiation exceeds a predetermined level within a predetermined time, then a capacitor will trigger a first thyratron tube to light a hot'' lamp. If, however, the count is below such level during this time period, another capacitor will fire a second thyratron to light a safe'' lamp. (AEC)

  1. Automatic thermal switch

    NASA Technical Reports Server (NTRS)

    Wing, L. D.; Cunningham, J. W. (Inventor)

    1981-01-01

    An automatic thermal switch to control heat flow includes a first thermally conductive plate, a second thermally conductive plate and a thermal transfer plate pivotally mounted between the first and second plates. A phase change power unit, including a plunger connected to the transfer plate, is in thermal contact with the first thermally conductive plate. A biasing element, connected to the transfer plate, biases the transfer plate in a predetermined position with respect to the first and second plates. When the phase change power unit is actuated by an increase in heat transmitted through the first plate, the plunger extends and pivots the transfer plate to vary the thermal conduction between the first and second plates through the transfer plate. The biasing element, transfer plate and piston can be arranged to provide either a normally closed or normally open thermally conductive path between the first and second plates.

  2. Automatic Bayesian polarity determination

    NASA Astrophysics Data System (ADS)

    Pugh, D. J.; White, R. S.; Christie, P. A. F.

    2016-07-01

    The polarity of the first motion of a seismic signal from an earthquake is an important constraint in earthquake source inversion. Microseismic events often have low signal-to-noise ratios, which may lead to difficulties estimating the correct first-motion polarities of the arrivals. This paper describes a probabilistic approach to polarity picking that can be both automated and combined with manual picking. This approach includes a quantitative estimate of the uncertainty of the polarity, improving calculation of the polarity probability density function for source inversion. It is sufficiently fast to be incorporated into an automatic processing workflow. When used in source inversion, the results are consistent with those from manual observations. In some cases, they produce a clearer constraint on the range of high-probability source mechanisms, and are better constrained than source mechanisms determined using a uniform probability of an incorrect polarity pick.

  3. Automatic alkaloid removal system.

    PubMed

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user. PMID:24783795

  4. Composite materials: Fatigue and fracture. Vol. 3

    NASA Technical Reports Server (NTRS)

    O'Brien, T. K. (Editor)

    1991-01-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  5. Automatic TLI recognition system, general description

    SciTech Connect

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  6. Semi-automatic approach for music classification

    NASA Astrophysics Data System (ADS)

    Zhang, Tong

    2003-11-01

    Audio categorization is essential when managing a music database, either a professional library or a personal collection. However, a complete automation in categorizing music into proper classes for browsing and searching is not yet supported by today"s technology. Also, the issue of music classification is subjective to some extent as each user may have his own criteria for categorizing music. In this paper, we propose the idea of semi-automatic music classification. With this approach, a music browsing system is set up which contains a set of tools for separating music into a number of broad types (e.g. male solo, female solo, string instruments performance, etc.) using existing music analysis methods. With results of the automatic process, the user may further cluster music pieces in the database into finer classes and/or adjust misclassifications manually according to his own preferences and definitions. Such a system may greatly improve the efficiency of music browsing and retrieval, while at the same time guarantee accuracy and user"s satisfaction of the results. Since this semi-automatic system has two parts, i.e. the automatic part and the manual part, they are described separately in the paper, with detailed descriptions and examples of each step of the two parts included.

  7. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  8. Youth Studies Abstracts. Vol. 4 No. 1.

    ERIC Educational Resources Information Center

    Youth Studies Abstracts, 1985

    1985-01-01

    This volume contains abstracts of 76 projects (most of which were conducted in Australia and New Zealand) concerned with programs for youth and with social and educational developments affecting youth. The abstracts are arranged in the following two categories: (1) Social and Educational Developments: Policy, Analysis, Research; and (2) Programs:…

  9. Empirical Research in Theatre, Vol 3.

    ERIC Educational Resources Information Center

    Addington, David W., Ed.; Kepke, Allen N., Ed.

    This journal provides a focal point for the collection and distribution of systematically processed information about theory and practice in theatre. Part of an irregularly published series, this issue contains investigations of the application of transactional analysis to the theatre, the psychological effect of counterattitudinal acting in…

  10. Automatic Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr.

    1985-01-01

    Coal cutting and removal done with minimal hazard to people. Automatic coal mine cutting, transport and roof-support movement all done by automatic machinery. Exposure of people to hazardous conditions reduced to inspection tours, maintenance, repair, and possibly entry mining.

  11. 3D numerical test objects for the evaluation of a software used for an automatic analysis of a linear accelerator mechanical stability

    NASA Astrophysics Data System (ADS)

    Torfeh, Tarraf; Beaumont, Stéphane; Guédon, Jeanpierre; Benhdech, Yassine

    2010-04-01

    Mechanical stability of a medical LINear ACcelerator (LINAC), particularly the quality of the gantry, collimator and table rotations and the accuracy of the isocenter position, are crucial for the radiation therapy process, especially in stereotactic radio surgery and in Image Guided Radiation Therapy (IGRT) where this mechanical stability is perturbed due to the additional weight the kV x-ray tube and detector. In this paper, we present a new method to evaluate a software which is used to perform an automatic measurement of the "size" (flex map) and the location of the kV and the MV isocenters of the linear accelerator. The method consists of developing a complete numerical 3D simulation of a LINAC and physical phantoms in order to produce Electronic Portal Imaging Device (EPID) images including calibrated distortions of the mechanical movement of the gantry and isocenter misalignments.

  12. Microstructure and Mechanical Properties of Al6061-31vol.% B4C Composites Prepared by Hot Isostatic Pressing

    NASA Astrophysics Data System (ADS)

    Xian, Yajiang; Pang, Xiaoxuan; He, Shixiong; Wang, Wei; Wang, Xin; Zhang, Pengcheng

    2015-10-01

    Fabrication of durable and usable composites with high content of B4C (up to 31vol.%) is quite challenging in several aspects including blending, cold isostatic pressing, and hot isostatic pressing (HIP), and especially the optimal HIP process is essential to achieve the metal matrix composite with desirable properties. The microstructure and mechanical properties of Al6061-31vol.% B4C with different particle sizes were investigated by scanning electron microscopy (SEM) and tensile testing, respectively. SEM analysis and quantitative measurements of the particle distribution reveal that B4C particles were uniformly distributed in the matrix without agglomeration when the HIP treatment temperature was about 580 °C, and x-ray diffraction also identified a dispersion of B4C particles as well as reaction products (AlB2 and Al3BC) in the composites. Microhardness of Al6061-31vol.% B4C composites was improved with B4C particle size, and the tensile strength of all the samples declined with an increase in B4C particle size. The contribution from different strengthening mechanisms was also discussed.

  13. An open system for automatic home-cage behavioral analysis and its application to male and female mouse models of Huntington's disease.

    PubMed

    Zarringhalam, Kourosh; Ka, Minhan; Kook, Yeon-Hee; Terranova, Joseph I; Suh, Yongjoon; King, Oliver D; Um, Moonkyoung

    2012-04-01

    Changes in routine mouse home-cage behavioral activities have been used recently to study alterations of neural circuits caused by genetic and environmental modifications and by drug administration. Nevertheless, automatic assessment of mouse home-cage behaviors remains challenging due to the cost of proprietary systems and to the difficulty in adjusting systems to different monitoring conditions. Here we present software for the automatic quantification of multiple facets of mouse home-cage behaviors, suitable for continuous 24 h video monitoring. We used this program to assess behavioral changes in male and female R6/2 transgenic mouse models of Huntington's disease over a 10-week period. Consistent with the well-known progressive motor coordination deficits of R6/2 mice, their hanging, rearing, and climbing activity declined as the disease progressed. R6/2 mice also exhibited frequent disturbances in their resting activity compared to wild-type mice, suggesting that R6/2 mice are more restless and wakeful. Behavioral differences were seen earlier for male R6/2 mice than female R6/2 mice, and "behavioral signatures" based on multiple behaviors enabled us to distinguish male R6/2 mice from sex- and age-matched wild-type controls as early as 5 weeks of age. These results demonstrate that the automated behavioral classification software that we developed ("OpenCage") provides a powerful tool for analyzing natural home-cage mouse behaviors, and for constructing behavioral signatures that will be useful for assessing therapeutic strategies. The OpenCage software is available under an open-source GNU General Public License, allowing other users to freely modify and extend it to suit their purposes. PMID:22266926

  14. A Comparative Analysis of DBSCAN, K-Means, and Quadratic Variation Algorithms for Automatic Identification of Swallows from Swallowing Accelerometry Signals

    PubMed Central

    Dudik, Joshua M.; Kurosu, Atsuko; Coyle, James L

    2015-01-01

    Background Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. Methods In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Results Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differen-tiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. Conclusions In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. PMID:25658505

  15. Identification of Xanthomonas fragariae, Xanthomonas axonopodis pv. phaseoli, and Xanthomonas fuscans subsp. fuscans with novel markers and using a dot blot platform coupled with automatic data analysis.

    PubMed

    Albuquerque, Pedro; Caridade, Cristina M R; Marcal, Andre R S; Cruz, Joana; Cruz, Leonor; Santos, Catarina L; Mendes, Marta V; Tavares, Fernando

    2011-08-15

    Phytosanitary regulations and the provision of plant health certificates still rely mainly on long and laborious culture-based methods of diagnosis, which are frequently inconclusive. DNA-based methods of detection can circumvent many of the limitations of currently used screening methods, allowing a fast and accurate monitoring of samples. The genus Xanthomonas includes 13 phytopathogenic quarantine organisms for which improved methods of diagnosis are needed. In this work, we propose 21 new Xanthomonas-specific molecular markers, within loci coding for Xanthomonas-specific protein domains, useful for DNA-based methods of identification of xanthomonads. The specificity of these markers was assessed by a dot blot hybridization array using 23 non-Xanthomonas species, mostly soil dwelling and/or phytopathogens for the same host plants. In addition, the validation of these markers on 15 Xanthomonas spp. suggested species-specific hybridization patterns, which allowed discrimination among the different Xanthomonas species. Having in mind that DNA-based methods of diagnosis are particularly hampered for unsequenced species, namely, Xanthomonas fragariae, Xanthomonas axonopodis pv. phaseoli, and Xanthomonas fuscans subsp. fuscans, for which comparative genomics tools to search for DNA signatures are not yet applicable, emphasis was given to the selection of informative markers able to identify X. fragariae, X. axonopodis pv. phaseoli, and X. fuscans subsp. fuscans strains. In order to avoid inconsistencies due to operator-dependent interpretation of dot blot data, an image-processing algorithm was developed to analyze automatically the dot blot patterns. Ultimately, the proposed markers and the dot blot platform, coupled with automatic data analyses, have the potential to foster a thorough monitoring of phytopathogenic xanthomonads. PMID:21705524

  16. Automatic Command Sequence Generation

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladded, Roy; Khanampompan, Teerapat

    2007-01-01

    Automatic Sequence Generator (Autogen) Version 3.0 software automatically generates command sequences for the Mars Reconnaissance Orbiter (MRO) and several other JPL spacecraft operated by the multi-mission support team. Autogen uses standard JPL sequencing tools like APGEN, ASP, SEQGEN, and the DOM database to automate the generation of uplink command products, Spacecraft Command Message Format (SCMF) files, and the corresponding ground command products, DSN Keywords Files (DKF). Autogen supports all the major multi-mission mission phases including the cruise, aerobraking, mapping/science, and relay mission phases. Autogen is a Perl script, which functions within the mission operations UNIX environment. It consists of two parts: a set of model files and the autogen Perl script. Autogen encodes the behaviors of the system into a model and encodes algorithms for context sensitive customizations of the modeled behaviors. The model includes knowledge of different mission phases and how the resultant command products must differ for these phases. The executable software portion of Autogen, automates the setup and use of APGEN for constructing a spacecraft activity sequence file (SASF). The setup includes file retrieval through the DOM (Distributed Object Manager), an object database used to store project files. This step retrieves all the needed input files for generating the command products. Depending on the mission phase, Autogen also uses the ASP (Automated Sequence Processor) and SEQGEN to generate the command product sent to the spacecraft. Autogen also provides the means for customizing sequences through the use of configuration files. By automating the majority of the sequencing generation process, Autogen eliminates many sequence generation errors commonly introduced by manually constructing spacecraft command sequences. Through the layering of commands into the sequence by a series of scheduling algorithms, users are able to rapidly and reliably construct the

  17. Automatically classifying question types for consumer health questions.

    PubMed

    Roberts, Kirk; Kilicoglu, Halil; Fiszman, Marcelo; Demner-Fushman, Dina

    2014-01-01

    We present a method for automatically classifying consumer health questions. Our thirteen question types are designed to aid in the automatic retrieval of medical answers from consumer health resources. To our knowledge, this is the first machine learning-based method specifically for classifying consumer health questions. We demonstrate how previous approaches to medical question classification are insufficient to achieve high accuracy on this task. Additionally, we describe, manually annotate, and automatically classify three important question elements that improve question classification over previous techniques. Our results and analysis illustrate the difficulty of the task and the future directions that are necessary to achieve high-performing consumer health question classification.

  18. Electronically controlled automatic transmission

    SciTech Connect

    Ohkubo, M.; Shiba, H.; Nakamura, K.

    1989-03-28

    This patent describes an electronically controlled automatic transmission having a manual valve working in connection with a manual shift lever, shift valves operated by solenoid valves which are driven by an electronic control circuit previously memorizing shift patterns, and a hydraulic circuit controlled by these manual valve and shift valves for driving brakes and a clutch in order to change speed. Shift patterns of 2-range and L-range, in addition to a shift pattern of D-range, are memorized previously in the electronic control circuit, an operation switch is provided which changes the shift pattern of the electronic control circuit to any shift pattern among those of D-range, 2-range and L-range at time of the manual shift lever being in a D-range position, a releasable lock mechanism is provided which prevents the manual shift lever from entering 2-range and L-range positions, and the hydraulic circuit is set to a third speed mode when the manual shift lever is in the D-range position. The circuit is set to a second speed mode when it is in the 2-range position, and the circuit is set to a first speed mode when it is in the L-range position, respectively, in case where the shift valves are not working.

  19. Automatic Welding System

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Robotic welding has been of interest to industrial firms because it offers higher productivity at lower cost than manual welding. There are some systems with automated arc guidance available, but they have disadvantages, such as limitations on types of materials or types of seams that can be welded; susceptibility to stray electrical signals; restricted field of view; or tendency to contaminate the weld seam. Wanting to overcome these disadvantages, Marshall Space Flight Center, aided by Hayes International Corporation, developed system that uses closed-circuit TV signals for automatic guidance of the welding torch. NASA granted license to Combined Technologies, Inc. for commercial application of the technology. They developed a refined and improved arc guidance system. CTI in turn, licensed the Merrick Corporation, also of Nashville, for marketing and manufacturing of the new system, called the CT2 Optical Trucker. CT2 is a non-contracting system that offers adaptability to broader range of welding jobs and provides greater reliability in high speed operation. It is extremely accurate and can travel at high speed of up to 150 inches per minute.

  20. Automatic transmission system

    SciTech Connect

    Ha, J.S.

    1989-04-25

    An automatic transmission system is described for use in vehicles, which comprises: a clutch wheel containing a plurality of concentric rings of decreasing diameter, the clutch wheel being attached to an engine of the vehicle; a plurality of clutch gears corresponding in size to the concentric rings, the clutch gears being adapted to selectively and frictionally engage with the concentric rings of the clutch wheel; an accelerator pedal and a gear selector, the accelerator pedals being connected to one end of a substantially U-shaped frame member, the other end of the substantially U-shaped frame member selectively engaging with one end of one of wires received in a pair of apertures of the gear selector; a plurality of drive gear controllers and a reverse gear controller; means operatively connected with the gear selector and the plurality of drive gear controllers and reverse gear controller for selectively engaging one of the drive and reverse gear controllers depending upon the position of the gear selector; and means for individually connecting the drive and reverse gear controllers with the corresponding clutch gears whereby upon the selection of the gear selector, friction engagement is achieved between the clutch gear and the clutch wheels for rotating the wheel in the forward or reverse direction.

  1. Attaining Automaticity in the Visual Numerosity Task is Not Automatic

    PubMed Central

    Speelman, Craig P.; Muller Townsend, Katrina L.

    2015-01-01

    This experiment is a replication of experiments reported by Lassaline and Logan (1993) using the visual numerosity task. The aim was to replicate the transition from controlled to automatic processing reported by Lassaline and Logan (1993), and to examine the extent to which this result, reported with average group results, can be observed in the results of individuals within a group. The group results in this experiment did replicate those reported by Lassaline and Logan (1993); however, one half of the sample did not attain automaticity with the task, and one-third did not exhibit a transition from controlled to automatic processing. These results raise questions about the pervasiveness of automaticity, and the interpretation of group means when examining cognitive processes. PMID:26635658

  2. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved. PMID:20329520

  3. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  4. Automatic Weather Station (AWS) Lidar

    NASA Technical Reports Server (NTRS)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  5. Automatic transmission apparatus

    SciTech Connect

    Hiketa, M.

    1987-10-06

    An automatic transmission apparatus is described comprising: an input shaft, an output shaft disposed behind and coaxially with the input shaft, a counter shaft disposed substantially parallel to both of the input and output shafts, a first gear train including a first gear provided on the input shaft and a second gear provided on the counter shaft to be meshed with the first gear so as to form a first power transmitting path, first friction clutch means operative selectively to make and break the first power transmitting path, a second gear train including a third gear provided through one-way clutch means on a rear end portion of the input shaft and a fourth gear provided on the counter shaft to be meshed with the third gear so as to form a second power transmitting path, second friction clutch means provided at a front end portion of the output shaft, a third gear train including a fifth gear provided on a rear end portion of the counter shaft and a sixth gear provided on the output shaft to be meshed with the fifth gear so as to form a fourth power transmitting path, third friction clutch means operative selectively to make and break the fourth power transmitting path, fourth friction clutch means operative selectively to make and break the second power transmitting path, a fourth gear train including a seventh gear provided on the counter shaft and an eighth gear provided on the output shaft and fifth friction clutch means operative selectively to make and break the fifth power transmitting path.

  6. Clothes Dryer Automatic Termination Evaluation

    SciTech Connect

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  7. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  8. Automatic landslides detection on Stromboli volcanic Island

    NASA Astrophysics Data System (ADS)

    Silengo, Maria Cristina; Delle Donne, Dario; Ulivieri, Giacomo; Cigolini, Corrado; Ripepe, Maurizio

    2016-04-01

    Landslides occurring in active volcanic islands play a key role in triggering tsunami and other related risks. Therefore, it becomes vital for a correct and prompt risk assessment to monitor landslides activity and to have an automatic system for a robust early-warning. We then developed a system based on a multi-frequency analysis of seismic signals for automatic landslides detection occurring at Stromboli volcano. We used a network of 4 seismic 3 components stations located along the unstable flank of the Sciara del Fuoco. Our method is able to recognize and separate the different sources of seismic signals related to volcanic and tectonic activity (e.g. tremor, explosions, earthquake) from landslides. This is done using a multi-frequency analysis combined with a waveform patter recognition. We applied the method to one year of seismic activity of Stromboli volcano centered during the last 2007 effusive eruption. This eruption was characterized by a pre-eruptive landslide activity reflecting the slow deformation of the volcano edifice. The algorithm is at the moment running off-line but has proved to be robust and efficient in picking automatically landslide. The method provides also real-time statistics on the landslide occurrence, which could be used as a proxy for the volcano deformation during the pre-eruptive phases. This method is very promising since the number of false detections is quite small (<5%) and is reducing when the size of the landslide increases. The final aim will be to apply this method on-line and for a real-time automatic detection as an improving tool for early warnings of tsunami-genic landslide activity. We suggest that a similar approach could be also applied to other unstable non-volcanic also slopes.

  9. A training programme involving automatic self-transcending meditation in late-life depression: preliminary analysis of an ongoing randomised controlled trial

    PubMed Central

    Arena, Amanda; Burhan, Amer M.; Ionson, Emily; Hirjee, Hussein; Maldeniya, Pramudith; Wetmore, Stephen; Newman, Ronnie I.

    2016-01-01

    Late-life depression affects 2–6% of seniors aged 60 years and above. Patients are increasingly embracing non-pharmacological therapies, many of which have not been scientifically evaluated. This study aimed to evaluate a category of meditation, automatic self-transcending meditation (ASTM), in alleviating symptoms of depression when augmenting treatment as usual (NCT02149810). The preliminary results of an ongoing single-blind randomised controlled trial comparing a training programme involving ASTM with a wait-list control indicate that a 12-week ASTM programme may lead to significantly greater reductions in depression and anxiety severity. As such, ASTM may be an effective adjunctive therapy in the treatment of late-life depression. Declaration of interest R.I.N. is Director of Research and Health Promotion for the Art of Living Foundation, Canada and supervised the staff providing ASTM training. Copyright and usage © The Royal College of Psychiatrists 2016. This is an open access article distributed under the terms of the Creative Commons Non-Commercial, No Derivatives (CC BY-NC-ND) licence. PMID:27703774

  10. Automatic safety rod for reactors

    DOEpatents

    Germer, John H.

    1988-01-01

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-core flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  11. Prospects for de-automatization.

    PubMed

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness.

  12. Automatic Collision Avoidance Technology (ACAT)

    NASA Technical Reports Server (NTRS)

    Swihart, Donald E.; Skoog, Mark A.

    2007-01-01

    This document represents two views of the Automatic Collision Avoidance Technology (ACAT). One viewgraph presentation reviews the development and system design of Automatic Collision Avoidance Technology (ACAT). Two types of ACAT exist: Automatic Ground Collision Avoidance (AGCAS) and Automatic Air Collision Avoidance (AACAS). The AGCAS Uses Digital Terrain Elevation Data (DTED) for mapping functions, and uses Navigation data to place aircraft on map. It then scans DTED in front of and around aircraft and uses future aircraft trajectory (5g) to provide automatic flyup maneuver when required. The AACAS uses data link to determine position and closing rate. It contains several canned maneuvers to avoid collision. Automatic maneuvers can occur at last instant and both aircraft maneuver when using data link. The system can use sensor in place of data link. The second viewgraph presentation reviews the development of a flight test and an evaluation of the test. A review of the operation and comparison of the AGCAS and a pilot's performance are given. The same review is given for the AACAS is given.

  13. Automatic target detection in cluttered IR images

    NASA Astrophysics Data System (ADS)

    Mueller, Markus; Korn, Axel

    1998-07-01

    Automatic target detection (ATR) generally refers to the localization of potential targets by computer processing of data from a variety of sensors. Automatic detection is applicable for data reduction purposes in the reconnaissance domain and is therefore aimed at reducing the workload on human operators. ATR covers activities such as the localization of individual objects in large areas or volumes for assessing the battlefield simulation. An increase of reliability and efficiency of the overall reconnaissance process is expected. The results of automatic image evaluation are offered to the image analyst as hypotheses. In this paper cluttered images from an infrared sensor are analyzed with the aim of finding Regions of Interest (ROIs), where hints for man-made objects have to be found. This analysis uses collateral data from acquisition time and location (e.g. day time, weather condition, resolution, sensor specification and orientation etc.). The assumed target size in the image is also compared by using collateral data. Based on the collateral data, the algorithm adjusts its parameters in order to find ROIs and to detect targets. Low contrast conditions can be successfully tackled if the directions of the grey value gradient are considered, which are nearly independent of the contrast. Blobs are generated by applying adaptive thresholds in the ROIs. Here the evaluation of histograms is very important for the extraction of structured features. The height, aspect angle, and camera parameters are approximately known for an estimation of target sizes in the image domain out of the collateral data.

  14. Automatic segmentation editing for cortical surface reconstruction

    NASA Astrophysics Data System (ADS)

    Han, Xiao; Xu, Chenyang; Rettmann, Maryam E.; Prince, Jerry L.

    2001-07-01

    Segmentation and representation of the human cerebral cortex from magnetic resonance images is an important goal in neuroscience and medicine. Accurate cortical segmentation requires preprocessing of the image data to separate certain subcortical structures from the cortex in order to generate a good initial white-matter/gray-matter interface. This step is typically manual or semi-automatic. In this paper, we propose an automatic procedure that is based on a careful analysis of the brain anatomy. Following a fuzzy segmentation of the brain image, the method first extracts the ventricles using a geometric deformable surface model. A region force, derived from the cerebrospinal membership function, is used to deform the surface towards the boundary of the ventricles, while a curvature force controls the smoothness of the surface and prevents it from growing into the outer pial surface. Next, region-growing identifies and fills the subcortical regions in each cortical slice using the detected ventricles as seeds and the white matter and several automatically determined sealing lines as boundaries. To make the method robust to segmentation artifacts, a putamen mask drawn in the Talairach coordinate system is also used to help the region growing process. Visual inspection and initial results on 15 subjects show the success of the proposed method.

  15. Automatic Detection of Dominance and Expected Interest

    NASA Astrophysics Data System (ADS)

    Escalera, Sergio; Pujol, Oriol; Radeva, Petia; Vitrià, Jordi; Anguera, M. Teresa

    2010-12-01

    Social Signal Processing is an emergent area of research that focuses on the analysis of social constructs. Dominance and interest are two of these social constructs. Dominance refers to the level of influence a person has in a conversation. Interest, when referred in terms of group interactions, can be defined as the degree of engagement that the members of a group collectively display during their interaction. In this paper, we argue that only using behavioral motion information, we are able to predict the interest of observers when looking at face-to-face interactions as well as the dominant people. First, we propose a simple set of movement-based features from body, face, and mouth activity in order to define a higher set of interaction indicators. The considered indicators are manually annotated by observers. Based on the opinions obtained, we define an automatic binary dominance detection problem and a multiclass interest quantification problem. Error-Correcting Output Codes framework is used to learn to rank the perceived observer's interest in face-to-face interactions meanwhile Adaboost is used to solve the dominant detection problem. The automatic system shows good correlation between the automatic categorization results and the manual ranking made by the observers in both dominance and interest detection problems.

  16. Automatic image classification for the urinoculture screening.

    PubMed

    Andreini, Paolo; Bonechi, Simone; Bianchini, Monica; Garzelli, Andrea; Mecocci, Alessandro

    2016-03-01

    Urinary tract infections (UTIs) are considered to be the most common bacterial infection and, actually, it is estimated that about 150 million UTIs occur world wide yearly, giving rise to roughly $6 billion in healthcare expenditures and resulting in 100,000 hospitalizations. Nevertheless, it is difficult to carefully assess the incidence of UTIs, since an accurate diagnosis depends both on the presence of symptoms and on a positive urinoculture, whereas in most outpatient settings this diagnosis is made without an ad hoc analysis protocol. On the other hand, in the traditional urinoculture test, a sample of midstream urine is put onto a Petri dish, where a growth medium favors the proliferation of germ colonies. Then, the infection severity is evaluated by a visual inspection of a human expert, an error prone and lengthy process. In this paper, we propose a fully automated system for the urinoculture screening that can provide quick and easily traceable results for UTIs. Based on advanced image processing and machine learning tools, the infection type recognition, together with the estimation of the bacterial load, can be automatically carried out, yielding accurate diagnoses. The proposed AID (Automatic Infection Detector) system provides support during the whole analysis process: first, digital color images of Petri dishes are automatically captured, then specific preprocessing and spatial clustering algorithms are applied to isolate the colonies from the culture ground and, finally, an accurate classification of the infections and their severity evaluation are performed. The AID system speeds up the analysis, contributes to the standardization of the process, allows result repeatability, and reduces the costs. Moreover, the continuous transition between sterile and external environments (typical of the standard analysis procedure) is completely avoided. PMID:26780249

  17. Warmer temperatures stimulate respiration and reduce net ecosystem productivity in a northern Great Plains grassland: Analysis of CO2 exchange in automatic chambers

    NASA Astrophysics Data System (ADS)

    Flanagan, L. B.

    2013-12-01

    The interacting effects of altered temperature and precipitation are expected to have significant consequences for ecosystem net carbon storage. Here I report the results of an experiment that evaluated the effects of elevated temperature and altered precipitation on ecosystem CO2 exchange in a northern Great Plains grassland, near Lethbridge, Alberta Canada. Open-top chambers were used to establish an experiment in 2012 with three treatments (control, warmed, warmed plus 50% of normal precipitation input). A smaller experiment with only the two temperature treatments (control and warmed) was conducted in 2013. Continuous half-hourly net CO2 exchange measurements were made using nine automatic chambers during May-October in both years. My objectives were to determine the sensitivity of the ecosystem carbon budget to temperature and moisture manipulations, and to test for direct and indirect effects of the environmental changes on ecosystem CO2 exchange. The experimental manipulations resulted primarily in a significant increase in air temperature in the warmed treatment plots. A cumulative net loss of carbon or negative net ecosystem productivity (NEP) occurred during May through September in the warmed treatment (NEP = -659 g C m-2), while in the control treatment there was a cumulative net gain of carbon (NEP = +50 g C m-2). An eddy covariance system that operated at the site, over a footprint region that was not influenced by the experimental treatments, also showed a net gain of carbon by the ecosystem. The reduced NEP was due to higher plant and soil respiration rates in the warmed treatment that appeared to be caused by a combination of: (i) higher carbon substrate availability indirectly stimulating soil respiration in the warmed relative to the control treatment, and (ii) a strong increase in leaf respiration likely caused by a shift in electron partitioning to the alternative pathway respiration in the warmed treatment, particularly when exposed to high

  18. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  19. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, Anthony J.

    1994-05-10

    Disclosed are a method and apparatus for (1) automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, (2) automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, (3) manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and (4) automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly.

  20. Automatic rapid attachable warhead section

    DOEpatents

    Trennel, A.J.

    1994-05-10

    Disclosed are a method and apparatus for automatically selecting warheads or reentry vehicles from a storage area containing a plurality of types of warheads or reentry vehicles, automatically selecting weapon carriers from a storage area containing at least one type of weapon carrier, manipulating and aligning the selected warheads or reentry vehicles and weapon carriers, and automatically coupling the warheads or reentry vehicles with the weapon carriers such that coupling of improperly selected warheads or reentry vehicles with weapon carriers is inhibited. Such inhibition enhances safety of operations and is achieved by a number of means including computer control of the process of selection and coupling and use of connectorless interfaces capable of assuring that improperly selected items will be rejected or rendered inoperable prior to coupling. Also disclosed are a method and apparatus wherein the stated principles pertaining to selection, coupling and inhibition are extended to apply to any item-to-be-carried and any carrying assembly. 10 figures.

  1. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  2. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  3. Grinding Parts For Automatic Welding

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  4. Automatic interpretation of Schlumberger soundings

    SciTech Connect

    Ushijima, K.

    1980-09-01

    The automatic interpretation of apparent resistivity curves from horizontally layered earth models is carried out by the curve-fitting method in three steps: (1) the observed VES data are interpolated at equidistant points of electrode separations on the logarithmic scale by using the cubic spline function, (2) the layer parameters which are resistivities and depths are predicted from the sampled apparent resistivity values by SALS system program and (3) the theoretical VES curves from the models are calculated by Ghosh's linear filter method using the Zhody's computer program. Two soundings taken over Takenoyu geothermal area were chosen to test the procedures of the automatic interpretation.

  5. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology

    ERIC Educational Resources Information Center

    Dansereau, Jules

    1978-01-01

    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  6. Approaches to the automatic generation and control of finite element meshes

    NASA Technical Reports Server (NTRS)

    Shephard, Mark S.

    1987-01-01

    The algorithmic approaches being taken to the development of finite element mesh generators capable of automatically discretizing general domains without the need for user intervention are discussed. It is demonstrated that because of the modeling demands placed on a automatic mesh generator, all the approaches taken to date produce unstructured meshes. Consideration is also given to both a priori and a posteriori mesh control devices for automatic mesh generators as well as their integration with geometric modeling and adaptive analysis procedures.

  7. Automatic 35 mm slide duplicator

    NASA Technical Reports Server (NTRS)

    Seidel, H. F.; Texler, R. E.

    1980-01-01

    Automatic duplicator is readily assembled from conventional, inexpensive equipment and parts. Series of slides can be exposed without operator attention, eliminating considerable manual handling and processing ordinarily required. At end of programmed exposure sequence, unit shuts off and audible alarm signals completion of process.

  8. Bubble vector in automatic merging

    NASA Technical Reports Server (NTRS)

    Pamidi, P. R.; Butler, T. G.

    1987-01-01

    It is shown that it is within the capability of the DMAP language to build a set of vectors that can grow incrementally to be applied automatically and economically within a DMAP loop that serves to append sub-matrices that are generated within a loop to a core matrix. The method of constructing such vectors is explained.

  9. Automatically Preparing Safe SQL Queries

    NASA Astrophysics Data System (ADS)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  10. Graphonomics, Automaticity and Handwriting Assessment

    ERIC Educational Resources Information Center

    Tucha, Oliver; Tucha, Lara; Lange, Klaus W.

    2008-01-01

    A recent review of handwriting research in "Literacy" concluded that current curricula of handwriting education focus too much on writing style and neatness and neglect the aspect of handwriting automaticity. This conclusion is supported by evidence in the field of graphonomic research, where a range of experiments have been used to investigate…

  11. Automatic Identification of Metaphoric Utterances

    ERIC Educational Resources Information Center

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  12. Automatic marker for photographic film

    NASA Technical Reports Server (NTRS)

    Gabbard, N. M.; Surrency, W. M.

    1974-01-01

    Commercially-produced wire-marking machine is modified to title or mark film rolls automatically. Machine is used with film drive mechanism which is powered with variable-speed, 28-volt dc motor. Up to 40 frames per minute can be marked, reducing time and cost of process.

  13. Human Identification Using Automatic and Semi-Automatically Detected Facial Marks.

    PubMed

    Srinivas, Nisha; Flynn, Patrick J; Vorder Bruegge, Richard W

    2016-01-01

    Continuing advancements in the field of digital cameras and surveillance imaging devices have led law enforcement and intelligence agencies to use analysis of images and videos for the investigation and prosecution of crime. When determining identity from photographic evidence, forensic analysts perform comparison of visible facial features manually, which is inefficient. In this study, we will address research efforts to use facial marks as biometric signatures to distinguish between individuals. We propose two systems to assist forensic analysts during photographic comparison: an improved multiscale facial mark system in which facial marks are detected automatically, and a semi-automatic facial mark system that integrates human knowledge within the improved multiscale facial mark system. Experiment results employ a high-resolution time-elapsed dataset acquired at the University of Notre Dame between 2009 and 2011. The results indicate that the geometric distributions of facial mark patterns can be used to distinguish between individuals.

  14. Human Identification Using Automatic and Semi-Automatically Detected Facial Marks.

    PubMed

    Srinivas, Nisha; Flynn, Patrick J; Vorder Bruegge, Richard W

    2016-01-01

    Continuing advancements in the field of digital cameras and surveillance imaging devices have led law enforcement and intelligence agencies to use analysis of images and videos for the investigation and prosecution of crime. When determining identity from photographic evidence, forensic analysts perform comparison of visible facial features manually, which is inefficient. In this study, we will address research efforts to use facial marks as biometric signatures to distinguish between individuals. We propose two systems to assist forensic analysts during photographic comparison: an improved multiscale facial mark system in which facial marks are detected automatically, and a semi-automatic facial mark system that integrates human knowledge within the improved multiscale facial mark system. Experiment results employ a high-resolution time-elapsed dataset acquired at the University of Notre Dame between 2009 and 2011. The results indicate that the geometric distributions of facial mark patterns can be used to distinguish between individuals. PMID:27405018

  15. Automatic segmentation of chromosomes in Q-band images.

    PubMed

    Grisan, Enrico; Poletti, Enea; Tomelleri, Christopher; Ruggeri, Alfredo

    2007-01-01

    Karyotype analysis is a widespread procedure in cytogenetics to assess the possible presence of genetics defects. The procedure is lengthy and repetitive, so that an automatic analysis would greatly help the cytogeneticist routine work. Still, automatic segmentation and full disentangling of chromosomes are open issues. We propose an automatic procedure to obtain the separated chromosomes, which are then ready for a subsequent classification step. The segmentation is carried out by means of a space variant thresholding scheme, which proved to be successful even in presence of hyper- or hypo-fluorescent regions in the image. Then a greedy approach is used to identify and resolve touching and overlapping chromosomes, based on geometric evidence and image information. We show the effectiveness of the proposed method on routine data: 90% of the overlaps and 92% of the adjacencies are resolved, resulting in a correct segmentation of 96% of the chromosomes.

  16. How automatic are crossmodal correspondences?

    PubMed

    Spence, Charles; Deroy, Ophelia

    2013-03-01

    The last couple of years have seen a rapid growth of interest (especially amongst cognitive psychologists, cognitive neuroscientists, and developmental researchers) in the study of crossmodal correspondences - the tendency for our brains (not to mention the brains of other species) to preferentially associate certain features or dimensions of stimuli across the senses. By now, robust empirical evidence supports the existence of numerous crossmodal correspondences, affecting people's performance across a wide range of psychological tasks - in everything from the redundant target effect paradigm through to studies of the Implicit Association Test, and from speeded discrimination/classification tasks through to unspeeded spatial localisation and temporal order judgment tasks. However, one question that has yet to receive a satisfactory answer is whether crossmodal correspondences automatically affect people's performance (in all, or at least in a subset of tasks), as opposed to reflecting more of a strategic, or top-down, phenomenon. Here, we review the latest research on the topic of crossmodal correspondences to have addressed this issue. We argue that answering the question will require researchers to be more precise in terms of defining what exactly automaticity entails. Furthermore, one's answer to the automaticity question may also hinge on the answer to a second question: Namely, whether crossmodal correspondences are all 'of a kind', or whether instead there may be several different kinds of crossmodal mapping (e.g., statistical, structural, and semantic). Different answers to the automaticity question may then be revealed depending on the type of correspondence under consideration. We make a number of suggestions for future research that might help to determine just how automatic crossmodal correspondences really are. PMID:23370382

  17. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  18. Users manual for AUTOMESH-2D: A program of automatic mesh generation for two-dimensional scattering analysis by the finite element method

    NASA Technical Reports Server (NTRS)

    Hua, Chongyu; Volakis, John L.

    1990-01-01

    AUTOMESH-2D is a computer program specifically designed as a preprocessor for the scattering analysis of two dimensional bodies by the finite element method. This program was developed due to a need for reproducing the effort required to define and check the geometry data, element topology, and material properties. There are six modules in the program: (1) Parameter Specification; (2) Data Input; (3) Node Generation; (4) Element Generation; (5) Mesh Smoothing; and (5) Data File Generation.

  19. Auxiliary circuit enables automatic monitoring of EKG'S

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Auxiliary circuits allow direct, automatic monitoring of electrocardiograms by digital computers. One noiseless square-wave output signal for each trigger pulse from an electrocardiogram preamplifier is produced. The circuit also permits automatic processing of cardiovascular data from analog tapes.

  20. Automatisms: bridging clinical neurology with criminal law.

    PubMed

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms.