Science.gov

Sample records for automated change detection

  1. Automated change detection for synthetic aperture sonar

    NASA Astrophysics Data System (ADS)

    G-Michael, Tesfaye; Marchand, Bradley; Tucker, J. D.; Sternlicht, Daniel D.; Marston, Timothy M.; Azimi-Sadjadi, Mahmood R.

    2014-05-01

    In this paper, an automated change detection technique is presented that compares new and historical seafloor images created with sidescan synthetic aperture sonar (SAS) for changes occurring over time. The method consists of a four stage process: a coarse navigational alignment; fine-scale co-registration using the scale invariant feature transform (SIFT) algorithm to match features between overlapping images; sub-pixel co-registration to improves phase coherence; and finally, change detection utilizing canonical correlation analysis (CCA). The method was tested using data collected with a high-frequency SAS in a sandy shallow-water environment. By using precise co-registration tools and change detection algorithms, it is shown that the coherent nature of the SAS data can be exploited and utilized in this environment over time scales ranging from hours through several days.

  2. Automated baseline change detection phase I. Final report

    SciTech Connect

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  3. Classification, change-detection and accuracy assessment: Toward fuller automation

    NASA Astrophysics Data System (ADS)

    Podger, Nancy E.

    This research aims to automate methods for conducting change detection studies using remotely sensed images. Five major objectives were tested on two study sites, one encompassing Madison, Wisconsin, and the other Fort Hood, Texas. (Objective 1) Enhance accuracy assessments by estimating standard errors using bootstrap analysis. Bootstrap estimates of the standard errors were found to be comparable to parametric statistical estimates. Also, results show that bootstrapping can be used to evaluate the consistency of a classification process. (Objective 2) Automate the guided clustering classifier. This research shows that the guided clustering classification process can be automated while maintaining highly accurate results. Three different evaluation methods were used. (Evaluation 1) Appraised the consistency of 25 classifications produced from the automated system. The classifications differed from one another by only two to four percent. (Evaluation 2) Compared accuracies produced by the automated system to classification accuracies generated following a manual guided clustering protocol. Results: The automated system produced higher overall accuracies in 50 percent of the tests and was comparable for all but one of the remaining tests. (Evaluation 3) Assessed the time and effort required to produce accurate classifications. Results: The automated system produced classifications in less time and with less effort than the manual 'protocol' method. (Objective 3) Built a flexible, interactive software tool to aid in producing binary change masks. (Objective 4) Reduced by automation the amount of training data needed to classify the second image of a two-time-period change detection project. Locations of the training sites in 'unchanged' areas employed to classify the first image were used to identify sites where spectral information was automatically extracted from the second image. Results: The automatically generated training data produces classification accuracies

  4. Automated baseline change detection -- Phases 1 and 2. Final report

    SciTech Connect

    Byler, E.

    1997-10-31

    The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. The ABCD image processing software was installed on a robotic vehicle developed under a related DOE/FETC contract DE-AC21-92MC29112 Intelligent Mobile Sensor System (IMSS) and integrated with the electronics and software. This vehicle was designed especially to navigate in DOE Waste Storage Facilities. Initial system testing was performed at Fernald in June 1996. After some further development and more extensive integration the prototype integrated system was installed and tested at the Radioactive Waste Management Facility (RWMC) at INEEL beginning in April 1997 through the present (November 1997). The integrated system, composed of ABCD imaging software and IMSS mobility base, is called MISS EVE (Mobile Intelligent Sensor System--Environmental Validation Expert). Evaluation of the integrated system in RWMC Building 628, containing approximately 10,000 drums, demonstrated an easy to use system with the ability to properly navigate through the facility, image all the defined drums, and process the results into a report delivered to the operator on a GUI interface and on hard copy. Further work is needed to make the brassboard system more operationally robust.

  5. Information Foraging and Change Detection for Automated Science Exploration

    NASA Technical Reports Server (NTRS)

    Furlong, P. Michael; Dille, Michael

    2016-01-01

    This paper presents a new algorithm for autonomous on-line exploration in unknown environments. The objective is to free remote scientists from possibly-infeasible extensive preliminary site investigation prior to sending robotic agents. We simulate a common exploration task for an autonomous robot sampling the environment at various locations and compare performance against simpler control strategies. An extension is proposed and evaluated that further permits operation in the presence of environmental variability in which the robot encounters a change in the distribution underlying sampling targets. Experimental results indicate a strong improvement in performance across varied parameter choices for the scenario.

  6. SU-E-J-191: Automated Detection of Anatomic Changes in H'N Patients

    SciTech Connect

    Usynin, A; Ramsey, C

    2014-06-01

    Purpose: To develop a novel statistics-based method for automated detection of anatomical changes using cone-beam CT data. A method was developed that can provide a reliable and automated early warning system that enables a “just-in-time” adaptation of the treatment plan. Methods: Anatomical changes were evaluated by comparing the original treatment planning CT with daily CBCT images taken prior treatment delivery. The external body contour was computed on a given CT slice and compared against the corresponding contour on the daily CBCT. In contrast to threshold-based techniques, a statistical approach was employed to evaluate the difference between the contours using a given confidence level. The detection tool used the two-sample Kolmogorov-Smirnov test, which is a non-parametric technique that compares two samples drawn from arbitrary probability distributions. 11 H'N patients were retrospectively selected from a clinical imaging database with a total of 186 CBCT images. Six patients in the database were confirmed to have anatomic changes during the course of radiotherapy. Five of the H'N patients did not have significant changes. The KS test was applied to the contour data using a sliding window analysis. The confidence level of 0.99 was used to moderate false detection. Results: The algorithm was able to correctly detect anatomical changes in 6 out of 6 patients with an excellent spatial accuracy as early as at the 14th elapsed day. The algorithm provided a consistent and accurate delineation of the detected changes. The output of the anatomical change tool is easy interpretable, and can be shown overlaid on a 3D rendering of the patient's anatomy. Conclusion: The detection method provides the basis for one of the key components of Adaptive Radiation Therapy. The method uses tools that are readily available in the clinic, including daily CBCT imaging, and image co-registration facilities.

  7. Automated detection of slum area change in Hyderabad, India using multitemporal satellite imagery

    NASA Astrophysics Data System (ADS)

    Kit, Oleksandr; Lüdeke, Matthias

    2013-09-01

    This paper presents an approach to automated identification of slum area change patterns in Hyderabad, India, using multi-year and multi-sensor very high resolution satellite imagery. It relies upon a lacunarity-based slum detection algorithm, combined with Canny- and LSD-based imagery pre-processing routines. This method outputs plausible and spatially explicit slum locations for the whole urban agglomeration of Hyderabad in years 2003 and 2010. The results indicate a considerable growth of area occupied by slums between these years and allow identification of trends in slum development in this urban agglomeration.

  8. Automated detection of sperm whale sounds as a function of abrupt changes in sound intensity

    NASA Astrophysics Data System (ADS)

    Walker, Christopher D.; Rayborn, Grayson H.; Brack, Benjamin A.; Kuczaj, Stan A.; Paulos, Robin L.

    2003-04-01

    An algorithm designed to detect abrupt changes in sound intensity was developed and used to identify and count sperm whale vocalizations and to measure boat noise. The algorithm is a MATLAB routine that counts the number of occurrences for which the change in intensity level exceeds a threshold. The algorithm also permits the setting of a ``dead time'' interval to prevent the counting of multiple pulses within a single sperm whale click. This algorithm was used to analyze digitally sampled recordings of ambient noise obtained from the Gulf of Mexico using near bottom mounted EARS buoys deployed as part of the Littoral Acoustic Demonstration Center experiment. Because the background in these data varied slowly, the result of the application of the algorithm was automated detection of sperm whale clicks and creaks with results that agreed well with those obtained by trained human listeners. [Research supported by ONR.

  9. Eigenvector methods for automated detection of electrocardiographic changes in partial epileptic patients.

    PubMed

    Ubeyli, Elif Derya

    2009-07-01

    In this paper, the automated diagnostic systems trained on diverse and composite features were presented for detection of electrocardiographic changes in partial epileptic patients. In practical applications of pattern recognition, there are often diverse features extracted from raw data that require recognizing. Methods of combining multiple classifiers with diverse features are viewed as a general problem in various application areas of pattern recognition. Two types (normal and partial epilepsy) of ECG beats (180 records from each class) were obtained from the Physiobank database. The multilayer perceptron neural network (MLPNN), combined neural network (CNN), mixture of experts (ME), and modified mixture of experts (MME) were tested and benchmarked for their performance on the classification of the studied ECG signals, which were trained on diverse or composite features. Decision making was performed in two stages: feature extraction by eigenvector methods and classification using the classifiers trained on the extracted features. The present research demonstrated that the MME trained on the diverse features achieved accuracy rates (total classification accuracy is 99.44%) that were higher than that of the other automated diagnostic systems. PMID:19273021

  10. Tapping into the Hexagon spy imagery database: A new automated pipeline for geomorphic change detection

    NASA Astrophysics Data System (ADS)

    Maurer, Joshua; Rupper, Summer

    2015-10-01

    Declassified historical imagery from the Hexagon spy satellite database has near-global coverage, yet remains a largely untapped resource for geomorphic change studies. Unavailable satellite ephemeris data make DEM (digital elevation model) extraction difficult in terms of time and accuracy. A new fully-automated pipeline for DEM extraction and image orthorectification is presented which yields accurate results and greatly increases efficiency over traditional photogrammetric methods, making the Hexagon image database much more appealing and accessible. A 1980 Hexagon DEM is extracted and geomorphic change computed for the Thistle Creek Landslide region in the Wasatch Range of North America to demonstrate an application of the new method. Surface elevation changes resulting from the landslide show an average elevation decrease of 14.4 ± 4.3 m in the source area, an increase of 17.6 ± 4.7 m in the deposition area, and a decrease of 30.2 ± 5.1 m resulting from a new roadcut. Two additional applications of the method include volume estimates of material excavated during the Mount St. Helens volcanic eruption and the volume of net ice loss over a 34-year period for glaciers in the Bhutanese Himalayas. These results show the value of Hexagon imagery in detecting and quantifying historical geomorphic change, especially in regions where other data sources are limited.

  11. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    NASA Astrophysics Data System (ADS)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of

  12. Automated segmentation algorithm for detection of changes in vaginal epithelial morphology using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Chitchian, Shahab; Vincent, Kathleen L.; Vargas, Gracie; Motamedi, Massoud

    2012-11-01

    We have explored the use of optical coherence tomography (OCT) as a noninvasive tool for assessing the toxicity of topical microbicides, products used to prevent HIV, by monitoring the integrity of the vaginal epithelium. A novel feature-based segmentation algorithm using a nearest-neighbor classifier was developed to monitor changes in the morphology of vaginal epithelium. The two-step automated algorithm yielded OCT images with a clearly defined epithelial layer, enabling differentiation of normal and damaged tissue. The algorithm was robust in that it was able to discriminate the epithelial layer from underlying stroma as well as residual microbicide product on the surface. This segmentation technique for OCT images has the potential to be readily adaptable to the clinical setting for noninvasively defining the boundaries of the epithelium, enabling quantifiable assessment of microbicide-induced damage in vaginal tissue.

  13. Automated urban change detection using scanned cartographic and satellite image data

    USGS Publications Warehouse

    Spooner, Jeffrey D.

    1991-01-01

    The objective of this study was to develop a digital procedure to measure the amount of urban change that has occurred in an area since the publication of its corresponding 1:24,000-scale topographic map. Traditional change detection techniques are dependent upon the visual comparison of high-altitude aerial photographs or, more recently, satellite image data to a corresponding map. Analytical change detection techniques typically involve the digital comparison of satellite images to one another. As a result of this investigation, a new technique has been developed that analytically compares the most recently published map to a corresponding digital satellite image. Scanned cartographic and satellite image data are combined in a single file with a structural component derived from the satellite image. This investigation determined that with this combination of data the spectral characteristics of urban change are predictable. A supervised classification was used to detect and delimit urban change. Although it was not intended to identify the specific nature of any change, this procedure does provide a means of differentiating between areas that have or have not experienced urbanization to determine appropriate map revision strategies.

  14. Automated seizure detection using EKG.

    PubMed

    Osorio, Ivan

    2014-03-01

    Changes in heart rate, most often increases, are associated with the onset of epileptic seizures and may be used in lieu of cortical activity for automated seizure detection. The feasibility of this aim was tested on 241 clinical seizures from 81 subjects admitted to several Epilepsy Centers for invasive monitoring for evaluation for epilepsy surgery. The performance of the EKG-based seizure detection algorithm was compared to that of a validated algorithm applied to electrocorticogram (ECoG). With the most sensitive detection settings [threshold T: 1.15; duration D: 0 s], 5/241 seizures (2%) were undetected (false negatives) and with the highest [T: 1.3; D: 5 s] settings, the number of false negative detections rose to 34 (14%). The rate of potential false positive (PFP) detections was 9.5/h with the lowest and 1.1/h with the highest T, D settings. Visual review of 336 ECoG segments associated with PFPs revealed that 120 (36%) were associated with seizures, 127 (38%) with bursts of epileptiform discharges and only 87 (26%) were true false positives. Electrocardiographic (EKG)-based seizure onset detection preceded clinical onset by 0.8 s with the lowest and followed it by 13.8 s with the highest T, D settings. Automated EKG-based seizure detection is feasible and has potential clinical utility given its ease of acquisition, processing, high signal/noise and ergonomic advantages viz-a-viz EEG (electroencephalogram) or ECoG. Its use as an "electronic" seizure diary will remedy in part, the inaccuracies of those generated by patients/care-givers in a cost-effective manner.

  15. Automated thematic mapping and change detection of ERTS-1 images. [Phoenix, Arizona

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N.

    1974-01-01

    Results of an automated thematic mapping investigation using ERTS-1 MSS images are presented. A diffraction pattern analysis of MSS images led to the development of spatial signatures for farm land, urban areas, and mountains. Four spatial features are employed to describe the spatial characteristics of image cells in the digital data. Three spectral features are combined with the spatial features to form a seven dimensional vector describing each cell. Then, the classification of the feature vectors is accomplished by using the maximum likelihood criterion. Three ERTS-1 images from the Phoenix, Arizona area were processed, and recognition rates between 85% and 100% were obtained for the terrain classes of desert, farms, mountains and urban areas. To eliminate the need for training data, a new clustering algorithm has also been developed.

  16. LANDSAT image differencing as an automated land cover change detection technique

    NASA Technical Reports Server (NTRS)

    Stauffer, M. L.; Mckinney, R. L.

    1978-01-01

    Image differencing was investigated as a technique for use with LANDSAT digital data to delineate areas of land cover change in an urban environment. LANDSAT data collected in April 1973 and April 1975 for Austin, Texas, were geometrically corrected and precisely registered to United States Geological Survey 7.5-minute quadrangle maps. At each pixel location reflectance values for the corresponding bands were subtracted to produce four difference images. Areas of major reflectance differences are isolated by thresholding each of the difference images. The resulting images are combined to obtain an image data set to total change. These areas of reflectance differences were found, in general, to correspond to areas of land cover change. Information on areas of land cover change was incorporated into a procedure to mask out all nonchange areas and perform an unsupervised classification only for data in the change areas. This procedure identified three broad categories: (1) areas of high reflectance (construction or extractive), (2) changes in agricultural areas, and (3) areas of confusion between agricultural and other areas.

  17. Use of an automated digital images system for detecting plant status changes in response to climate change manipulations

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo

    2014-05-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. However, traditional monitoring of phenology is labor intensive and costly and affected to a certain degree of subjective inaccuracy. Other methods used to quantify the seasonal patterns of vegetation development are based on satellite remote sensing (land surface phenology) but they operate at coarse spatial and temporal resolution. To overcome the issues of these methodologies different approaches for vegetation monitoring based on "near-surface" remote sensing have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. Canopy green-up phenology can be quantified from the greenness indices. Species-specific dates of leaf emergence can be estimated by RGB image analyses. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted continuously since October 2012. The system was set to acquire one panorama per day at noon which included three experimental plots for

  18. An automated process for deceit detection

    NASA Astrophysics Data System (ADS)

    Nwogu, Ifeoma; Frank, Mark; Govindaraju, Venu

    2010-04-01

    In this paper we present a prototype for an automated deception detection system. Similar to polygraph examinations, we attempt to take advantage of the theory that false answers will produce distinctive measurements in certain physiological manifestations. We investigate the role of dynamic eye-based features such as eye closure/blinking and lateral movements of the iris in detecting deceit. The features are recorded both when the test subjects are having non-threatening conversations as well as when they are being interrogated about a crime they might have committed. The rates of the behavioral changes are blindly clustered into two groups. Examining the clusters and their characteristics, we observe that the dynamic features selected for deception detection show promising results with an overall deceptive/non-deceptive prediction rate of 71.43% from a study consisting of 28 subjects.

  19. Satellite mapping and automated feature extraction: Geographic information system-based change detection of the Antarctic coast

    NASA Astrophysics Data System (ADS)

    Kim, Kee-Tae

    Declassified Intelligence Satellite Photograph (DISP) data are important resources for measuring the geometry of the coastline of Antarctica. By using the state-of-art digital imaging technology, bundle block triangulation based on tie points and control points derived from a RADARSAT-1 Synthetic Aperture Radar (SAR) image mosaic and Ohio State University (OSU) Antarctic digital elevation model (DEM), the individual DISP images were accurately assembled into a map quality mosaic of Antarctica as it appeared in 1963. The new map is one of important benchmarks for gauging the response of the Antarctic coastline to changing climate. Automated coastline extraction algorithm design is the second theme of this dissertation. At the pre-processing stage, an adaptive neighborhood filtering was used to remove the film-grain noise while preserving edge features. At the segmentation stage, an adaptive Bayesian approach to image segmentation was used to split the DISP imagery into its homogenous regions, in which the fuzzy c-means clustering (FCM) technique and Gibbs random field (GRF) model were introduced to estimate the conditional and prior probability density functions. A Gaussian mixture model was used to estimate the reliable initial values for the FCM technique. At the post-processing stage, image object formation and labeling, removal of noisy image objects, and vectorization algorithms were sequentially applied to segmented images for extracting a vector representation of coastlines. Results were presented that demonstrate the effectiveness of the algorithm in segmenting the DISP data. In the cases of cloud cover and little contrast scenes, manual editing was carried out based on intermediate image processing and visual inspection in comparison of old paper maps. Through a geographic information system (GIS), the derived DISP coastline data were integrated with earlier and later data to assess continental scale changes in the Antarctic coast. Computing the area of

  20. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  1. Automated detection of bacteria in urine

    NASA Technical Reports Server (NTRS)

    Fleig, A. J.; Picciolo, G. L.; Chappelle, E. W.; Kelbaugh, B. N.

    1972-01-01

    A method for detecting the presence of bacteria in urine was developed which utilizes the bioluminescent reaction of adenosine triphosphate with luciferin and luciferase derived from the tails of fireflies. The method was derived from work on extraterrestrial life detection. A device was developed which completely automates the assay process.

  2. An Automated Flying-Insect-Detection System

    NASA Technical Reports Server (NTRS)

    Vann, Timi; Andrews, Jane C.; Howell, Dane; Ryan, Robert

    2005-01-01

    An automated flying-insect-detection system (AFIDS) was developed as a proof-of-concept instrument for real-time detection and identification of flying insects. This type of system has use in public health and homeland security decision support, agriculture and military pest management, and/or entomological research. Insects are first lured into the AFIDS integrated sphere by insect attractants. Once inside the sphere, the insect's wing beats cause alterations in light intensity that is detected by a photoelectric sensor. Following detection, the insects are encouraged (with the use of a small fan) to move out of the sphere and into a designated insect trap where they are held for taxonomic identification or serological testing. The acquired electronic wing beat signatures are preprocessed (Fourier transformed) in real-time to display a periodic signal. These signals are sent to the end user where they are graphically displayed. All AFIDS data are pre-processed in the field with the use of a laptop computer equipped with LABVIEW. The AFIDS software can be programmed to run continuously or at specific time intervals when insects are prevalent. A special DC-restored transimpedance amplifier reduces the contributions of low-frequency background light signals, and affords approximately two orders of magnitude greater AC gain than conventional amplifiers. This greatly increases the signal-to-noise ratio and enables the detection of small changes in light intensity. The AFIDS light source consists of high-intensity Al GaInP light-emitting diodes (LEDs). The AFIDS circuitry minimizes brightness fluctuations in the LEDs and when integrated with an integrating sphere, creates a diffuse uniform light field. The insect wing beats isotropically scatter the diffuse light in the sphere and create wing beat signatures that are detected by the sensor. This configuration minimizes variations in signal associated with insect flight orientation.

  3. Automated Methods for Multiplexed Pathogen Detection

    SciTech Connect

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.; Valdez, Catherine O.; Shutthanandan, Janani I.; Tarasevich, Barbara J.; Grate, Jay W.; Bruckner-Lea, Cindy J.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However

  4. Imaging flow cytometry for automated detection of hypoxia-induced erythrocyte shape change in sickle cell disease.

    PubMed

    van Beers, Eduard J; Samsel, Leigh; Mendelsohn, Laurel; Saiyed, Rehan; Fertrin, Kleber Y; Brantner, Christine A; Daniels, Mathew P; Nichols, James; McCoy, J Philip; Kato, Gregory J

    2014-06-01

    In preclinical and early phase pharmacologic trials in sickle cell disease, the percentage of sickled erythrocytes after deoxygenation, an ex vivo functional sickling assay, has been used as a measure of a patient's disease outcome. We developed a new sickle imaging flow cytometry assay (SIFCA) and investigated its application. To perform the SIFCA, peripheral blood was diluted, deoxygenated (2% oxygen) for 2 hr, fixed, and analyzed using imaging flow cytometry. We developed a software algorithm that correctly classified investigator tagged "sickled" and "normal" erythrocyte morphology with a sensitivity of 100% and a specificity of 99.1%. The percentage of sickled cells as measured by SIFCA correlated strongly with the percentage of sickle cell anemia blood in experimentally admixed samples (R = 0.98, P ≤ 0.001), negatively with fetal hemoglobin (HbF) levels (R = -0.558, P = 0.027), negatively with pH (R = -0.688, P = 0.026), negatively with pretreatment with the antisickling agent, Aes-103 (5-hydroxymethyl-2-furfural) (R = -0.766, P = 0.002), and positively with the presence of long intracellular fibers as visualized by transmission electron microscopy (R = 0.799, P = 0.002). This study shows proof of principle that the automated, operator-independent SIFCA is associated with predictable physiologic and clinical parameters and is altered by the putative antisickling agent, Aes-103. SIFCA is a new method that may be useful in sickle cell drug development. PMID:24585634

  5. Imaging flow cytometry for automated detection of hypoxia-induced erythrocyte shape change in sickle cell disease

    PubMed Central

    van Beers, Eduard J.; Samsel, Leigh; Mendelsohn, Laurel; Saiyed, Rehan; Fertrin, Kleber Y.; Brantner, Christine A.; Daniels, Mathew P.; Nichols, James; McCoy, J. Philip; Kato, Gregory J.

    2014-01-01

    In preclinical and early phase pharmacologic trials in sickle cell disease, the percentage of sickled erythrocytes after deoxygenation, an ex vivo functional sickling assay, has been used as a measure of a patient’s disease outcome. We developed a new sickle imaging flow cytometry assay (SIFCA) and investigated its application. To perform the SIFCA, peripheral blood was diluted, deoxygenated (2% oxygen) for 2 hr, fixed, and analyzed using imaging flow cytometry. We developed a software algorithm that correctly classified investigator tagged “sickled” and “normal” erythrocyte morphology with a sensitivity of 100% and a specificity of 99.1%. The percentage of sickled cells as measured by SIFCA correlated strongly with the percentage of sickle cell anemia blood in experimentally admixed samples (R = 0.98, P ≤ 0.001), negatively with fetal hemoglobin (HbF) levels (R = −0.558, P = 0.027), negatively with pH (R = −0.688, P = 0.026), negatively with pretreatment with the antisickling agent, Aes-103 (5-hydroxymethyl-2-furfural) (R = −0.766, P = 0.002), and positively with the presence of long intracellular fibers as visualized by transmission electron microscopy (R = 0.799, P = 0.002). This study shows proof of principle that the automated, operator-independent SIFCA is associated with predictable physiologic and clinical parameters and is altered by the putative antisickling agent, Aes-103. SIFCA is a new method that may be useful in sickle cell drug development. PMID:24585634

  6. An Automated Flying-Insect Detection System

    NASA Technical Reports Server (NTRS)

    Vann, Timi; Andrews, Jane C.; Howell, Dane; Ryan, Robert

    2007-01-01

    An automated flying-insect detection system (AFIDS) was developed as a proof-of-concept instrument for real-time detection and identification of flying insects. This type of system has use in public health and homeland-security decision support, agriculture and military pest management, and/or entomological research. Insects are first lured into the AFIDS integrated sphere by insect attractants. Once inside the sphere, the insect s wing beats cause alterations in light intensity that is detected by a photoelectric sensor. Following detection, the insects are encouraged (with the use of a small fan) to move out of the sphere and into a designated insect trap where they are held for taxonomic identification or serological testing. The acquired electronic wing-beat signatures are preprocessed (Fourier transformed) in real time to display a periodic signal. These signals are sent to the end user where they are graphically. All AFIDS data are preprocessed in the field with the use of a laptop computer equipped with LabVIEW. The AFIDS software can be programmed to run continuously or at specific time intervals when insects are prevalent. A special DC-restored transimpedance amplifier reduces the contributions of low-frequency background light signals, and affords approximately two orders of magnitude greater AC gain than conventional amplifiers. This greatly increases the signal-to-noise ratio and enables the detection of small changes in light intensity. The AFIDS light source consists of high-intensity Al-GaInP light-emitting diodes (LEDs). The AFIDS circuitry minimizes brightness fluctuations in the LEDs and when integrated with an integrating sphere, creates a diffuse uniform light field. The insect wing beats isotropically scatter the diffuse light in the sphere and create wing-beat signatures that are detected by the sensor. This configuration minimizes variations in signal associated with insect flight orientation. Preliminary data indicate that AFIDS has

  7. Photoelectric detection system. [manufacturing automation

    NASA Technical Reports Server (NTRS)

    Currie, J. R.; Schansman, R. R. (Inventor)

    1982-01-01

    A photoelectric beam system for the detection of the arrival of an object at a discrete station wherein artificial light, natural light, or no light may be present is described. A signal generator turns on and off a signal light at a selected frequency. When the object in question arrives on station, ambient light is blocked by the object, and the light from the signal light is reflected onto a photoelectric sensor which has a delayed electrical output but is of the frequency of the signal light. Outputs from both the signal source and the photoelectric sensor are fed to inputs of an exclusively OR detector which provides as an output the difference between them. The difference signal is a small width pulse occurring at the frequency of the signal source. By filter means, this signal is distinguished from those responsive to sunlight, darkness, or 120 Hz artificial light. In this fashion, the presence of an object is positively established.

  8. Automated macromolecular crystal detection system and method

    DOEpatents

    Christian, Allen T.; Segelke, Brent; Rupp, Bernard; Toppani, Dominique

    2007-06-05

    An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.

  9. Automated Wildfire Detection Through Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Miller, Jerry; Borne, Kirk; Thomas, Brian; Huang, Zhenping; Chi, Yuechen

    2005-01-01

    We have tested and deployed Artificial Neural Network (ANN) data mining techniques to analyze remotely sensed multi-channel imaging data from MODIS, GOES, and AVHRR. The goal is to train the ANN to learn the signatures of wildfires in remotely sensed data in order to automate the detection process. We train the ANN using the set of human-detected wildfires in the U.S., which are provided by the Hazard Mapping System (HMS) wildfire detection group at NOAA/NESDIS. The ANN is trained to mimic the behavior of fire detection algorithms and the subjective decision- making by N O M HMS Fire Analysts. We use a local extremum search in order to isolate fire pixels, and then we extract a 7x7 pixel array around that location in 3 spectral channels. The corresponding 147 pixel values are used to populate a 147-dimensional input vector that is fed into the ANN. The ANN accuracy is tested and overfitting is avoided by using a subset of the training data that is set aside as a test data set. We have achieved an automated fire detection accuracy of 80-92%, depending on a variety of ANN parameters and for different instrument channels among the 3 satellites. We believe that this system can be deployed worldwide or for any region to detect wildfires automatically in satellite imagery of those regions. These detections can ultimately be used to provide thermal inputs to climate models.

  10. Automated assistance for detecting malicious code

    SciTech Connect

    Crawford, R.; Kerchen, P.; Levitt, K.; Olsson, R.; Archer, M.; Casillas, M.

    1993-06-18

    This paper gives an update on the continuing work on the Malicious Code Testbed (MCT). The MCT is a semi-automated tool, operating in a simulated, cleanroom environment, that is capable of detecting many types of malicious code, such as viruses, Trojan horses, and time/logic bombs. The MCT allows security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict.

  11. Automated target detection from compressive measurements

    NASA Astrophysics Data System (ADS)

    Shilling, Richard Z.; Muise, Robert R.

    2016-04-01

    A novel compressive imaging model is proposed that multiplexes segments of the field of view onto an infrared focal plane array (FPA). Similar to the compound eyes of insects, our imaging model is based on combining pixels from a surface comprising of different parts of the field of view (FOV). We formalize this superposition of pixels in a global multiplexing process reducing the resolution requirements of the FPA. We then apply automated target detection algorithms directed on the measurements of this model in a typical missile seeker scene. Based on quadratic correlation filters, we extend the target training and detection processes directly using these encoded measurements. Preliminary results are promising.

  12. Automated Wildfire Detection Through Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Miller, Jerry; Borne, Kirk; Thomas, Brian; Huang, Zhenping; Chi, Yuechen

    2005-01-01

    Wildfires have a profound impact upon the biosphere and our society in general. They cause loss of life, destruction of personal property and natural resources and alter the chemistry of the atmosphere. In response to the concern over the consequences of wildland fire and to support the fire management community, the National Oceanic and Atmospheric Administration (NOAA), National Environmental Satellite, Data and Information Service (NESDIS) located in Camp Springs, Maryland gradually developed an operational system to routinely monitor wildland fire by satellite observations. The Hazard Mapping System, as it is known today, allows a team of trained fire analysts to examine and integrate, on a daily basis, remote sensing data from Geostationary Operational Environmental Satellite (GOES), Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS) satellite sensors and generate a 24 hour fire product for the conterminous United States. Although assisted by automated fire detection algorithms, N O M has not been able to eliminate the human element from their fire detection procedures. As a consequence, the manually intensive effort has prevented NOAA from transitioning to a global fire product as urged particularly by climate modelers. NASA at Goddard Space Flight Center in Greenbelt, Maryland is helping N O M more fully automate the Hazard Mapping System by training neural networks to mimic the decision-making process of the frre analyst team as well as the automated algorithms.

  13. Automated DNA electrophoresis, hybridization and detection

    SciTech Connect

    Zapolski, E.J.; Gersten, D.M.; Golab, T.J.; Ledley, R.S.

    1986-05-01

    A fully automated, computer controlled system for nucleic acid hybridization analysis has been devised and constructed. In practice, DNA is digested with restriction endonuclease enzyme(s) and loaded into the system by pipette; /sup 32/P-labelled nucleic acid probe(s) is loaded into the nine hybridization chambers. Instructions for all the steps in the automated process are specified by answering questions that appear on the computer screen at the start of the experiment. Subsequent steps are performed automatically. The system performs horizontal electrophoresis in agarose gel, fixed the fragments to a solid phase matrix, denatures, neutralizes, prehybridizes, hybridizes, washes, dries and detects the radioactivity according to the specifications given by the operator. The results, printed out at the end, give the positions on the matrix to which radioactivity remains hybridized following stringent washing.

  14. Automated Detection of HONcode Website Conformity Compared to Manual Detection: An Evaluation

    PubMed Central

    2015-01-01

    of at least 75%, with a recall of more than 50% for contact details (100% precision, 69% recall), authority (85% precision, 52% recall), and reference (75% precision, 56% recall). The results also revealed issues for some criteria such as date. Changing the “document” definition (ie, using the sentence instead of whole document as a unit of classification) within the automated system resolved some but not all of them. Conclusions Study results indicate concordance between automated and expert manual compliance detection for authority, privacy, reference, and contact details. Results also indicate that using the same general parameters for automated detection of each criterion produces suboptimal results. Future work to configure optimal system parameters for each HONcode principle would improve results. The potential utility of integrating automated detection of HONcode conformity into future search engines is also discussed. PMID:26036669

  15. Computing and Office Automation: Changing Variables.

    ERIC Educational Resources Information Center

    Staman, E. Michael

    1981-01-01

    Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…

  16. Sunglint Detection for Unmanned and Automated Platforms

    PubMed Central

    Garaba, Shungudzemwoyo Pascal; Schulz, Jan; Wernand, Marcel Robert; Zielinski, Oliver

    2012-01-01

    We present an empirical quality control protocol for above-water radiometric sampling focussing on identifying sunglint situations. Using hyperspectral radiometers, measurements were taken on an automated and unmanned seaborne platform in northwest European shelf seas. In parallel, a camera system was used to capture sea surface and sky images of the investigated points. The quality control consists of meteorological flags, to mask dusk, dawn, precipitation and low light conditions, utilizing incoming solar irradiance (ES) spectra. Using 629 from a total of 3,121 spectral measurements that passed the test conditions of the meteorological flagging, a new sunglint flag was developed. To predict sunglint conspicuous in the simultaneously available sea surface images a sunglint image detection algorithm was developed and implemented. Applying this algorithm, two sets of data, one with (having too much or detectable white pixels or sunglint) and one without sunglint (having least visible/detectable white pixel or sunglint), were derived. To identify the most effective sunglint flagging criteria we evaluated the spectral characteristics of these two data sets using water leaving radiance (LW) and remote sensing reflectance (RRS). Spectral conditions satisfying ‘mean LW (700–950 nm) < 2 mW·m−2·nm−1·Sr−1’ or alternatively ‘minimum RRS (700–950 nm) < 0.010 Sr−1’, mask most measurements affected by sunglint, providing an efficient empirical flagging of sunglint in automated quality control.

  17. Automated detection of Antarctic blue whale calls.

    PubMed

    Socheleau, Francois-Xavier; Leroy, Emmanuelle; Pecci, Andres Carvallo; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2015-11-01

    This paper addresses the problem of automated detection of Z-calls emitted by Antarctic blue whales (B. m. intermedia). The proposed solution is based on a subspace detector of sigmoidal-frequency signals with unknown time-varying amplitude. This detection strategy takes into account frequency variations of blue whale calls as well as the presence of other transient sounds that can interfere with Z-calls (such as airguns or other whale calls). The proposed method has been tested on more than 105 h of acoustic data containing about 2200 Z-calls (as found by an experienced human operator). This method is shown to have a correct-detection rate of up to more than 15% better than the extensible bioacoustic tool package, a spectrogram-based correlation detector commonly used to study blue whales. Because the proposed method relies on subspace detection, it does not suffer from some drawbacks of correlation-based detectors. In particular, it does not require the choice of an a priori fixed and subjective template. The analytic expression of the detection performance is also derived, which provides crucial information for higher level analyses such as animal density estimation from acoustic data. Finally, the detection threshold automatically adapts to the soundscape in order not to violate a user-specified false alarm rate. PMID:26627784

  18. Automated detection of Antarctic blue whale calls.

    PubMed

    Socheleau, Francois-Xavier; Leroy, Emmanuelle; Pecci, Andres Carvallo; Samaran, Flore; Bonnel, Julien; Royer, Jean-Yves

    2015-11-01

    This paper addresses the problem of automated detection of Z-calls emitted by Antarctic blue whales (B. m. intermedia). The proposed solution is based on a subspace detector of sigmoidal-frequency signals with unknown time-varying amplitude. This detection strategy takes into account frequency variations of blue whale calls as well as the presence of other transient sounds that can interfere with Z-calls (such as airguns or other whale calls). The proposed method has been tested on more than 105 h of acoustic data containing about 2200 Z-calls (as found by an experienced human operator). This method is shown to have a correct-detection rate of up to more than 15% better than the extensible bioacoustic tool package, a spectrogram-based correlation detector commonly used to study blue whales. Because the proposed method relies on subspace detection, it does not suffer from some drawbacks of correlation-based detectors. In particular, it does not require the choice of an a priori fixed and subjective template. The analytic expression of the detection performance is also derived, which provides crucial information for higher level analyses such as animal density estimation from acoustic data. Finally, the detection threshold automatically adapts to the soundscape in order not to violate a user-specified false alarm rate.

  19. Automated Detection of Clouds in Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary

    2010-01-01

    Many different approaches have been used to automatically detect clouds in satellite imagery. Most approaches are deterministic and provide a binary cloud - no cloud product used in a variety of applications. Some of these applications require the identification of cloudy pixels for cloud parameter retrieval, while others require only an ability to mask out clouds for the retrieval of surface or atmospheric parameters in the absence of clouds. A few approaches estimate a probability of the presence of a cloud at each point in an image. These probabilities allow a user to select cloud information based on the tolerance of the application to uncertainty in the estimate. Many automated cloud detection techniques develop sophisticated tests using a combination of visible and infrared channels to determine the presence of clouds in both day and night imagery. Visible channels are quite effective in detecting clouds during the day, as long as test thresholds properly account for variations in surface features and atmospheric scattering. Cloud detection at night is more challenging, since only courser resolution infrared measurements are available. A few schemes use just two infrared channels for day and night cloud detection. The most influential factor in the success of a particular technique is the determination of the thresholds for each cloud test. The techniques which perform the best usually have thresholds that are varied based on the geographic region, time of year, time of day and solar angle.

  20. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  1. Automated Hydrogen Gas Leak Detection System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Gencorp Aerojet Automated Hydrogen Gas Leak Detection System was developed through the cooperation of industry, academia, and the Government. Although the original purpose of the system was to detect leaks in the main engine of the space shuttle while on the launch pad, it also has significant commercial potential in applications for which there are no existing commercial systems. With high sensitivity, the system can detect hydrogen leaks at low concentrations in inert environments. The sensors are integrated with hardware and software to form a complete system. Several of these systems have already been purchased for use on the Ford Motor Company assembly line for natural gas vehicles. This system to detect trace hydrogen gas leaks from pressurized systems consists of a microprocessor-based control unit that operates a network of sensors. The sensors can be deployed around pipes, connectors, flanges, and tanks of pressurized systems where leaks may occur. The control unit monitors the sensors and provides the operator with a visual representation of the magnitude and locations of the leak as a function of time. The system can be customized to fit the user's needs; for example, it can monitor and display the condition of the flanges and fittings associated with the tank of a natural gas vehicle.

  2. A semi-automated system for the assessment of toxicity to cultured mammalian cells based on detection of changes in staining properties.

    PubMed

    Barer, M R; Mann, G F; Drasar, B S

    1986-01-01

    We have established a semi-automated microtiter-based system for the quantification of dye binding to cultured eukaryotic cells. This system has been applied to the quantitation of toxic activities that disrupt cell monolayers and their neutralization. We have used this background as a basis for developing a detection and characterization system for activities that do not cause such gross toxicity. A prototype system has been established based on three staining procedures which in broad terms assess cellular dehydrogenase activity, and protein, DNA, and RNA content. The activity of several agents affecting cyclic nucleotide metabolism, including cholera toxin, on the staining properties of exposed monolayers has been assessed. Several new categories of cellular response are readily discernible in this latter system indicating that biological activities may be identified on the basis of the pattern of such responses. Since microtiter based systems show considerable potential for automation, it is suggested that the further development of this approach could offer a realistic prospect for numerous forms of toxicity testing on an industrial scale.

  3. Automated detection of Karnal bunt teliospores

    SciTech Connect

    Linder, K.D.; Baumgart, C.; Creager, J.; Heinen, B.; Troupe, T.; Meyer, D.; Carr, J.; Quint, J.

    1998-02-01

    Karnal bunt is a fungal disease which infects wheat and, when present in wheat crops, yields it unsatisfactory for human consumption. Due to the fact that Karnal bunt (KB) is difficult to detect in the field, samples are taken to laboratories where technicians use microscopes and methodically search for KB teliospores. AlliedSignal Federal Manufacturing and Technologies (FM and T), working with the Kansas Department of Agriculture, created a system which utilizes pattern recognition, feature extraction, and neural networks to prototype an automated detection system for identifying KB teliospores. System hardware consists of a biological compound microscope, motorized stage, CCD camera, frame grabber, and a PC. Integration of the system hardware with custom software comprises the machine vision system. Fundamental processing steps involve capturing an image from the slide, while concurrently processing the previous image. Features extracted from the acquired imagery are then processed by a neural network classifier which has been trained to recognize spore-like objects. Images with spore-like objects are reviewed by trained technicians. Benefits of this system include: (1) reduction of the overall cycle-time; (2) utilization of technicians for intelligent decision making (vs. manual searching); (3) a regulatory standard which is quantifiable and repeatable; (4) guaranteed 100% coverage of the cover slip; and (5) significantly enhanced detection accuracy.

  4. Automated Detection of Activity Transitions for Prompting

    PubMed Central

    Feuz, Kyle D.; Cook, Diane J.; Rosasco, Cody; Robertson, Kayela; Schmitter-Edgecombe, Maureen

    2016-01-01

    Individuals with cognitive impairment can benefit from intervention strategies like recording important information in a memory notebook. However, training individuals to use the notebook on a regular basis requires a constant delivery of reminders. In this work, we design and evaluate machine learning-based methods for providing automated reminders using a digital memory notebook interface. Specifically, we identify transition periods between activities as times to issue prompts. We consider the problem of detecting activity transitions using supervised and unsupervised machine learning techniques, and find that both techniques show promising results for detecting transition periods. We test the techniques in a scripted setting with 15 individuals. Motion sensors data is recorded and annotated as participants perform a fixed set of activities. We also test the techniques in an unscripted setting with 8 individuals. Motion sensor data is recorded as participants go about their normal daily routine. In both the scripted and unscripted settings a true positive rate of greater than 80% can be achieved while maintaining a false positive rate of less than 15%. On average, this leads to transitions being detected within 1 minute of a true transition for the scripted data and within 2 minutes of a true transition on the unscripted data. PMID:27019791

  5. Automated detection of glaucoma using structural and non structural features.

    PubMed

    Salam, Anum A; Khalil, Tehmina; Akram, M Usman; Jameel, Amina; Basit, Imran

    2016-01-01

    Glaucoma is a chronic disease often called "silent thief of sight" as it has no symptoms and if not detected at an early stage it may cause permanent blindness. Glaucoma progression precedes some structural changes in the retina which aid ophthalmologists to detect glaucoma at an early stage and stop its progression. Fundoscopy is among one of the biomedical imaging techniques to analyze the internal structure of retina. Our proposed technique provides a novel algorithm to detect glaucoma from digital fundus image using a hybrid feature set. This paper proposes a novel combination of structural (cup to disc ratio) and non-structural (texture and intensity) features to improve the accuracy of automated diagnosis of glaucoma. The proposed method introduces a suspect class in automated diagnosis in case of any conflict in decision from structural and non-structural features. The evaluation of proposed algorithm is performed using a local database containing fundus images from 100 patients. This system is designed to refer glaucoma cases from rural areas to specialists and the motivation behind introducing suspect class is to ensure high sensitivity of proposed system. The average sensitivity and specificity of proposed system are 100 and 87 % respectively. PMID:27652092

  6. Automated System for Early Breast Cancer Detection in Mammograms

    NASA Technical Reports Server (NTRS)

    Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.

    1993-01-01

    The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.

  7. Automated detection and location of structural degradation

    SciTech Connect

    Damiano, B.; Blakeman, E.D.; Phillips, L.D.

    1997-03-01

    The investigation of a diagnostic method for detecting and locating the source of structural degradation in mechanical systems is described in this paper. The diagnostic method uses a mathematical model of the mechanical system to define relationships between system parameters, such as spring rates and damping rates, and measurable spectral features, such as natural frequencies and mode shapes. These model-defined relationships are incorporated into a neural network, which is used to relate measured spectral features to system parameters. The diagnosis of the system`s condition is performed by presenting the neural network with measured spectral features and comparing the system parameters estimated by the neural network to previously estimated values. Changes in the estimated system parameters indicate the location and severity of degradation in the mechanical system. The investigation applied the method by using computer-simulated data and data collected form a bench-top mechanical system. The effects of neural network training set size and composition on the accuracy of the model parameter estimates were investigated by using computer simulated data. The results show that diagnostic method can be applied to successfully locate and estimate the magnitude of structural changes in a mechanical system. The average error in the estimated spring rate values of the bench-top mechanical system was less than 10%. This degree of accuracy is sufficient to permit the use of this method for detecting and locating structural degradation in mechanical systems.

  8. Automated Detection of Opaque Volcanic Plumes in Polar Satellite Data

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.

    2013-12-01

    Response to an explosive volcanic eruption is time sensitive, so automated eruption detection techniques are essential to minimize alert times after an event. Automated detection of volcanic ash plumes in satellite imagery is usually done using a variant of the split-window or reverse-absorption method. This method is often effective but requires among other things that an ash plume be translucent to allow thermal radiation to pass through it. In the critical first hour or two of an eruption, plumes are most often opaque, and therefore cannot be detected by this method. It has been shown that an emergent plume appears as a sudden cold cloud over a volcano where a weather system should not appear, and this has been applied to geostationary data that is acquired every 15 to 30 minutes and will be an integral part of the upcoming geostationary mission, GOES-R. In this study this concept is used on time sequential polar orbiting satellite data to detect emergent plumes. This augments geostationary data, and may detect smaller plumes at higher latitudes where geostationary data suffers from poorer spatial resolution. A series of weighted credits and demerits are used to determine the presence of an anomalously cold cloud over a volcano in time sequential advanced very high resolution radiometer (AVHRR) data. Parameters such as coldest thermal infrared temperature, time between images, ratio of cold to background temperature, and temperature trend are assigned a weighted value and a threshold used to determine the presence of an anomalous cloud. The weighting and threshold is unique for each volcano due to weather conditions and satellite coverage. Using the 20 year archive of eruptions in the North Pacific at the Geophysical Institute of the University of Alaska Fairbanks, explosive eruptions were evaluated at Karmsky Volcano (1996), Pavlof volcano (1996, 2007, 2013), Cleveland Volcano (1994, 2001, 2008), Shishaldin Volcano (1999), Augustine Volcano (2006), Fourpeaked

  9. Detecting Unidentified Changes

    PubMed Central

    Howe, Piers D. L.; Webb, Margaret E.

    2014-01-01

    Does becoming aware of a change to a purely visual stimulus necessarily cause the observer to be able to identify or localise the change or can change detection occur in the absence of identification or localisation? Several theories of visual awareness stress that we are aware of more than just the few objects to which we attend. In particular, it is clear that to some extent we are also aware of the global properties of the scene, such as the mean luminance or the distribution of spatial frequencies. It follows that we may be able to detect a change to a visual scene by detecting a change to one or more of these global properties. However, detecting a change to global property may not supply us with enough information to accurately identify or localise which object in the scene has been changed. Thus, it may be possible to reliably detect the occurrence of changes without being able to identify or localise what has changed. Previous attempts to show that this can occur with natural images have produced mixed results. Here we use a novel analysis technique to provide additional evidence that changes can be detected in natural images without also being identified or localised. It is likely that this occurs by the observers monitoring the global properties of the scene. PMID:24454727

  10. Laboratory Detection of Respiratory Viruses by Automated Techniques

    PubMed Central

    Pérez-Ruiz, Mercedes; Pedrosa-Corral, Irene; Sanbonmatsu-Gámez, Sara; Navarro-Marí, José-María

    2012-01-01

    Advances in clinical virology for detecting respiratory viruses have been focused on nucleic acids amplification techniques, which have converted in the reference method for the diagnosis of acute respiratory infections of viral aetiology. Improvements of current commercial molecular assays to reduce hands-on-time rely on two strategies, a stepwise automation (semi-automation) and the complete automation of the whole procedure. Contributions to the former strategy have been the use of automated nucleic acids extractors, multiplex PCR, real-time PCR and/or DNA arrays for detection of amplicons. Commercial fully-automated molecular systems are now available for the detection of respiratory viruses. Some of them could convert in point-of-care methods substituting antigen tests for detection of respiratory syncytial virus and influenza A and B viruses. This article describes laboratory methods for detection of respiratory viruses. A cost-effective and rational diagnostic algorithm is proposed, considering technical aspects of the available assays, infrastructure possibilities of each laboratory and clinic-epidemiologic factors of the infection PMID:23248735

  11. Automated multidimensional single molecule fluorescence microscopy feature detection and tracking.

    PubMed

    Rolfe, Daniel J; McLachlan, Charles I; Hirsch, Michael; Needham, Sarah R; Tynan, Christopher J; Webb, Stephen E D; Martin-Fernandez, Marisa L; Hobson, Michael P

    2011-10-01

    Characterisation of multi-protein interactions in cellular networks can be achieved by optical microscopy using multidimensional single molecule fluorescence imaging. Proteins of different species, individually labelled with a single fluorophore, can be imaged as isolated spots (features) of different colour light in different channels, and their diffusive behaviour in cells directly measured through time. Challenges in data analysis have, however, thus far hindered its application in biology. A set of methods for the automated analysis of multidimensional single molecule microscopy data from cells is presented, incorporating Bayesian segmentation-based feature detection, image registration and particle tracking. Single molecules of different colours can be simultaneously detected in noisy, high background data with an arbitrary number of channels, acquired simultaneously or time-multiplexed, and then tracked through time. The resulting traces can be further analysed, for example to detect intensity steps, count discrete intensity levels, measure fluorescence resonance energy transfer (FRET) or changes in polarisation. Examples are shown illustrating the use of the algorithms in investigations of the epidermal growth factor receptor (EGFR) signalling network, a key target for cancer therapeutics, and with simulated data.

  12. Automation: An Illustration of Social Change.

    ERIC Educational Resources Information Center

    Warnat, Winifred I.

    Advanced automation is significantly affecting American society and the individual. To understand the extent of this impact, an understanding of the country's service economy is necessary. The United States made the transition from a goods- to service-based economy shortly after World War II. In 1982, services generated 67% of the Gross National…

  13. A highly automated moving object detection package

    NASA Astrophysics Data System (ADS)

    Petit, J.-M.; Holman, M.; Scholl, H.; Kavelaars, J.; Gladman, B.

    2004-01-01

    With the deployment of large CCD mosaic cameras and their use in large-scale surveys to discover Solar system objects, there is a need for fast detection algorithms that can handle large data loads in a nearly automatic way. We present here an algorithm that we have developed. Our approach, by using two independent detection algorithms and combining the results, maintains high efficiency while producing low false-detection rates. These properties are crucial in order to reduce the operator time associated with searching these huge data sets. We have used this algorithm on two different mosaic data sets obtained using the CFH12K camera at the Canada-France-Hawaii Telescope (CFHT). Comparing the detection efficiency and false-detection rate of each individual algorithm with the combination of both, we show that our approach decreases the false detection rate by a factor of a few hundred to a thousand, while decreasing the `limiting magnitude' (where the detection rate drops to 50 per cent) by only 0.1-0.3 mag. The limiting magnitude is similar to that of a human operator blinking the images. Our full pipeline also characterizes the magnitude efficiency of the entire system by implanting artificial objects in the data set. The detection portion of the package is publicly available.

  14. Systems and Methods for Automated Water Detection Using Visible Sensors

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L. (Inventor); Matthies, Larry H. (Inventor); Bellutta, Paolo (Inventor)

    2016-01-01

    Systems and methods are disclosed that include automated machine vision that can utilize images of scenes captured by a 3D imaging system configured to image light within the visible light spectrum to detect water. One embodiment includes autonomously detecting water bodies within a scene including capturing at least one 3D image of a scene using a sensor system configured to detect visible light and to measure distance from points within the scene to the sensor system, and detecting water within the scene using a processor configured to detect regions within each of the at least one 3D images that possess at least one characteristic indicative of the presence of water.

  15. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    SciTech Connect

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  16. Automated Detection of Solar Loops by the Oriented Connectivity Method

    NASA Technical Reports Server (NTRS)

    Lee, Jong Kwan; Newman, Timothy S.; Gary, G. Allen

    2004-01-01

    An automated technique to segment solar coronal loops from intensity images of the Sun s corona is introduced. It exploits physical characteristics of the solar magnetic field to enable robust extraction from noisy images. The technique is a constructive curve detection approach, constrained by collections of estimates of the magnetic fields orientation. Its effectiveness is evaluated through experiments on synthetic and real coronal images.

  17. Defect Prevention and Detection in Software for Automated Test Equipment

    SciTech Connect

    E. Bean

    2006-11-30

    Software for automated test equipment can be tedious and monotonous making it just as error-prone as other software. Active defect prevention and detection are also important for test applications. Incomplete or unclear requirements, a cryptic syntax used for some test applications—especially script-based test sets, variability in syntax or structure, and changing requirements are among the problems encountered in one tester. Such problems are common to all software but can be particularly problematic in test equipment software intended to test another product. Each of these issues increases the probability of error injection during test application development. This report describes a test application development tool designed to address these issues and others for a particular piece of test equipment. By addressing these problems in the development environment, the tool has powerful built-in defect prevention and detection capabilities. Regular expressions are widely used in the development tool as a means of formally defining test equipment requirements for the test application and verifying conformance to those requirements. A novel means of using regular expressions to perform range checking was developed. A reduction in rework and increased productivity are the results. These capabilities are described along with lessons learned and their applicability to other test equipment software. The test application development tool, or “application builder”, is known as the PT3800 AM Creation, Revision and Archiving Tool (PACRAT).

  18. Automated fetal spine detection in ultrasound images

    NASA Astrophysics Data System (ADS)

    Tolay, Paresh; Vajinepalli, Pallavi; Bhattacharya, Puranjoy; Firtion, Celine; Sisodia, Rajendra Singh

    2009-02-01

    A novel method is proposed for the automatic detection of fetal spine in ultrasound images along with its orientation in this paper. This problem presents a variety of challenges, including robustness to speckle noise, variations in the visible shape of the spine due to orientation of the ultrasound probe with respect to the fetus and the lack of a proper edge enclosing the entire spine on account of its composition out of distinct vertebra. The proposed method improves robustness and accuracy by making use of two independent techniques to estimate the spine, and then detects the exact location using a cross-correlation approach. Experimental results show that the proposed method is promising for fetal spine detection.

  19. Automated Detection of Stereotypical Motor Movements

    ERIC Educational Resources Information Center

    Goodwin, Matthew S.; Intille, Stephen S.; Albinali, Fahd; Velicer, Wayne F.

    2011-01-01

    To overcome problems with traditional methods for measuring stereotypical motor movements in persons with Autism Spectrum Disorders (ASD), we evaluated the use of wireless three-axis accelerometers and pattern recognition algorithms to automatically detect body rocking and hand flapping in children with ASD. Findings revealed that, on average,…

  20. Automated Human Screening for Detecting Concealed Knowledge

    ERIC Educational Resources Information Center

    Twyman, Nathan W.

    2012-01-01

    Screening individuals for concealed knowledge has traditionally been the purview of professional interrogators investigating a crime. But the ability to detect when a person is hiding important information would be of high value to many other fields and functions. This dissertation proposes design principles for and reports on an implementation…

  1. Automated detection of geomagnetic storms with heightened risk of GIC

    NASA Astrophysics Data System (ADS)

    Bailey, Rachel L.; Leonhardt, Roman

    2016-06-01

    Automated detection of geomagnetic storms is of growing importance to operators of technical infrastructure (e.g., power grids, satellites), which is susceptible to damage caused by the consequences of geomagnetic storms. In this study, we compare three methods for automated geomagnetic storm detection: a method analyzing the first derivative of the geomagnetic variations, another looking at the Akaike information criterion, and a third using multi-resolution analysis of the maximal overlap discrete wavelet transform of the variations. These detection methods are used in combination with an algorithm for the detection of coronal mass ejection shock fronts in ACE solar wind data prior to the storm arrival on Earth as an additional constraint for possible storm detection. The maximal overlap discrete wavelet transform is found to be the most accurate of the detection methods. The final storm detection software, implementing analysis of both satellite solar wind and geomagnetic ground data, detects 14 of 15 more powerful geomagnetic storms over a period of 2 years.

  2. Method and automated apparatus for detecting coliform organisms

    NASA Technical Reports Server (NTRS)

    Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)

    1980-01-01

    Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.

  3. Automated detection, characterization, and tracking of filaments from SDO data

    NASA Astrophysics Data System (ADS)

    Buchlin, Eric; Vial, Jean-Claude; Mercier, Claude

    2016-07-01

    Thanks to the cadence and continuity of AIA and HMI observations, SDO offers unique data for detecting, characterizing, and tracking solar filaments, until their eruptions, which are often associated with coronal mass ejections. Because of the requirement of short latency when aiming at space weather applications, and because of the important data volume, only an automated detection can be worked out. We present the code "FILaments, Eruptions, and Activations detected from Space" (FILEAS) that we have developed for the automated detection and tracking of filaments. Detections are based on the analysis of AIA 30.4 nm He II images and on the magnetic polarity inversion lines derived from HMI. Following the tracking of filaments as they rotate with the Sun, filament characteristics are computed and a database of filaments parameters is built. We present the algorithms and performances of the code, and we compare its results with the filaments detected in Hα and already present in the Heliophysics Events Knowledgebase. We finally discuss the possibility of using such a code to detect eruptions in real time.

  4. Towards an Automated Acoustic Detection System for Free Ranging Elephants

    PubMed Central

    Zeppelzauer, Matthias; Hensman, Sean; Stoeger, Angela S.

    2015-01-01

    The human-elephant conflict is one of the most serious conservation problems in Asia and Africa today. The involuntary confrontation of humans and elephants claims the lives of many animals and humans every year. A promising approach to alleviate this conflict is the development of an acoustic early warning system. Such a system requires the robust automated detection of elephant vocalizations under unconstrained field conditions. Today, no system exists that fulfills these requirements. In this paper, we present a method for the automated detection of elephant vocalizations that is robust to the diverse noise sources present in the field. We evaluate the method on a dataset recorded under natural field conditions to simulate a real-world scenario. The proposed method outperformed existing approaches and robustly and accurately detected elephants. It thus can form the basis for a future automated early warning system for elephants. Furthermore, the method may be a useful tool for scientists in bioacoustics for the study of wildlife recordings. PMID:25983398

  5. Automated Imaging Techniques for Biosignature Detection in Geologic Samples

    NASA Astrophysics Data System (ADS)

    Williford, K. H.

    2015-12-01

    Robust biosignature detection in geologic samples typically requires the integration of morphological/textural data with biogeochemical data across a variety of scales. We present new automated imaging and coordinated biogeochemical analysis techniques developed at the JPL Astrobiogeochemistry Laboratory (abcLab) in support of biosignature detection in terrestrial samples as well as those that may eventually be returned from Mars. Automated gigapixel mosaic imaging of petrographic thin sections in transmitted and incident light (including UV epifluorescence) is supported by a microscopy platform with a digital XYZ stage. Images are acquired, processed, and co-registered using multiple software platforms at JPL and can be displayed and shared using Gigapan, a freely available, web-based toolset (e.g. . Automated large area (cm-scale) elemental mapping at sub-micrometer spatial resolution is enabled by a variable pressure scanning electron microscope (SEM) with a large (150 mm2) silicon drift energy dispersive spectroscopy (EDS) detector system. The abcLab light and electron microscopy techniques are augmented by additional elemental chemistry, mineralogy and organic detection/classification using laboratory Micro-XRF and UV Raman/fluorescence systems, precursors to the PIXL and SHERLOC instrument platforms selected for flight on the NASA Mars 2020 rover mission. A workflow including careful sample preparation followed by iterative gigapixel imaging, SEM/EDS, Micro-XRF and UV fluorescence/Raman in support of organic, mineralogic, and elemental biosignature target identification and follow up analysis with other techniques including secondary ion mass spectrometry (SIMS) will be discussed.

  6. An Automated Cloud-edge Detection Algorithm Using Cloud Physics and Radar Data

    NASA Technical Reports Server (NTRS)

    Ward, Jennifer G.; Merceret, Francis J.; Grainger, Cedric A.

    2003-01-01

    An automated cloud edge detection algorithm was developed and extensively tested. The algorithm uses in-situ cloud physics data measured by a research aircraft coupled with ground-based weather radar measurements to determine whether the aircraft is in or out of cloud. Cloud edges are determined when the in/out state changes, subject to a hysteresis constraint. The hysteresis constraint prevents isolated transient cloud puffs or data dropouts from being identified as cloud boundaries. The algorithm was verified by detailed manual examination of the data set in comparison to the results from application of the automated algorithm.

  7. Automated choroidal neovascularization detection algorithm for optical coherence tomography angiography

    PubMed Central

    Liu, Li; Gao, Simon S.; Bailey, Steven T.; Huang, David; Li, Dengwang; Jia, Yali

    2015-01-01

    Optical coherence tomography angiography has recently been used to visualize choroidal neovascularization (CNV) in participants with age-related macular degeneration. Identification and quantification of CNV area is important clinically for disease assessment. An automated algorithm for CNV area detection is presented in this article. It relies on denoising and a saliency detection model to overcome issues such as projection artifacts and the heterogeneity of CNV. Qualitative and quantitative evaluations were performed on scans of 7 participants. Results from the algorithm agreed well with manual delineation of CNV area. PMID:26417524

  8. An Automated Motion Detection and Reward System for Animal Training

    PubMed Central

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F

    2015-01-01

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an “off-the-shelf” automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use. PMID:26798573

  9. An Automated Motion Detection and Reward System for Animal Training.

    PubMed

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F; Black, Kevin J

    2015-12-04

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an "off-the-shelf" automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use.

  10. Automated Feature Detection and Solar Flare Prediction Using SDO Data

    NASA Astrophysics Data System (ADS)

    Qahwaji, Rami; Ahmed, Omar; Colak, Tufan

    The importance of real-time processing of solar data especially for space weather applica-tions is increasing continuously, especially with the launch of SDO which will provide sev-eral times more data compared to previous solar satellites. In this paper, we will show the initial results of applying our Automated Solar Activity Prediction (ASAP) system for the short-term prediction of significant solar flares to SDO data. This automated system is cur-rently working in real-time mode with SOHO/MDI images and its results are available online (http://spaceweather.inf.brad.ac.uk/) whenever a new solar image available. This system inte-grates image processing and machine learning to deliver these predictions. A machine learning-based system is designed to analyse years of sunspots and flares data to extract knowledge and to create associations that can be represented using computer-based learning rules. An imaging-based real time system that provides automated detection, grouping and then clas-sification of recent sunspots based on the McIntosh classification and integrated within this system. The results of current feature detections and flare predictions of ASAP using SOHO data will be compared to those results of ASAP using SDO data and will also be presented in this paper.

  11. SAR change detection MTI

    NASA Astrophysics Data System (ADS)

    Scarborough, Steven; Lemanski, Christopher; Nichols, Howard; Owirka, Gregory; Minardi, Michael; Hale, Todd

    2006-05-01

    This paper examines the theory, application, and results of using single-channel synthetic aperture radar (SAR) data with Moving Reference Processing (MRP) to focus and geolocate moving targets. Moving targets within a standard SAR imaging scene are defocused, displaced, or completely missing in the final image. Building on previous research at AFRL, the SAR-MRP method focuses and geolocates moving targets by reprocessing the SAR data to focus the movers rather than the stationary clutter. SAR change detection is used so that target detection and focusing is performed more robustly. In the cases where moving target returns possess the same range versus slow-time histories, a geolocation ambiguity results. This ambiguity can be resolved in a number of ways. This paper concludes by applying the SAR-MRP method to high-frequency radar measurements from persistent continuous-dwell SAR observations of a moving target.

  12. Automated edge detection versus manual edge measurement in analysis of brachial artery reactivity: a comparison study.

    PubMed

    Williamson, Eric B; Bronas, Ulf G; Dengel, Donald R

    2008-09-01

    High resolution ultrasound, combined with computer imaging technology, is commonly used to measure changes in brachial artery diameter for the determination of endothelial-dependent vasodilation (EDD) and endothelial independent-vasodilation (EID). Currently, two methods of computerized edge-detection systems are in use to measure changes in artery diameter. One system involves the sonographer manually tracking the artery walls while the second system involves a computer automated edge-detection system that automatically tracks the artery wall. The purpose of this study was to compare the two types of computerized edge-detection systems for measuring vascular function and structure. One hundred fifty (female = 70, male = 80) participants agreed to participate. Baseline brachial diameter, carotid intima-medial thickness (cIMT), EDD and EID were measured by the two computerized edge-detection systems utilizing the same ultrasound B-mode image. Mean values (+/-standard error) for baseline diameter, cIMT, EDD and EID were 3.53 (+/-0.10) mm, 0.43 (+/-0.01) mm, 5.72 (+/-0.20)% and 22.17 (+/-0.60)%, respectively for the manual edge-detection software system. Mean values for baseline diameter, cIMT, EDD and EID were 3.59 (+/-0.10) mm, 0.44 (+/-0.01) mm, 7.33 (+/-0.30)% and 25.77 (+/-0.60)%, respectively for the automated edge-detection software system. Bland-Altman plots displayed large variations between the two edge-detection methods for assessing cIMT and changes in artery diameter following brachial EDD and EID. The results of the study demonstrate that manual and automated computerized edge-detection systems track dynamic changes in brachial artery diameter and cIMT measures differently. Therefore, caution should be used when comparing research utilizing different computerized edge-detection systems for measuring vascular function and structure.

  13. Detecting and predicting changes.

    PubMed

    Brown, Scott D; Steyvers, Mark

    2009-02-01

    When required to predict sequential events, such as random coin tosses or basketball free throws, people reliably use inappropriate strategies, such as inferring temporal structure when none is present. We investigate the ability of observers to predict sequential events in dynamically changing environments, where there is an opportunity to detect true temporal structure. In two experiments we demonstrate that participants often make correct statistical decisions when asked to infer the hidden state of the data generating process. However, when asked to make predictions about future outcomes, accuracy decreased even though normatively correct responses in the two tasks were identical. A particle filter model accounts for all data, describing performance in terms of a plausible psychological process. By varying the number of particles, and the prior belief about the probability of a change occurring in the data generating process, we were able to model most of the observed individual differences.

  14. Eclipsing binaries in the Gaia era: automated detection performance

    NASA Astrophysics Data System (ADS)

    Holl, Berry; Mowlavi, Nami; Lecoeur-Taïbi, Isabelle; Geneva Gaia CU7 Team members

    2014-09-01

    Binary systems can have periods from a fraction of a day to several years and exist in a large range of possible configurations at various evolutionary stages. About 2% of them are oriented such that eclipses can be observed. Such observations provide unique opportunities for the determination of their orbital and stellar parameters. Large-scale multi-epoch photometric surveys produce large sets of eclipsing binaries that allow for statistical studies of binary systems. In this respect the ESA Gaia mission, launched in December 2013, is expected to deliver an unprecedented sample of millions of eclipsing binaries. Their detection from Gaia photometry and estimation of their orbital periods are essential for their subclassification and orbital and stellar parameter determination. For a subset of these eclipsing systems, Gaia radial velocities and astrometric orbital measurements will further complement the Gaia light curves. A key challenge of the detection and period determination of the expected millions of Gaia eclipsing binaries is the automation of the procedure. Such an automated pipeline is being developed within the Gaia Data Processing Analysis Consortium, in the framework of automated detection and identification of various types of photometric variable objects. In this poster we discuss the performance of this pipeline on eclipsing binaries using simulated Gaia data and the existing Hipparcos data. We show that we can detect a wide range of binary systems and very often determine their orbital periods from photometry alone, even though the data sampling is relatively sparse. The results can further be improved for those objects for which spectroscopic and/or astrometric orbital measurements will also be available from Gaia.

  15. Automated J wave detection from digital 12-lead electrocardiogram.

    PubMed

    Wang, Yi Grace; Wu, Hau-Tieng; Daubechies, Ingrid; Li, Yabing; Estes, E Harvey; Soliman, Elsayed Z

    2015-01-01

    In this report we provide a method for automated detection of J wave, defined as a notch or slur in the descending slope of the terminal positive wave of the QRS complex, using signal processing and functional data analysis techniques. Two different sets of ECG tracings were selected from the EPICARE ECG core laboratory, Wake Forest School of Medicine, Winston Salem, NC. The first set was a training set comprised of 100 ECGs of which 50 ECGs had J-wave and the other 50 did not. The second set was a test set (n=116 ECGs) in which the J-wave status (present/absent) was only known by the ECG Center staff. All ECGs were recorded using GE MAC 1200 (GE Marquette, Milwaukee, Wisconsin) at 10mm/mV calibration, speed of 25mm/s and 500HZ sampling rate. All ECGs were initially inspected visually for technical errors and inadequate quality, and then automatically processed with the GE Marquette 12-SL program 2001 version (GE Marquette, Milwaukee, WI). We excluded ECG tracings with major abnormalities or rhythm disorder. Confirmation of the presence or absence of a J wave was done visually by the ECG Center staff and verified once again by three of the coauthors. There was no disagreement in the identification of the J wave state. The signal processing and functional data analysis techniques applied to the ECGs were conducted at Duke University and the University of Toronto. In the training set, the automated detection had sensitivity of 100% and specificity of 94%. For the test set, sensitivity was 89% and specificity was 86%. In conclusion, test results of the automated method we developed show a good J wave detection accuracy, suggesting possible utility of this approach for defining and detection of other complex ECG waveforms.

  16. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy.

    PubMed

    Pociask, Elżbieta; Jaworek-Korjakowska, Joanna; Malinowski, Krzysztof Piotr; Roleder, Tomasz; Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  17. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects. PMID:27610191

  18. Glaucoma risk index: automated glaucoma detection from color fundus images.

    PubMed

    Bock, Rüdiger; Meier, Jörg; Nyúl, László G; Hornegger, Joachim; Michelson, Georg

    2010-06-01

    Glaucoma as a neurodegeneration of the optic nerve is one of the most common causes of blindness. Because revitalization of the degenerated nerve fibers of the optic nerve is impossible early detection of the disease is essential. This can be supported by a robust and automated mass-screening. We propose a novel automated glaucoma detection system that operates on inexpensive to acquire and widely used digital color fundus images. After a glaucoma specific preprocessing, different generic feature types are compressed by an appearance-based dimension reduction technique. Subsequently, a probabilistic two-stage classification scheme combines these features types to extract the novel Glaucoma Risk Index (GRI) that shows a reasonable glaucoma detection performance. On a sample set of 575 fundus images a classification accuracy of 80% has been achieved in a 5-fold cross-validation setup. The GRI gains a competitive area under ROC (AUC) of 88% compared to the established topography-based glaucoma probability score of scanning laser tomography with AUC of 87%. The proposed color fundus image-based GRI achieves a competitive and reliable detection performance on a low-priced modality by the statistical analysis of entire images of the optic nerve head.

  19. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    PubMed Central

    Wojakowski, Wojciech

    2016-01-01

    Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement), segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC) and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  20. An Automated Directed Spectral Search Methodology for Small Target Detection

    NASA Astrophysics Data System (ADS)

    Grossman, Stanley I.

    Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed

  1. Change detection: training and transfer.

    PubMed

    Gaspar, John G; Neider, Mark B; Simons, Daniel J; McCarley, Jason S; Kramer, Arthur F

    2013-01-01

    Observers often fail to notice even dramatic changes to their environment, a phenomenon known as change blindness. If training could enhance change detection performance in general, then it might help to remedy some real-world consequences of change blindness (e.g. failing to detect hazards while driving). We examined whether adaptive training on a simple change detection task could improve the ability to detect changes in untrained tasks for young and older adults. Consistent with an effective training procedure, both young and older adults were better able to detect changes to trained objects following training. However, neither group showed differential improvement on untrained change detection tasks when compared to active control groups. Change detection training led to improvements on the trained task but did not generalize to other change detection tasks.

  2. Change Detection: Training and Transfer

    PubMed Central

    Gaspar, John G.; Neider, Mark B.; Simons, Daniel J.; McCarley, Jason S.; Kramer, Arthur F.

    2013-01-01

    Observers often fail to notice even dramatic changes to their environment, a phenomenon known as change blindness. If training could enhance change detection performance in general, then it might help to remedy some real-world consequences of change blindness (e.g. failing to detect hazards while driving). We examined whether adaptive training on a simple change detection task could improve the ability to detect changes in untrained tasks for young and older adults. Consistent with an effective training procedure, both young and older adults were better able to detect changes to trained objects following training. However, neither group showed differential improvement on untrained change detection tasks when compared to active control groups. Change detection training led to improvements on the trained task but did not generalize to other change detection tasks. PMID:23840775

  3. An Automated Visual Event Detection System for Cabled Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Mariette, J.

    2007-12-01

    The permanent presence of underwater cameras on oceanic cabled observatories, such as the Victoria Experimental Network Under the Sea (VENUS) and Eye-In-The-Sea (EITS) on Monterey Accelerated Research System (MARS), will generate valuable data that can move forward the boundaries of understanding the underwater world. However, sightings of underwater animal activities are rare, resulting in the recording of many hours of video with relatively few events of interest. The burden of video management and analysis often requires reducing the amount of video recorded and later analyzed. Sometimes enough human resources do not exist to analyze the video; the strains on human attention needed to analyze video demand an automated way to assist in video analysis. Towards this end, an Automated Visual Event Detection System (AVED) is in development at the Monterey Bay Aquarium Research Institute (MBARI) to address the problem of analyzing cabled observatory video. Here we describe the overall design of the system to process video data and enable science users to analyze the results. We present our results analyzing video from the VENUS observatory and test data from EITS deployments. This automated system for detecting visual events includes a collection of custom and open source software that can be run three ways: through a Web Service, through a Condor managed pool of AVED enabled compute servers, or locally on a single computer. The collection of software also includes a graphical user interface to preview or edit detected results and to setup processing options. To optimize the compute-intensive AVED algorithms, a parallel program has been implemented for high-data rate applications like the EITS instrument on MARS.

  4. Automated sleep scoring and sleep apnea detection in children

    NASA Astrophysics Data System (ADS)

    Baraglia, David P.; Berryman, Matthew J.; Coussens, Scott W.; Pamula, Yvonne; Kennedy, Declan; Martin, A. James; Abbott, Derek

    2005-12-01

    This paper investigates the automated detection of a patient's breathing rate and heart rate from their skin conductivity as well as sleep stage scoring and breathing event detection from their EEG. The software developed for these tasks is tested on data sets obtained from the sleep disorders unit at the Adelaide Women's and Children's Hospital. The sleep scoring and breathing event detection tasks used neural networks to achieve signal classification. The Fourier transform and the Higuchi fractal dimension were used to extract features for input to the neural network. The filtered skin conductivity appeared visually to bear a similarity to the breathing and heart rate signal, but a more detailed evaluation showed the relation was not consistent. Sleep stage classification was achieved with and accuracy of around 65% with some stages being accurately scored and others poorly scored. The two breathing events hypopnea and apnea were scored with varying degrees of accuracy with the highest scores being around 75% and 30%.

  5. Automated Vulnerability Detection for Compiled Smart Grid Software

    SciTech Connect

    Prowell, Stacy J; Pleszkoch, Mark G; Sayre, Kirk D; Linger, Richard C

    2012-01-01

    While testing performed with proper experimental controls can provide scientifically quantifiable evidence that software does not contain unintentional vulnerabilities (bugs), it is insufficient to show that intentional vulnerabilities exist, and impractical to certify devices for the expected long lifetimes of use. For both of these needs, rigorous analysis of the software itself is essential. Automated software behavior computation applies rigorous static software analysis methods based on function extraction (FX) to compiled software to detect vulnerabilities, intentional or unintentional, and to verify critical functionality. This analysis is based on the compiled firmware, takes into account machine precision, and does not rely on heuristics or approximations early in the analysis.

  6. Automated Detection and Annotation of Disturbance in Eastern Forests

    NASA Astrophysics Data System (ADS)

    Hughes, M. J.; Chen, G.; Hayes, D. J.

    2013-12-01

    Forest disturbances represent an important component of the terrestrial carbon budget. To generate spatially-explicit estimates of disturbance and regrowth, we developed an automated system to detect and characterize forest change in the eastern United States at 30 m resolution from a 28-year Landsat Thematic Mapper time-series (1984-2011). Forest changes are labeled as 'disturbances' or 'regrowth', assigned to a severity class, and attributed to a disturbance type: either fire, insects, harvest, or 'unknown'. The system generates cloud-free summertime composite images for each year from multiple summer scenes and calculates vegetation indices from these composites. Patches of similar terrain on the landscape are identified by segmenting the Normalized Burn Ratio image. The spatial variance within each patch, which has been found to be a good indicator of diffuse disturbances such as forest insect damage, is then calculated for each index, creating an additional set of indexes. To identify vegetation change and quantify its degree, the derivative through time is calculated for each index using total variance regularization to account for noise and create a piecewise-linear trend. These indexes and their derivatives detect areas of disturbance and regrowth and are also used as inputs into a neural network that classifies the disturbance type/agent. Disturbance and disease information from the US Forest Service Aerial Detection Surveys (ADS) geodatabase and disturbed plots from the US Forest Service Forest Inventory and Analysis (FIA) database provided training data for the neural network. Although there have been recent advances in discriminating between disturbance types in boreal forests, due to the larger number of forest species and cosmopolitan nature of overstory communities in eastern forests, separation remains difficult. The ADS database, derived from sketch maps and later digitized, commonly designates a single large area encompassing many smaller effected

  7. Multisensor Fusion for Change Detection

    NASA Astrophysics Data System (ADS)

    Schenk, T.; Csatho, B.

    2005-12-01

    Combining sensors that record different properties of a 3-D scene leads to complementary and redundant information. If fused properly, a more robust and complete scene description becomes available. Moreover, fusion facilitates automatic procedures for object reconstruction and modeling. For example, aerial imaging sensors, hyperspectral scanning systems, and airborne laser scanning systems generate complementary data. We describe how data from these sensors can be fused for such diverse applications as mapping surface erosion and landslides, reconstructing urban scenes, monitoring urban land use and urban sprawl, and deriving velocities and surface changes of glaciers and ice sheets. An absolute prerequisite for successful fusion is a rigorous co-registration of the sensors involved. We establish a common 3-D reference frame by using sensor invariant features. Such features are caused by the same object space phenomena and are extracted in multiple steps from the individual sensors. After extracting, segmenting and grouping the features into more abstract entities, we discuss ways on how to automatically establish correspondences. This is followed by a brief description of rigorous mathematical models suitable to deal with linear and area features. In contrast to traditional, point-based registration methods, lineal and areal features lend themselves to a more robust and more accurate registration. More important, the chances to automate the registration process increases significantly. The result of the co-registration of the sensors is a unique transformation between the individual sensors and the object space. This makes spatial reasoning of extracted information more versatile; reasoning can be performed in sensor space or in 3-D space where domain knowledge about features and objects constrains reasoning processes, reduces the search space, and helps to make the problem well-posed. We demonstrate the feasibility of the proposed multisensor fusion approach

  8. Observer performance in semi-automated microbleed detection

    NASA Astrophysics Data System (ADS)

    Kuijf, Hugo J.; Brundel, Manon; de Bresser, Jeroen; Viergever, Max A.; Biessels, Geert Jan; Geerlings, Mirjam I.; Vincken, Koen L.

    2013-03-01

    Cerebral microbleeds are small bleedings in the human brain, detectable with MRI. Microbleeds are associated with vascular disease and dementia. The number of studies involving microbleed detection is increasing rapidly. Visual rating is the current standard for detection, but is a time-consuming process, especially at high-resolution 7.0 T MR images, has limited reproducibility and is highly observer dependent. Recently, multiple techniques have been published for the semi-automated detection of microbleeds, attempting to overcome these problems. In the present study, a 7.0 T dual-echo gradient echo MR image was acquired in 18 participants with microbleeds from the SMART study. Two experienced observers identified 54 microbleeds in these participants, using a validated visual rating scale. The radial symmetry transform (RST) can be used for semi-automated detection of microbleeds in 7.0 T MR images. In the present study, the results of the RST were assessed by two observers and 47 microbleeds were identified: 35 true positives and 12 extra positives (microbleeds that were missed during visual rating). Hence, after scoring a total number of 66 microbleeds could be identified in the 18 participants. The use of the RST increased the average sensitivity of observers from 59% to 69%. More importantly, inter-observer agreement (ICC and Dice's coefficient) increased from 0.85 and 0.64 to 0.98 and 0.96, respectively. Furthermore, the required rating time was reduced from 30 to 2 minutes per participant. By fine-tuning the RST, sensitivities up to 90% can be achieved, at the cost of extra false positives.

  9. Automated microaneurysm detection in diabetic retinopathy using curvelet transform.

    PubMed

    Ali Shah, Syed Ayaz; Laude, Augustinus; Faye, Ibrahima; Tang, Tong Boon

    2016-10-01

    Microaneurysms (MAs) are known to be the early signs of diabetic retinopathy (DR). An automated MA detection system based on curvelet transform is proposed for color fundus image analysis. Candidates of MA were extracted in two parallel steps. In step one, blood vessels were removed from preprocessed green band image and preliminary MA candidates were selected by local thresholding technique. In step two, based on statistical features, the image background was estimated. The results from the two steps allowed us to identify preliminary MA candidates which were also present in the image foreground. A collection set of features was fed to a rule-based classifier to divide the candidates into MAs and non-MAs. The proposed system was tested with Retinopathy Online Challenge database. The automated system detected 162 MAs out of 336, thus achieved a sensitivity of 48.21% with 65 false positives per image. Counting MA is a means to measure the progression of DR. Hence, the proposed system may be deployed to monitor the progression of DR at early stage in population studies.

  10. Development of automated detection of radiology reports citing adrenal findings

    NASA Astrophysics Data System (ADS)

    Zopf, Jason; Langer, Jessica; Boonn, William; Kim, Woojin; Zafar, Hanna

    2011-03-01

    Indeterminate incidental findings pose a challenge to both the radiologist and the ordering physician as their imaging appearance is potentially harmful but their clinical significance and optimal management is unknown. We seek to determine if it is possible to automate detection of adrenal nodules, an indeterminate incidental finding, on imaging examinations at our institution. Using PRESTO (Pathology-Radiology Enterprise Search tool), a newly developed search engine at our institution that mines dictated radiology reports, we searched for phrases used by attendings to describe incidental adrenal findings. Using these phrases as a guide, we designed a query that can be used with the PRESTO index. The results were refined using a modified version of NegEx to eliminate query terms that have been negated within the report text. In order to validate these findings we used an online random date generator to select two random weeks. We queried our RIS database for all reports created on those dates and manually reviewed each report to check for adrenal incidental findings. This survey produced a ground- truth dataset of reports citing adrenal incidental findings against which to compare query performance. We further reviewed the false positives and negatives identified by our validation study, in an attempt to improve the performance query. This algorithm is an important step towards automating the detection of incidental adrenal nodules on cross sectional imaging at our institution. Subsequently, this query can be combined with electronic medical record data searches to determine the clinical significance of these findings through resultant follow-up.

  11. ADVICE: Automated Detection and Validation of Interaction by Co-Evolution.

    PubMed

    Tan, Soon-Heng; Zhang, Zhuo; Ng, See-Kiong

    2004-07-01

    ADVICE (Automated Detection and Validation of Interaction by Co-Evolution) is a web tool for predicting and validating protein-protein interactions using the observed co-evolution between interacting proteins. Interacting proteins are known to share similar evolutionary histories since they undergo coordinated evolutionary changes to preserve interactions and functionalities. The web tool automates a commonly adopted methodology to quantify the similarities in proteins' evolutionary histories for postulating potential protein-protein interactions. ADVICE can also be used to validate experimental data against spurious protein interactions by identifying those that have few similarities in their evolutionary histories. The web tool accepts a list of protein sequences or sequence pairs as input and retrieves orthologous sequences to compute the similarities in the proteins' evolutionary histories. To facilitate hypothesis generation, detected co-evolved proteins can be visualized as a network at the website. ADVICE is available at http://advice.i2r.a-star.edu.sg.

  12. [Automated detection of estrus and mastitis in dairy cows].

    PubMed

    de Mol, R M

    2001-02-15

    The development and test of detection models for oestrus and mastitis in dairy cows is described in a PhD thesis that was defended in Wageningen on June 5, 2000. These models were based on sensors for milk yield, milk temperature, electrical conductivity of milk, and cow activity and concentrate intake, and on combined processing of the sensor data. The models alert farmers to cows that need attention, because of possible oestrus or mastitis. A first detection model for cows, milked twice a day, was based on time series models for the sensor variables. A time series model describes the dependence between successive observations. The parameters of the time series models were fitted on-line for each cow after each milking by means of a Kalman filter, a mathematical method to estimate the state of a system on-line. The Kalman filter gives the best estimate of the current state of a system based on all preceding observations. This model was tested for 2 years on two experimental farms, and under field conditions on four farms over several years. A second detection model, for cow milked in an automatic milking system (AMS), was based on a generalization of the first model. Two data sets (one small, one large) were used for testing. The results for oestrus detection were good for both models. The results for mastitis detection were varying (in some cases good, in other cases moderate). Fuzzy logic was used to classify mastitis and oestrus alerts with both detection models, to reduce the number of false positive alerts. Fuzzy logic makes approximate reasoning possible, where statements can be partly true or false. Input for the fuzzy logic model were alerts from the detection models and additional information. The number of false positive alerts decreased considerably, while the number of detected cases remained at the same level. These models make automated detection possible in practice.

  13. The Development of Change Detection

    ERIC Educational Resources Information Center

    Shore, David I.; Burack, Jacob A.; Miller, Danny; Joseph, Shari; Enns, James T.

    2006-01-01

    Changes to a scene often go unnoticed if the objects of the change are unattended, making change detection an index of where attention is focused during scene perception. We measured change detection in school-age children and young adults by repeatedly alternating two versions of an image. To provide an age-fair assessment we used a bimanual…

  14. Fast-time Simulation of an Automated Conflict Detection and Resolution Concept

    NASA Technical Reports Server (NTRS)

    Windhorst, Robert; Erzberger, Heinz

    2006-01-01

    This paper investigates the effect on the National Airspace System of reducing air traffc controller workload by automating conflict detection and resolution. The Airspace Concept Evaluation System is used to perform simulations of the Cleveland Center with conventional and with automated conflict detection and resolution concepts. Results show that the automated conflict detection and resolution concept significantly decreases growth of delay as traffic demand is increased in en-route airspace.

  15. Automated calibration methods for robotic multisensor landmine detection

    NASA Astrophysics Data System (ADS)

    Keranen, Joe G.; Miller, Jonathan; Schultz, Gregory; Topolosky, Zeke

    2007-04-01

    Both force protection and humanitarian demining missions require efficient and reliable detection and discrimination of buried anti-tank and anti-personnel landmines. Widely varying surface and subsurface conditions, mine types and placement, as well as environmental regimes challenge the robustness of the automatic target recognition process. In this paper we present applications created for the U.S. Army Nemesis detection platform. Nemesis is an unmanned rubber-tracked vehicle-based system designed to eradicate a wide variety of anti-tank and anti-personnel landmines for humanitarian demining missions. The detection system integrates advanced ground penetrating synthetic aperture radar (GPSAR) and electromagnetic induction (EMI) arrays, highly accurate global and local positioning, and on-board target detection/classification software on the front loader of a semi-autonomous UGV. An automated procedure is developed to estimate the soil's dielectric constant using surface reflections from the ground penetrating radar. The results have implications not only for calibration of system data acquisition parameters, but also for user awareness and tuning of automatic target recognition detection and discrimination algorithms.

  16. Automated transient detection in the STEREO Heliospheric Imagers.

    NASA Astrophysics Data System (ADS)

    Barnard, Luke; Scott, Chris; Owens, Mat; Lockwood, Mike; Tucker-Hood, Kim; Davies, Jackie

    2014-05-01

    Since the launch of the twin STEREO satellites, the heliospheric imagers (HI) have been used, with good results, in tracking transients of solar origin, such as Coronal Mass Ejections (CMEs), out far into the heliosphere. A frequently used approach is to build a "J-map", in which multiple elongation profiles along a constant position angle are stacked in time, building an image in which radially propagating transients form curved tracks in the J-map. From this the time-elongation profile of a solar transient can be manually identified. This is a time consuming and laborious process, and the results are subjective, depending on the skill and expertise of the investigator. Therefore, it is desirable to develop an automated algorithm for the detection and tracking of the transient features observed in HI data. This is to some extent previously covered ground, as similar problems have been encountered in the analysis of coronagraph data and have led to the development of products such as CACtus etc. We present the results of our investigation into the automated detection of solar transients observed in J-maps formed from HI data. We use edge and line detection methods to identify transients in the J-maps, and then use kinematic models of the solar transient propagation (such as the fixed-phi and harmonic mean geometric models) to estimate the solar transients properties, such as transient speed and propagation direction, from the time-elongation profile. The effectiveness of this process is assessed by comparison of our results with a set of manually identified CMEs, extracted and analysed by the Solar Storm Watch Project. Solar Storm Watch is a citizen science project in which solar transients are identified in J-maps formed from HI data and tracked multiple times by different users. This allows the calculation of a consensus time-elongation profile for each event, and therefore does not suffer from the potential subjectivity of an individual researcher tracking an

  17. Automated focusing in bright-field microscopy for tuberculosis detection

    PubMed Central

    OSIBOTE, O.A.; DENDERE, R.; KRISHNAN, S.; DOUGLAS, T.S.

    2010-01-01

    Summary Automated microscopy to detect Mycobacterium tuberculosis in sputum smear slides would enable laboratories in countries with a high tuberculosis burden to cope efficiently with large numbers of smears. Focusing is a core component of automated microscopy, and successful autofocusing depends on selection of an appropriate focus algorithm for a specific task. We examined autofocusing algorithms for bright-field microscopy of Ziehl–Neelsen stained sputum smears. Six focus measures, defined in the spatial domain, were examined with respect to accuracy, execution time, range, full width at half maximum of the peak and the presence of local maxima. Curve fitting around an estimate of the focal plane was found to produce good results and is therefore an acceptable strategy to reduce the number of images captured for focusing and the processing time. Vollath's F4 measure performed best for full z-stacks, with a mean difference of 0.27 μm between manually and automatically determined focal positions, whereas it is jointly ranked best with the Brenner gradient for curve fitting. PMID:20946382

  18. Digital tripwire: a small automated human detection system

    NASA Astrophysics Data System (ADS)

    Fischer, Amber D.; Redd, Emmett; Younger, A. Steven

    2009-05-01

    A low cost, lightweight, easily deployable imaging sensor that can dependably discriminate threats from other activities within its field of view and, only then, alert the distant duty officer by transmitting a visual confirmation of the threat would provide a valuable asset to modern defense. At present, current solutions suffer from a multitude of deficiencies - size, cost, power endurance, but most notably, an inability to assess an image and conclude that it contains a threat. The human attention span cannot maintain critical surveillance over banks of displays constantly conveying such images from the field. DigitalTripwire is a small, self-contained, automated human-detection system capable of running for 1-5 days on two AA batteries. To achieve such long endurance, the DigitalTripwire system utilizes an FPGA designed with sleep functionality. The system uses robust vision algorithms, such as a partially unsupervised innovative backgroundmodeling algorithm, which employ several data reduction strategies to operate in real-time, and achieve high detection rates. When it detects human activity, either mounted or dismounted, it sends an alert including images to notify the command center. In this paper, we describe the hardware and software design of the DigitalTripwire system. In addition, we provide detection and false alarm rates across several challenging data sets demonstrating the performance of the vision algorithms in autonomously analyzing the video stream and classifying moving objects into four primary categories - dismounted human, vehicle, non-human, or unknown. Performance results across several challenging data sets are provided.

  19. Automated detection of oscillations in extreme ultraviolet imaging data

    NASA Astrophysics Data System (ADS)

    Ireland, J.; Marsh, M. S.; Kucera, T. A.; Young, C. A.

    2008-12-01

    The corona is now known to support many different types of oscillation. Initial detection of these oscillations currently relied on manual labor. With the advent of much higher cadence EUV (extreme ultraviolet) data at better spatial resolution, sifting through the data manually to look for oscillatory material becomes an onerous task. Further, different observers tend to see different behavior in the data. To overcome these problems, we introduce a Bayesian probability-based automated method to detect areas in EUV images that support oscillations. The method is fast and can handle time series data with even or uneven cadences. Interestingly, the Bayesian approach allows us to generate a probability that a given frequency is present without the need for an estimate of the noise in the data. We also generate simple and intuitive "quality measures" for each detected oscillation. This will allow users to select the "best" examples in a given dataset automatically. The method is demonstrated on existing datasets (EIT, TRACE, STEREO). Its application to Solar Dynamics Observatory data is also discussed. We also discuss some of the problems in detecting oscillations in the presence of a significant background trend which can pollute the frequency spectrum.

  20. Effects of response bias and judgment framing on operator use of an automated aid in a target detection task.

    PubMed

    Rice, Stephen; McCarley, Jason S

    2011-12-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.

  1. Automated rice leaf disease detection using color image analysis

    NASA Astrophysics Data System (ADS)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  2. Automated detection of open magnetic field regions in EUV images

    NASA Astrophysics Data System (ADS)

    Krista, Larisza Diana; Reinard, Alysha

    2016-05-01

    Open magnetic regions on the Sun are either long-lived (coronal holes) or transient (dimmings) in nature, but both appear as dark regions in EUV images. For this reason their detection can be done in a similar way. As coronal holes are often large and long-lived in comparison to dimmings, their detection is more straightforward. The Coronal Hole Automated Recognition and Monitoring (CHARM) algorithm detects coronal holes using EUV images and a magnetogram. The EUV images are used to identify dark regions, and the magnetogam allows us to determine if the dark region is unipolar - a characteristic of coronal holes. There is no temporal sensitivity in this process, since coronal hole lifetimes span days to months. Dimming regions, however, emerge and disappear within hours. Hence, the time and location of a dimming emergence need to be known to successfully identify them and distinguish them from regular coronal holes. Currently, the Coronal Dimming Tracker (CoDiT) algorithm is semi-automated - it requires the dimming emergence time and location as an input. With those inputs we can identify the dimming and track it through its lifetime. CoDIT has also been developed to allow the tracking of dimmings that split or merge - a typical feature of dimmings.The advantage of these particular algorithms is their ability to adapt to detecting different types of open field regions. For coronal hole detection, each full-disk solar image is processed individually to determine a threshold for the image, hence, we are not limited to a single pre-determined threshold. For dimming regions we also allow individual thresholds for each dimming, as they can differ substantially. This flexibility is necessary for a subjective analysis of the studied regions. These algorithms were developed with the goal to allow us better understand the processes that give rise to eruptive and non-eruptive open field regions. We aim to study how these regions evolve over time and what environmental

  3. Computer automated movement detection for the analysis of behavior.

    PubMed

    Ramazani, Roseanna B; Krishnan, Harish R; Bergeson, Susan E; Atkinson, Nigel S

    2007-05-15

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimental observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtraction removes the background and non-moving flies, leaving white pixels where movement has occurred. These pixels are tallied, giving a value that corresponds to the number of animals that have moved between images. Perl scripts automate these processes, allowing compatibility with high-throughput genetic screens. Four experiments demonstrate the utility of this method, the first showing heat-induced locomotor changes, the second showing tolerance to ethanol in a climbing assay, the third showing tolerance to ethanol by scoring the recovery of individual flies, and the fourth showing a mouse's preference for a novel object. Our lab will use this method to conduct a genetic screen for ethanol-induced hyperactivity and sedation, however, it could also be used to analyze locomotor behavior of any organism. PMID:17335906

  4. Computer automated movement detection for the analysis of behavior

    PubMed Central

    Ramazani, Roseanna B.; Krishnan, Harish R.; Bergeson, Susan E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtraction removes the background and non-moving flies, leaving white pixels where movement has occurred. These pixels are tallied, giving a value that corresponds to the number of animals that have moved between images. Perl scripts automate these processes, allowing compatibility with high-throughput genetic screens. Four experiments demonstrate the utility of this method, the first showing heat-induced locomotor changes, the second showing tolerance to ethanol in a climbing assay, the third showing tolerance to ethanol by scoring the recovery of individual flies, and the fourth showing a mouse’s preference for a novel object. Our lab will use this method to conduct a genetic screen for ethanol induced hyperactivity and sedation, however, it could also be used to analyze locomotor behavior of any organism. PMID:17335906

  5. Automated detection of retinal whitening in malarial retinopathy

    NASA Astrophysics Data System (ADS)

    Joshi, V.; Agurto, C.; Barriga, S.; Nemeth, S.; Soliz, P.; MacCormick, I.; Taylor, T.; Lewallen, S.; Harding, S.

    2016-03-01

    Cerebral malaria (CM) is a severe neurological complication associated with malarial infection. Malaria affects approximately 200 million people worldwide, and claims 600,000 lives annually, 75% of whom are African children under five years of age. Because most of these mortalities are caused by the high incidence of CM misdiagnosis, there is a need for an accurate diagnostic to confirm the presence of CM. The retinal lesions associated with malarial retinopathy (MR) such as retinal whitening, vessel discoloration, and hemorrhages, are highly specific to CM, and their detection can improve the accuracy of CM diagnosis. This paper will focus on development of an automated method for the detection of retinal whitening which is a unique sign of MR that manifests due to retinal ischemia resulting from CM. We propose to detect the whitening region in retinal color images based on multiple color and textural features. First, we preprocess the image using color and textural features of the CMYK and CIE-XYZ color spaces to minimize camera reflex. Next, we utilize color features of the HSL, CMYK, and CIE-XYZ channels, along with the structural features of difference of Gaussians. A watershed segmentation algorithm is used to assign each image region a probability of being inside the whitening, based on extracted features. The algorithm was applied to a dataset of 54 images (40 with whitening and 14 controls) that resulted in an image-based (binary) classification with an AUC of 0.80. This provides 88% sensitivity at a specificity of 65%. For a clinical application that requires a high specificity setting, the algorithm can be tuned to a specificity of 89% at a sensitivity of 82%. This is the first published method for retinal whitening detection and combining it with the detection methods for vessel discoloration and hemorrhages can further improve the detection accuracy for malarial retinopathy.

  6. The Automated Planet Finder telescope's automation and first three years of planet detections

    NASA Astrophysics Data System (ADS)

    Burt, Jennifer

    The Automated Planet Finder (APF) is a 2.4m, f/15 telescope located at the UCO's Lick Observatory, atop Mt. Hamilton. The telescope has been specifically optimized to detect and characterize extrasolar planets via high precision, radial velocity (RV) observations using the high-resolution Levy echelle spectrograph. The telescope has demonstrated world-class internal precision levels of 1 m/s when observing bright, RV standard stars. Observing time on the telescope is divided such that ˜80% is spent on exoplanet related research and the remaining ˜20% is made available to the University of California consortium for other science goals. The telescope achieved first light in 2013, and this work describes the APF's early science achievements and its transition from a traditional observing approach to a fully autonomous facility. First we provide a characteristic look at the APF telescope and the Levy spectrograph, focusing on the stability of the instrument and its performance on RV standard stars. Second, we describe the design and implementation of the dynamic scheduling software which has been running our team's nightly observations on the APF for the past year. Third, we discuss the detection of a Neptune-mass planet orbiting the nearby, low-mass star GL687 by the APF in collaboration with the HIRES instrument on Keck I. Fourth, we summarize the APF's detection of two multi-planet systems: the four planet system orbiting HD 141399 and the 6 planet system orbiting HD 219134. Fifth, we expand our science focus to assess the impact that the APF - with the addition of a new, time-varying prioritization scheme to the telescope's dynamic scheduling software - can have on filling out the exoplanet Mass-Radius diagram when pursuing RV follow-up of transiting planets detected by NASA's TESS satellite. Finally, we outline some likely next science goals for the telescope.

  7. Evaluation of object level change detection techniques

    NASA Astrophysics Data System (ADS)

    Irvine, John M.; Bergeron, Stuart; Hugo, Doug; O'Brien, Michael A.

    2007-04-01

    A variety of change detection (CD) methods have been developed and employed to support imagery analysis for applications including environmental monitoring, mapping, and support to military operations. Evaluation of these methods is necessary to assess technology maturity, identify areas for improvement, and support transition to operations. This paper presents a methodology for conducting this type of evaluation, discusses the challenges, and illustrates the techniques. The evaluation of object-level change detection methods is more complicated than for automated techniques for processing a single image. We explore algorithm performance assessments, emphasizing the definition of the operating conditions (sensor, target, and environmental factors) and the development of measures of performance. Specific challenges include image registration; occlusion due to foliage, cultural clutter and terrain masking; diurnal differences; and differences in viewing geometry. Careful planning, sound experimental design, and access to suitable imagery with image truth and metadata are critical.

  8. Automation and the future practice of pharmacy--changing the focus of pharmacy.

    PubMed

    Lee, M P

    1995-10-01

    Automation technology offers great potential in pharmacy practice. To realize the full benefits of the potential inherent in automation systems, it is necessary to understand basic concepts of automation and to realize that automation is simply a tool to help achieve the goals of practice. The goal of pharmacy practice is pharmaceutical care. Through using the techniques of reengineering, pharmacies can be redesigned with the help of automation to facilitate the accomplishment of that goal. Essential to achieving that goal is the necessity to change the focus of pharmacy from distribution to pharmaceutical care. Reengineering and automation are the tools to help make that change in focus.

  9. Automated Detection of Firearms and Knives in a CCTV Image.

    PubMed

    Grega, Michał; Matiolański, Andrzej; Guzik, Piotr; Leszczuk, Mikołaj

    2016-01-01

    Closed circuit television systems (CCTV) are becoming more and more popular and are being deployed in many offices, housing estates and in most public spaces. Monitoring systems have been implemented in many European and American cities. This makes for an enormous load for the CCTV operators, as the number of camera views a single operator can monitor is limited by human factors. In this paper, we focus on the task of automated detection and recognition of dangerous situations for CCTV systems. We propose algorithms that are able to alert the human operator when a firearm or knife is visible in the image. We have focused on limiting the number of false alarms in order to allow for a real-life application of the system. The specificity and sensitivity of the knife detection are significantly better than others published recently. We have also managed to propose a version of a firearm detection algorithm that offers a near-zero rate of false alarms. We have shown that it is possible to create a system that is capable of an early warning in a dangerous situation, which may lead to faster and more effective response times and a reduction in the number of potential victims.

  10. Automated analysis for detecting beams in laser wakefield simulations

    SciTech Connect

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  11. Automated Detection of Firearms and Knives in a CCTV Image.

    PubMed

    Grega, Michał; Matiolański, Andrzej; Guzik, Piotr; Leszczuk, Mikołaj

    2016-01-01

    Closed circuit television systems (CCTV) are becoming more and more popular and are being deployed in many offices, housing estates and in most public spaces. Monitoring systems have been implemented in many European and American cities. This makes for an enormous load for the CCTV operators, as the number of camera views a single operator can monitor is limited by human factors. In this paper, we focus on the task of automated detection and recognition of dangerous situations for CCTV systems. We propose algorithms that are able to alert the human operator when a firearm or knife is visible in the image. We have focused on limiting the number of false alarms in order to allow for a real-life application of the system. The specificity and sensitivity of the knife detection are significantly better than others published recently. We have also managed to propose a version of a firearm detection algorithm that offers a near-zero rate of false alarms. We have shown that it is possible to create a system that is capable of an early warning in a dangerous situation, which may lead to faster and more effective response times and a reduction in the number of potential victims. PMID:26729128

  12. Automated detection of microaneurysms using robust blob descriptors

    NASA Astrophysics Data System (ADS)

    Adal, K.; Ali, S.; Sidibé, D.; Karnowski, T.; Chaum, E.; Mériaudeau, F.

    2013-03-01

    Microaneurysms (MAs) are among the first signs of diabetic retinopathy (DR) that can be seen as round dark-red structures in digital color fundus photographs of retina. In recent years, automated computer-aided detection and diagnosis (CAD) of MAs has attracted many researchers due to its low-cost and versatile nature. In this paper, the MA detection problem is modeled as finding interest points from a given image and several interest point descriptors are introduced and integrated with machine learning techniques to detect MAs. The proposed approach starts by applying a novel fundus image contrast enhancement technique using Singular Value Decomposition (SVD) of fundus images. Then, Hessian-based candidate selection algorithm is applied to extract image regions which are more likely to be MAs. For each candidate region, robust low-level blob descriptors such as Speeded Up Robust Features (SURF) and Intensity Normalized Radon Transform are extracted to characterize candidate MA regions. The combined features are then classified using SVM which has been trained using ten manually annotated training images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. Preliminary results show the competitiveness of the proposed candidate selection techniques against state-of-the art methods as well as the promising future for the proposed descriptors to be used in the localization of MAs from fundus images.

  13. Automated Detection of Firearms and Knives in a CCTV Image

    PubMed Central

    Grega, Michał; Matiolański, Andrzej; Guzik, Piotr; Leszczuk, Mikołaj

    2016-01-01

    Closed circuit television systems (CCTV) are becoming more and more popular and are being deployed in many offices, housing estates and in most public spaces. Monitoring systems have been implemented in many European and American cities. This makes for an enormous load for the CCTV operators, as the number of camera views a single operator can monitor is limited by human factors. In this paper, we focus on the task of automated detection and recognition of dangerous situations for CCTV systems. We propose algorithms that are able to alert the human operator when a firearm or knife is visible in the image. We have focused on limiting the number of false alarms in order to allow for a real-life application of the system. The specificity and sensitivity of the knife detection are significantly better than others published recently. We have also managed to propose a version of a firearm detection algorithm that offers a near-zero rate of false alarms. We have shown that it is possible to create a system that is capable of an early warning in a dangerous situation, which may lead to faster and more effective response times and a reduction in the number of potential victims. PMID:26729128

  14. Automated detection and recognition of wildlife using thermal cameras.

    PubMed

    Christiansen, Peter; Steen, Kim Arild; Jørgensen, Rasmus Nyholm; Karstoft, Henrik

    2014-01-01

    In agricultural mowing operations, thousands of animals are injured or killed each year, due to the increased working widths and speeds of agricultural machinery. Detection and recognition of wildlife within the agricultural fields is important to reduce wildlife mortality and, thereby, promote wildlife-friendly farming. The work presented in this paper contributes to the automated detection and classification of animals in thermal imaging. The methods and results are based on top-view images taken manually from a lift to motivate work towards unmanned aerial vehicle-based detection and recognition. Hot objects are detected based on a threshold dynamically adjusted to each frame. For the classification of animals, we propose a novel thermal feature extraction algorithm. For each detected object, a thermal signature is calculated using morphological operations. The thermal signature describes heat characteristics of objects and is partly invariant to translation, rotation, scale and posture. The discrete cosine transform (DCT) is used to parameterize the thermal signature and, thereby, calculate a feature vector, which is used for subsequent classification. Using a k-nearest-neighbor (kNN) classifier, animals are discriminated from non-animals with a balanced classification accuracy of 84.7% in an altitude range of 3-10 m and an accuracy of 75.2% for an altitude range of 10-20 m. To incorporate temporal information in the classification, a tracking algorithm is proposed. Using temporal information improves the balanced classification accuracy to 93.3% in an altitude range 3-10 of meters and 77.7% in an altitude range of 10-20 m.

  15. Automated Detection and Recognition of Wildlife Using Thermal Cameras

    PubMed Central

    Christiansen, Peter; Steen, Kim Arild; Jørgensen, Rasmus Nyholm; Karstoft, Henrik

    2014-01-01

    In agricultural mowing operations, thousands of animals are injured or killed each year, due to the increased working widths and speeds of agricultural machinery. Detection and recognition of wildlife within the agricultural fields is important to reduce wildlife mortality and, thereby, promote wildlife-friendly farming. The work presented in this paper contributes to the automated detection and classification of animals in thermal imaging. The methods and results are based on top-view images taken manually from a lift to motivate work towards unmanned aerial vehicle-based detection and recognition. Hot objects are detected based on a threshold dynamically adjusted to each frame. For the classification of animals, we propose a novel thermal feature extraction algorithm. For each detected object, a thermal signature is calculated using morphological operations. The thermal signature describes heat characteristics of objects and is partly invariant to translation, rotation, scale and posture. The discrete cosine transform (DCT) is used to parameterize the thermal signature and, thereby, calculate a feature vector, which is used for subsequent classification. Using a k-nearest-neighbor (kNN) classifier, animals are discriminated from non-animals with a balanced classification accuracy of 84.7% in an altitude range of 3–10 m and an accuracy of 75.2% for an altitude range of 10–20 m. To incorporate temporal information in the classification, a tracking algorithm is proposed. Using temporal information improves the balanced classification accuracy to 93.3% in an altitude range 3–10 of meters and 77.7% in an altitude range of 10–20 m PMID:25196105

  16. A practical automated polyp detection scheme for CT colonography

    NASA Astrophysics Data System (ADS)

    Li, Hong; Santago, Pete

    2004-05-01

    A fully automated computerized polyp detection (CPD) system is presented that takes DICOM images from CT scanners and provides a list of detected polyps. The system comprises three stages, segmentation, polyp candidate generation (PCG), and false positive reduction (FPR). Employing computer tomographic colonography (CTC), both supine and prone scans are used for improving detection sensitivity. We developed a novel and efficient segmentation scheme. Major shape features, e.g., the mean curvature and Gaussian curvature, together with a connectivity test efficiently produce polyp candidates. We select six shape features and introduce a multi-plane linear discriminant function (MLDF) classifier in our system for FPR. The classifier parameters are empirically assigned with respect to the geometric meanings of a specific feature. We have tested the system on 68 real subjects, 20 positive and 48 negative for 6 mm and larger polyps from colonoscopy results. Using a patient-based criterion, 95% accuracy and 31% specificity were achieved when 6 mm was used as the cutoff size, implying that 15 out of 48 healthy subjects could avoid OC. One 11 mm polyp was missed by CPD but was also not reported by the radiologist. With a complete polyp database, we anticipate that a maximum a posteriori probability (MAP) classifier tuned by supervised training will improve the detection performance. The execution time for both scans is about 10-15 minutes using a 1 GHz PC running Linux. The system may be used standalone, but is envisioned more as a part of a computer-aided CTC screening that can address the problems with a fully automatic approach and a fully physician approach.

  17. Automated Point Cloud Correspondence Detection for Underwater Mapping Using AUVs

    NASA Technical Reports Server (NTRS)

    Hammond, Marcus; Clark, Ashley; Mahajan, Aditya; Sharma, Sumant; Rock, Stephen

    2015-01-01

    An algorithm for automating correspondence detection between point clouds composed of multibeam sonar data is presented. This allows accurate initialization for point cloud alignment techniques even in cases where accurate inertial navigation is not available, such as iceberg profiling or vehicles with low-grade inertial navigation systems. Techniques from computer vision literature are used to extract, label, and match keypoints between "pseudo-images" generated from these point clouds. Image matches are refined using RANSAC and information about the vehicle trajectory. The resulting correspondences can be used to initialize an iterative closest point (ICP) registration algorithm to estimate accumulated navigation error and aid in the creation of accurate, self-consistent maps. The results presented use multibeam sonar data obtained from multiple overlapping passes of an underwater canyon in Monterey Bay, California. Using strict matching criteria, the method detects 23 between-swath correspondence events in a set of 155 pseudo-images with zero false positives. Using less conservative matching criteria doubles the number of matches but introduces several false positive matches as well. Heuristics based on known vehicle trajectory information are used to eliminate these.

  18. A heuristic approach to automated nipple detection in digital mammograms.

    PubMed

    Jas, Mainak; Mukhopadhyay, Sudipta; Chakraborty, Jayasree; Sadhu, Anup; Khandelwal, Niranjan

    2013-10-01

    In this paper, a heuristic approach to automated nipple detection in digital mammograms is presented. A multithresholding algorithm is first applied to segment the mammogram and separate the breast region from the background region. Next, the problem is considered separately for craniocaudal (CC) and mediolateral-oblique (MLO) views. In the simplified algorithm, a search is performed on the segmented image along a band around the centroid and in a direction perpendicular to the pectoral muscle edge in the MLO view image. The direction defaults to the horizontal (perpendicular to the thoracic wall) in case of CC view images. The farthest pixel from the base found in this direction can be approximated as the nipple point. Further, an improved version of the simplified algorithm is proposed which can be considered as a subclass of the Branch and Bound algorithms. The mean Euclidean distance between the ground truth and calculated nipple position for 500 mammograms from the Digital Database for Screening Mammography (DDSM) database was found to be 11.03 mm and the average total time taken by the algorithm was 0.79 s. Results of the proposed algorithm demonstrate that even simple heuristics can achieve the desired result in nipple detection thus reducing the time and computational complexity.

  19. Evaluation of automated target detection using image fusion

    NASA Astrophysics Data System (ADS)

    Irvine, John M.; Abramson, Susan; Mossing, John

    2003-09-01

    Reliance on Automated Target Recognition (ATR) technology is essential to the future success of Intelligence, Surveillance, and Reconnaissance (ISR) missions. Although benefits may be realized through ATR processing of a single data source, fusion of information across multiple images and multiple sensors promises significant performance gains. A major challenge, as ATR fusion technologies mature, is the establishment of sound methods for evaluating ATR performance in the context of data fusion. The Deputy Under Secretary of Defense for Science and Technology (DUSD/S&T), as part of their ongoing ATR Program, has sponsored an effort to develop and demonstrate methods for evaluating ATR algorithms that utilize multiple data source, i.e., fusion-based ATR. This paper presents results from this program, focusing on the target detection and cueing aspect of the problem. The first step in assessing target detection performance is to relate the ground truth to the ATR decisions. Once the ATR decisions have been mapped to ground truth, the second step in the evaluation is to characterize ATR performance. A common approach is to vary the confidence threshold of the ATR and compute the Probability of Detection (PD) and the False Alarm Rate (FAR) associated with each threshold. Varying the threshold, therefore, produces an empirical performance curve relating detection performance to false alarms. Various statistical methods have been developed, largely in the medical imaging literature, to model this curve so that statistical inferences are possible. One approach, based on signal detection theory, generalizes the Receiver Operator Characteristic (ROC) curve. Under this approach, the Free Response Operating Characteristic (FROC) curve models performance for search problems. The FROC model is appropriate when multiple detections are possible and the number of false alarms is unconstrained. The parameterization of the FROC model provides a natural method for characterizing both

  20. Detection of Operator Performance Breakdown as an Automation Triggering Mechanism

    NASA Technical Reports Server (NTRS)

    Yoo, Hyo-Sang; Lee, Paul U.; Landry, Steven J.

    2015-01-01

    Performance breakdown (PB) has been anecdotally described as a state where the human operator "loses control of context" and "cannot maintain required task performance." Preventing such a decline in performance is critical to assure the safety and reliability of human-integrated systems, and therefore PB could be useful as a point at which automation can be applied to support human performance. However, PB has never been scientifically defined or empirically demonstrated. Moreover, there is no validated objective way of detecting such a state or the transition to that state. The purpose of this work is: 1) to empirically demonstrate a PB state, and 2) to develop an objective way of detecting such a state. This paper defines PB and proposes an objective method for its detection. A human-in-the-loop study was conducted: 1) to demonstrate PB by increasing workload until the subject reported being in a state of PB, and 2) to identify possible parameters of a detection method for objectively identifying the subjectively-reported PB point, and 3) to determine if the parameters are idiosyncratic to an individual/context or are more generally applicable. In the experiment, fifteen participants were asked to manage three concurrent tasks (one primary and two secondary) for 18 minutes. The difficulty of the primary task was manipulated over time to induce PB while the difficulty of the secondary tasks remained static. The participants' task performance data was collected. Three hypotheses were constructed: 1) increasing workload will induce subjectively-identified PB, 2) there exists criteria that identifies the threshold parameters that best matches the subjectively-identified PB point, and 3) the criteria for choosing the threshold parameters is consistent across individuals. The results show that increasing workload can induce subjectively-identified PB, although it might not be generalizable-only 12 out of 15 participants declared PB. The PB detection method based on

  1. Automated motion detection from space in sea surveilliance

    NASA Astrophysics Data System (ADS)

    Charalambous, Elisavet; Takaku, Junichi; Michalis, Pantelis; Dowman, Ian; Charalampopoulou, Vasiliki

    2015-06-01

    The Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) carried by the Advanced Land-Observing Satellite (ALOS) was designed to generate worldwide topographic data with its high-resolution and stereoscopic observation. PRISM performs along-track (AT) triplet stereo observations using independent forward (FWD), nadir (NDR), and backward (BWD) panchromatic optical line sensors of 2.5m ground resolution in swaths 35 km wide. The FWD and BWD sensors are arranged at an inclination of ±23.8° from NDR. In this paper, PRISM images are used under a new perspective, in security domain for sea surveillance, based on the sequence of the triplet which is acquired in a time interval of 90 sec (45 sec between images). An automated motion detection algorithm is developed allowing the combination of encompassed information at each instant and therefore the identification of patterns and trajectories of moving objects on sea; including the extraction of geometric characteristics along with the speed of movement and direction. The developed methodology combines well established image segmentation and morphological operation techniques for the detection of objects. Each object in the scene is represented by dimensionless measure properties and maintained in a database to allow the generation of trajectories as these arise over time, while the location of moving objects is updated based on the result of neighbourhood calculations. Most importantly, the developed methodology can be deployed in any air borne (optionally piloted) sensor system with along the track stereo capability enabling the provision of near real time automatic detection of targets; a task that cannot be achieved with satellite imagery due to the very intermittent coverage.

  2. Automated detection of Martian water ice clouds: the Valles Marineris

    NASA Astrophysics Data System (ADS)

    Ogohara, Kazunori; Munetomo, Takafumi; Hatanaka, Yuji; Okumura, Susumu

    2016-10-01

    We need to extract water ice clouds from the large number of Mars images in order to reveal spatial and temporal variations of water ice cloud occurrence and to meteorologically understand climatology of water ice clouds. However, visible images observed by Mars orbiters for several years are too many to visually inspect each of them even though the inspection was limited to one region. Therefore, an automated detection algorithm of Martian water ice clouds is necessary for collecting ice cloud images efficiently. In addition, it may visualize new aspects of spatial and temporal variations of water ice clouds that we have never been aware. We present a method for automatically evaluating the presence of Martian water ice clouds using difference images and cross-correlation distributions calculated from blue band images of the Valles Marineris obtained by the Mars Orbiter Camera onboard the Mars Global Surveyor (MGS/MOC). We derived one subtracted image and one cross-correlation distribution from two reflectance images. The difference between the maximum and the average, variance, kurtosis, and skewness of the subtracted image were calculated. Those of the cross-correlation distribution were also calculated. These eight statistics were used as feature vectors for training Support Vector Machine, and its generalization ability was tested using 10-fold cross-validation. F-measure and accuracy tended to be approximately 0.8 if the maximum in the normalized reflectance and the difference of the maximum and the average in the cross-correlation were chosen as features. In the process of the development of the detection algorithm, we found many cases where the Valles Marineris became clearly brighter than adjacent areas in the blue band. It is at present unclear whether the bright Valles Marineris means the occurrence of water ice clouds inside the Valles Marineris or not. Therefore, subtracted images showing the bright Valles Marineris were excluded from the detection of

  3. Fusion of geometric and thermographic data for automated defect detection

    NASA Astrophysics Data System (ADS)

    Oswald-Tranta, Beata; O'Leary, Paul

    2012-04-01

    Many workpieces produced in large numbers with a large variety of sizes and geometries, e.g. castings and forgings, have to be 100% inspected. In addition to geometric tolerances, material defects, e.g. surface cracks, also have to be detected. We present a fully automated nondestructive testing technique for both types of defects. The workpiece is subject to continuous motion, and during this motion two measurements are performed. In the first step, after applying a short inductive heating, a thermographic measurement is carried out. An infrared camera records the surface temperature of the workpiece enabling the localization of material defects and surface cracks. In the second step, a light sectioning measurement is performed to measure the three-dimensional geometry of the piece. With the help of feature-based registration the data from the two different sources are fused and evaluated together. The advantage of this technique is that a more reliable decision can be made about the nature of the failures and their possible causes. The same registration technique also can be used for the comparison of different pieces and therefore to localize different failure types, via comparison with a ``golden,'' defect-free piece. The registration technique can be applied to any part that has unique geometric features, around which moments can be computed. Consequently, the inspection technique can be applied to many different parts. The efficacy of the method is demonstrated with measurements on three parts having different geometries.

  4. Automated single particle detection and tracking for large microscopy datasets.

    PubMed

    Wilson, Rhodri S; Yang, Lei; Dun, Alison; Smyth, Annya M; Duncan, Rory R; Rickman, Colin; Lu, Weiping

    2016-05-01

    Recent advances in optical microscopy have enabled the acquisition of very large datasets from living cells with unprecedented spatial and temporal resolutions. Our ability to process these datasets now plays an essential role in order to understand many biological processes. In this paper, we present an automated particle detection algorithm capable of operating in low signal-to-noise fluorescence microscopy environments and handling large datasets. When combined with our particle linking framework, it can provide hitherto intractable quantitative measurements describing the dynamics of large cohorts of cellular components from organelles to single molecules. We begin with validating the performance of our method on synthetic image data, and then extend the validation to include experiment images with ground truth. Finally, we apply the algorithm to two single-particle-tracking photo-activated localization microscopy biological datasets, acquired from living primary cells with very high temporal rates. Our analysis of the dynamics of very large cohorts of 10 000 s of membrane-associated protein molecules show that they behave as if caged in nanodomains. We show that the robustness and efficiency of our method provides a tool for the examination of single-molecule behaviour with unprecedented spatial detail and high acquisition rates.

  5. Automated single particle detection and tracking for large microscopy datasets

    PubMed Central

    Wilson, Rhodri S.; Yang, Lei; Dun, Alison; Smyth, Annya M.; Duncan, Rory R.; Rickman, Colin

    2016-01-01

    Recent advances in optical microscopy have enabled the acquisition of very large datasets from living cells with unprecedented spatial and temporal resolutions. Our ability to process these datasets now plays an essential role in order to understand many biological processes. In this paper, we present an automated particle detection algorithm capable of operating in low signal-to-noise fluorescence microscopy environments and handling large datasets. When combined with our particle linking framework, it can provide hitherto intractable quantitative measurements describing the dynamics of large cohorts of cellular components from organelles to single molecules. We begin with validating the performance of our method on synthetic image data, and then extend the validation to include experiment images with ground truth. Finally, we apply the algorithm to two single-particle-tracking photo-activated localization microscopy biological datasets, acquired from living primary cells with very high temporal rates. Our analysis of the dynamics of very large cohorts of 10 000 s of membrane-associated protein molecules show that they behave as if caged in nanodomains. We show that the robustness and efficiency of our method provides a tool for the examination of single-molecule behaviour with unprecedented spatial detail and high acquisition rates. PMID:27293801

  6. Automated optic disk boundary detection by modified active contour model.

    PubMed

    Xu, Juan; Chutatape, Opas; Chew, Paul

    2007-03-01

    This paper presents a novel deformable-model-based algorithm for fully automated detection of optic disk boundary in fundus images. The proposed method improves and extends the original snake (deforming-only technique) in two aspects: clustering and smoothing update. The contour points are first self-separated into edge-point group or uncertain-point group by clustering after each deformation, and these contour points are then updated by different criteria based on different groups. The updating process combines both the local and global information of the contour to achieve the balance of contour stability and accuracy. The modifications make the proposed algorithm more accurate and robust to blood vessel occlusions, noises, ill-defined edges and fuzzy contour shapes. The comparative results show that the proposed method can estimate the disk boundaries of 100 test images closer to the groundtruth, as measured by mean distance to closest point (MDCP) <3 pixels, with the better success rate when compared to those obtained by gradient vector flow snake (GVF-snake) and modified active shape models (ASM).

  7. Stage Evolution of Office Automation Technological Change and Organizational Learning.

    ERIC Educational Resources Information Center

    Sumner, Mary

    1985-01-01

    A study was conducted to identify stage characteristics in terms of technology, applications, the role and responsibilities of the office automation organization, and planning and control strategies; and to describe the respective roles of data processing professionals, office automation analysts, and users in office automation systems development…

  8. Automated Ground Penetrating Radar hyperbola detection in complex environment

    NASA Astrophysics Data System (ADS)

    Mertens, Laurence; Lambot, Sébastien

    2015-04-01

    Ground Penetrating Radar (GPR) systems are commonly used in many applications to detect, amongst others, buried targets (various types of pipes, landmines, tree roots ...), which, in a cross-section, present theoretically a particular hyperbolic-shaped signature resulting from the antenna radiation pattern. Considering the large quantity of information we can acquire during a field campaign, a manual detection of these hyperbolas is barely possible, therefore we have a real need to have at our disposal a quick and automated detection of these hyperbolas. However, this task may reveal itself laborious in real field data because these hyperbolas are often ill-shaped due to the heterogeneity of the medium and to instrumentation clutter. We propose a new detection algorithm for well- and ill-shaped GPR reflection hyperbolas especially developed for complex field data. This algorithm is based on human recognition pattern to emulate human expertise to identify the hyperbolas apexes. The main principle relies in a fitting process of the GPR image edge dots detected with Canny filter to analytical hyperbolas, considering the object as a punctual disturbance with a physical constraint of the parameters. A long phase of observation of a large number of ill-shaped hyperbolas in various complex media led to the definition of smart criteria characterizing the hyperbolic shape and to the choice of accepted value ranges acceptable for an edge dot to correspond to the apex of a specific hyperbola. These values were defined to fit the ambiguity zone for the human brain and present the particularity of being functional in most heterogeneous media. Furthermore, the irregularity is particularly taken into account by defining a buffer zone around the theoretical hyperbola in which the edge dots need to be encountered to belong to this specific hyperbola. First, the method was tested in laboratory conditions over tree roots and over PVC pipes with both time- and frequency-domain radars

  9. Rapid toxicity detection in water quality control utilizing automated multispecies biomonitoring for permanent space stations

    NASA Technical Reports Server (NTRS)

    Morgan, E. L.; Young, R. C.; Smith, M. D.; Eagleson, K. W.

    1986-01-01

    The objective of this study was to evaluate proposed design characteristics and applications of automated biomonitoring devices for real-time toxicity detection in water quality control on-board permanent space stations. Simulated tests in downlinking transmissions of automated biomonitoring data to Earth-receiving stations were simulated using satellite data transmissions from remote Earth-based stations.

  10. Quantitative Automated Image Analysis System with Automated Debris Filtering for the Detection of Breast Carcinoma Cells

    PubMed Central

    Martin, David T.; Sandoval, Sergio; Ta, Casey N.; Ruidiaz, Manuel E.; Cortes-Mateos, Maria Jose; Messmer, Davorka; Kummel, Andrew C.; Blair, Sarah L.; Wang-Rodriguez, Jessica

    2011-01-01

    Objective To develop an intraoperative method for margin status evaluation during breast conservation therapy (BCT) using an automated analysis of imprint cytology specimens. Study Design Imprint cytology samples were prospectively taken from 47 patients undergoing either BCT or breast reduction surgery. Touch preparations from BCT patients were taken on cut sections through the tumor to generate positive margin controls. For breast reduction patients, slide imprints were taken at cuts through the center of excised tissue. Analysis results from the presented technique were compared against standard pathologic diagnosis. Slides were stained with cytokeratin and Hoechst, imaged with an automated fluorescent microscope, and analyzed with a fast algorithm to automate discrimination between epithelial cells and noncellular debris. Results The accuracy of the automated analysis was 95% for identifying invasive cancers compared against final pathologic diagnosis. The overall sensitivity was 87% while specificity was 100% (no false positives). This is comparable to the best reported results from manual examination of intraoperative imprint cytology slides while reducing the need for direct input from a cytopathologist. Conclusion This work demonstrates a proof of concept for developing a highly accurate and automated system for the intraoperative evaluation of margin status to guide surgical decisions and lower positive margin rates. PMID:21525740

  11. Fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device.

    PubMed

    Oh, Seung Jun; Park, Byung Hyun; Choi, Goro; Seo, Ji Hyun; Jung, Jae Hwan; Choi, Jong Seob; Kim, Do Hyun; Seo, Tae Seok

    2016-05-21

    This work describes fully automated and colorimetric foodborne pathogen detection on an integrated centrifugal microfluidic device, which is called a lab-on-a-disc. All the processes for molecular diagnostics including DNA extraction and purification, DNA amplification and amplicon detection were integrated on a single disc. Silica microbeads incorporated in the disc enabled extraction and purification of bacterial genomic DNA from bacteria-contaminated milk samples. We targeted four kinds of foodborne pathogens (Escherichia coli O157:H7, Salmonella typhimurium, Vibrio parahaemolyticus and Listeria monocytogenes) and performed loop-mediated isothermal amplification (LAMP) to amplify the specific genes of the targets. Colorimetric detection mediated by a metal indicator confirmed the results of the LAMP reactions with the colour change of the LAMP mixtures from purple to sky blue. The whole process was conducted in an automated manner using the lab-on-a-disc and a miniaturized rotary instrument equipped with three heating blocks. We demonstrated that a milk sample contaminated with foodborne pathogens can be automatically analysed on the centrifugal disc even at the 10 bacterial cell level in 65 min. The simplicity and portability of the proposed microdevice would provide an advanced platform for point-of-care diagnostics of foodborne pathogens, where prompt confirmation of food quality is needed. PMID:27112702

  12. Automated shock detection and analysis algorithm for space weather application

    NASA Astrophysics Data System (ADS)

    Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.

    2008-03-01

    Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.

  13. Algorithm for Automated Detection of Edges of Clouds

    NASA Technical Reports Server (NTRS)

    Ward, Jennifer G.; Merceret, Francis J.

    2006-01-01

    An algorithm processes cloud-physics data gathered in situ by an aircraft, along with reflectivity data gathered by ground-based radar, to determine whether the aircraft is inside or outside a cloud at a given time. A cloud edge is deemed to be detected when the in/out state changes, subject to a hysteresis constraint. Such determinations are important in continuing research on relationships among lightning, electric charges in clouds, and decay of electric fields with distance from cloud edges.

  14. Change Detection in Auditory Textures.

    PubMed

    Boubenec, Yves; Lawlor, Jennifer; Shamma, Shihab; Englitz, Bernhard

    2016-01-01

    Many natural sounds have spectrotemporal signatures only on a statistical level, e.g. wind, fire or rain. While their local structure is highly variable, the spectrotemporal statistics of these auditory textures can be used for recognition. This suggests the existence of a neural representation of these statistics. To explore their encoding, we investigated the detectability of changes in the spectral statistics in relation to the properties of the change. To achieve precise parameter control, we designed a minimal sound texture--a modified cloud of tones--which retains the central property of auditory textures: solely statistical predictability. Listeners had to rapidly detect a change in the frequency marginal probability of the tone cloud occurring at a random time.The size of change as well as the time available to sample the original statistics were found to correlate positively with performance and negatively with reaction time, suggesting the accumulation of noisy evidence. In summary we quantified dynamic aspects of change detection in statistically defined contexts, and found evidence of integration of statistical information.

  15. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  16. Detection of Salmonella from chicken rinses and chicken hot dogs with the automated BAX PCR system.

    PubMed

    Bailey, J S; Cosby, D E

    2003-11-01

    The BAX system with automated PCR detection was compared with standard cultural procedures for the detection of naturally occurring and spiked Salmonella in 183 chicken carcass rinses and 90 chicken hot dogs. The automated assay procedure consists of overnight growth (16 to 18 h) of the sample in buffered peptone broth at 35 degrees C, transfer of the sample to lysis tubes, incubation and lysis of the cells, transfer of the sample to PCR tubes, and placement of tubes into the cycler-detector, which runs automatically. The automated PCR detection assay takes about 4 h after 16 to 24 h of overnight preenrichment. The culture procedure consists of preerichment, enrichment, plating, and serological confirmation and takes about 72 h. Three trials involving 10 to 31 samples were carried out for each product. Some samples were spiked with Salmonella Typhimurium, Salmonella Heidelberg, Salmonella Montevideo, and Salmonella Enteritidis at 1 to 250 cells per ml of rinse or 1 to 250 cells per g of meat. For unspiked chicken rinses, Salmonella was detected in 2 of 61 samples with the automated system and in 1 of 61 samples with the culture method. Salmonella was recovered from 111 of 122 spiked samples with the automated PCR system and from 113 of 122 spiked samples with the culture method. For chicken hot dogs, Salmonella was detected in all 60 of the spiked samples with both the automated PCR and the culture procedures. For the 30 unspiked samples, Salmonella was recovered from 19 samples with the automated PCR system and from 10 samples with the culture method. The automated PCR system provided reliable Salmonella screening of chicken product samples within 24 h.

  17. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    NASA Technical Reports Server (NTRS)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  18. Land-cover change detection

    USGS Publications Warehouse

    Chen, Xuexia; Giri, Chandra; Vogelmann, James

    2012-01-01

    Land cover is the biophysical material on the surface of the earth. Land-cover types include grass, shrubs, trees, barren, water, and man-made features. Land cover changes continuously.  The rate of change can be either dramatic and abrupt, such as the changes caused by logging, hurricanes and fire, or subtle and gradual, such as regeneration of forests and damage caused by insects (Verbesselt et al., 2001).  Previous studies have shown that land cover has changed dramatically during the past sevearal centuries and that these changes have severely affected our ecosystems (Foody, 2010; Lambin et al., 2001). Lambin and Strahlers (1994b) summarized five types of cause for land-cover changes: (1) long-term natural changes in climate conditions, (2) geomorphological and ecological processes, (3) human-induced alterations of vegetation cover and landscapes, (4) interannual climate variability, and (5) human-induced greenhouse effect.  Tools and techniques are needed to detect, describe, and predict these changes to facilitate sustainable management of natural resources.

  19. A method for the automated detection phishing websites through both site characteristics and image analysis

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  20. Short wave–automated perimetry (SWAP) versus optical coherence tomography in early detection of glaucoma

    PubMed Central

    Zaky, Adel Galal; Yassin, Ahmed Tarek; El Sayid, Saber Hamed

    2016-01-01

    Objective To assess the role and diagnostic effectiveness of optical coherence tomography (OCT) and short wave–automated perimetry (SWAP) to distinguish between normal, glaucoma suspects, and surely diagnosed glaucomatous eye. Background Changes in the optic disc and retinal nerve fiber layer (RNFL) often precede the appearance of visual field defect with standard automated perimetry. Unfortunately, RNFL defect can be difficult to identify during clinical examination. Early detection of glaucoma is still controversial, whether by OCT, SWAP, or frequency-doubling technology perimetry. Patients and methods In this randomized controlled, consecutive, prospective study, a total 70 subjects (140 eyes) were included in the study, divided into three groups: Group A, 10 healthy volunteers (20 eyes); Group B, 30 patients (60 eyes) with glaucoma suspect; and Group C, 30 patients (60 eyes) with already diagnosed glaucomatous eyes. Results Average RNFL thickness was 75±9.0 in the glaucoma group, 99±15.5 in the control group, and 94±12 in glaucoma suspect. The inferior quadrant was the early parameter affected. There was significant correlation between visual field parameters and RNFL thickness in both glaucoma and glaucoma suspect groups. Conclusion Both RNFL thickness measured by OCT and SWAP indices are good discrimination tools between glaucomatous, glaucoma suspect, and normal eyes. OCT parameters tend to be more sensitive than SWAP parameters. PMID:27698551

  1. Short wave–automated perimetry (SWAP) versus optical coherence tomography in early detection of glaucoma

    PubMed Central

    Zaky, Adel Galal; Yassin, Ahmed Tarek; El Sayid, Saber Hamed

    2016-01-01

    Objective To assess the role and diagnostic effectiveness of optical coherence tomography (OCT) and short wave–automated perimetry (SWAP) to distinguish between normal, glaucoma suspects, and surely diagnosed glaucomatous eye. Background Changes in the optic disc and retinal nerve fiber layer (RNFL) often precede the appearance of visual field defect with standard automated perimetry. Unfortunately, RNFL defect can be difficult to identify during clinical examination. Early detection of glaucoma is still controversial, whether by OCT, SWAP, or frequency-doubling technology perimetry. Patients and methods In this randomized controlled, consecutive, prospective study, a total 70 subjects (140 eyes) were included in the study, divided into three groups: Group A, 10 healthy volunteers (20 eyes); Group B, 30 patients (60 eyes) with glaucoma suspect; and Group C, 30 patients (60 eyes) with already diagnosed glaucomatous eyes. Results Average RNFL thickness was 75±9.0 in the glaucoma group, 99±15.5 in the control group, and 94±12 in glaucoma suspect. The inferior quadrant was the early parameter affected. There was significant correlation between visual field parameters and RNFL thickness in both glaucoma and glaucoma suspect groups. Conclusion Both RNFL thickness measured by OCT and SWAP indices are good discrimination tools between glaucomatous, glaucoma suspect, and normal eyes. OCT parameters tend to be more sensitive than SWAP parameters.

  2. An automated approach to detecting signals in electroantennogram data

    USGS Publications Warehouse

    Slone, D.H.; Sullivan, B.T.

    2007-01-01

    Coupled gas chromatography/electroantennographic detection (GC-EAD) is a widely used method for identifying insect olfactory stimulants present in mixtures of volatiles, and it can greatly accelerate the identification of insect semiochemicals. In GC-EAD, voltage changes across an insect's antenna are measured while the antenna is exposed to compounds eluting from a gas chromatograph. The antenna thus serves as a selective GC detector whose output can be compared to that of a "general" GC detector, commonly a flame ionization detector. Appropriate interpretation of GC-EAD results requires that olfaction-related voltage changes in the antenna be distinguishable from background noise that arises inevitably from antennal preparations and the GC-EAD-associated hardware. In this paper, we describe and compare mathematical algorithms for discriminating olfaction-generated signals in an EAD trace from background noise. The algorithms amplify signals by recognizing their characteristic shape and wavelength while suppressing unstructured noise. We have found these algorithms to be both powerful and highly discriminatory even when applied to noisy traces where the signals would be difficult to discriminate by eye. This new methodology removes operator bias as a factor in signal identification, can improve realized sensitivity of the EAD system, and reduces the number of runs required to confirm the identity of an olfactory stimulant. ?? 2007 Springer Science+Business Media, LLC.

  3. Strategies for Working with Library Staff Members in Embracing Change Caused by Library Automation.

    ERIC Educational Resources Information Center

    Shepherd, Murray

    This paper begins with a discussion of information management as it pertains to the four operations of automated library systems (i.e., acquisitions, cataloging, circulation, and reference). Library staff reactions to library automation change are summarized, including uncertainty, cynicism, and resignation or hope. Common pitfalls that interfere…

  4. Automated detection of a prostate Ni-Ti stent in electronic portal images

    SciTech Connect

    Carl, Jesper; Nielsen, Henning; Nielsen, Jane; Lund, Bente; Larsen, Erik Hoejkjaer

    2006-12-15

    Planning target volumes (PTV) in fractionated radiotherapy still have to be outlined with wide margins to the clinical target volume due to uncertainties arising from daily shift of the prostate position. A recently proposed new method of visualization of the prostate is based on insertion of a thermo-expandable Ni-Ti stent. The current study proposes a new detection algorithm for automated detection of the Ni-Ti stent in electronic portal images. The algorithm is based on the Ni-Ti stent having a cylindrical shape with a fixed diameter, which was used as the basis for an automated detection algorithm. The automated method uses enhancement of lines combined with a grayscale morphology operation that looks for enhanced pixels separated with a distance similar to the diameter of the stent. The images in this study are all from prostate cancer patients treated with radiotherapy in a previous study. Images of a stent inserted in a humanoid phantom demonstrated a localization accuracy of 0.4-0.7 mm which equals the pixel size in the image. The automated detection of the stent was compared to manual detection in 71 pairs of orthogonal images taken in nine patients. The algorithm was successful in 67 of 71 pairs of images. The method is fast, has a high success rate, good accuracy, and has a potential for unsupervised localization of the prostate before radiotherapy, which would enable automated repositioning before treatment and allow for the use of very tight PTV margins.

  5. Water quality change detection: multivariate algorithms

    NASA Astrophysics Data System (ADS)

    Klise, Katherine A.; McKenna, Sean A.

    2006-05-01

    In light of growing concern over the safety and security of our nation's drinking water, increased attention has been focused on advanced monitoring of water distribution systems. The key to these advanced monitoring systems lies in the combination of real time data and robust statistical analysis. Currently available data streams from sensors provide near real time information on water quality. Combining these data streams with change detection algorithms, this project aims to develop automated monitoring techniques that will classify real time data and denote anomalous water types. Here, water quality data in 1 hour increments over 3000 hours at 4 locations are used to test multivariate algorithms to detect anomalous water quality events. The algorithms use all available water quality sensors to measure deviation from expected water quality. Simulated anomalous water quality events are added to the measured data to test three approaches to measure this deviation. These approaches include multivariate distance measures to 1) the previous observation, 2) the closest observation in multivariate space, and 3) the closest cluster of previous water quality observations. Clusters are established using kmeans classification. Each approach uses a moving window of previous water quality measurements to classify the current measurement as normal or anomalous. Receiver Operating Characteristic (ROC) curves test the ability of each approach to discriminate between normal and anomalous water quality using a variety of thresholds and simulated anomalous events. These analyses result in a better understanding of the deviation from normal water quality that is necessary to sound an alarm.

  6. Cell-Detection Technique for Automated Patch Clamping

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2008-01-01

    A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

  7. Managing laboratory automation in a changing pharmaceutical industry.

    PubMed

    Rutherford, M L

    1995-01-01

    The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014

  8. Managing laboratory automation in a changing pharmaceutical industry.

    PubMed

    Rutherford, M L

    1995-01-01

    The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented.

  9. An Optimal Cell Detection Technique for Automated Patch Clamping

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2004-01-01

    While there are several hardware techniques for the automated patch clamping of cells that describe the equipment apparatus used for patch clamping, very few explain the science behind the actual technique of locating the ideal cell for a patch clamping procedure. We present a machine vision approach to patch clamping cell selection by developing an intelligent algorithm technique that gives the user the ability to determine the good cell to patch clamp in an image within one second. This technique will aid the user in determining the best candidates for patch clamping and will ultimately save time, increase efficiency and reduce cost. The ultimate goal is to combine intelligent processing with instrumentation and controls in order to produce a complete turnkey automated patch clamping system capable of accurately and reliably patch clamping cells with a minimum amount of human intervention. We present a unique technique that identifies good patch clamping cell candidates based on feature metrics of a cell's (x, y) position, major axis length, minor axis length, area, elongation, roundness, smoothness, angle of orientation, thinness and whether or not the cell is only particularly in the field of view. A patent is pending for this research.

  10. An Automated Classification Technique for Detecting Defects in Battery Cells

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2006-01-01

    Battery cell defect classification is primarily done manually by a human conducting a visual inspection to determine if the battery cell is acceptable for a particular use or device. Human visual inspection is a time consuming task when compared to an inspection process conducted by a machine vision system. Human inspection is also subject to human error and fatigue over time. We present a machine vision technique that can be used to automatically identify defective sections of battery cells via a morphological feature-based classifier using an adaptive two-dimensional fast Fourier transformation technique. The initial area of interest is automatically classified as either an anode or cathode cell view as well as classified as an acceptable or a defective battery cell. Each battery cell is labeled and cataloged for comparison and analysis. The result is the implementation of an automated machine vision technique that provides a highly repeatable and reproducible method of identifying and quantifying defects in battery cells.

  11. On Radar Resolution in Coherent Change Detection.

    SciTech Connect

    Bickel, Douglas L.

    2015-11-01

    It is commonly observed that resolution plays a role in coherent change detection. Although this is the case, the relationship of the resolution in coherent change detection is not yet defined . In this document, we present an analytical method of evaluating this relationship using detection theory. Specifically we examine the effect of resolution on receiver operating characteristic curves for coherent change detection.

  12. Assessment of an Automated Touchdown Detection Algorithm for the Orion Crew Module

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.

    2011-01-01

    Orion Crew Module (CM) touchdown detection is critical to activating the post-landing sequence that safe?s the Reaction Control Jets (RCS), ensures that the vehicle remains upright, and establishes communication with recovery forces. In order to accommodate safe landing of an unmanned vehicle or incapacitated crew, an onboard automated detection system is required. An Orion-specific touchdown detection algorithm was developed and evaluated to differentiate landing events from in-flight events. The proposed method will be used to initiate post-landing cutting of the parachute riser lines, to prevent CM rollover, and to terminate RCS jet firing prior to submersion. The RCS jets continue to fire until touchdown to maintain proper CM orientation with respect to the flight path and to limit impact loads, but have potentially hazardous consequences if submerged while firing. The time available after impact to cut risers and initiate the CM Up-righting System (CMUS) is measured in minutes, whereas the time from touchdown to RCS jet submersion is a function of descent velocity, sea state conditions, and is often less than one second. Evaluation of the detection algorithms was performed for in-flight events (e.g. descent under chutes) using hi-fidelity rigid body analyses in the Decelerator Systems Simulation (DSS), whereas water impacts were simulated using a rigid finite element model of the Orion CM in LS-DYNA. Two touchdown detection algorithms were evaluated with various thresholds: Acceleration magnitude spike detection, and Accumulated velocity changed (over a given time window) spike detection. Data for both detection methods is acquired from an onboard Inertial Measurement Unit (IMU) sensor. The detection algorithms were tested with analytically generated in-flight and landing IMU data simulations. The acceleration spike detection proved to be faster while maintaining desired safety margin. Time to RCS jet submersion was predicted analytically across a series of

  13. An automated walk-over weighing system as a tool for measuring liveweight change in lactating dairy cows.

    PubMed

    Dickinson, R A; Morton, J M; Beggs, D S; Anderson, G A; Pyman, M F; Mansell, P D; Blackwood, C B

    2013-07-01

    Automated walk-over weighing systems can be used to monitor liveweights of cattle. Minimal literature exists to describe agreement between automated and static scales, and no known studies describe repeatability when used for daily measurements of dairy cows. This study establishes the repeatability of an automated walk-over cattle-weighing system, and agreement with static electronic scales, when used in a commercial dairy herd to weigh lactating cows. Forty-six lactating dairy cows from a seasonal calving, pasture-based dairy herd in southwest Victoria, Australia, were weighed once using a set of static scales and repeatedly using an automated walk-over weighing system at the exit of a rotary dairy. Substantial agreement was observed between the automated and static scales when assessed using Lin's concordance correlation coefficient. Weights measured by the automated walkover scales were within 5% of those measured by the static scales in 96% of weighings. Bland and Altman's 95% limits of agreement were -23.3 to 43.6 kg, a range of 66.9 kg. The 95% repeatability coefficient for automated weighings was 46.3 kg. Removal of a single outlier from the data set increased Lin's concordance coefficient, narrowed Bland and Altman's 95% limits of agreement to a range of 32.5 kg, and reduced the 95% repeatability coefficient to 18.7 kg. Cow misbehavior during walk-over weighing accounted for many of the larger weight discrepancies. The automated walk-over weighing system showed substantial agreement with the static scales when assessed using Lin's concordance correlation coefficient. This contrasted with limited agreement when assessed using Bland and Altman's method, largely due to poor repeatability. This suggests the automated weighing system is inadequate for detecting small liveweight differences in individual cows based on comparisons of single weights. Misbehaviors and other factors can result in the recording of spurious values on walk-over scales. Excluding

  14. Automated Network Anomaly Detection with Learning, Control and Mitigation

    ERIC Educational Resources Information Center

    Ippoliti, Dennis

    2014-01-01

    Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…

  15. Automation - Changes in cognitive demands and mental workload

    NASA Technical Reports Server (NTRS)

    Tsang, Pamela S.; Johnson, Walter W.

    1987-01-01

    The effect of partial automation on mental workloads in man/machine tasks is investigated experimentally. Subjective workload measures are obtained from six subjects after performance of a task battery comprising two manual (flight-path control, FC, and target acquisition, TA) tasks and one decisionmaking (engine failure, EF) task; the FC task was performed in both a fully manual (altitude and lateral control) mode and in a semiautomated mode (autmatic latitude control). The performance results and subjective evaluations are presented in graphs and characterized in detail. The automation is shown to improve objective performance and lower subjective workload significantly in the combined FC/TA task, but not in the FC task alone or in the FC/EF task.

  16. Automated detection scheme of architectural distortion in mammograms using adaptive Gabor filter

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Ruriha; Teramoto, Atsushi; Matsubara, Tomoko; Fujita, Hiroshi

    2013-03-01

    Breast cancer is a serious health concern for all women. Computer-aided detection for mammography has been used for detecting mass and micro-calcification. However, there are challenges regarding the automated detection of the architectural distortion about the sensitivity. In this study, we propose a novel automated method for detecting architectural distortion. Our method consists of the analysis of the mammary gland structure, detection of the distorted region, and reduction of false positive results. We developed the adaptive Gabor filter for analyzing the mammary gland structure that decides filter parameters depending on the thickness of the gland structure. As for post-processing, healthy mammary glands that run from the nipple to the chest wall are eliminated by angle analysis. Moreover, background mammary glands are removed based on the intensity output image obtained from adaptive Gabor filter. The distorted region of the mammary gland is then detected as an initial candidate using a concentration index followed by binarization and labeling. False positives in the initial candidate are eliminated using 23 types of characteristic features and a support vector machine. In the experiments, we compared the automated detection results with interpretations by a radiologist using 50 cases (200 images) from the Digital Database of Screening Mammography (DDSM). As a result, true positive rate was 82.72%, and the number of false positive per image was 1.39. There results indicate that the proposed method may be useful for detecting architectural distortion in mammograms.

  17. Automated detection and location of indications in eddy current signals

    DOEpatents

    Brudnoy, David M.; Oppenlander, Jane E.; Levy, Arthur J.

    2000-01-01

    A computer implemented information extraction process that locates and identifies eddy current signal features in digital point-ordered signals, signals representing data from inspection of test materials, by enhancing the signal features relative to signal noise, detecting features of the signals, verifying the location of the signal features that can be known in advance, and outputting information about the identity and location of all detected signal features.

  18. ASTRiDE: Automated Streak Detection for Astronomical Images

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Won

    2016-05-01

    ASTRiDE detects streaks in astronomical images using a "border" of each object (i.e. "boundary-tracing" or "contour-tracing") and their morphological parameters. Fast moving objects such as meteors, satellites, near-Earth objects (NEOs), or even cosmic rays can leave streak-like traces in the images; ASTRiDE can detect not only long streaks but also relatively short or curved streaks.

  19. Fully automated procedure for ship detection using optical satellite imagery

    NASA Astrophysics Data System (ADS)

    Corbane, C.; Pecoul, E.; Demagistri, L.; Petit, M.

    2009-01-01

    Ship detection from remote sensing imagery is a crucial application for maritime security which includes among others traffic surveillance, protection against illegal fisheries, oil discharge control and sea pollution monitoring. In the framework of a European integrated project GMES-Security/LIMES, we developed an operational ship detection algorithm using high spatial resolution optical imagery to complement existing regulations, in particular the fishing control system. The automatic detection model is based on statistical methods, mathematical morphology and other signal processing techniques such as the wavelet analysis and Radon transform. This paper presents current progress made on the detection model and describes the prototype designed to classify small targets. The prototype was tested on panchromatic SPOT 5 imagery taking into account the environmental and fishing context in French Guiana. In terms of automatic detection of small ship targets, the proposed algorithm performs well. Its advantages are manifold: it is simple and robust, but most of all, it is efficient and fast, which is a crucial point in performance evaluation of advanced ship detection strategies.

  20. Detection of anti-salmonella flgk antibodies in chickens by automated capillary immunoassay

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Western blot is a very useful tool to identify specific protein, but is tedious, labor-intensive and time-consuming. An automated "Simple Western" assay has recently been developed that enables the protein separation, blotting and detection in an automatic manner. However, this technology has not ...

  1. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    ERIC Educational Resources Information Center

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  2. Automated Detection of Lupus White Matter Lesions in MRI.

    PubMed

    Roura, Eloy; Sarbu, Nicolae; Oliver, Arnau; Valverde, Sergi; González-Villà, Sandra; Cervera, Ricard; Bargalló, Núria; Lladó, Xavier

    2016-01-01

    Brain magnetic resonance imaging provides detailed information which can be used to detect and segment white matter lesions (WML). In this work we propose an approach to automatically segment WML in Lupus patients by using T1w and fluid-attenuated inversion recovery (FLAIR) images. Lupus WML appear as small focal abnormal tissue observed as hyperintensities in the FLAIR images. The quantification of these WML is a key factor for the stratification of lupus patients and therefore both lesion detection and segmentation play an important role. In our approach, the T1w image is first used to classify the three main tissues of the brain, white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF), while the FLAIR image is then used to detect focal WML as outliers of its GM intensity distribution. A set of post-processing steps based on lesion size, tissue neighborhood, and location are used to refine the lesion candidates. The proposal is evaluated on 20 patients, presenting qualitative, and quantitative results in terms of precision and sensitivity of lesion detection [True Positive Rate (62%) and Positive Prediction Value (80%), respectively] as well as segmentation accuracy [Dice Similarity Coefficient (72%)]. Obtained results illustrate the validity of the approach to automatically detect and segment lupus lesions. Besides, our approach is publicly available as a SPM8/12 toolbox extension with a simple parameter configuration. PMID:27570507

  3. Automated Detection of Lupus White Matter Lesions in MRI

    PubMed Central

    Roura, Eloy; Sarbu, Nicolae; Oliver, Arnau; Valverde, Sergi; González-Villà, Sandra; Cervera, Ricard; Bargalló, Núria; Lladó, Xavier

    2016-01-01

    Brain magnetic resonance imaging provides detailed information which can be used to detect and segment white matter lesions (WML). In this work we propose an approach to automatically segment WML in Lupus patients by using T1w and fluid-attenuated inversion recovery (FLAIR) images. Lupus WML appear as small focal abnormal tissue observed as hyperintensities in the FLAIR images. The quantification of these WML is a key factor for the stratification of lupus patients and therefore both lesion detection and segmentation play an important role. In our approach, the T1w image is first used to classify the three main tissues of the brain, white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF), while the FLAIR image is then used to detect focal WML as outliers of its GM intensity distribution. A set of post-processing steps based on lesion size, tissue neighborhood, and location are used to refine the lesion candidates. The proposal is evaluated on 20 patients, presenting qualitative, and quantitative results in terms of precision and sensitivity of lesion detection [True Positive Rate (62%) and Positive Prediction Value (80%), respectively] as well as segmentation accuracy [Dice Similarity Coefficient (72%)]. Obtained results illustrate the validity of the approach to automatically detect and segment lupus lesions. Besides, our approach is publicly available as a SPM8/12 toolbox extension with a simple parameter configuration. PMID:27570507

  4. Automated detection of periventricular veins on 7 T brain MRI

    NASA Astrophysics Data System (ADS)

    Kuijf, Hugo J.; Bouvy, Willem H.; Zwanenburg, Jaco J. M.; Viergever, Max A.; Biessels, Geert Jan; Vincken, Koen L.

    2015-03-01

    Cerebral small vessel disease is common in elderly persons and a leading cause of cognitive decline, dementia, and acute stroke. With the introduction of ultra-high field strength 7.0T MRI, it is possible to visualize small vessels in the brain. In this work, a proof-of-principle study is conducted to assess the feasibility of automatically detecting periventricular veins. Periventricular veins are organized in a fan-pattern and drain venous blood from the brain towards the caudate vein of Schlesinger, which is situated along the lateral ventricles. Just outside this vein, a region-of- interest (ROI) through which all periventricular veins must cross is defined. Within this ROI, a combination of the vesselness filter, tubular tracking, and hysteresis thresholding is applied to locate periventricular veins. All detected locations were evaluated by an expert human observer. The results showed a positive predictive value of 88% and a sensitivity of 95% for detecting periventricular veins. The proposed method shows good results in detecting periventricular veins in the brain on 7.0T MR images. Compared to previous works, that only use a 1D or 2D ROI and limited image processing, our work presents a more comprehensive definition of the ROI, advanced image processing techniques to detect periventricular veins, and a quantitative analysis of the performance. The results of this proof-of-principle study are promising and will be used to assess periventricular veins on 7.0T brain MRI.

  5. Characterizing interplanetary shocks for development and optimization of an automated solar wind shock detection algorithm

    NASA Astrophysics Data System (ADS)

    Cash, M. D.; Wrobel, J. S.; Cosentino, K. C.; Reinard, A. A.

    2014-06-01

    Human evaluation of solar wind data for interplanetary (IP) shock identification relies on both heuristics and pattern recognition, with the former lending itself to algorithmic representation and automation. Such detection algorithms can potentially alert forecasters of approaching shocks, providing increased warning of subsequent geomagnetic storms. However, capturing shocks with an algorithmic treatment alone is challenging, as past and present work demonstrates. We present a statistical analysis of 209 IP shocks observed at L1, and we use this information to optimize a set of shock identification criteria for use with an automated solar wind shock detection algorithm. In order to specify ranges for the threshold values used in our algorithm, we quantify discontinuities in the solar wind density, velocity, temperature, and magnetic field magnitude by analyzing 8 years of IP shocks detected by the SWEPAM and MAG instruments aboard the ACE spacecraft. Although automatic shock detection algorithms have previously been developed, in this paper we conduct a methodical optimization to refine shock identification criteria and present the optimal performance of this and similar approaches. We compute forecast skill scores for over 10,000 permutations of our shock detection criteria in order to identify the set of threshold values that yield optimal forecast skill scores. We then compare our results to previous automatic shock detection algorithms using a standard data set, and our optimized algorithm shows improvements in the reliability of automated shock detection.

  6. An automated computer misuse detection system for UNICOS

    SciTech Connect

    Jackson, K.A.; Neuman, M.C.; Simmonds, D.D.; Stallings, C.A.; Thompson, J.L.; Christoph, G.G.

    1994-09-27

    An effective method for detecting computer misuse is the automatic monitoring and analysis of on-line user activity. This activity is reflected in the system audit record, in the system vulnerability posture, and in other evidence found through active testing of the system. During the last several years we have implemented an automatic misuse detection system at Los Alamos. This is the Network Anomaly Detection and Intrusion Reporter (NADIR). We are currently expanding NADIR to include processing of the Cray UNICOS operating system. This new component is called the UNICOS Realtime NADIR, or UNICORN. UNICORN summarizes user activity and system configuration in statistical profiles. It compares these profiles to expert rules that define security policy and improper or suspicious behavior. It reports suspicious behavior to security auditors and provides tools to aid in follow-up investigations. The first phase of UNICORN development is nearing completion, and will be operational in late 1994.

  7. System and method for automated object detection in an image

    DOEpatents

    Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.

    2015-10-06

    A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.

  8. Automated Detection of Anomalous Shipping Manifests to Identify Illicit Trade

    SciTech Connect

    Sanfilippo, Antonio P.; Chikkagoudar, Satish

    2013-11-12

    We describe an approach to analyzing trade data which uses clustering to detect similarities across shipping manifest records, classification to evaluate clustering results and categorize new unseen shipping data records, and visual analytics to provide to support situation awareness in dynamic decision making to monitor and warn against the movement of radiological threat materials through search, analysis and forecasting capabilities. The evaluation of clustering results through classification and systematic inspection of the clusters show the clusters have strong semantic cohesion and offer novel ways to detect transactions related to nuclear smuggling.

  9. Automated Micro-Object Detection for Mobile Diagnostics Using Lens-Free Imaging Technology

    PubMed Central

    Roy, Mohendra; Seo, Dongmin; Oh, Sangwoo; Chae, Yeonghun; Nam, Myung-Hyun; Seo, Sungkyu

    2016-01-01

    Lens-free imaging technology has been extensively used recently for microparticle and biological cell analysis because of its high throughput, low cost, and simple and compact arrangement. However, this technology still lacks a dedicated and automated detection system. In this paper, we describe a custom-developed automated micro-object detection method for a lens-free imaging system. In our previous work (Roy et al.), we developed a lens-free imaging system using low-cost components. This system was used to generate and capture the diffraction patterns of micro-objects and a global threshold was used to locate the diffraction patterns. In this work we used the same setup to develop an improved automated detection and analysis algorithm based on adaptive threshold and clustering of signals. For this purpose images from the lens-free system were then used to understand the features and characteristics of the diffraction patterns of several types of samples. On the basis of this information, we custom-developed an automated algorithm for the lens-free imaging system. Next, all the lens-free images were processed using this custom-developed automated algorithm. The performance of this approach was evaluated by comparing the counting results with standard optical microscope results. We evaluated the counting results for polystyrene microbeads, red blood cells, HepG2, HeLa, and MCF7 cells lines. The comparison shows good agreement between the systems, with a correlation coefficient of 0.91 and linearity slope of 0.877. We also evaluated the automated size profiles of the microparticle samples. This Wi-Fi-enabled lens-free imaging system, along with the dedicated software, possesses great potential for telemedicine applications in resource-limited settings. PMID:27164146

  10. Automated Micro-Object Detection for Mobile Diagnostics Using Lens-Free Imaging Technology.

    PubMed

    Roy, Mohendra; Seo, Dongmin; Oh, Sangwoo; Chae, Yeonghun; Nam, Myung-Hyun; Seo, Sungkyu

    2016-01-01

    Lens-free imaging technology has been extensively used recently for microparticle and biological cell analysis because of its high throughput, low cost, and simple and compact arrangement. However, this technology still lacks a dedicated and automated detection system. In this paper, we describe a custom-developed automated micro-object detection method for a lens-free imaging system. In our previous work (Roy et al.), we developed a lens-free imaging system using low-cost components. This system was used to generate and capture the diffraction patterns of micro-objects and a global threshold was used to locate the diffraction patterns. In this work we used the same setup to develop an improved automated detection and analysis algorithm based on adaptive threshold and clustering of signals. For this purpose images from the lens-free system were then used to understand the features and characteristics of the diffraction patterns of several types of samples. On the basis of this information, we custom-developed an automated algorithm for the lens-free imaging system. Next, all the lens-free images were processed using this custom-developed automated algorithm. The performance of this approach was evaluated by comparing the counting results with standard optical microscope results. We evaluated the counting results for polystyrene microbeads, red blood cells, and HepG2, HeLa, and MCF7 cells. The comparison shows good agreement between the systems, with a correlation coefficient of 0.91 and linearity slope of 0.877. We also evaluated the automated size profiles of the microparticle samples. This Wi-Fi-enabled lens-free imaging system, along with the dedicated software, possesses great potential for telemedicine applications in resource-limited settings. PMID:27164146

  11. Automated Micro-Object Detection for Mobile Diagnostics Using Lens-Free Imaging Technology.

    PubMed

    Roy, Mohendra; Seo, Dongmin; Oh, Sangwoo; Chae, Yeonghun; Nam, Myung-Hyun; Seo, Sungkyu

    2016-05-05

    Lens-free imaging technology has been extensively used recently for microparticle and biological cell analysis because of its high throughput, low cost, and simple and compact arrangement. However, this technology still lacks a dedicated and automated detection system. In this paper, we describe a custom-developed automated micro-object detection method for a lens-free imaging system. In our previous work (Roy et al.), we developed a lens-free imaging system using low-cost components. This system was used to generate and capture the diffraction patterns of micro-objects and a global threshold was used to locate the diffraction patterns. In this work we used the same setup to develop an improved automated detection and analysis algorithm based on adaptive threshold and clustering of signals. For this purpose images from the lens-free system were then used to understand the features and characteristics of the diffraction patterns of several types of samples. On the basis of this information, we custom-developed an automated algorithm for the lens-free imaging system. Next, all the lens-free images were processed using this custom-developed automated algorithm. The performance of this approach was evaluated by comparing the counting results with standard optical microscope results. We evaluated the counting results for polystyrene microbeads, red blood cells, and HepG2, HeLa, and MCF7 cells. The comparison shows good agreement between the systems, with a correlation coefficient of 0.91 and linearity slope of 0.877. We also evaluated the automated size profiles of the microparticle samples. This Wi-Fi-enabled lens-free imaging system, along with the dedicated software, possesses great potential for telemedicine applications in resource-limited settings.

  12. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research.

  13. Automated detection of gait initiation and termination using wearable sensors.

    PubMed

    Novak, Domen; Reberšek, Peter; De Rossi, Stefano Marco Maria; Donati, Marco; Podobnik, Janez; Beravs, Tadej; Lenzi, Tommaso; Vitiello, Nicola; Carrozza, Maria Chiara; Munih, Marko

    2013-12-01

    This paper presents algorithms for detection of gait initiation and termination using wearable inertial measurement units and pressure-sensitive insoles. Body joint angles, joint angular velocities, ground reaction force and center of plantar pressure of each foot are obtained from these sensors and input into supervised machine learning algorithms. The proposed initiation detection method recognizes two events: gait onset (an anticipatory movement preceding foot lifting) and toe-off. The termination detection algorithm segments gait into steps, measures the signals over a buffer at the beginning of each step, and determines whether this measurement belongs to the final step. The approach is validated with 10 subjects at two gait speeds, using within-subject and subject-independent cross-validation. Results show that gait initiation can be detected timely and accurately, with few errors in the case of within-subject cross-validation and overall good performance in subject-independent cross-validation. Gait termination can be predicted in over 80% of trials well before the subject comes to a complete stop. Results also show that the two sensor types are equivalent in predicting gait initiation while inertial measurement units are generally superior in predicting gait termination. Potential use of the algorithms is foreseen primarily with assistive devices such as prostheses and exoskeletons.

  14. Automated design of image operators that detect interest points.

    PubMed

    Trujillo, Leonardo; Olague, Gustavo

    2008-01-01

    This work describes how evolutionary computation can be used to synthesize low-level image operators that detect interesting points on digital images. Interest point detection is an essential part of many modern computer vision systems that solve tasks such as object recognition, stereo correspondence, and image indexing, to name but a few. The design of the specialized operators is posed as an optimization/search problem that is solved with genetic programming (GP), a strategy still mostly unexplored by the computer vision community. The proposed approach automatically synthesizes operators that are competitive with state-of-the-art designs, taking into account an operator's geometric stability and the global separability of detected points during fitness evaluation. The GP search space is defined using simple primitive operations that are commonly found in point detectors proposed by the vision community. The experiments described in this paper extend previous results (Trujillo and Olague, 2006a,b) by presenting 15 new operators that were synthesized through the GP-based search. Some of the synthesized operators can be regarded as improved manmade designs because they employ well-known image processing techniques and achieve highly competitive performance. On the other hand, since the GP search also generates what can be considered as unconventional operators for point detection, these results provide a new perspective to feature extraction research. PMID:19053496

  15. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    USGS Publications Warehouse

    Gorresen, P.M.; Miles, A.C.; Todd, C.M.; Bonaccorso, F.J.; Weller, T.J.

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled to digital recorders operating at a series of proximate sites on consecutive nights in 2 trial surveys for the Hawaiian hoary bat (Lasiurus cinereus semotus). Our results confirmed that the technique is readily amenable for use in occupancy analysis. We also conducted a simulation exercise to assess the effects of sampling effort on parameter estimation. The results indicated that the precision and bias of parameter estimation were often more influenced by the number of sites sampled than number of visits. Acceptable accuracy often was not attained until at least 15 sites or 15 visits were used to estimate detection probability and occupancy. The method has significant potential for use in monitoring trends in bat activity and in comparative studies of habitat use. ?? 2008 American Society of Mammalogists.

  16. Automated video quality measurement based on manmade object characterization and motion detection

    NASA Astrophysics Data System (ADS)

    Kalukin, Andrew; Harguess, Josh; Maltenfort, A. J.; Irvine, John; Algire, C.

    2016-05-01

    Automated video quality assessment methods have generally been based on measurements of engineering parameters such as ground sampling distance, level of blur, and noise. However, humans rate video quality using specific criteria that measure the interpretability of the video by determining the kinds of objects and activities that might be detected in the video. Given the improvements in tracking, automatic target detection, and activity characterization that have occurred in video science, it is worth considering whether new automated video assessment methods might be developed by imitating the logical steps taken by humans in evaluating scene content. This article will outline a new procedure for automatically evaluating video quality based on automated object and activity recognition, and demonstrate the method for several ground-based and maritime examples. The detection and measurement of in-scene targets makes it possible to assess video quality without relying on source metadata. A methodology is given for comparing automated assessment with human assessment. For the human assessment, objective video quality ratings can be obtained through a menu-driven, crowd-sourced scheme of video tagging, in which human participants tag objects such as vehicles and people on film clips. The size, clarity, and level of detail of features present on the tagged targets are compared directly with the Video National Image Interpretability Rating Scale (VNIIRS).

  17. Active change detection by pigeons and humans.

    PubMed

    Hagmann, Carl Erick; Cook, Robert G

    2013-10-01

    Detecting change is vital to both human and nonhuman animals' interactions with the environment. Using the go/no-go dynamic change detection task, we examined the capacity of four pigeons to detect changes in brightness of an area on a computer display. In contrast to our prior research, we reversed the response contingencies so that the animals had to actively inhibit pecking upon detecting change in brightness rather than its constancy. Testing eight rates of change revealed that this direct report change detection contingency produced results equivalent to the earlier indirect procedure. Corresponding tests with humans suggested that the temporal dynamics of detecting change were similar for both species. The results indicate the mechanisms of change detection in both pigeons and humans are organized in similar ways, although limitations in the operations of working memory may prevent pigeons from integrating information over the same time scale as humans.

  18. Automated muscle wrapping using finite element contact detection.

    PubMed

    Favre, Philippe; Gerber, Christian; Snedeker, Jess G

    2010-07-20

    Realistic muscle path representation is essential to musculoskeletal modeling of joint function. Algorithms predicting these muscle paths typically rely on a labor intensive predefinition of via points or underlying geometries to guide wrapping for given joint positions. While muscle wrapping using anatomically precise three-dimensional (3D) finite element (FE) models of bone and muscle has been achieved, computational expense and pre-processing associated with this approach exclude its use in applications such as subject-specific modeling. With the intention of combining advantageous features of both approaches, an intermediate technique relying on contact detection capabilities of commercial FE packages is presented. We applied the approach to the glenohumeral joint, and validated the method by comparison against existing experimental data. Individual muscles were modeled as a straight series of deformable beam elements and bones as anatomically precise 3D rigid bodies. Only the attachment locations and a default orientation of the undeformed muscle segment were pre-defined. The joint was then oriented in a static position of interest. The muscle segment free end was then moved along the shortest Euclidean path to its origin on the scapula, wrapping the muscle along bone surfaces by relying on software contact detection. After wrapping for a given position, the resulting moment arm was computed as the perpendicular distance from the line of action vector to the humeral head center of rotation. This approach reasonably predicted muscle length and moment arm for 27 muscle segments when compared to experimental measurements over a wide range of shoulder motion. Artificial via points or underlying contact geometries were avoided, contact detection and multiobject wrapping on the bone surfaces were automatic, and low computational cost permitted wrapping of individual muscles within seconds on a standard desktop PC. These advantages may be valuable for both general

  19. Automated detection of meteors in observed image sequence

    NASA Astrophysics Data System (ADS)

    Šimberová, Stanislava; Suk, Tomáš

    2015-12-01

    We propose a new detection technique based on statistical characteristics of images in the video sequence. These characteristics displayed in time enable to catch any bright track during the whole sequence. We applied our method to the image datacubes that are created from camera pictures of the night sky. Meteor flying through the Earth's atmosphere leaves a light trail lasting a few seconds on the sky background. We developed a special technique to recognize this event automatically in the complete observed video sequence. For further analysis leading to the precise recognition of object we suggest to apply Fourier and Hough transformations.

  20. High-Speed Observer: Automated Streak Detection in SSME Plumes

    NASA Technical Reports Server (NTRS)

    Rieckoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A high frame rate digital video camera installed on test stands at Stennis Space Center has been used to capture images of Space Shuttle main engine plumes during test. These plume images are processed in real time to detect and differentiate anomalous plume events occurring during a time interval on the order of 5 msec. Such speed yields near instantaneous availability of information concerning the state of the hardware. This information can be monitored by the test conductor or by other computer systems, such as the integrated health monitoring system processors, for possible test shutdown before occurrence of a catastrophic engine failure.

  1. Development of an automated MODS plate reader to detect early growth of Mycobacterium tuberculosis.

    PubMed

    Comina, G; Mendoza, D; Velazco, A; Coronel, J; Sheen, P; Gilman, R H; Moore, D A J; Zimic, M

    2011-06-01

    In this work, an automated microscopic observation drug susceptibility (MODS) plate reader has been developed. The reader automatically handles MODS plates and after autofocussing digital images are acquired of the characteristic microscopic cording structures of Mycobacterium tuberculosis, which are the identification method utilized in the MODS technique to detect tuberculosis and multidrug resistant tuberculosis. In conventional MODS, trained technicians manually move the MODS plate on the stage of an inverted microscope while trying to locate and focus upon the characteristic microscopic cording colonies. In centres with high tuberculosis diagnostic demand, sufficient time may not be available to adequately examine all cultures. An automated reader would reduce labour time and the handling of M. tuberculosis cultures by laboratory personnel. Two hundred MODS culture images (100 from tuberculosis positive and 100 from tuberculosis negative sputum samples confirmed by a standard MODS reading using a commercial microscope) were acquired randomly using the automated MODS plate reader. A specialist analysed these digital images with the help of a personal computer and designated them as M. tuberculosis present or absent. The specialist considered four images insufficiently clear to permit a definitive reading. The readings from the 196 valid images resulted in a 100% agreement with the conventional nonautomated standard reading. The automated MODS plate reader combined with open-source MODS pattern recognition software provides a novel platform for high throughput automated tuberculosis diagnosis.

  2. Rapid and automated detection of salmonella by electrical measurements.

    PubMed Central

    Easter, M. C.; Gibson, D. M.

    1985-01-01

    A rapid method for determining the presence of salmonella in food is described. It consists of pre-enrichment in buffered peptone water modified by the addition of dulcitol and trimethylamine oxide, followed by selective enrichment in a selenite-cystine broth with similar modifications. Changes in the conductance of the selective enrichment broth are monitored continuously using a suitable impediometric instrument. Most of the Salmonella spp. tested gave a fast (approximately 100 microS/h) and large (greater than 600 microS) change in conductance, other enteric bacteria much less or no change. The assay is usually complete within 24 h. Samples of foodstuffs, naturally and artificially contaminated with Salmonella spp., were all correctly classified. Some strains of Citrobacter freundii produced a false positive conductance response, and they could not be selectively eliminated using antibiotics or cyanide. The conductance method is simple and easy to use, gives rapid results and involves less media and subculturing than is required for traditional methods. PMID:3891846

  3. Fully Automated Detection of Cloud and Aerosol Layers in the CALIPSO Lidar Measurements

    NASA Technical Reports Server (NTRS)

    Vaughan, Mark A.; Powell, Kathleen A.; Kuehn, Ralph E.; Young, Stuart A.; Winker, David M.; Hostetler, Chris A.; Hunt, William H.; Liu, Zhaoyan; McGill, Matthew J.; Getzewich, Brian J.

    2009-01-01

    Accurate knowledge of the vertical and horizontal extent of clouds and aerosols in the earth s atmosphere is critical in assessing the planet s radiation budget and for advancing human understanding of climate change issues. To retrieve this fundamental information from the elastic backscatter lidar data acquired during the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission, a selective, iterated boundary location (SIBYL) algorithm has been developed and deployed. SIBYL accomplishes its goals by integrating an adaptive context-sensitive profile scanner into an iterated multiresolution spatial averaging scheme. This paper provides an in-depth overview of the architecture and performance of the SIBYL algorithm. It begins with a brief review of the theory of target detection in noise-contaminated signals, and an enumeration of the practical constraints levied on the retrieval scheme by the design of the lidar hardware, the geometry of a space-based remote sensing platform, and the spatial variability of the measurement targets. Detailed descriptions are then provided for both the adaptive threshold algorithm used to detect features of interest within individual lidar profiles and the fully automated multiresolution averaging engine within which this profile scanner functions. The resulting fusion of profile scanner and averaging engine is specifically designed to optimize the trade-offs between the widely varying signal-to-noise ratio of the measurements and the disparate spatial resolutions of the detection targets. Throughout the paper, specific algorithm performance details are illustrated using examples drawn from the existing CALIPSO dataset. Overall performance is established by comparisons to existing layer height distributions obtained by other airborne and space-based lidars.

  4. Use of an automated database to evaluate markers for early detection of pregnancy.

    PubMed

    Manson, J M; McFarland, B; Weiss, S

    2001-07-15

    The objective of this study was to develop and validate algorithms to detect pregnancies from the time of first clinical recognition by using Kaiser Permanente automated databases from Portland, Oregon. In 1993--1994, the authors evaluated these databases retrospectively to identify markers indicative of initial clinical detection of pregnancy and pregnancy outcomes. Pregnancy markers were found for 99% of the women for whom pregnancy outcomes were included in the automated databases, and pregnancy outcomes were identified for 77% of the women for whom there were pregnancy markers. The earliest marker most predictive of a pregnancy outcome was a positive human chorionic gonadotropin test; least predictive was an obstetric outpatient visit. Medical record review indicated that in a sample of women with pregnancy markers in the database, an estimated 6% of pregnancy outcomes (primarily early fetal deaths and elective terminations) were lost. Pregnancies were first captured in automated databases 6--8 weeks after the last menstrual period, and a combination of a positive human chorionic gonadotropin test and an outpatient obstetric visit was the most sensitive and specific early marker of pregnancy. When combined with automated pharmacy records, these databases may be valuable tools for evaluating prescription drug effects on all major outcomes of clinically recognized pregnancies. PMID:11447053

  5. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals were independently detected by our CAD algorithm and an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots. The study demonstrated the feasibility of applying automated FISH image and signal analysis to assist cyto-geneticists in detecting cervical cancers.

  6. Application of Reflectance Transformation Imaging Technique to Improve Automated Edge Detection in a Fossilized Oyster Reef

    NASA Astrophysics Data System (ADS)

    Djuricic, Ana; Puttonen, Eetu; Harzhauser, Mathias; Dorninger, Peter; Székely, Balázs; Mandic, Oleg; Nothegger, Clemens; Molnár, Gábor; Pfeifer, Norbert

    2016-04-01

    The world's largest fossilized oyster reef is located in Stetten, Lower Austria excavated during field campaigns of the Natural History Museum Vienna between 2005 and 2008. It is studied in paleontology to learn about change in climate from past events. In order to support this study, a laser scanning and photogrammetric campaign was organized in 2014 for 3D documentation of the large and complex site. The 3D point clouds and high resolution images from this field campaign are visualized by photogrammetric methods in form of digital surface models (DSM, 1 mm resolution) and orthophoto (0.5 mm resolution) to help paleontological interpretation of data. Due to size of the reef, automated analysis techniques are needed to interpret all digital data obtained from the field. One of the key components in successful automation is detection of oyster shell edges. We have tested Reflectance Transformation Imaging (RTI) to visualize the reef data sets for end-users through a cultural heritage viewing interface (RTIViewer). The implementation includes a Lambert shading method to visualize DSMs derived from terrestrial laser scanning using scientific software OPALS. In contrast to shaded RTI no devices consisting of a hardware system with LED lights, or a body to rotate the light source around the object are needed. The gray value for a given shaded pixel is related to the angle between light source and the normal at that position. Brighter values correspond to the slope surfaces facing the light source. Increasing of zenith angle results in internal shading all over the reef surface. In total, oyster reef surface contains 81 DSMs with 3 m x 2 m each. Their surface was illuminated by moving the virtual sun every 30 degrees (12 azimuth angles from 20-350) and every 20 degrees (4 zenith angles from 20-80). This technique provides paleontologists an interactive approach to virtually inspect the oyster reef, and to interpret the shell surface by changing the light source direction

  7. Automated Extraction Improves Multiplex Molecular Detection of Infection in Septic Patients

    PubMed Central

    Regueiro, Benito J.; Varela-Ledo, Eduardo; Martinez-Lamas, Lucia; Rodriguez-Calviño, Javier; Aguilera, Antonio; Santos, Antonio; Gomez-Tato, Antonio; Alvarez-Escudero, Julian

    2010-01-01

    Sepsis is one of the leading causes of morbidity and mortality in hospitalized patients worldwide. Molecular technologies for rapid detection of microorganisms in patients with sepsis have only recently become available. LightCycler SeptiFast test Mgrade (Roche Diagnostics GmbH) is a multiplex PCR analysis able to detect DNA of the 25 most frequent pathogens in bloodstream infections. The time and labor saved while avoiding excessive laboratory manipulation is the rationale for selecting the automated MagNA Pure compact nucleic acid isolation kit-I (Roche Applied Science, GmbH) as an alternative to conventional SeptiFast extraction. For the purposes of this study, we evaluate extraction in order to demonstrate the feasibility of automation. Finally, a prospective observational study was done using 106 clinical samples obtained from 76 patients in our ICU. Both extraction methods were used in parallel to test the samples. When molecular detection test results using both manual and automated extraction were compared with the data from blood cultures obtained at the same time, the results show that SeptiFast with the alternative MagNA Pure compact extraction not only shortens the complete workflow to 3.57 hrs., but also increases sensitivity of the molecular assay for detecting infection as defined by positive blood culture confirmation. PMID:20967222

  8. An automated procedure for covariation-based detection of RNA structure

    SciTech Connect

    Winker, S.; Overbeek, R.; Woese, C.R.; Olsen, G.J.; Pfluger, N.

    1989-12-01

    This paper summarizes our investigations into the computational detection of secondary and tertiary structure of ribosomal RNA. We have developed a new automated procedure that not only identifies potential bondings of secondary and tertiary structure, but also provides the covariation evidence that supports the proposed bondings, and any counter-evidence that can be detected in the known sequences. A small number of previously unknown bondings have been detected in individual RNA molecules (16S rRNA and 7S RNA) through the use of our automated procedure. Currently, we are systematically studying mitochondrial rRNA. Our goal is to detect tertiary structure within 16S rRNA and quaternary structure between 16S and 23S rRNA. Our ultimate hope is that automated covariation analysis will contribute significantly to a refined picture of ribosome structure. Our colleagues in biology have begun experiments to test certain hypotheses suggested by an examination of our program's output. These experiments involve sequencing key portions of the 23S ribosomal RNA for species in which the known 16S ribosomal RNA exhibits variation (from the dominant pattern) at the site of a proposed bonding. The hope is that the 23S ribosomal RNA of these species will exhibit corresponding complementary variation or generalized covariation. 24 refs.

  9. Automating dicentric chromosome detection from cytogenetic biodosimetry data

    PubMed Central

    Rogan, Peter K.; Li, Yanxin; Wickramasinghe, Asanka; Subasinghe, Akila; Caminsky, Natasha; Khan, Wahab; Samarabandu, Jagath; Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.

    2014-01-01

    We present a prototype software system with sufficient capacity and speed to estimate radiation exposures in a mass casualty event by counting dicentric chromosomes (DCs) in metaphase cells from many individuals. Top-ranked metaphase cell images are segmented by classifying and defining chromosomes with an active contour gradient vector field (GVF) and by determining centromere locations along the centreline. The centreline is extracted by discrete curve evolution (DCE) skeleton branch pruning and curve interpolation. Centromere detection minimises the global width and DAPI-staining intensity profiles along the centreline. A second centromere is identified by reapplying this procedure after masking the first. Dicentrics can be identified from features that capture width and intensity profile characteristics as well as local shape features of the object contour at candidate pixel locations. The correct location of the centromere is also refined in chromosomes with sister chromatid separation. The overall algorithm has both high sensitivity (85 %) and specificity (94 %). Results are independent of the shape and structure of chromosomes in different cells, or the laboratory preparation protocol followed. The prototype software was recoded in C++/OpenCV; image processing was accelerated by data and task parallelisation with Message Passaging Interface and Intel Threading Building Blocks and an asynchronous non-blocking I/O strategy. Relative to a serial process, metaphase ranking, GVF and DCE are, respectively, 100 and 300-fold faster on an 8-core desktop and 64-core cluster computers. The software was then ported to a 1024-core supercomputer, which processed 200 metaphase images each from 1025 specimens in 1.4 h. PMID:24757176

  10. Automating dicentric chromosome detection from cytogenetic biodosimetry data.

    PubMed

    Rogan, Peter K; Li, Yanxin; Wickramasinghe, Asanka; Subasinghe, Akila; Caminsky, Natasha; Khan, Wahab; Samarabandu, Jagath; Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H

    2014-06-01

    We present a prototype software system with sufficient capacity and speed to estimate radiation exposures in a mass casualty event by counting dicentric chromosomes (DCs) in metaphase cells from many individuals. Top-ranked metaphase cell images are segmented by classifying and defining chromosomes with an active contour gradient vector field (GVF) and by determining centromere locations along the centreline. The centreline is extracted by discrete curve evolution (DCE) skeleton branch pruning and curve interpolation. Centromere detection minimises the global width and DAPI-staining intensity profiles along the centreline. A second centromere is identified by reapplying this procedure after masking the first. Dicentrics can be identified from features that capture width and intensity profile characteristics as well as local shape features of the object contour at candidate pixel locations. The correct location of the centromere is also refined in chromosomes with sister chromatid separation. The overall algorithm has both high sensitivity (85 %) and specificity (94 %). Results are independent of the shape and structure of chromosomes in different cells, or the laboratory preparation protocol followed. The prototype software was recoded in C++/OpenCV; image processing was accelerated by data and task parallelisation with Message Passaging Interface and Intel Threading Building Blocks and an asynchronous non-blocking I/O strategy. Relative to a serial process, metaphase ranking, GVF and DCE are, respectively, 100 and 300-fold faster on an 8-core desktop and 64-core cluster computers. The software was then ported to a 1024-core supercomputer, which processed 200 metaphase images each from 1025 specimens in 1.4 h.

  11. Automating dicentric chromosome detection from cytogenetic biodosimetry data.

    PubMed

    Rogan, Peter K; Li, Yanxin; Wickramasinghe, Asanka; Subasinghe, Akila; Caminsky, Natasha; Khan, Wahab; Samarabandu, Jagath; Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H

    2014-06-01

    We present a prototype software system with sufficient capacity and speed to estimate radiation exposures in a mass casualty event by counting dicentric chromosomes (DCs) in metaphase cells from many individuals. Top-ranked metaphase cell images are segmented by classifying and defining chromosomes with an active contour gradient vector field (GVF) and by determining centromere locations along the centreline. The centreline is extracted by discrete curve evolution (DCE) skeleton branch pruning and curve interpolation. Centromere detection minimises the global width and DAPI-staining intensity profiles along the centreline. A second centromere is identified by reapplying this procedure after masking the first. Dicentrics can be identified from features that capture width and intensity profile characteristics as well as local shape features of the object contour at candidate pixel locations. The correct location of the centromere is also refined in chromosomes with sister chromatid separation. The overall algorithm has both high sensitivity (85 %) and specificity (94 %). Results are independent of the shape and structure of chromosomes in different cells, or the laboratory preparation protocol followed. The prototype software was recoded in C++/OpenCV; image processing was accelerated by data and task parallelisation with Message Passaging Interface and Intel Threading Building Blocks and an asynchronous non-blocking I/O strategy. Relative to a serial process, metaphase ranking, GVF and DCE are, respectively, 100 and 300-fold faster on an 8-core desktop and 64-core cluster computers. The software was then ported to a 1024-core supercomputer, which processed 200 metaphase images each from 1025 specimens in 1.4 h. PMID:24757176

  12. Automated cerebellar segmentation: Validation and application to detect smaller volumes in children prenatally exposed to alcohol☆

    PubMed Central

    Cardenas, Valerie A.; Price, Mathew; Infante, M. Alejandra; Moore, Eileen M.; Mattson, Sarah N.; Riley, Edward P.; Fein, George

    2014-01-01

    Objective To validate an automated cerebellar segmentation method based on active shape and appearance modeling and then segment the cerebellum on images acquired from adolescents with histories of prenatal alcohol exposure (PAE) and non-exposed controls (NC). Methods Automated segmentations of the total cerebellum, right and left cerebellar hemispheres, and three vermal lobes (anterior, lobules I–V; superior posterior, lobules VI–VII; inferior posterior, lobules VIII–X) were compared to expert manual labelings on 20 subjects, studied twice, that were not used for model training. The method was also used to segment the cerebellum on 11 PAE and 9 NC adolescents. Results The test–retest intraclass correlation coefficients (ICCs) of the automated method were greater than 0.94 for all cerebellar volume and mid-sagittal vermal area measures, comparable or better than the test–retest ICCs for manual measurement (all ICCs > 0.92). The ICCs computed on all four cerebellar measurements (manual and automated measures on the repeat scans) to compare comparability were above 0.97 for non-vermis parcels, and above 0.89 for vermis parcels. When applied to patients, the automated method detected smaller cerebellar volumes and mid-sagittal areas in the PAE group compared to controls (p < 0.05 for all regions except the superior posterior lobe, consistent with prior studies). Discussion These results demonstrate excellent reliability and validity of automated cerebellar volume and mid-sagittal area measurements, compared to manual measurements. These data also illustrate that this new technology for automatically delineating the cerebellum leads to conclusions regarding the effects of prenatal alcohol exposure on the cerebellum consistent with prior studies that used labor intensive manual delineation, even with a very small sample. PMID:25061566

  13. Automated Retinal Image Analysis for Evaluation of Focal Hyperpigmentary Changes in Intermediate Age-Related Macular Degeneration

    PubMed Central

    Schmitz-Valckenberg, Steffen; Göbel, Arno P.; Saur, Stefan C.; Steinberg, Julia S.; Thiele, Sarah; Wojek, Christian; Russmann, Christoph; Holz, Frank G.; for the MODIAMD-Study Group

    2016-01-01

    Purpose To develop and evaluate a software tool for automated detection of focal hyperpigmentary changes (FHC) in eyes with intermediate age-related macular degeneration (AMD). Methods Color fundus (CFP) and autofluorescence (AF) photographs of 33 eyes with FHC of 28 AMD patients (mean age 71 years) from the prospective longitudinal natural history MODIAMD-study were included. Fully automated to semiautomated registration of baseline to corresponding follow-up images was evaluated. Following the manual circumscription of individual FHC (four different readings by two readers), a machine-learning algorithm was evaluated for automatic FHC detection. Results The overall pixel distance error for the semiautomated (CFP follow-up to CFP baseline: median 5.7; CFP to AF images from the same visit: median 6.5) was larger as compared for the automated image registration (4.5 and 5.7; P < 0.001 and P < 0.001). The total number of manually circumscribed objects and the corresponding total size varied between 637 to 1163 and 520,848 pixels to 924,860 pixels, respectively. Performance of the learning algorithms showed a sensitivity of 96% at a specificity level of 98% using information from both CFP and AF images and defining small areas of FHC (“speckle appearance”) as “neutral.” Conclusions FHC as a high-risk feature for progression of AMD to late stages can be automatically assessed at different time points with similar sensitivity and specificity as compared to manual outlining. Upon further development of the research prototype, this approach may be useful both in natural history and interventional large-scale studies for a more refined classification and risk assessment of eyes with intermediate AMD. Translational Relevance Automated FHC detection opens the door for a more refined and detailed classification and risk assessment of eyes with intermediate AMD in both natural history and future interventional studies. PMID:26966639

  14. An Investigation of Automatic Change Detection for Topographic Map Updating

    NASA Astrophysics Data System (ADS)

    Duncan, P.; Smit, J.

    2012-08-01

    Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  15. Automated detection, 3D segmentation and analysis of high resolution spine MR images using statistical shape models

    NASA Astrophysics Data System (ADS)

    Neubert, A.; Fripp, J.; Engstrom, C.; Schwarz, R.; Lauer, L.; Salvado, O.; Crozier, S.

    2012-12-01

    Recent advances in high resolution magnetic resonance (MR) imaging of the spine provide a basis for the automated assessment of intervertebral disc (IVD) and vertebral body (VB) anatomy. High resolution three-dimensional (3D) morphological information contained in these images may be useful for early detection and monitoring of common spine disorders, such as disc degeneration. This work proposes an automated approach to extract the 3D segmentations of lumbar and thoracic IVDs and VBs from MR images using statistical shape analysis and registration of grey level intensity profiles. The algorithm was validated on a dataset of volumetric scans of the thoracolumbar spine of asymptomatic volunteers obtained on a 3T scanner using the relatively new 3D T2-weighted SPACE pulse sequence. Manual segmentations and expert radiological findings of early signs of disc degeneration were used in the validation. There was good agreement between manual and automated segmentation of the IVD and VB volumes with the mean Dice scores of 0.89 ± 0.04 and 0.91 ± 0.02 and mean absolute surface distances of 0.55 ± 0.18 mm and 0.67 ± 0.17 mm respectively. The method compares favourably to existing 3D MR segmentation techniques for VBs. This is the first time IVDs have been automatically segmented from 3D volumetric scans and shape parameters obtained were used in preliminary analyses to accurately classify (100% sensitivity, 98.3% specificity) disc abnormalities associated with early degenerative changes.

  16. Automated Region of Interest Detection of Fluorescent Neurons for Optogenetic Stimulation

    NASA Astrophysics Data System (ADS)

    Mishler, Jonathan; Plenz, Dietmar

    With the emergence of optogenetics, light has been used to simultaneously stimulate and image neural clusters in vivofor the purpose of understanding neural dynamics. Spatial light modulators (SLMs) have become the choice method for the targeted stimulation of neural clusters, offering unprecedented spatio-temporal resolution. By first imaging, and subsequently selecting the desired neurons for stimulation, SLMs can reliably stimulate those regions of interest (ROIs). However, as the cluster size grows, manually selecting the neurons becomes cumbersome and inefficient. Automated ROI detectors for this purpose have been developed, but rely on neural fluorescent spiking for detection, requiring several thousand imaging frames. To overcome this limitation, we present an automated ROI detection algorithm utilizing neural geometry and stationary information from a few hundred imaging frames that can be adjusted for sensitivity.

  17. Effects of Response Bias and Judgment Framing on Operator Use of an Automated Aid in a Target Detection Task

    ERIC Educational Resources Information Center

    Rice, Stephen; McCarley, Jason S.

    2011-01-01

    Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in…

  18. A Statistical Analysis of Automated and Manually Detected Fires Using Environmental Satellites

    NASA Astrophysics Data System (ADS)

    Ruminski, M. G.; McNamara, D.

    2003-12-01

    The National Environmental Satellite and Data Information Service (NESDIS) of the National Oceanic and Atmospheric Administration (NOAA) has been producing an analysis of fires and smoke over the US since 1998. This product underwent significant enhancement in June 2002 with the introduction of the Hazard Mapping System (HMS), an interactive workstation based system that displays environmental satellite imagery (NOAA Geostationary Operational Environmental Satellite (GOES), NOAA Polar Operational Environmental Satellite (POES) and National Aeronautics and Space Administration (NASA) MODIS data) and fire detects from the automated algorithms for each of the satellite sensors. The focus of this presentation is to present statistics compiled on the fire detects since November 2002. The Automated Biomass Burning Algorithm (ABBA) detects fires using GOES East and GOES West imagery. The Fire Identification, Mapping and Monitoring Algorithm (FIMMA) utilizes NOAA POES 15/16/17 imagery and the MODIS algorithm uses imagery from the MODIS instrument on the Terra and Aqua spacecraft. The HMS allows satellite analysts to inspect and interrogate the automated fire detects and the input satellite imagery. The analyst can then delete those detects that are felt to be false alarms and/or add fire points that the automated algorithms have not selected. Statistics are compiled for the number of automated detects from each of the algorithms, the number of automated detects that are deleted and the number of fire points added by the analyst for the contiguous US and immediately adjacent areas of Mexico and Canada. There is no attempt to distinguish between wildfires and control or agricultural fires. A detailed explanation of the automated algorithms is beyond the scope of this presentation. However, interested readers can find a more thorough description by going to www.ssd.noaa.gov/PS/FIRE/hms.html and scrolling down to Individual Fire Layers. For the period November 2002 thru August

  19. Automated Detection of Volcanic Thermal Anomalies: Detailed Analysis of the 2004 - 2005 Mt. Etna, Italy Eruption

    NASA Astrophysics Data System (ADS)

    Steffke, A. M.; Harris, A.; Garbeil, H.; Wright, R.; Dehn, J.

    2007-05-01

    Use of thermal infrared satellite data to detect, characterize and track volcanic thermal emissions is an appealing method for monitoring volcanoes for a number of reasons. It provides a synoptic perspective, with satellites sensors such as AVHRR and MODIS allowing global coverage at-least 4 times/day. At the same time, direct reception of calibrated digital data in a standard and stable format allows automation, enabling near-real time analysis of many volcanoes over large regions, including volcanoes where other geophysical instruments are not deployed. In addition, extracted thermal data can be use to convert to heat and volume flux estimates/time series. The development of an automated algorithm to detect volcanic thermal anomalies using thermal satellite data was first attempted over a decade ago (VAST). Subsequently several attempts have been made to create an effective way to automatically detect thermal anomalies at volcanoes using such high-temporal resolution satellite data (e.g. Okmok, MODVOLC and RAT). The underlying motivation has been to allow automated, routine and timely hot spot detection for volcanic monitoring purposes. In this study we review four algorithms that have been implemented to date, specifically: VAST, Okmok, MODVOLC and RAT. To test how VAST and MODVOLC performed we tested them on the 2004 - 2005 effusive eruption of Mount Etna (Sicily, Italy). These results were then compared with manually detected and picked thermal anomalies. Each algorithm is designed for different purposes, thus they perform differently. MODVOLC, for example, must run efficiently, up to 4 times a day, on a full global data set. Thus the number of algorithm steps are minimal and the detection threshold is high, meaning that the incidence of false positives are low, but so too is its sensitivity. In contrast, VAST is designed to run on a single volcano and has the added advantage of some user input. Thus, a greater incidence of false positives occurs, but more

  20. Measuring the fit between human judgments and automated alerting algorithms: a study of collision detection.

    PubMed

    Bisantz, Ann M; Pritchett, Amy R

    2003-01-01

    Methodologies for assessing human judgment in complex domains are important for the design of both displays that inform judgment and automated systems that suggest judgments. This paper uses the n-system lens model to evaluate the impact of displays on human judgment and to explicitly assess the similarity between human judgments and a set of potential judgment algorithms for use in automated systems. First, the need for and concepts underlying judgment analysis are outlined. Then the n-system lens model and its parameters are formally described. This model is then used to examine a previously conducted study of aircraft collision detection that had been analyzed using standard analysis of variance methods. Our analysis found the same main effects as did the earlier analysis. However, n-system lens model analysis was able to provide greater insight into the information relied upon for judgments and the impact of displays on judgment. Additionally, the analysis was able to identify attributes of human judgments that were--and were not--similar to judgments produced by automated systems. Potential applications of this research include automated aid design and operator training.

  1. Optimal training dataset composition for SVM-based, age-independent, automated epileptic seizure detection.

    PubMed

    Bogaarts, J G; Gommer, E D; Hilkman, D M W; van Kranen-Mastenbroek, V H J M; Reulen, J P H

    2016-08-01

    Automated seizure detection is a valuable asset to health professionals, which makes adequate treatment possible in order to minimize brain damage. Most research focuses on two separate aspects of automated seizure detection: EEG feature computation and classification methods. Little research has been published regarding optimal training dataset composition for patient-independent seizure detection. This paper evaluates the performance of classifiers trained on different datasets in order to determine the optimal dataset for use in classifier training for automated, age-independent, seizure detection. Three datasets are used to train a support vector machine (SVM) classifier: (1) EEG from neonatal patients, (2) EEG from adult patients and (3) EEG from both neonates and adults. To correct for baseline EEG feature differences among patients feature, normalization is essential. Usually dedicated detection systems are developed for either neonatal or adult patients. Normalization might allow for the development of a single seizure detection system for patients irrespective of their age. Two classifier versions are trained on all three datasets: one with feature normalization and one without. This gives us six different classifiers to evaluate using both the neonatal and adults test sets. As a performance measure, the area under the receiver operating characteristics curve (AUC) is used. With application of FBC, it resulted in performance values of 0.90 and 0.93 for neonatal and adult seizure detection, respectively. For neonatal seizure detection, the classifier trained on EEG from adult patients performed significantly worse compared to both the classifier trained on EEG data from neonatal patients and the classier trained on both neonatal and adult EEG data. For adult seizure detection, optimal performance was achieved by either the classifier trained on adult EEG data or the classifier trained on both neonatal and adult EEG data. Our results show that age

  2. Automated Guided-Wave Scanning Developed to Characterize Materials and Detect Defects

    NASA Technical Reports Server (NTRS)

    Martin, Richard E.; Gyekenyeski, Andrew L.; Roth, Don J.

    2004-01-01

    The Nondestructive Evaluation (NDE) Group of the Optical Instrumentation Technology Branch at the NASA Glenn Research Center has developed a scanning system that uses guided waves to characterize materials and detect defects. The technique uses two ultrasonic transducers to interrogate the condition of a material. The sending transducer introduces an ultrasonic pulse at a point on the surface of the specimen, and the receiving transducer detects the signal after it has passed through the material. The aim of the method is to correlate certain parameters in both the time and frequency domains of the detected waveform to characteristics of the material between the two transducers. The scanning system is shown. The waveform parameters of interest include the attenuation due to internal damping, waveform shape parameters, and frequency shifts due to material changes. For the most part, guided waves are used to gauge the damage state and defect growth of materials subjected to various mechanical or environmental loads. The technique has been applied to polymer matrix composites, ceramic matrix composites, and metal matrix composites as well as metallic alloys. Historically, guided wave analysis has been a point-by-point, manual technique with waveforms collected at discrete locations and postprocessed. Data collection and analysis of this type limits the amount of detail that can be obtained. Also, the manual movement of the sensors is prone to user error and is time consuming. The development of an automated guided-wave scanning system has allowed the method to be applied to a wide variety of materials in a consistent, repeatable manner. Experimental studies have been conducted to determine the repeatability of the system as well as compare the results obtained using more traditional NDE methods. The following screen capture shows guided-wave scan results for a ceramic matrix composite plate, including images for each of nine calculated parameters. The system can

  3. Anomalous change detection in imagery

    DOEpatents

    Theiler, James P.; Perkins, Simon J.

    2011-05-31

    A distribution-based anomaly detection platform is described that identifies a non-flat background that is specified in terms of the distribution of the data. A resampling approach is also disclosed employing scrambled resampling of the original data with one class specified by the data and the other by the explicit distribution, and solving using binary classification.

  4. Automated and miniaturized detection of biological threats with a centrifugal microfluidic system

    NASA Astrophysics Data System (ADS)

    Mark, D.; van Oordt, T.; Strohmeier, O.; Roth, G.; Drexler, J.; Eberhard, M.; Niedrig, M.; Patel, P.; Zgaga-Griesz, A.; Bessler, W.; Weidmann, M.; Hufert, F.; Zengerle, R.; von Stetten, F.

    2012-06-01

    The world's growing mobility, mass tourism, and the threat of terrorism increase the risk of the fast spread of infectious microorganisms and toxins. Today's procedures for pathogen detection involve complex stationary devices, and are often too time consuming for a rapid and effective response. Therefore a robust and mobile diagnostic system is required. We present a microstructured LabDisk which performs complex biochemical analyses together with a mobile centrifugal microfluidic device which processes the LabDisk. This portable system will allow fully automated and rapid detection of biological threats at the point-of-need.

  5. Automated, per pixel Cloud Detection from High-Resolution VNIR Data

    NASA Technical Reports Server (NTRS)

    Varlyguin, Dmitry L.

    2007-01-01

    CASA is a fully automated software program for the per-pixel detection of clouds and cloud shadows from medium- (e.g., Landsat, SPOT, AWiFS) and high- (e.g., IKONOS, QuickBird, OrbView) resolution imagery without the use of thermal data. CASA is an object-based feature extraction program which utilizes a complex combination of spectral, spatial, and contextual information available in the imagery and the hierarchical self-learning logic for accurate detection of clouds and their shadows.

  6. Automated thematic mapping and change detection of ERTS-1 images

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N. (Principal Investigator); Alpaugh, H.

    1972-01-01

    The author has identified the following significant results. An ERTS-1 image was compared to aircraft photography and maps of an area near Brownsville, Texas. In the coastal region of Cameron County, natural and cultural detail were identified in the ERTS-1 image. In Hidalgo County, ground truth was located on the ERTS-1 image. Haze and 50% cloud cover over Hidalgo County reduced the usefulness of multispectral techniques for recognizing crops.

  7. THE IMPACT OF TECHNOLOGICAL CHANGE IN THE MEATPACKING INDUSTRY. AUTOMATION PROGRAM REPORT, NUMBER 1.

    ERIC Educational Resources Information Center

    DICK, WILLIAM G.

    TWENTY AUTOMATION MANPOWER SERVICES DEMONSTRATION PROJECTS WERE STARTED TO PROVIDE EXPERIENCE WITH JOB MARKET PROBLEMS CAUSED BY CHANGING TECHNOLOGY AND MASS LAYOFFS. THE FIRST OF THE SERIES, ESTABLISHED IN LOCAL PUBLIC EMPLOYMENT SERVICE OFFICES, THIS PROJECT DEALT WITH THE LAYOFF OF 675 WORKERS, PROBLEMS OF READJUSTMENT IN THE PLANT, THE…

  8. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    SciTech Connect

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  9. Automated Fovea Detection in Spectral Domain Optical Coherence Tomography Scans of Exudative Macular Disease.

    PubMed

    Wu, Jing; Waldstein, Sebastian M; Montuoro, Alessio; Gerendas, Bianca S; Langs, Georg; Schmidt-Erfurth, Ursula

    2016-01-01

    In macular spectral domain optical coherence tomography (SD-OCT) volumes, detection of the foveal center is required for accurate and reproducible follow-up studies, structure function correlation, and measurement grid positioning. However, disease can cause severe obscuring or deformation of the fovea, thus presenting a major challenge in automated detection. We propose a fully automated fovea detection algorithm to extract the fovea position in SD-OCT volumes of eyes with exudative maculopathy. The fovea is classified into 3 main appearances to both specify the detection algorithm used and reduce computational complexity. Based on foveal type classification, the fovea position is computed based on retinal nerve fiber layer thickness. Mean absolute distance between system and clinical expert annotated fovea positions from a dataset comprised of 240 SD-OCT volumes was 162.3 µm in cystoid macular edema and 262 µm in nAMD. The presented method has cross-vendor functionality, while demonstrating accurate and reliable performance close to typical expert interobserver agreement. The automatically detected fovea positions may be used as landmarks for intra- and cross-patient registration and to create a joint reference frame for extraction of spatiotemporal features in "big data." Furthermore, reliable analyses of retinal thickness, as well as retinal structure function correlation, may be facilitated. PMID:27660636

  10. Automated Fovea Detection in Spectral Domain Optical Coherence Tomography Scans of Exudative Macular Disease

    PubMed Central

    Wu, Jing; Montuoro, Alessio; Gerendas, Bianca S.; Langs, Georg

    2016-01-01

    In macular spectral domain optical coherence tomography (SD-OCT) volumes, detection of the foveal center is required for accurate and reproducible follow-up studies, structure function correlation, and measurement grid positioning. However, disease can cause severe obscuring or deformation of the fovea, thus presenting a major challenge in automated detection. We propose a fully automated fovea detection algorithm to extract the fovea position in SD-OCT volumes of eyes with exudative maculopathy. The fovea is classified into 3 main appearances to both specify the detection algorithm used and reduce computational complexity. Based on foveal type classification, the fovea position is computed based on retinal nerve fiber layer thickness. Mean absolute distance between system and clinical expert annotated fovea positions from a dataset comprised of 240 SD-OCT volumes was 162.3 µm in cystoid macular edema and 262 µm in nAMD. The presented method has cross-vendor functionality, while demonstrating accurate and reliable performance close to typical expert interobserver agreement. The automatically detected fovea positions may be used as landmarks for intra- and cross-patient registration and to create a joint reference frame for extraction of spatiotemporal features in “big data.” Furthermore, reliable analyses of retinal thickness, as well as retinal structure function correlation, may be facilitated. PMID:27660636

  11. Automated Fovea Detection in Spectral Domain Optical Coherence Tomography Scans of Exudative Macular Disease

    PubMed Central

    Wu, Jing; Montuoro, Alessio; Gerendas, Bianca S.; Langs, Georg

    2016-01-01

    In macular spectral domain optical coherence tomography (SD-OCT) volumes, detection of the foveal center is required for accurate and reproducible follow-up studies, structure function correlation, and measurement grid positioning. However, disease can cause severe obscuring or deformation of the fovea, thus presenting a major challenge in automated detection. We propose a fully automated fovea detection algorithm to extract the fovea position in SD-OCT volumes of eyes with exudative maculopathy. The fovea is classified into 3 main appearances to both specify the detection algorithm used and reduce computational complexity. Based on foveal type classification, the fovea position is computed based on retinal nerve fiber layer thickness. Mean absolute distance between system and clinical expert annotated fovea positions from a dataset comprised of 240 SD-OCT volumes was 162.3 µm in cystoid macular edema and 262 µm in nAMD. The presented method has cross-vendor functionality, while demonstrating accurate and reliable performance close to typical expert interobserver agreement. The automatically detected fovea positions may be used as landmarks for intra- and cross-patient registration and to create a joint reference frame for extraction of spatiotemporal features in “big data.” Furthermore, reliable analyses of retinal thickness, as well as retinal structure function correlation, may be facilitated.

  12. Automated detection of clustered microcalcifications on mammograms: CAD system application to MIAS database

    NASA Astrophysics Data System (ADS)

    Ibrahim, Norhayati; Fujita, Hiroshi; Hara, Takeshi; Endo, Tokiko

    1997-12-01

    To investigate the detection performance of our automated detection scheme for clustered microcalcifications on mammograms, we applied our computer-aided diagnosis (CAD) system to the database of the Mammographic Image Analysis Society (MIAS) in the UK. Forty-three mammograms from this database were used in this study. In our scheme, the breast regions were firstly extracted by determining the skinline. Histograms of the original images were used to extract the high-density area within the breast region as the segmentation from the fatty area around the skinline. Then the contrast correction technique was employed. Gradient vectors of the image density were calculated on the contrast corrected images. To extract the specific features of the pattern of the microcalcifications, triple-ring filter analysis was employed. A variable-ring filter was used for more accurate detection after the triple-ring filter. The features of the detected candidate areas were then characterized by feature analysis. The areas which satisfied the characteristics and specific terms were classified and displayed as clusters. As a result, the sensitivity was 95.8% with the false-positive rate at 1.8 clusters per image. This demonstrates that the automated detection of clustered microcalcifications in our CAD system is reliable as an aid to radiologists.

  13. Filament Chirality over an Entire Cycle Determined with an Automated Detection Module -- a Neat Surprise!

    NASA Astrophysics Data System (ADS)

    Martens, Petrus C.; Yeates, A. R.; Mackay, D.; Pillai, K. G.

    2013-07-01

    Using metadata produced by automated solar feature detection modules developed for SDO (Martens et al. 2012) we have discovered some trends in filament chirality and filament-sigmoid relations that are new and in part contradict the current consensus. Automated detection of solar features has the advantage over manual detection of having the detection criteria applied consistently, and in being able to deal with enormous amounts of data, like the 1 Terabyte per day that SDO produces. Here we use the filament detection module developed by Bernasconi, which has metadata from 2000 on, and the sigmoid sniffer, which has been producing metadata from AIA 94 A images since October 2011. The most interesting result we find is that the hemispheric chirality preference for filaments (dextral in the north, and v.v.), studied in detail for a three year period by Pevtsov et al. (2003) seems to disappear during parts of the decline of cycle 23 and during the extended solar minimum that followed. Moreover the hemispheric chirality rule seems to be much less pronounced during the onset of cycle 24. For sigmoids we find the expected correlation between chirality and handedness (S or Z) shape but not as strong as expected.

  14. Detection of diarrhoeal pathogens in human faeces using an automated, robotic platform.

    PubMed

    Jex, Aaron R; Stanley, Keith K; Lo, William; Littman, Rachael; Verweij, Jaco J; Campbell, Bronwyn E; Nolan, Matthew J; Pangasa, Aradhana; Stevens, Melita A; Haydon, Shane; Gasser, Robin B

    2012-02-01

    Infectious diarrhoeal diseases represent a major socio-economic burden to humans, and are linked to a range of pathogens, including viruses, bacteria and protists. The accurate detection of such pathogens is central to control. However, detection often relies on methods that have limited diagnostic sensitivity and specificity. Here, we assessed an automated, robotic platform for the simultaneous detection of eight major pathogens associated with infectious diarrhoea. Genomic DNA samples (n = 167) from faeces from humans with diarrhoea and diagnosed as cryptosporidiosis, and 100 uninfected control subjects, were tested for adenovirus 40/41, norovirus, Clostridium difficile, Campylobacter, Salmonella, Shigella, Cryptosporidium and Giardia by multiplexed-tandem PCR, and also characterized by single-strand conformation polymorphism analysis (SSCP) and selective sequencing. All 167 samples tested positive for Cryptosporidium, five for adenovirus 40/41, four for Campylobacter, three for C. difficile and seven for Shigella spp., with no false positive results for any assay. The automated PCR exhibited a high sensitivity, with <10 individual pathogens being readily detected. The robotic detection platform assessed here represents a sensitive, high-throughput tool for key pathogens linked to infectious diarrhoea in humans. This platform requires little molecular biological expertise and is well suited to various diagnostic facilities and settings.

  15. Automated Detection of Brain Abnormalities in Neonatal Hypoxia Ischemic Injury from MR Images

    PubMed Central

    Ghosh, Nirmalya; Sun, Yu; Bhanu, Bir; Ashwal, Stephen; Obenaus, Andre

    2014-01-01

    We compared the efficacy of three automated brain injury detection methods, namely symmetry-integrated region growing (SIRG), hierarchical region splitting (HRS) and modified watershed segmentation (MWS) in human and animal magnetic resonance imaging (MRI) datasets for the detection of hypoxic ischemic injuries (HII). Diffusion weighted imaging (DWI, 1.5T) data from neonatal arterial ischemic stroke (AIS) patients, as well as T2-weighted imaging (T2WI, 11.7T, 4.7T) at seven different time-points (1, 4, 7, 10, 17, 24 and 31 days post HII) in rat-pup model of hypoxic ischemic injury were used to check the temporal efficacy of our computational approaches. Sensitivity, specificity, similarity were used as performance metrics based on manual (‘gold standard’) injury detection to quantify comparisons. When compared to the manual gold standard, automated injury location results from SIRG performed the best in 62% of the data, while 29% for HRS and 9% for MWS. Injury severity detection revealed that SIRG performed the best in 67% cases while HRS for 33% data. Prior information is required by HRS and MWS, but not by SIRG. However, SIRG is sensitive to parameter-tuning, while HRS and MWS are not. Among these methods, SIRG performs the best in detecting lesion volumes; HRS is the most robust, while MWS lags behind in both respects. PMID:25000294

  16. Automated Detection of Benzodiazepine Dosage in ICU Patients through a Computational Analysis of Electrocardiographic Data

    PubMed Central

    Spadafore, Maxwell T.; Syed, Zeeshan; Rubinfeld, Ilan S.

    2015-01-01

    To enable automated maintenance of patient sedation in an intensive care unit (ICU) setting, more robust, quantitative metrics of sedation depth must be developed. In this study, we demonstrated the feasibility of a fully computational system that leverages low-quality electrocardiography (ECG) from a single lead to detect the presence of benzodiazepine sedatives in a subject’s system. Starting with features commonly examined manually by cardiologists searching for evidence of poisonings, we generalized the extraction of these features to a fully automated process. We tested the predictive power of these features using nine subjects from an intensive care clinical database. Features were found to be significantly indicative of a binary relationship between dose and ECG morphology, but we were unable to find evidence of a predictable continuous relationship. Fitting this binary relationship to a classifier, we achieved a sensitivity of 89% and a specificity of 95%. PMID:26958308

  17. A fully automated IIF system for the detection of antinuclear antibodies and antineutrophil cytoplasmic antibodies.

    PubMed

    Shovman, O; Agmon-Levin, N; Gilburd, B; Martins, T; Petzold, A; Matthias, T; Shoenfeld, Y

    2015-02-01

    Indirect immunofluorescence (IIF) is the main technique for the detection of antinuclear antibodies (ANA) and antineutrophil cytoplasmic antibodies (ANCA). The fully automated IIF processor HELIOS(®) is the first IIF processor that is able to automatically prepare slides and perform automatic reading. The objective of the present study was to determine the diagnostic performance of this system for ANA and ANCA IIF interpretation, in comparison with visual IIF. ANA detection by visual IIF or HELIOS(®) was performed on 425 sera samples including: 218 consecutive samples submitted to a reference laboratory for routine ANA testing, 137 samples from healthy subjects and 70 ANA/ENA positive samples. For ANCA determination, 170 sera samples were collected: 40 samples for routine testing, 90 samples from healthy blood donors and 40 anti-PR3/anti-MPO positive subjects. Good correlation was found for the visual and automated ANA IIF approach regarding positive/negative discrimination of these samples (kappa = 0.633 for ANA positive samples and kappa = 0.657 for ANA negative samples, respectively). Positive/negative IIF ANCA discrimination by HELIOS(®) and visual IIF revealed a complete agreement of 100% in sera from healthy patients and PR3/MPO positive samples (kappa = 1.00). There was 95% agreement between the ANCA IIF performed by automated and visual IIF on the investigation of routine samples. Based on these results, HELIOS(®) demonstrated a high diagnostic performance for the automated ANA and ANCA IIF interpretation that was similar to a visual reading in all groups of samples.

  18. Development of an Automated DNA Detection System Using an Electrochemical DNA Chip Technology

    NASA Astrophysics Data System (ADS)

    Hongo, Sadato; Okada, Jun; Hashimoto, Koji; Tsuji, Koichi; Nikaido, Masaru; Gemma, Nobuhiro

    A new compact automated DNA detection system Genelyzer™ has been developed. After injecting a sample solution into a cassette with a built-in electrochemical DNA chip, processes from hybridization reaction to detection and analysis are all operated fully automatically. In order to detect a sample DNA, electrical currents from electrodes due to an oxidization reaction of electrochemically active intercalator molecules bound to hybridized DNAs are detected. The intercalator is supplied as a reagent solution by a fluid supply unit of the system. The feasibility test proved that the simultaneous typing of six single nucleotide polymorphisms (SNPs) associated with a rheumatoid arthritis (RA) was carried out within two hours and that all the results were consistent with those by conventional typing methods. It is expected that this system opens a new way to a DNA testing such as a test for infectious diseases, a personalized medicine, a food inspection, a forensic application and any other applications.

  19. A system for automated outbreak detection of communicable diseases in Germany.

    PubMed

    Salmon, Maëlle; Schumacher, Dirk; Burmann, Hendrik; Frank, Christina; Claus, Hermann; Höhle, Michael

    2016-01-01

    We describe the design and implementation of a novel automated outbreak detection system in Germany that monitors the routinely collected surveillance data for communicable diseases. Detecting unusually high case counts as early as possible is crucial as an accumulation may indicate an ongoing outbreak. The detection in our system is based on state-of-the-art statistical procedures conducting the necessary data mining task. In addition, we have developed effective methods to improve the presentation of the results of such algorithms to epidemiologists and other system users. The objective was to effectively integrate automatic outbreak detection into the epidemiological workflow of a public health institution. Since 2013, the system has been in routine use at the German Robert Koch Institute. PMID:27063588

  20. A rich Internet application for automated detection of road blockage in post-disaster scenarios

    NASA Astrophysics Data System (ADS)

    Liu, W.; Dong, P.; Liu, S.; Liu, J.

    2014-02-01

    This paper presents the development of a rich Internet application for automated detection of road blockage in post-disaster scenarios using volunteered geographic information from OpenStreetMap street centerlines and airborne light detection and ranging (LiDAR) data. The architecture of the application on the client-side and server-side was described. The major functionality of the application includes shapefile uploading, Web editing for spatial features, road blockage detection, and blockage points downloading. An example from the 2010 Haiti earthquake was included to demonstrate the effectiveness of the application. The results suggest that the prototype application can effectively detect (1) road blockage caused by earthquakes, and (2) some human errors caused by contributors of volunteered geographic information.

  1. Automated sinkhole detection using a DEM subsetting technique and fill tools at Mammoth Cave National Park

    NASA Astrophysics Data System (ADS)

    Wall, J.; Bohnenstiehl, D. R.; Levine, N. S.

    2013-12-01

    An automated workflow for sinkhole detection is developed using Light Detection and Ranging (Lidar) data from Mammoth Cave National Park (MACA). While the park is known to sit within a karst formation, the generally dense canopy cover and the size of the park (~53,000 acres) creates issues for sinkhole inventorying. Lidar provides a useful remote sensing technology for peering beneath the canopy in hard to reach areas of the park. In order to detect sinkholes, a subsetting technique is used to interpolate a Digital Elevation Model (DEM) thereby reducing edge effects. For each subset, standard GIS fill tools are used to fill depressions within the DEM. The initial DEM is then subtracted from the filled DEM resulting in detected depressions or sinkholes. Resulting depressions are then described in terms of size and geospatial trend.

  2. Image Change Detection via Ensemble Learning

    SciTech Connect

    Martin, Benjamin W; Vatsavai, Raju

    2013-01-01

    The concept of geographic change detection is relevant in many areas. Changes in geography can reveal much information about a particular location. For example, analysis of changes in geography can identify regions of population growth, change in land use, and potential environmental disturbance. A common way to perform change detection is to use a simple method such as differencing to detect regions of change. Though these techniques are simple, often the application of these techniques is very limited. Recently, use of machine learning methods such as neural networks for change detection has been explored with great success. In this work, we explore the use of ensemble learning methodologies for detecting changes in bitemporal synthetic aperture radar (SAR) images. Ensemble learning uses a collection of weak machine learning classifiers to create a stronger classifier which has higher accuracy than the individual classifiers in the ensemble. The strength of the ensemble lies in the fact that the individual classifiers in the ensemble create a mixture of experts in which the final classification made by the ensemble classifier is calculated from the outputs of the individual classifiers. Our methodology leverages this aspect of ensemble learning by training collections of weak decision tree based classifiers to identify regions of change in SAR images collected of a region in the Staten Island, New York area during Hurricane Sandy. Preliminary studies show that the ensemble method has approximately 11.5% higher change detection accuracy than an individual classifier.

  3. Automated Detection of Selective Logging in Amazon Forests Using Airborne Lidar Data and Pattern Recognition Algorithms

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; d'Oliveira, M. N.; Takemura, C. M.; Vitoria, D.; Araujo, L. S.; Morton, D. C.

    2012-12-01

    Selective logging, the removal of several valuable timber trees per hectare, is an important land use in the Brazilian Amazon and may degrade forests through long term changes in structure, loss of forest carbon and species diversity. Similar to deforestation, the annual area affected by selected logging has declined significantly in the past decade. Nonetheless, this land use affects several thousand km2 per year in Brazil. We studied a 1000 ha area of the Antimary State Forest (FEA) in the State of Acre, Brazil (9.304 ○S, 68.281 ○W) that has a basal area of 22.5 m2 ha-1 and an above-ground biomass of 231 Mg ha-1. Logging intensity was low, approximately 10 to 15 m3 ha-1. We collected small-footprint airborne lidar data using an Optech ALTM 3100EA over the study area once each in 2010 and 2011. The study area contained both recent and older logging that used both conventional and technologically advanced logging techniques. Lidar return density averaged over 20 m-2 for both collection periods with estimated horizontal and vertical precision of 0.30 and 0.15 m. A relative density model comparing returns from 0 to 1 m elevation to returns in 1-5 m elevation range revealed the pattern of roads and skid trails. These patterns were confirmed by ground-based GPS survey. A GIS model of the road and skid network was built using lidar and ground data. We tested and compared two pattern recognition approaches used to automate logging detection. Both segmentation using commercial eCognition segmentation and a Frangi filter algorithm identified the road and skid trail network compared to the GIS model. We report on the effectiveness of these two techniques.

  4. Computerized detection of breast cancer on automated breast ultrasound imaging of women with dense breasts

    PubMed Central

    Drukker, Karen; Sennett, Charlene A.; Giger, Maryellen L.

    2014-01-01

    Purpose: Develop a computer-aided detection method and investigate its feasibility for detection of breast cancer in automated 3D ultrasound images of women with dense breasts. Methods: The HIPAA compliant study involved a dataset of volumetric ultrasound image data, “views,” acquired with an automated U-Systems Somo•V® ABUS system for 185 asymptomatic women with dense breasts (BI-RADS Composition/Density 3 or 4). For each patient, three whole-breast views (3D image volumes) per breast were acquired. A total of 52 patients had breast cancer (61 cancers), diagnosed through any follow-up at most 365 days after the original screening mammogram. Thirty-one of these patients (32 cancers) had a screening-mammogram with a clinically assigned BI-RADS Assessment Category 1 or 2, i.e., were mammographically negative. All software used for analysis was developed in-house and involved 3 steps: (1) detection of initial tumor candidates, (2) characterization of candidates, and (3) elimination of false-positive candidates. Performance was assessed by calculating the cancer detection sensitivity as a function of the number of “marks” (detections) per view. Results: At a single mark per view, i.e., six marks per patient, the median detection sensitivity by cancer was 50.0% (16/32) ± 6% for patients with a screening mammogram-assigned BI-RADS category 1 or 2—similar to radiologists’ performance sensitivity (49.9%) for this dataset from a prior reader study—and 45.9% (28/61) ± 4% for all patients. Conclusions: Promising detection sensitivity was obtained for the computer on a 3D ultrasound dataset of women with dense breasts at a rate of false-positive detections that may be acceptable for clinical implementation. PMID:24387528

  5. Computerized detection of breast cancer on automated breast ultrasound imaging of women with dense breasts

    SciTech Connect

    Drukker, Karen Sennett, Charlene A.; Giger, Maryellen L.

    2014-01-15

    Purpose: Develop a computer-aided detection method and investigate its feasibility for detection of breast cancer in automated 3D ultrasound images of women with dense breasts. Methods: The HIPAA compliant study involved a dataset of volumetric ultrasound image data, “views,” acquired with an automated U-Systems Somo•V{sup ®} ABUS system for 185 asymptomatic women with dense breasts (BI-RADS Composition/Density 3 or 4). For each patient, three whole-breast views (3D image volumes) per breast were acquired. A total of 52 patients had breast cancer (61 cancers), diagnosed through any follow-up at most 365 days after the original screening mammogram. Thirty-one of these patients (32 cancers) had a screening-mammogram with a clinically assigned BI-RADS Assessment Category 1 or 2, i.e., were mammographically negative. All software used for analysis was developed in-house and involved 3 steps: (1) detection of initial tumor candidates, (2) characterization of candidates, and (3) elimination of false-positive candidates. Performance was assessed by calculating the cancer detection sensitivity as a function of the number of “marks” (detections) per view. Results: At a single mark per view, i.e., six marks per patient, the median detection sensitivity by cancer was 50.0% (16/32) ± 6% for patients with a screening mammogram-assigned BI-RADS category 1 or 2—similar to radiologists’ performance sensitivity (49.9%) for this dataset from a prior reader study—and 45.9% (28/61) ± 4% for all patients. Conclusions: Promising detection sensitivity was obtained for the computer on a 3D ultrasound dataset of women with dense breasts at a rate of false-positive detections that may be acceptable for clinical implementation.

  6. Automated detection of presence of mucus foci in airway diseases: preliminary results

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.; Ko, Jane; Godoy, Myrna C. B.

    2009-02-01

    Chronic Obstructive Pulmonary Disease (COPD) is often characterized by partial or complete obstruction of airflow in the lungs. This can be due to airway wall thickening and retained secretions, resulting in foci of mucoid impactions. Although radiologists have proposed scoring systems to assess extent and severity of airway diseases from CT images, these scores are seldom used clinically due to impracticality. The high level of subjectivity from visual inspection and the sheer number of airways in the lungs mean that automation is critical in order to realize accurate scoring. In this work we assess the feasibility of including an automated mucus detection method in a clinical scoring system. Twenty high-resolution datasets of patients with mild to severe bronchiectasis were randomly selected, and used to test the ability of the computer to detect the presence or absence of mucus in each lobe (100 lobes in all). Two experienced radiologists independently scored the presence or absence of mucus in each lobe based on the visual assessment method recommended by Sheehan et al [1]. These results were compared with an automated method developed for mucus plug detection [2]. Results showed agreement between the two readers on 44% of the lobes for presence of mucus, 39% of lobes for absence of mucus, and discordant opinions on 17 lobes. For 61 lobes where 1 or both readers detected mucus, the computer sensitivity was 75.4%, the specificity was 69.2%, and the positive predictive value (PPV) was 79.3%. Six computer false positives were a-posteriori reviewed by the experts and reassessed as true positives, yielding results of 77.6% sensitivity, 81.8% for specificity, and 89.6% PPV.

  7. Automated Cell Detection and Morphometry on Growth Plate Images of Mouse Bone

    PubMed Central

    Ascenzi, Maria-Grazia; Du, Xia; Harding, James I; Beylerian, Emily N; de Silva, Brian M; Gross, Ben J; Kastein, Hannah K; Wang, Weiguang; Lyons, Karen M; Schaeffer, Hayden

    2014-01-01

    Microscopy imaging of mouse growth plates is extensively used in biology to understand the effect of specific molecules on various stages of normal bone development and on bone disease. Until now, such image analysis has been conducted by manual detection. In fact, when existing automated detection techniques were applied, morphological variations across the growth plate and heterogeneity of image background color, including the faint presence of cells (chondrocytes) located deeper in tissue away from the image’s plane of focus, and lack of cell-specific features, interfered with identification of cell. We propose the first method of automated detection and morphometry applicable to images of cells in the growth plate of long bone. Through ad hoc sequential application of the Retinex method, anisotropic diffusion and thresholding, our new cell detection algorithm (CDA) addresses these challenges on bright-field microscopy images of mouse growth plates. Five parameters, chosen by the user in respect of image characteristics, regulate our CDA. Our results demonstrate effectiveness of the proposed numerical method relative to manual methods. Our CDA confirms previously established results regarding chondrocytes’ number, area, orientation, height and shape of normal growth plates. Our CDA also confirms differences previously found between the genetic mutated mouse Smad1/5CKO and its control mouse on fluorescence images. The CDA aims to aid biomedical research by increasing efficiency and consistency of data collection regarding arrangement and characteristics of chondrocytes. Our results suggest that automated extraction of data from microscopy imaging of growth plates can assist in unlocking information on normal and pathological development, key to the underlying biological mechanisms of bone growth. PMID:25525552

  8. Automated Detection of coronal mass ejections in three-dimensions using multi-viewpoint observations

    NASA Astrophysics Data System (ADS)

    Hutton, Joseph; Morgan, Huw

    2016-10-01

    A new, automated method of detecting Solar Wind transients such as Coronal Mass Ejections (CMEs) in three dimensions for the LASCO C2 and STEREO COR2 coronagraphs is presented. By triangulating isolated CME signal from the three coronagraphs over a sliding window of five hours, the most likely region through which CMEs pass at 5 solar radii is identified. The centre and size of the region gives the most likely direction of propagation and angular extent. The Automated CME Triangulation (ACT) method is tested extensively using a series of synthetic CME images created using a flux rope density model, and on a sample of real coronagraph data; including Halo CMEs. The accuracy of the detection remains acceptable regardless of CME position relative to the observer, the relative separation of the three observers, and even through the loss of one coronagraph. By comparing the detection results with the input parameters of the synthetic CMEs, and the low coronal sources of the real CMEs, it is found that the detection is on average accurate to within 7.14 degrees. All current CME catalogues (CDAW, CACTus, SEEDS, ARTEMIS and CORIMP) rely on plane-of-sky measurements for key parameters such as height and velocity. Estimating the true geometry using the new method gains considerable accuracy for kinematics and mass/density. The results of the new method will be incorporated into the CORIMP database in the near future, enabling improved space weather diagnostics and forecasting.

  9. Fully Automated Centrifugal Microfluidic Device for Ultrasensitive Protein Detection from Whole Blood.

    PubMed

    Park, Yang-Seok; Sunkara, Vijaya; Kim, Yubin; Lee, Won Seok; Han, Ja-Ryoung; Cho, Yoon-Kyoung

    2016-01-01

    Enzyme-linked immunosorbent assay (ELISA) is a promising method to detect small amount of proteins in biological samples. The devices providing a platform for reduced sample volume and assay time as well as full automation are required for potential use in point-of-care-diagnostics. Recently, we have demonstrated ultrasensitive detection of serum proteins, C-reactive protein (CRP) and cardiac troponin I (cTnI), utilizing a lab-on-a-disc composed of TiO2 nanofibrous (NF) mats. It showed a large dynamic range with femto molar (fM) detection sensitivity, from a small volume of whole blood in 30 min. The device consists of several components for blood separation, metering, mixing, and washing that are automated for improved sensitivity from low sample volumes. Here, in the video demonstration, we show the experimental protocols and know-how for the fabrication of NFs as well as the disc, their integration and the operation in the following order: processes for preparing TiO2 NF mat; transfer-printing of TiO2 NF mat onto the disc; surface modification for immune-reactions, disc assembly and operation; on-disc detection and representative results for immunoassay. Use of this device enables multiplexed analysis with minimal consumption of samples and reagents. Given the advantages, the device should find use in a wide variety of applications, and prove beneficial in facilitating the analysis of low abundant proteins. PMID:27167836

  10. Probabilistic Change Detection Framework for Analyzing Settlement Dynamics Using Very High-resolution Satellite Imagery

    SciTech Connect

    Vatsavai, Raju; Graesser, Jordan B

    2012-01-01

    Global human population growth and an increasingly urbanizing world have led to rapid changes in human settlement landscapes and patterns. Timely monitoring and assessment of these changes and dissemination of accurate information is important for policy makers, city planners, and humanitarian relief workers. Satellite imagery provides useful data for the aforementioned applications, and remote sensing can be used to identify and quantify change areas. We explore a probabilistic framework to identify changes in human settlements using very high-resolution satellite imagery. As compared to predominantly pixel-based change detection systems which are highly sensitive to image registration errors, our grid (block) based approach is more robust to registration errors. The presented framework is an automated change detection system applicable to both panchromatic and multi-spectral imagery. The detection system provides comprehensible information about change areas, and minimizes the post-detection thresholding procedure often needed in traditional change detection algorithms.

  11. Automatic detection of surface changes on Mars - a status report

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, Panagiotis; Muller, Jan-Peter

    2016-10-01

    Orbiter missions have acquired approximately 500,000 high-resolution visible images of the Martian surface, covering an area approximately 6 times larger than the overall area of Mars. This data abundance allows the scientific community to examine the Martian surface thoroughly and potentially make exciting new discoveries. However, the increased data volume, as well as its complexity, generate problems at the data processing stages, which are mainly related to a number of unresolved issues that batch-mode planetary data processing presents. As a matter of fact, the scientific community is currently struggling to scale the common ("one-at-a-time" processing of incoming products by expert scientists) paradigm to tackle the large volumes of input data. Moreover, expert scientists are more or less forced to use complex software in order to extract input information for their research from raw data, even though they are not data scientists themselves.Our work within the STFC and EU FP7 i-Mars projects aims at developing automated software that will process all of the acquired data, leaving domain expert planetary scientists to focus on their final analysis and interpretation. Moreover, after completing the development of a fully automated pipeline that processes automatically the co-registration of high-resolution NASA images to ESA/DLR HRSC baseline, our main goal has shifted to the automated detection of surface changes on Mars. In particular, we are developing a pipeline that uses as an input multi-instrument image pairs, which are processed by an automated pipeline, in order to identify changes that are correlated with Mars surface dynamic phenomena. The pipeline has currently been tested in anger on 8,000 co-registered images and by the time of DPS/EPSC we expect to have processed many tens of thousands of image pairs, producing a set of change detection results, a subset of which will be shown in the presentation.The research leading to these results has received

  12. Indigenous people's detection of rapid ecological change.

    PubMed

    Aswani, Shankar; Lauer, Matthew

    2014-06-01

    When sudden catastrophic events occur, it becomes critical for coastal communities to detect and respond to environmental transformations because failure to do so may undermine overall ecosystem resilience and threaten people's livelihoods. We therefore asked how capable of detecting rapid ecological change following massive environmental disruptions local, indigenous people are. We assessed the direction and periodicity of experimental learning of people in the Western Solomon Islands after a tsunami in 2007. We compared the results of marine science surveys with local ecological knowledge of the benthos across 3 affected villages and 3 periods before and after the tsunami. We sought to determine how people recognize biophysical changes in the environment before and after catastrophic events such as earthquakes and tsunamis and whether people have the ability to detect ecological changes over short time scales or need longer time scales to recognize changes. Indigenous people were able to detect changes in the benthos over time. Detection levels differed between marine science surveys and local ecological knowledge sources over time, but overall patterns of statistically significant detection of change were evident for various habitats. Our findings have implications for marine conservation, coastal management policies, and disaster-relief efforts because when people are able to detect ecological changes, this, in turn, affects how they exploit and manage their marine resources.

  13. Automated Detection of Toxigenic Clostridium difficile in Clinical Samples: Isothermal tcdB Amplification Coupled to Array-Based Detection

    PubMed Central

    Pasko, Chris; Groves, Benjamin; Ager, Edward; Corpuz, Maylene; Frech, Georges; Munns, Denton; Smith, Wendy; Warcup, Ashley; Denys, Gerald; Ledeboer, Nathan A.; Lindsey, Wes; Owen, Charles; Rea, Larry; Jenison, Robert

    2012-01-01

    Clostridium difficile can carry a genetically variable pathogenicity locus (PaLoc), which encodes clostridial toxins A and B. In hospitals and in the community at large, this organism is increasingly identified as a pathogen. To develop a diagnostic test that combines the strengths of immunoassays (cost) and DNA amplification assays (sensitivity/specificity), we targeted a genetically stable PaLoc region, amplifying tcdB sequences and detecting them by hybridization capture. The assay employs a hot-start isothermal method coupled to a multiplexed chip-based readout, creating a manual assay that detects toxigenic C. difficile with high sensitivity and specificity within 1 h. Assay automation on an electromechanical instrument produced an analytical sensitivity of 10 CFU (95% probability of detection) of C. difficile in fecal samples, along with discrimination against other enteric bacteria. To verify automated assay function, 130 patient samples were tested: 31/32 positive samples (97% sensitive; 95% confidence interval [CI], 82 to 99%) and 98/98 negative samples (100% specific; 95% CI, 95 to 100%) were scored correctly. Large-scale clinical studies are now planned to determine clinical sensitivity and specificity. PMID:22675134

  14. Foreign object detection and removal to improve automated analysis of chest radiographs

    SciTech Connect

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-07-15

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.

  15. Semi-automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    PubMed Central

    Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867

  16. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  17. Development of Raman microspectroscopy for automated detection and imaging of basal cell carcinoma

    NASA Astrophysics Data System (ADS)

    Larraona-Puy, Marta; Ghita, Adrian; Zoladek, Alina; Perkins, William; Varma, Sandeep; Leach, Iain H.; Koloydenko, Alexey A.; Williams, Hywel; Notingher, Ioan

    2009-09-01

    We investigate the potential of Raman microspectroscopy (RMS) for automated evaluation of excised skin tissue during Mohs micrographic surgery (MMS). The main aim is to develop an automated method for imaging and diagnosis of basal cell carcinoma (BCC) regions. Selected Raman bands responsible for the largest spectral differences between BCC and normal skin regions and linear discriminant analysis (LDA) are used to build a multivariate supervised classification model. The model is based on 329 Raman spectra measured on skin tissue obtained from 20 patients. BCC is discriminated from healthy tissue with 90+/-9% sensitivity and 85+/-9% specificity in a 70% to 30% split cross-validation algorithm. This multivariate model is then applied on tissue sections from new patients to image tumor regions. The RMS images show excellent correlation with the gold standard of histopathology sections, BCC being detected in all positive sections. We demonstrate the potential of RMS as an automated objective method for tumor evaluation during MMS. The replacement of current histopathology during MMS by a ``generalization'' of the proposed technique may improve the feasibility and efficacy of MMS, leading to a wider use according to clinical need.

  18. Semi-Automated Neuron Boundary Detection and Nonbranching Process Segmentation in Electron Microscopy Images

    SciTech Connect

    Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga

    2013-01-01

    Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.

  19. Automated detection system of single nucleotide polymorphisms using two kinds of functional magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Liu, Hongna; Li, Song; Wang, Zhifei; Li, Zhiyang; Deng, Yan; Wang, Hua; Shi, Zhiyang; He, Nongyue

    2008-11-01

    Single nucleotide polymorphisms (SNPs) comprise the most abundant source of genetic variation in the human genome wide codominant SNPs identification. Therefore, large-scale codominant SNPs identification, especially for those associated with complex diseases, has induced the need for completely high-throughput and automated SNP genotyping method. Herein, we present an automated detection system of SNPs based on two kinds of functional magnetic nanoparticles (MNPs) and dual-color hybridization. The amido-modified MNPs (NH 2-MNPs) modified with APTES were used for DNA extraction from whole blood directly by electrostatic reaction, and followed by PCR, was successfully performed. Furthermore, biotinylated PCR products were captured on the streptavidin-coated MNPs (SA-MNPs) and interrogated by hybridization with a pair of dual-color probes to determine SNP, then the genotype of each sample can be simultaneously identified by scanning the microarray printed with the denatured fluorescent probes. This system provided a rapid, sensitive and highly versatile automated procedure that will greatly facilitate the analysis of different known SNPs in human genome.

  20. Sleep spindle detection: crowdsourcing and evaluating performance of experts, non-experts, and automated methods

    PubMed Central

    Warby, Simon C.; Wendt, Sabrina L.; Welinder, Peter; Munk, Emil G.S.; Carrillo, Oscar; Sorensen, Helge B.D.; Jennum, Poul; Peppard, Paul E.; Perona, Pietro; Mignot, Emmanuel

    2014-01-01

    Sleep spindles are discrete, intermittent patterns of brain activity that arise as a result of interactions of several circuits in the brain. Increasingly, these oscillations are of biological and clinical interest because of their role in development, learning, and neurological disorders. We used an internet interface to ‘crowdsource’ spindle identification from human experts and non-experts, and compared performance with 6 automated detection algorithms in middle-to-older aged subjects from the general population. We also developed a method for forming group consensus, and refined methods of evaluating the performance of event detectors in physiological data such as polysomnography. Compared to the gold standard, the highest performance was by individual experts and the non-expert group consensus, followed by automated spindle detectors. Crowdsourcing the scoring of sleep data is an efficient method to collect large datasets, even for difficult tasks such as spindle identification. Further refinements to automated sleep spindle algorithms are needed for middle-to-older aged subjects. PMID:24562424

  1. Anger superiority effect for change detection and change blindness.

    PubMed

    Lyyra, Pessi; Hietanen, Jari K; Astikainen, Piia

    2014-11-01

    In visual search, an angry face in a crowd "pops out" unlike a happy or a neutral face. This "anger superiority effect" conflicts with views of visual perception holding that complex stimulus contents cannot be detected without focused top-down attention. Implicit visual processing of threatening changes was studied by recording event-related potentials (ERPs) using facial stimuli using the change blindness paradigm, in which conscious change detection is eliminated by presenting a blank screen before the changes. Already before their conscious detection, angry faces modulated relatively early emotion sensitive ERPs when appearing among happy and neutral faces, but happy faces only among neutral, not angry faces. Conscious change detection was more efficient for angry than happy faces regardless of background. These findings indicate that the brain can implicitly extract complex emotional information from facial stimuli, and the biological relevance of threatening contents can speed up their break up into visual consciousness.

  2. Automated feature detection and identification in digital point-ordered signals

    DOEpatents

    Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.

    1998-01-01

    A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.

  3. Automatic nipple detection on 3D images of an automated breast ultrasound system (ABUS)

    NASA Astrophysics Data System (ADS)

    Javanshir Moghaddam, Mandana; Tan, Tao; Karssemeijer, Nico; Platel, Bram

    2014-03-01

    Recent studies have demonstrated that applying Automated Breast Ultrasound in addition to mammography in women with dense breasts can lead to additional detection of small, early stage breast cancers which are occult in corresponding mammograms. In this paper, we proposed a fully automatic method for detecting the nipple location in 3D ultrasound breast images acquired from Automated Breast Ultrasound Systems. The nipple location is a valuable landmark to report the position of possible abnormalities in a breast or to guide image registration. To detect the nipple location, all images were normalized. Subsequently, features have been extracted in a multi scale approach and classification experiments were performed using a gentle boost classifier to identify the nipple location. The method was applied on a dataset of 100 patients with 294 different 3D ultrasound views from Siemens and U-systems acquisition systems. Our database is a representative sample of cases obtained in clinical practice by four medical centers. The automatic method could accurately locate the nipple in 90% of AP (Anterior-Posterior) views and in 79% of the other views.

  4. Automated night/day standoff detection, tracking, and identification of personnel for installation protection

    NASA Astrophysics Data System (ADS)

    Lemoff, Brian E.; Martin, Robert B.; Sluch, Mikhail; Kafka, Kristopher M.; McCormick, William; Ice, Robert

    2013-06-01

    The capability to positively and covertly identify people at a safe distance, 24-hours per day, could provide a valuable advantage in protecting installations, both domestically and in an asymmetric warfare environment. This capability would enable installation security officers to identify known bad actors from a safe distance, even if they are approaching under cover of darkness. We will describe an active-SWIR imaging system being developed to automatically detect, track, and identify people at long range using computer face recognition. The system illuminates the target with an eye-safe and invisible SWIR laser beam, to provide consistent high-resolution imagery night and day. SWIR facial imagery produced by the system is matched against a watch-list of mug shots using computer face recognition algorithms. The current system relies on an operator to point the camera and to review and interpret the face recognition results. Automation software is being developed that will allow the system to be cued to a location by an external system, automatically detect a person, track the person as they move, zoom in on the face, select good facial images, and process the face recognition results, producing alarms and sharing data with other systems when people are detected and identified. Progress on the automation of this system will be presented along with experimental night-time face recognition results at distance.

  5. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices

    PubMed Central

    Golden, J.P.; Verbarg, J.; Howell, P.B.; Shriver-Lake, L.C.; Ligler, F.S.

    2012-01-01

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose–response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. PMID:22960010

  6. Primer effect in the detection of mitochondrial DNA point heteroplasmy by automated sequencing.

    PubMed

    Calatayud, Marta; Ramos, Amanda; Santos, Cristina; Aluja, Maria Pilar

    2013-06-01

    The correct detection of mitochondrial DNA (mtDNA) heteroplasmy by automated sequencing presents methodological constraints. The main goals of this study are to investigate the effect of sense and distance of primers in heteroplasmy detection and to test if there are differences in the accurate determination of heteroplasmy involving transitions or transversions. A gradient of the heteroplasmy levels was generated for mtDNA positions 9477 (transition G/A) and 15,452 (transversion C/A). Amplification and subsequent sequencing with forward and reverse primers, situated at 550 and 150 bp from the heteroplasmic positions, were performed. Our data provide evidence that there is a significant difference between the use of forward and reverse primers. The forward primer is the primer that seems to give a better approximation to the real proportion of the variants. No significant differences were found concerning the distance at which the sequencing primers were placed neither between the analysis of transitions and transversions. The data collected in this study are a starting point that allows to glimpse the importance of the sequencing primers in the accurate detection of point heteroplasmy, providing additional insight into the overall automated sequencing strategy.

  7. A novel automated detection system for swallowing sounds during eating and speech under everyday conditions.

    PubMed

    Fukuike, C; Kodama, N; Manda, Y; Hashimoto, Y; Sugimoto, K; Hirata, A; Pan, Q; Maeda, N; Minagi, S

    2015-05-01

    The wave analysis of swallowing sounds has been receiving attention because the recording process is easy and non-invasive. However, up until now, an expert has been needed to visually examine the entire recorded wave to distinguish swallowing from other sounds. The purpose of this study was to establish a methodology to automatically distinguish the sound of swallowing from sound data recorded during a meal in the presence of everyday ambient sound. Seven healthy participants (mean age: 26·7 ± 1·3 years) participated in this study. A laryngeal microphone and a condenser microphone attached to the nostril were used for simultaneous recording. Recoding took place while participants were taking a meal and talking with a conversational partner. Participants were instructed to step on a foot pedal trigger switch when they swallowed, representing self-enumeration of swallowing, and also to achieve six additional noise-making tasks during the meal in a randomised manner. The automated analysis system correctly detected 342 out of the 352 self-enumerated swallowing events (sensitivity: 97·2%) and 479 out of the 503 semblable wave periods of swallowing (specificity: 95·2%). In this study, the automated detection system for swallowing sounds using a nostril microphone was able to detect the swallowing event with high sensitivity and specificity even under the conditions of daily life, thus showing potential utility in the diagnosis or screening of dysphagic patients in future studies.

  8. Procedure for Automated Eddy Current Crack Detection in Thin Titanium Plates

    NASA Technical Reports Server (NTRS)

    Wincheski, Russell A.

    2012-01-01

    This procedure provides the detailed instructions for conducting Eddy Current (EC) inspections of thin (5-30 mils) titanium membranes with thickness and material properties typical of the development of Ultra-Lightweight diaphragm Tanks Technology (ULTT). The inspection focuses on the detection of part-through, surface breaking fatigue cracks with depths between approximately 0.002" and 0.007" and aspect ratios (a/c) of 0.2-1.0 using an automated eddy current scanning and image processing technique.

  9. Airborne change detection system for the detection of route mines

    NASA Astrophysics Data System (ADS)

    Donzelli, Thomas P.; Jackson, Larry; Yeshnik, Mark; Petty, Thomas E.

    2003-09-01

    The US Army is interested in technologies that will enable it to maintain the free flow of traffic along routes such as Main Supply Routes (MSRs). Mines emplaced in the road by enemy forces under cover of darkness represent a major threat to maintaining a rapid Operational Tempo (OPTEMPO) along such routes. One technique that shows promise for detecting enemy mining activity is Airborne Change Detection, which allows an operator to detect suspicious day-to-day changes in and around the road that may be indicative of enemy mining. This paper presents an Airborne Change Detection that is currently under development at the US Army Night Vision and Electronic Sensors Directorate (NVESD). The system has been tested using a longwave infrared (LWIR) sensor on a vertical take-off and landing unmanned aerial vehicle (VTOL UAV) and a midwave infrared (MWIR) sensor on a fixed wing aircraft. The system is described and results of the various tests conducted to date are presented.

  10. Change detection and change blindness in pigeons (Columba livia).

    PubMed

    Herbranson, Walter T; Trinh, Yvan T; Xi, Patricia M; Arand, Mark P; Barker, Michael S K; Pratt, Theodore H

    2014-05-01

    Change blindness is a phenomenon in which even obvious details in a visual scene change without being noticed. Although change blindness has been studied extensively in humans, we do not yet know if it is a phenomenon that also occurs in other animals. Thus, investigation of change blindness in a nonhuman species may prove to be valuable by beginning to provide some insight into its ultimate causes. Pigeons learned a change detection task in which pecks to the location of a change in a sequence of stimulus displays were reinforced. They were worse at detecting changes if the stimulus displays were separated by a brief interstimulus interval, during which the display was blank, and this primary result matches the general pattern seen in previous studies of change blindness in humans. A second experiment attempted to identify specific stimulus characteristics that most reliably produced a failure to detect changes. Change detection was more difficult when interstimulus intervals were longer and when the change was iterated fewer times.

  11. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  12. Change Detection via Morphological Comparative Filters

    NASA Astrophysics Data System (ADS)

    Vizilter, Y. V.; Rubis, A. Y.; Zheltov, S. Y.; Vygolov, O. V.

    2016-06-01

    In this paper we propose the new change detection technique based on morphological comparative filtering. This technique generalizes the morphological image analysis scheme proposed by Pytiev. A new class of comparative filters based on guided contrasting is developed. Comparative filtering based on diffusion morphology is implemented too. The change detection pipeline contains: comparative filtering on image pyramid, calculation of morphological difference map, binarization, extraction of change proposals and testing change proposals using local morphological correlation coefficient. Experimental results demonstrate the applicability of proposed approach.

  13. Automated High-Pressure Titration System with In Situ Infrared Spectroscopic Detection

    SciTech Connect

    Thompson, Christopher J.; Martin, Paul F.; Chen, Jeffrey; Benezeth, Pascale; Schaef, Herbert T.; Rosso, Kevin M.; Felmy, Andrew R.; Loring, John S.

    2014-04-17

    A fully automated titration system with infrared detection was developed for investigating interfacial chemistry at high pressures. The apparatus consists of a high-pressure fluid generation and delivery system coupled to a high-pressure cell with infrared optics. A manifold of electronically actuated valves is used to direct pressurized fluids into the cell. Precise reagent additions to the pressurized cell are made with calibrated tubing loops that are filled with reagent and placed in-line with the cell and a syringe pump. The cell’s infrared optics facilitate both transmission and attenuated total reflection (ATR) measurements to monitor bulk-fluid composition and solid-surface phenomena such as adsorption, desorption, complexation, dissolution, and precipitation. Switching between the two measurement modes is accomplished with moveable mirrors that direct radiation from a Fourier transform infrared spectrometer into the cell along transmission or ATR light paths. The versatility of the high-pressure IR titration system is demonstrated with three case studies. First, we titrated water into supercritical CO2 (scCO2) to generate an infrared calibration curve and determine the solubility of water in CO2 at 50 °C and 90 bar. Next, we characterized the partitioning of water between a montmorillonite clay and scCO2 at 50 °C and 90 bar. Transmission-mode spectra were used to quantify changes in the clay’s sorbed water concentration as a function of scCO2 hydration, and ATR measurements provided insights into competitive residency of water and CO2 on the clay surface and in the interlayer. Finally, we demonstrated how time-dependent studies can be conducted with the system by monitoring the carbonation reaction of forsterite (Mg2SiO4) in water-bearing scCO2 at 50 °C and 90 bar. Immediately after water dissolved in the scCO2, a thin film of adsorbed water formed on the mineral surface, and the film thickness increased with time as the forsterite began to dissolve

  14. Automated high-pressure titration system with in situ infrared spectroscopic detection

    NASA Astrophysics Data System (ADS)

    Thompson, Christopher J.; Martin, Paul F.; Chen, Jeffrey; Benezeth, Pascale; Schaef, Herbert T.; Rosso, Kevin M.; Felmy, Andrew R.; Loring, John S.

    2014-04-01

    A fully automated titration system with infrared detection was developed for investigating interfacial chemistry at high pressures. The apparatus consists of a high-pressure fluid generation and delivery system coupled to a high-pressure cell with infrared optics. A manifold of electronically actuated valves is used to direct pressurized fluids into the cell. Precise reagent additions to the pressurized cell are made with calibrated tubing loops that are filled with reagent and placed in-line with the cell and a syringe pump. The cell's infrared optics facilitate both transmission and attenuated total reflection (ATR) measurements to monitor bulk-fluid composition and solid-surface phenomena such as adsorption, desorption, complexation, dissolution, and precipitation. Switching between the two measurement modes is accomplished with moveable mirrors that direct the light path of a Fourier transform infrared spectrometer into the cell along transmission or ATR light paths. The versatility of the high-pressure IR titration system was demonstrated with three case studies. First, we titrated water into supercritical CO2 (scCO2) to generate an infrared calibration curve and determine the solubility of water in CO2 at 50 °C and 90 bar. Next, we characterized the partitioning of water between a montmorillonite clay and scCO2 at 50 °C and 90 bar. Transmission-mode spectra were used to quantify changes in the clay's sorbed water concentration as a function of scCO2 hydration, and ATR measurements provided insights into competitive residency of water and CO2 on the clay surface and in the interlayer. Finally, we demonstrated how time-dependent studies can be conducted with the system by monitoring the carbonation reaction of forsterite (Mg2SiO4) in water-bearing scCO2 at 50 °C and 90 bar. Immediately after water dissolved in the scCO2, a thin film of adsorbed water formed on the mineral surface, and the film thickness increased with time as the forsterite began to dissolve

  15. Development of an Automated Microfluidic System for DNA Collection, Amplification, and Detection of Pathogens

    SciTech Connect

    Hagan, Bethany S.; Bruckner-Lea, Cynthia J.

    2002-12-01

    This project was focused on developing and testing automated routines for a microfluidic Pathogen Detection System. The basic pathogen detection routine has three primary components; cell concentration, DNA amplification, and detection. In cell concentration, magnetic beads are held in a flow cell by an electromagnet. Sample liquid is passed through the flow cell and bacterial cells attach to the beads. These beads are then released into a small volume of fluid and delivered to the peltier device for cell lysis and DNA amplification. The cells are lysed during initial heating in the peltier device, and the released DNA is amplified using polymerase chain reaction (PCR) or strand displacement amplification (SDA). Once amplified, the DNA is then delivered to a laser induced fluorescence detection unit in which the sample is detected. These three components create a flexible platform that can be used for pathogen detection in liquid and sediment samples. Future developments of the system will include on-line DNA detection during DNA amplification and improved capture and release methods for the magnetic beads during cell concentration.

  16. Change Detection Experiments Using Low Cost UAVs

    NASA Technical Reports Server (NTRS)

    Logan, Michael J.; Vranas, Thomas L.; Motter, Mark; Hines, Glenn D.; Rahman, Zia-ur

    2005-01-01

    This paper presents the progress in the development of a low-cost change-detection system. This system is being developed to provide users with the ability to use a low-cost unmanned aerial vehicle (UAV) and image processing system that can detect changes in specific fixed ground locations using video provided by an autonomous UAV. The results of field experiments conducted with the US Army at Ft. A.P.Hill are presented.

  17. An automated algorithm for online detection of fragmented QRS and identification of its various morphologies

    PubMed Central

    Maheshwari, Sidharth; Acharyya, Amit; Puddu, Paolo Emilio; Mazomenos, Evangelos B.; Leekha, Gourav; Maharatna, Koushik; Schiariti, Michele

    2013-01-01

    Fragmented QRS (f-QRS) has been proven to be an efficient biomarker for several diseases, including remote and acute myocardial infarction, cardiac sarcoidosis, non-ischaemic cardiomyopathy, etc. It has also been shown to have higher sensitivity and/or specificity values than the conventional markers (e.g. Q-wave, ST-elevation, etc.) which may even regress or disappear with time. Patients with such diseases have to undergo expensive and sometimes invasive tests for diagnosis. Automated detection of f-QRS followed by identification of its various morphologies in addition to the conventional ECG feature (e.g. P, QRS, T amplitude and duration, etc.) extraction will lead to a more reliable diagnosis, therapy and disease prognosis than the state-of-the-art approaches and thereby will be of significant clinical importance for both hospital-based and emerging remote health monitoring environments as well as for implanted ICD devices. An automated algorithm for detection of f-QRS from the ECG and identification of its various morphologies is proposed in this work which, to the best of our knowledge, is the first work of its kind. Using our recently proposed time–domain morphology and gradient-based ECG feature extraction algorithm, the QRS complex is extracted and discrete wavelet transform (DWT) with one level of decomposition, using the ‘Haar’ wavelet, is applied on it to detect the presence of fragmentation. Detailed DWT coefficients were observed to hypothesize the postulates of detection of all types of morphologies as reported in the literature. To model and verify the algorithm, PhysioNet's PTB database was used. Forty patients were randomly selected from the database and their ECG were examined by two experienced cardiologists and the results were compared with those obtained from the algorithm. Out of 40 patients, 31 were considered appropriate for comparison by two cardiologists, and it is shown that 334 out of 372 (89.8%) leads from the chosen 31 patients

  18. Change Point Detection in Correlation Networks.

    PubMed

    Barnett, Ian; Onnela, Jukka-Pekka

    2016-01-07

    Many systems of interacting elements can be conceptualized as networks, where network nodes represent the elements and network ties represent interactions between the elements. In systems where the underlying network evolves, it is useful to determine the points in time where the network structure changes significantly as these may correspond to functional change points. We propose a method for detecting change points in correlation networks that, unlike previous change point detection methods designed for time series data, requires minimal distributional assumptions. We investigate the difficulty of change point detection near the boundaries of the time series in correlation networks and study the power of our method and competing methods through simulation. We also show the generalizable nature of the method by applying it to stock price data as well as fMRI data.

  19. Development of adapted GMR-probes for automated detection of hidden defects in thin steel sheets

    NASA Astrophysics Data System (ADS)

    Pelkner, Matthias; Pohl, Rainer; Kreutzbruck, Marc; Commandeur, Colin

    2016-02-01

    Thin steel sheets with a thickness of 0.3 mm and less are the base materials of many everyday life products (cans, batteries, etc.). Potential inhomogeneities such as non-metallic inclusions inside the steel can lead to a rupture of the sheets when it is formed into a product such as a beverage can. Therefore, there is a need to develop automated NDT techniques to detect hidden defects and inclusions in thin sheets during production. For this purpose Tata Steel Europe and BAM, the Federal Institute for Materials Research and Testing (Germany), collaborate in order to develop an automated NDT-system. Defect detection systems have to be robust against external influences, especially when used in an industrial environment. In addition, such a facility has to achieve a high sensitivity and a high spatial resolution in terms of detecting small inclusions in the μm-regime. In a first step, we carried out a feasibility study to determine which testing method is promising for detecting hidden defects and inclusions inside ferrous thin steel sheets. Therefore, two methods were investigated in more detail - magnetic flux leakage testing (MFL) using giant magneto resistance sensor arrays (GMR) as receivers [1,2] and eddy current testing (ET). The capabilities of both methods were tested with 0.2 mm-thick steel samples containing small defects with depths ranging from 5 µm up to 60 µm. Only in case of GMR-MFL-testing, we were able to detect parts of the hidden defects with a depth of 10 µm trustworthily with a SNR better than 10 dB. Here, the lift off between sensor and surface was 250 µm. On this basis, we investigated different testing scenarios including velocity tests and different lift offs. In this contribution we present the results of the feasibility study leading to first prototypes of GMR-probes which are now installed as part of a demonstrator inside a production line.

  20. Semi-Automated Detection of Cerebral Microbleeds on 3.0 T MR Images

    PubMed Central

    Kuijf, Hugo J.; Brundel, Manon; de Bresser, Jeroen; van Veluw, Susanne J.; Heringa, Sophie M.; Viergever, Max A.; Biessels, Geert Jan; Vincken, Koen L.

    2013-01-01

    Cerebral microbleeds are associated with vascular disease and dementia. They can be detected on MRI and receive increasing attention. Visual rating is the current standard for microbleed detection, but is rater dependent, has limited reproducibility, modest sensitivity, and can be time-consuming. The goal of the current study is to present a tool for semi-automated detection of microbleeds that can assist human raters in the rating procedure. The radial symmetry transform is originally a technique to highlight circular-shaped objects in two-dimensional images. In the current study, the three-dimensional radial symmetry transform was adapted to detect spherical microbleeds in a series of 72 patients from our hospital, for whom a ground truth visual rating was made by four raters. Potential microbleeds were automatically identified on T2*-weighted 3.0 T MRI scans and the results were visually checked to identify microbleeds. Final ratings of the radial symmetry transform were compared to human ratings. After implementing and optimizing the radial symmetry transform, the method achieved a high sensitivity, while maintaining a modest number of false positives. Depending on the settings, sensitivities ranged from 65%–84% compared to the ground truth rating. Rating of the processed images required 1–2 minutes per participant, in which 20–96 false positive locations per participant were censored. Sensitivities of individual raters ranged from 39%–86% compared to the ground truth and required 5–10 minutes per participant per rater. The sensitivities that were achieved by the radial symmetry transform are similar to those of individual experienced human raters, demonstrating its feasibility and usefulness for semi-automated microbleed detection. PMID:23805246

  1. Comparison of hyperspectral change detection algorithms

    NASA Astrophysics Data System (ADS)

    Pieper, M.; Manolakis, D.; Truslow, E.; Cooley, T.; Brueggeman, M.; Weisner, A.; Jacobson, J.

    2015-09-01

    There are a multitude of civilian and military applications for the detection of anomalous changes in hyper-spectral images. Anomalous changes occur when the material within a pixel is replaced. Environmental factors that change over time, such as illumination, will affect the radiance of all the pixels in a scene, despite the materials within remaining constant. The goal of an anomalous change detection algorithm is to suppress changes caused by the environment, and detect pixels where the materials within have changed. Anomalous change detection is a two step process. Two co-registered images of a scene are first transformed to maximize the overall correlation between the images, then an anomalous change detector (ACD) is applied to the transformed images. The transforms maximize the correlation between the two images to attenuate the environmental differences that distract from the anomalous changes of importance. Several categories of transforms with different optimization parameters are discussed and compared. One of two types of ACDs are then applied to the transformed images. The first ACD uses the difference of the two transformed images. The second concatenates the spectra of two images and uses an aggregated ACD. A comparison of the two ACD methods and their effectiveness with the different transforms is done for the first time.

  2. A systematic review of automated melanoma detection in dermatoscopic images and its ground truth data

    NASA Astrophysics Data System (ADS)

    Ali, Abder-Rahman A.; Deserno, Thomas M.

    2012-02-01

    Malignant melanoma is the third most frequent type of skin cancer and one of the most malignant tumors, accounting for 79% of skin cancer deaths. Melanoma is highly curable if diagnosed early and treated properly as survival rate varies between 15% and 65% from early to terminal stages, respectively. So far, melanoma diagnosis is depending subjectively on the dermatologist's expertise. Computer-aided diagnosis (CAD) systems based on epiluminescense light microscopy can provide an objective second opinion on pigmented skin lesions (PSL). This work systematically analyzes the evidence of the effectiveness of automated melanoma detection in images from a dermatoscopic device. Automated CAD applications were analyzed to estimate their diagnostic outcome. Searching online databases for publication dates between 1985 and 2011, a total of 182 studies on dermatoscopic CAD were found. With respect to the systematic selection criterions, 9 studies were included, published between 2002 and 2011. Those studies formed databases of 14,421 dermatoscopic images including both malignant "melanoma" and benign "nevus", with 8,110 images being available ranging in resolution from 150 x 150 to 1568 x 1045 pixels. Maximum and minimum of sensitivity and specificity are 100.0% and 80.0% as well as 98.14% and 61.6%, respectively. Area under the receiver operator characteristics (AUC) and pooled sensitivity, specificity and diagnostics odds ratio are respectively 0.87, 0.90, 0.81, and 15.89. So, although that automated melanoma detection showed good accuracy in terms of sensitivity, specificity, and AUC, but diagnostic performance in terms of DOR was found to be poor. This might be due to the lack of dermatoscopic image resources (ground truth) that are needed for comprehensive assessment of diagnostic performance. In future work, we aim at testing this hypothesis by joining dermatoscopic images into a unified database that serves as a standard reference for dermatology related research in

  3. Sensor for detecting changes in magnetic fields

    DOEpatents

    Praeg, W.F.

    1980-02-26

    A sensor is described for detecting changes in the magnetic field of the equilibrium-field coil of a Tokamak plasma device that comprises a pair of bifilar wires disposed circumferentially, one inside and one outside the equilibrium-field coil. Each is shorted at one end. The difference between the voltages detected at the other ends of the bifilar wires provides a measure of changing flux in the equilibrium-field coil. This difference can be used to detect faults in the coil in time to take action to protect the coil.

  4. Sensor for detecting changes in magnetic fields

    DOEpatents

    Praeg, Walter F.

    1981-01-01

    A sensor for detecting changes in the magnetic field of the equilibrium-field coil of a Tokamak plasma device comprises a pair of bifilar wires disposed circumferentially, one inside and one outside the equilibrium-field coil. Each is shorted at one end. The difference between the voltages detected at the other ends of the bifilar wires provides a measure of changing flux in the equilibrium-field coil. This difference can be used to detect faults in the coil in time to take action to protect the coil.

  5. Electrochemical pesticide detection with AutoDip--a portable platform for automation of crude sample analyses.

    PubMed

    Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils

    2015-02-01

    Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.

  6. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    NASA Astrophysics Data System (ADS)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  7. Semi-Automated, Occupationally Safe Immunofluorescence Microtip Sensor for Rapid Detection of Mycobacterium Cells in Sputum

    PubMed Central

    Soelberg, Scott D.; Weigel, Kris M.; Hiraiwa, Morgan; Cairns, Andrew; Lee, Hyun-Boo; Furlong, Clement E.; Oh, Kieseok; Lee, Kyong-Hoon; Gao, Dayong; Chung, Jae-Hyun; Cangelosi, Gerard A.

    2014-01-01

    An occupationally safe (biosafe) sputum liquefaction protocol was developed for use with a semi-automated antibody-based microtip immunofluorescence sensor. The protocol effectively liquefied sputum and inactivated microorganisms including Mycobacterium tuberculosis, while preserving the antibody-binding activity of Mycobacterium cell surface antigens. Sputum was treated with a synergistic chemical-thermal protocol that included moderate concentrations of NaOH and detergent at 60°C for 5 to 10 min. Samples spiked with M. tuberculosis complex cells showed approximately 106-fold inactivation of the pathogen after treatment. Antibody binding was retained post-treatment, as determined by analysis with a microtip immunosensor. The sensor correctly distinguished between Mycobacterium species and other cell types naturally present in biosafe-treated sputum, with a detection limit of 100 CFU/mL for M. tuberculosis, in a 30-minute sample-to-result process. The microtip device was also semi-automated and shown to be compatible with low-cost, LED-powered fluorescence microscopy. The device and biosafe sputum liquefaction method opens the door to rapid detection of tuberculosis in settings with limited laboratory infrastructure. PMID:24465845

  8. Automated detection and quantification of single RNAs at cellular resolution in zebrafish embryos.

    PubMed

    Stapel, L Carine; Lombardot, Benoit; Broaddus, Coleman; Kainmueller, Dagmar; Jug, Florian; Myers, Eugene W; Vastenhouw, Nadine L

    2016-02-01

    Analysis of differential gene expression is crucial for the study of cell fate and behavior during embryonic development. However, automated methods for the sensitive detection and quantification of RNAs at cellular resolution in embryos are lacking. With the advent of single-molecule fluorescence in situ hybridization (smFISH), gene expression can be analyzed at single-molecule resolution. However, the limited availability of protocols for smFISH in embryos and the lack of efficient image analysis pipelines have hampered quantification at the (sub)cellular level in complex samples such as tissues and embryos. Here, we present a protocol for smFISH on zebrafish embryo sections in combination with an image analysis pipeline for automated transcript detection and cell segmentation. We use this strategy to quantify gene expression differences between different cell types and identify differences in subcellular transcript localization between genes. The combination of our smFISH protocol and custom-made, freely available, analysis pipeline will enable researchers to fully exploit the benefits of quantitative transcript analysis at cellular and subcellular resolution in tissues and embryos. PMID:26700682

  9. Semi-automated, occupationally safe immunofluorescence microtip sensor for rapid detection of Mycobacterium cells in sputum.

    PubMed

    Inoue, Shinnosuke; Becker, Annie L; Kim, Jong-Hoon; Shu, Zhiquan; Soelberg, Scott D; Weigel, Kris M; Hiraiwa, Morgan; Cairns, Andrew; Lee, Hyun-Boo; Furlong, Clement E; Oh, Kieseok; Lee, Kyong-Hoon; Gao, Dayong; Chung, Jae-Hyun; Cangelosi, Gerard A

    2014-01-01

    An occupationally safe (biosafe) sputum liquefaction protocol was developed for use with a semi-automated antibody-based microtip immunofluorescence sensor. The protocol effectively liquefied sputum and inactivated microorganisms including Mycobacterium tuberculosis, while preserving the antibody-binding activity of Mycobacterium cell surface antigens. Sputum was treated with a synergistic chemical-thermal protocol that included moderate concentrations of NaOH and detergent at 60°C for 5 to 10 min. Samples spiked with M. tuberculosis complex cells showed approximately 10(6)-fold inactivation of the pathogen after treatment. Antibody binding was retained post-treatment, as determined by analysis with a microtip immunosensor. The sensor correctly distinguished between Mycobacterium species and other cell types naturally present in biosafe-treated sputum, with a detection limit of 100 CFU/mL for M. tuberculosis, in a 30-minute sample-to-result process. The microtip device was also semi-automated and shown to be compatible with low-cost, LED-powered fluorescence microscopy. The device and biosafe sputum liquefaction method opens the door to rapid detection of tuberculosis in settings with limited laboratory infrastructure.

  10. Automated Aflatoxin Analysis Using Inline Reusable Immunoaffinity Column Cleanup and LC-Fluorescence Detection.

    PubMed

    Rhemrev, Ria; Pazdanska, Monika; Marley, Elaine; Biselli, Scarlett; Staiger, Simone

    2015-01-01

    A novel reusable immunoaffinity cartridge containing monoclonal antibodies to aflatoxins coupled to a pressure resistant polymer has been developed. The cartridge is used in conjunction with a handling system inline to LC with fluorescence detection to provide fully automated aflatoxin analysis for routine monitoring of a variety of food matrixes. The handling system selects an immunoaffinity cartridge from a tray and automatically applies the sample extract. The cartridge is washed, then aflatoxins B1, B2, G1, and G2 are eluted and transferred inline to the LC system for quantitative analysis using fluorescence detection with postcolumn derivatization using a KOBRA® cell. Each immunoaffinity cartridge can be used up to 15 times without loss in performance, offering increased sample throughput and reduced costs compared to conventional manual sample preparation and cleanup. The system was validated in two independent laboratories using samples of peanuts and maize spiked at 2, 8, and 40 μg/kg total aflatoxins, and paprika, nutmeg, and dried figs spiked at 5, 20, and 100 μg/kg total aflatoxins. Recoveries exceeded 80% for both aflatoxin B1 and total aflatoxins. The between-day repeatability ranged from 2.1 to 9.6% for aflatoxin B1 for the six levels and five matrixes. Satisfactory Z-scores were obtained with this automated system when used for participation in proficiency testing (FAPAS®) for samples of chilli powder and hazelnut paste containing aflatoxins. PMID:26651571

  11. How Small Can Impact Craters Be Detected at Large Scale by Automated Algorithms?

    NASA Astrophysics Data System (ADS)

    Bandeira, L.; Machado, M.; Pina, P.; Marques, J. S.

    2013-12-01

    The last decade has seen a widespread publication of crater detection algorithms (CDA) with increasing detection performances. The adaptive nature of some of the algorithms [1] has permitting their use in the construction or update of global catalogues for Mars and the Moon. Nevertheless, the smallest craters detected in these situations by CDA have 10 pixels in diameter (or about 2 km in MOC-WA images) [2] or can go down to 16 pixels or 200 m in HRSC imagery [3]. The availability of Martian images with metric (HRSC and CTX) and centimetric (HiRISE) resolutions is permitting to unveil craters not perceived before, thus automated approaches seem a natural way of detecting the myriad of these structures. In this study we present the efforts, based on our previous algorithms [2-3] and new training strategies, to push the automated detection of craters to a dimensional threshold as close as possible to the detail that can be perceived on the images, something that has not been addressed yet in a systematic way. The approach is based on the selection of candidate regions of the images (portions that contain crescent highlight and shadow shapes indicating a possible presence of a crater) using mathematical morphology operators (connected operators of different sizes) and on the extraction of texture features (Haar-like) and classification by Adaboost, into crater and non-crater. This is a supervised approach, meaning that a training phase, in which manually labelled samples are provided, is necessary so the classifier can learn what crater and non-crater structures are. The algorithm is intensively tested in Martian HiRISE images, from different locations on the planet, in order to cover the largest surface types from the geological point view (different ages and crater densities) and also from the imaging or textural perspective (different degrees of smoothness/roughness). The quality of the detections obtained is clearly dependent on the dimension of the craters

  12. Automated Detection and Extraction of Coronal Dimmings from SDO/AIA Data

    NASA Astrophysics Data System (ADS)

    Davey, Alisdair R.; Attrill, G. D. R.; Wills-Davey, M. J.

    2010-05-01

    The sheer volume of data anticipated from the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) highlights the necessity for the development of automatic detection methods for various types of solar activity. Initially recognised in the 1970s, it is now well established that coronal dimmings are closely associated with coronal mass ejections (CMEs), and are particularly recognised as an indicator of front-side (halo) CMEs, which can be difficult to detect in white-light coronagraph data. An automated coronal dimming region detection and extraction algorithm removes visual observer bias from determination of physical quantities such as spatial location, area and volume. This allows reproducible, quantifiable results to be mined from very large datasets. The information derived may facilitate more reliable early space weather detection, as well as offering the potential for conducting large-sample studies focused on determining the geoeffectiveness of CMEs, coupled with analysis of their associated coronal dimmings. We present examples of dimming events extracted using our algorithm from existing EUV data, demonstrating the potential for the anticipated application to SDO/AIA data. Metadata returned by our algorithm include: location, area, volume, mass and dynamics of coronal dimmings. As well as running on historic datasets, this algorithm is capable of detecting and extracting coronal dimmings in near real-time. The coronal dimming detection and extraction algorithm described in this poster is part of the SDO/Computer Vision Center effort hosted at SAO (Martens et al., 2009). We acknowledge NASA grant NNH07AB97C.

  13. An automated lung nodule detection system for CT images using synthetic minority oversampling

    NASA Astrophysics Data System (ADS)

    Mehre, Shrikant A.; Mukhopadhyay, Sudipta; Dutta, Anirvan; Harsha, Nagam Chaithan; Dhara, Ashis Kumar; Khandelwal, Niranjan

    2016-03-01

    Pulmonary nodules are a potential manifestation of lung cancer, and their early detection can remarkably enhance the survival rate of patients. This paper presents an automated pulmonary nodule detection algorithm for lung CT images. The algorithm utilizes a two-stage approach comprising nodule candidate detection followed by reduction of false positives. The nodule candidate detection involves thresholding, followed by morphological opening. The geometrical features at this stage are selected from properties of nodule size and compactness, and lead to reduced number of false positives. An SVM classifier is used with a radial basis function kernel. The data imbalance, due to uneven distribution of nodules and non-nodules as a result of the candidate detection stage, is proposed to be addressed by oversampling of minority class using Synthetic Minority Over-sampling Technique (SMOTE), and over-imposition of its misclassification penalty. Experiments were performed on 97 CT scans of a publically-available (LIDC-IDRI) database. Performance is evaluated in terms of sensitivity and false positives per scan (FP/scan). Results indicate noteworthy performance of the proposed approach (nodule detection sensitivity after 4-fold cross-validation is 92.91% with 3 FP/scan). Comparative analysis also reflects a comparable and often better performance of the proposed setup over some of the existing techniques.

  14. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  15. lidar change detection using building models

    NASA Astrophysics Data System (ADS)

    Kim, Angela M.; Runyon, Scott C.; Jalobeanu, Andre; Esterline, Chelsea H.; Kruse, Fred A.

    2014-06-01

    Terrestrial LiDAR scans of building models collected with a FARO Focus3D and a RIEGL VZ-400 were used to investigate point-to-point and model-to-model LiDAR change detection. LiDAR data were scaled, decimated, and georegistered to mimic real world airborne collects. Two physical building models were used to explore various aspects of the change detection process. The first model was a 1:250-scale representation of the Naval Postgraduate School campus in Monterey, CA, constructed from Lego blocks and scanned in a laboratory setting using both the FARO and RIEGL. The second model at 1:8-scale consisted of large cardboard boxes placed outdoors and scanned from rooftops of adjacent buildings using the RIEGL. A point-to-point change detection scheme was applied directly to the point-cloud datasets. In the model-to-model change detection scheme, changes were detected by comparing Digital Surface Models (DSMs). The use of physical models allowed analysis of effects of changes in scanner and scanning geometry, and performance of the change detection methods on different types of changes, including building collapse or subsistence, construction, and shifts in location. Results indicate that at low false-alarm rates, the point-to-point method slightly outperforms the model-to-model method. The point-to-point method is less sensitive to misregistration errors in the data. Best results are obtained when the baseline and change datasets are collected using the same LiDAR system and collection geometry.

  16. Integrating edge detection and fuzzy connectedness for automated segmentation of anatomical branching structures.

    PubMed

    Skoura, Angeliki; Nuzhnaya, Tatyana; Megalooikonomou, Vasileios

    2014-01-01

    Image segmentation algorithms are critical components of medical image analysis systems. This paper presents a novel and fully automated methodology for segmenting anatomical branching structures in medical images. It is a hybrid approach which integrates the Canny edge detection to obtain a preliminary boundary of the structure and the fuzzy connectedness algorithm to handle efficiently the discontinuities of the returned edge map. To ensure efficient localisation of weak branches, the fuzzy connectedness framework is applied in a sliding window mode and using a voting scheme the optimal connection point is estimated. Finally, the image regions are labelled as tissue or background using a locally adaptive thresholding technique. The proposed methodology is applied and evaluated in segmenting ductal trees visualised in clinical X-ray galactograms and vasculature visualised in angiograms. The experimental results demonstrate the effectiveness of the proposed approach achieving high scores of detection rate and accuracy among state-of-the-art segmentation techniques.

  17. Automated lesion detection in dynamic contrast enhanced magnetic resonance imaging of breast

    NASA Astrophysics Data System (ADS)

    Liang, Xi; Kotagiri, Romamohanarao; Frazer, Helen; Yang, Qing

    2015-03-01

    We propose an automated method in detecting lesions to assist radiologists in interpreting dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) of breast. The aim is to highlight the suspicious regions of interest to reduce the searching time of the lesions and the possibility of radiologists overlooking small regions. In our method, we locate the suspicious regions by applying a threshold on essential features. The features are normalized to reduce the variation between patients. Support vector machine classifier is then applied to exclude normal tissues from these regions, using both kinetic and morphological features extracted in the lesions. In the evaluation of the system on 21 patients with 50 lesions, all lesions were successfully detected with 5.02 false positive regions per breast.

  18. An evaluation of some factors affecting the detection of blood group antibodies by automated methods.

    PubMed

    Kolberg, J; Nordhagen, R

    1975-01-01

    Some factors affecting the sensitivity of the automated methods for blood group antibody detection have been evaluated. The experiments revealed influencing differences between various albumin preparations. In the BMC method, one lot of albumin permitted no significant antibody detection. In the LISP technique, a plateau of maximum Polybrene activity was found. The beginning of this plateau depended on both the albumin preparation and the Polybrene lot. In the BMC method, methyl cellulose gave optimal sensitivity within a concentration range of 0.3 to 0.5 per cent. The stability of test cells stored in ACD at 4 C was studied. All test cells could be used safely up to two weeks. Cells from different donors showed variable reactivity after three weeks. PMID:1101466

  19. The impact of misregistration on change detection

    NASA Technical Reports Server (NTRS)

    Townshend, John R. G.; Justice, Christopher O.; Gurney, Charlotte; Mcmanus, James

    1992-01-01

    The impact of images misregistration on the detection of changes in land cover was studied using spatially degraded Landsat MSS images. Emphasis is placed on simulated images of the Normalized Difference Vegetation Index (NDVI) at spatial resolutions of 250 and 500 m. It is pointed out that there is the need to achieve high values of registration accuracy. The evidence from simulations suggests that misregistrations can have a marked effect on the ability of remotely sensed data to detect changes in land cover. Even subpixel misregistrations can have a major impact, and the most marked proportional changes will tend to occur at the finest misregistrations.

  20. Line matching for automatic change detection algorithm

    NASA Astrophysics Data System (ADS)

    Dhollande, Jérôme; Monnin, David; Gond, Laetitia; Cudel, Christophe; Kohler, Sophie; Dieterlen, Alain

    2012-06-01

    During foreign operations, Improvised Explosive Devices (IEDs) are one of major threats that soldiers may unfortunately encounter along itineraries. Based on a vehicle-mounted camera, we propose an original approach by image comparison to detect signicant changes on these roads. The classic 2D-image registration techniques do not take into account parallax phenomena. The consequence is that the misregistration errors could be detected as changes. According to stereovision principles, our automatic method compares intensity proles along corresponding epipolar lines by extrema matching. An adaptive space warping compensates scale dierence in 3D-scene. When the signals are matched, the signal dierence highlights changes which are marked in current video.

  1. Noninvasive Real-Time Automated Skin Lesion Analysis System for Melanoma Early Detection and Prevention

    PubMed Central

    Abuzaghleh, Omar; Barkana, Buket D.

    2015-01-01

    Melanoma spreads through metastasis, and therefore, it has been proved to be very fatal. Statistical evidence has revealed that the majority of deaths resulting from skin cancer are as a result of melanoma. Further investigations have shown that the survival rates in patients depend on the stage of the cancer; early detection and intervention of melanoma implicate higher chances of cure. Clinical diagnosis and prognosis of melanoma are challenging, since the processes are prone to misdiagnosis and inaccuracies due to doctors’ subjectivity. Malignant melanomas are asymmetrical, have irregular borders, notched edges, and color variations, so analyzing the shape, color, and texture of the skin lesion is important for the early detection and prevention of melanoma. This paper proposes the two major components of a noninvasive real-time automated skin lesion analysis system for the early detection and prevention of melanoma. The first component is a real-time alert to help users prevent skinburn caused by sunlight; a novel equation to compute the time for skin to burn is thereby introduced. The second component is an automated image analysis module, which contains image acquisition, hair detection and exclusion, lesion segmentation, feature extraction, and classification. The proposed system uses PH2 Dermoscopy image database from Pedro Hispano Hospital for the development and testing purposes. The image database contains a total of 200 dermoscopy images of lesions, including benign, atypical, and melanoma cases. The experimental results show that the proposed system is efficient, achieving classification of the benign, atypical, and melanoma images with accuracy of 96.3%, 95.7%, and 97.5%, respectively. PMID:27170906

  2. Holistic processing improves change detection but impairs change identification.

    PubMed

    Mathis, Katherine M; Kahan, Todd A

    2014-10-01

    It has been just over a century since Gestalt psychologists described the factors that contribute to the holistic processing of visually presented stimuli. Recent research indicates that holistic processing may come at a cost; specifically, the perception of holistic forms may reduce the visibility of constituent parts. In the present experiment, we examined change detection and change identification accuracy with Kanizsa rectangle patterns that were arranged to either form a Gestalt whole or not. Results from an experiment with 62 participants support this trade-off in processing holistic forms. Holistic processing improved the detection of change but obstructed its identification. Results are discussed in terms of both their theoretical significance and their application in areas ranging from baggage screening and the detection of changes in radiological images to the systems that are used to generate composite images of perpetrators on the basis of eyewitness reports.

  3. Low power multi-camera system and algorithms for automated threat detection

    NASA Astrophysics Data System (ADS)

    Huber, David J.; Khosla, Deepak; Chen, Yang; Van Buer, Darrel J.; Martin, Kevin

    2013-05-01

    A key to any robust automated surveillance system is continuous, wide field-of-view sensor coverage and high accuracy target detection algorithms. Newer systems typically employ an array of multiple fixed cameras that provide individual data streams, each of which is managed by its own processor. This array can continuously capture the entire field of view, but collecting all the data and back-end detection algorithm consumes additional power and increases the size, weight, and power (SWaP) of the package. This is often unacceptable, as many potential surveillance applications have strict system SWaP requirements. This paper describes a wide field-of-view video system that employs multiple fixed cameras and exhibits low SWaP without compromising the target detection rate. We cycle through the sensors, fetch a fixed number of frames, and process them through a modified target detection algorithm. During this time, the other sensors remain powered-down, which reduces the required hardware and power consumption of the system. We show that the resulting gaps in coverage and irregular frame rate do not affect the detection accuracy of the underlying algorithms. This reduces the power of an N-camera system by up to approximately N-fold compared to the baseline normal operation. This work was applied to Phase 2 of DARPA Cognitive Technology Threat Warning System (CT2WS) program and used during field testing.

  4. Facile electrochemical method and corresponding automated instrument for the detection of furfural in insulation oil.

    PubMed

    Wang, Ruili; Huang, Xinjian; Wang, Lishi

    2016-02-01

    Determining the concentration of furfural contained in the insulation oil of a transformer has been established as a method to evaluate the health status of the transformer. However, the detection of furfural involves the employment of expensive instruments and/or time-consuming laboratorial operations. In this paper, we proposed a convenient electrochemical method to make the detection. The quantification of furfural was realized by extraction of furfural from oil phase to aqueous phase followed by reductive detection of furfural with differential pulse voltammetry (DPV) at a mercury electrode. This method is very sensitive and the limit of detection, corresponding to furfural contained in oil, is estimated to be 0.03 μg g(-1). Furthermore, excellent linearity can be obtained in the range of 0-10 μg g(-1). These features make the method very suitable for the determination of furfural in real situation. A fully automated instrument that can perform the operations of extraction and detection was developed, and this instrument enables the whole measurement to be finished within eight minutes. The methodology and the instrument were tested with real samples, and very favorable agreement between results obtained with this instrument and HPLC indicates that the proposed method along with instrument can be employed as a facile tool to diagnose the health status of aged transformers.

  5. Automated detection of submerged navigational obstructions in freshwater impoundments with hull mounted sidescan sonar

    NASA Astrophysics Data System (ADS)

    Morris, Phillip A.

    The prevalence of low-cost side scanning sonar systems mounted on small recreational vessels has created improved opportunities to identify and map submerged navigational hazards in freshwater impoundments. However, these economical sensors also present unique challenges for automated techniques. This research explores related literature in automated sonar imagery processing and mapping technology, proposes and implements a framework derived from these sources, and evaluates the approach with video collected from a recreational grade sonar system. Image analysis techniques including optical character recognition and an unsupervised computer automated detection (CAD) algorithm are employed to extract the transducer GPS coordinates and slant range distance of objects protruding from the lake bottom. The retrieved information is formatted for inclusion into a spatial mapping model. Specific attributes of the sonar sensors are modeled such that probability profiles may be projected onto a three dimensional gridded map. These profiles are computed from multiple points of view as sonar traces crisscross or come near each other. As lake levels fluctuate over time so do the elevation points of view. With each sonar record, the probability of a hazard existing at certain elevations at the respective grid points is updated with Bayesian mechanics. As reinforcing data is collected, the confidence of the map improves. Given a lake's current elevation and a vessel draft, a final generated map can identify areas of the lake that have a high probability of containing hazards that threaten navigation. The approach is implemented in C/C++ utilizing OpenCV, Tesseract OCR, and QGIS open source software and evaluated in a designated test area at Lake Lavon, Collin County, Texas.

  6. Automated Image Analysis for the Detection of Benthic Crustaceans and Bacterial Mat Coverage Using the VENUS Undersea Cabled Network

    PubMed Central

    Aguzzi, Jacopo; Costa, Corrado; Robert, Katleen; Matabos, Marjolaine; Antonucci, Francesca; Juniper, S. Kim; Menesatti, Paolo

    2011-01-01

    The development and deployment of sensors for undersea cabled observatories is presently biased toward the measurement of habitat variables, while sensor technologies for biological community characterization through species identification and individual counting are less common. The VENUS cabled multisensory network (Vancouver Island, Canada) deploys seafloor camera systems at several sites. Our objective in this study was to implement new automated image analysis protocols for the recognition and counting of benthic decapods (i.e., the galatheid squat lobster, Munida quadrispina), as well as for the evaluation of changes in bacterial mat coverage (i.e., Beggiatoa spp.), using a camera deployed in Saanich Inlet (103 m depth). For the counting of Munida we remotely acquired 100 digital photos at hourly intervals from 2 to 6 December 2009. In the case of bacterial mat coverage estimation, images were taken from 2 to 8 December 2009 at the same time frequency. The automated image analysis protocols for both study cases were created in MatLab 7.1. Automation for Munida counting incorporated the combination of both filtering and background correction (Median- and Top-Hat Filters) with Euclidean Distances (ED) on Red-Green-Blue (RGB) channels. The Scale-Invariant Feature Transform (SIFT) features and Fourier Descriptors (FD) of tracked objects were then extracted. Animal classifications were carried out with the tools of morphometric multivariate statistic (i.e., Partial Least Square Discriminant Analysis; PLSDA) on Mean RGB (RGBv) value for each object and Fourier Descriptors (RGBv+FD) matrices plus SIFT and ED. The SIFT approach returned the better results. Higher percentages of images were correctly classified and lower misclassification errors (an animal is present but not detected) occurred. In contrast, RGBv+FD and ED resulted in a high incidence of records being generated for non-present animals. Bacterial mat coverage was estimated in terms of Percent Coverage

  7. NOVELTY DETECTION UNDER CHANGING ENVIRONMENTAL CONDITIONS

    SciTech Connect

    H. SOHN; K. WORDER; C. R. FARRAR

    2001-04-01

    The primary objective of novelty detection is to examine a system's dynamic response to determine if the system significantly deviates from an initial baseline condition. In reality, the system is often subject to changing environmental and operation conditions that affect its dynamic characteristics. Such variations include changes in loading, boundary conditions, temperature, and moisture. Most damage diagnosis techniques, however, generally neglect the effects of these changing ambient conditions. Here, a novelty detection technique is developed explicitly taking into account these natural variations of the system in order to minimize false positive indications of true system changes. Auto-associative neural networks are employed to discriminate system changes of interest such as structural deterioration and damage from the natural variations of the system.

  8. Automated microfluidically controlled electrochemical biosensor for the rapid and highly sensitive detection of Francisella tularensis.

    PubMed

    Dulay, Samuel B; Gransee, Rainer; Julich, Sandra; Tomaso, Herbert; O'Sullivan, Ciara K

    2014-09-15

    Tularemia is a highly infectious zoonotic disease caused by a Gram-negative coccoid rod bacterium, Francisella tularensis. Tularemia is considered as a life-threatening potential biological warfare agent due to its high virulence, transmission, mortality and simplicity of cultivation. In the work reported here, different electrochemical immunosensor formats for the detection of whole F. tularensis bacteria were developed and their performance compared. An anti-Francisella antibody (FB11) was used for the detection that recognises the lipopolysaccharide found in the outer membrane of the bacteria. In the first approach, gold-supported self-assembled monolayers of a carboxyl terminated bipodal alkanethiol were used to covalently cross-link with the FB11 antibody. In an alternative second approach F(ab) fragments of the FB11 antibody were generated and directly chemisorbed onto the gold electrode surface. The second approach resulted in an increased capture efficiency and higher sensitivity. Detection limits of 4.5 ng/mL for the lipopolysaccharide antigen and 31 bacteria/mL for the F. tularensis bacteria were achieved. Having demonstrated the functionality of the immunosensor, an electrode array was functionalised with the antibody fragment and integrated with microfluidics and housed in a tester set-up that facilitated complete automation of the assay. The only end-user intervention is sample addition, requiring less than one-minute hands-on time. The use of the automated microfluidic set-up not only required much lower reagent volumes but also the required incubation time was considerably reduced and a notable increase of 3-fold in assay sensitivity was achieved with a total assay time from sample addition to read-out of less than 20 min. PMID:24747573

  9. Automated coronary artery calcification detection on low-dose chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Cham, Matthew D.; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.

    2014-03-01

    Coronary artery calcification (CAC) measurement from low-dose CT images can be used to assess the risk of coronary artery disease. A fully automatic algorithm to detect and measure CAC from low-dose non-contrast, non-ECG-gated chest CT scans is presented. Based on the automatically detected CAC, the Agatston score (AS), mass score and volume score were computed. These were compared with scores obtained manually from standard-dose ECG-gated scans and low-dose un-gated scans of the same patient. The automatic algorithm segments the heart region based on other pre-segmented organs to provide a coronary region mask. The mitral valve and aortic valve calcification is identified and excluded. All remaining voxels greater than 180HU within the mask region are considered as CAC candidates. The heart segmentation algorithm was evaluated on 400 non-contrast cases with both low-dose and regular dose CT scans. By visual inspection, 371 (92.8%) of the segmentations were acceptable. The automated CAC detection algorithm was evaluated on 41 low-dose non-contrast CT scans. Manual markings were performed on both low-dose and standard-dose scans for these cases. Using linear regression, the correlation of the automatic AS with the standard-dose manual scores was 0.86; with the low-dose manual scores the correlation was 0.91. Standard risk categories were also computed. The automated method risk category agreed with manual markings of gated scans for 24 cases while 15 cases were 1 category off. For low-dose scans, the automatic method agreed with 33 cases while 7 cases were 1 category off.

  10. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  11. Investigation of automated feature extraction techniques for applications in cancer detection from multispectral histopathology images

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Levenson, Richard M.; Rimm, David L.

    2003-05-01

    Recent developments in imaging technology mean that it is now possible to obtain high-resolution histological image data at multiple wavelengths. This allows pathologists to image specimens over a full spectrum, thereby revealing (often subtle) distinctions between different types of tissue. With this type of data, the spectral content of the specimens, combined with quantitative spatial feature characterization may make it possible not only to identify the presence of an abnormality, but also to classify it accurately. However, such are the quantities and complexities of these data, that without new automated techniques to assist in the data analysis, the information contained in the data will remain inaccessible to those who need it. We investigate the application of a recently developed system for the automated analysis of multi-/hyper-spectral satellite image data to the problem of cancer detection from multispectral histopathology image data. The system provides a means for a human expert to provide training data simply by highlighting regions in an image using a computer mouse. Application of these feature extraction techniques to examples of both training and out-of-training-sample data demonstrate that these, as yet unoptimized, techniques already show promise in the discrimination between benign and malignant cells from a variety of samples.

  12. Automated detection of the retinal from OCT spectral domain images of healthy eyes

    NASA Astrophysics Data System (ADS)

    Giovinco, Gaspare; Savastano, Maria Cristina; Ventre, Salvatore; Tamburrino, Antonello

    2015-06-01

    Optical coherence tomography (OCT) has become one of the most relevant diagnostic tools for retinal diseases. Besides being a non-invasive technique, one distinguished feature is its unique capability of providing (in vivo) cross-sectional view of the retinal. Specifically, OCT images show the retinal layers. From the clinical point of view, the identification of the retinal layers opens new perspectives to study the correlation between morphological and functional aspects of the retinal tissue. The main contribution of this paper is a new method/algorithm for the automated segmentation of cross-sectional images of the retina of healthy eyes, obtained by means of spectral domain optical coherence tomography (SD-OCT). Specifically, the proposed segmentation algorithm provides the automated detection of different retinal layers. Tests on experimental SD-OCT scans performed by three different instruments/manufacturers have been successfully carried out and compared to a manual segmentation made by an independent ophthalmologist, showing the generality and the effectiveness of the proposed method.

  13. Automated detection of retinal layers from OCT spectral-domain images of healthy eyes

    NASA Astrophysics Data System (ADS)

    Giovinco, Gaspare; Savastano, Maria Cristina; Ventre, Salvatore; Tamburrino, Antonello

    2015-12-01

    Optical coherence tomography (OCT) has become one of the most relevant diagnostic tools for retinal diseases. Besides being a non-invasive technique, one distinguished feature is its unique capability of providing (in vivo) cross-sectional view of the retina. Specifically, OCT images show the retinal layers. From the clinical point of view, the identification of the retinal layers opens new perspectives to study the correlation between morphological and functional aspects of the retinal tissue. The main contribution of this paper is a new method/algorithm for the automated segmentation of cross-sectional images of the retina of healthy eyes, obtained by means of spectral-domain optical coherence tomography (SD-OCT). Specifically, the proposed segmentation algorithm provides the automated detection of different retinal layers. Tests on experimental SD-OCT scans performed by three different instruments/manufacturers have been successfully carried out and compared to a manual segmentation made by an independent ophthalmologist, showing the generality and the effectiveness of the proposed method.

  14. Automated Detection of Oil Depots from High Resolution Images: a New Perspective

    NASA Astrophysics Data System (ADS)

    Ok, A. O.; Başeski, E.

    2015-03-01

    This paper presents an original approach to identify oil depots from single high resolution aerial/satellite images in an automated manner. The new approach considers the symmetric nature of circular oil depots, and it computes the radial symmetry in a unique way. An automated thresholding method to focus on circular regions and a new measure to verify circles are proposed. Experiments are performed on six GeoEye-1 test images. Besides, we perform tests on 16 Google Earth images of an industrial test site acquired in a time series manner (between the years 1995 and 2012). The results reveal that our approach is capable of detecting circle objects in very different/difficult images. We computed an overall performance of 95.8% for the GeoEye-1 dataset. The time series investigation reveals that our approach is robust enough to locate oil depots in industrial environments under varying illumination and environmental conditions. The overall performance is computed as 89.4% for the Google Earth dataset, and this result secures the success of our approach compared to a state-of-the-art approach.

  15. Automated Detection/Characterization of EUV Waves in SDO/AIA Data

    NASA Astrophysics Data System (ADS)

    Shih, A. Y.; Ireland, J.; Christe, S.; Hughitt, V. K.; Young, C.; Earnshaw, M. D.; Mayer, F.

    2012-12-01

    Although EUV waves in the solar corona (also called coronal bright fronts or "EIT waves") were first observed in 1996, many questions still remain about their nature and their association with other phenomena such as flares, CMEs, and Moreton waves. The high-resolution, high-cadence data from the Atmospheric Imaging Assembly (AIA) instrument on the Solar Dynamics Observatory (SDO) allows for unprecedented analysis of the kinematics and morphology of EUV waves. This information can be crucial for constraining and differentiating between theoretical models. While this analysis can be performed "by hand", the large volume of AIA data is well-suited for automated algorithms to detect and characterize these waves. We are developing such algorithms, which will generate a comprehensive catalog that can be used for statistical studies, and the biases of the algorithms can be well-studied using simulated data. We take advantage of imaging processing methods developed in Python, a general-purpose scientific computing language widely used used by multiple communities, as well as the SunPy Python library. We compare the results of our automated algorithms with other efforts that use more traditional, human-powered methods to identify and characterize EUV waves.

  16. Computer-aided interpretation of ICU portable chest images: automated detection of endotracheal tubes

    NASA Astrophysics Data System (ADS)

    Huo, Zhimin; Li, Simon; Chen, Minjie; Wandtke, John

    2008-03-01

    In intensive care units (ICU), endotracheal (ET) tubes are inserted to assist patients who may have difficulty breathing. A malpositioned ET tube could lead to a collapsed lung, which is life threatening. The purpose of this study is to develop a new method that automatically detects the positioning of ET tubes on portable chest X-ray images. The method determines a region of interest (ROI) in the image and processes the raw image to provide edge enhancement for further analysis. The search of ET tubes is performed within the ROI. The ROI is determined based upon the analysis of the positions of the detected lung area and the spine in the image. Two feature images are generated: a Haar-like image and an edge image. The Haar-like image is generated by applying a Haar-like template to the raw ROI or the enhanced version of the raw ROI. The edge image is generated by applying a direction-specific edge detector. Both templates are designed to represent the characteristics of the ET tubes. Thresholds are applied to the Haar-like image and the edge image to detect initial tube candidates. Region growing, combined with curve fitting of the initial detected candidates, is performed to detect the entire ET tube. The region growing or "tube growing" is guided by the fitted curve of the initial candidates. Merging of the detected tubes after tube growing is performed to combine the detected broken tubes. Tubes within a predefined space can be merged if they meet a set of criteria. Features, such as width, length of the detected tubes, tube positions relative to the lung and spine, and the statistics from the analysis of the detected tube lines, are extracted to remove the false-positive detections in the images. The method is trained and evaluated on two different databases. Preliminary results show that computer-aided detection of tubes in portable chest X-ray images is promising. It is expected that automated detection of ET tubes could lead to timely detection of

  17. Computer-automated caries detection in digital bitewings: consistency of a program and its influence on observer agreement.

    PubMed

    Wenzel, A

    2001-01-01

    The aim of this study was to evaluate a decision-support, caries detection program and its influence on observer agreement in caries diagnosis. 130 patients were examined by digital bitewing radiography (RVG XL sensor, Trophy Radiologie Inc.). Fifty-four approximal surfaces (27 in premolars and 27 in molars) were selected by the author: 24 surfaces (9 in molars and 15 in premolars) scored as sound, 16 surfaces (9 in molars and 7 in premolars) scored as carious in enamel, and 14 surfaces (9 in molars and 5 in premolars) scored as carious in dentine. The Logicon Caries Detector (LCD) program (Logicon Inc., USA) was assessed by repeating the automated analysis ten times for each surface. The two most varying outcomes for lesion probability (Lp(min) and Lp(max)) were saved. Five observers scored the 54 surfaces independently as sound, caries in enamel or caries in dentine before and after the use of LCD. In more than one third of all surfaces the program indicated different lesion probability, from sound at Lp(min) to the presence of a carious lesion at Lp(max). The 5 observers changed their caries score after the use of LCD in a total of 31 surfaces (only 2 of these were in the same surface). Mean kappa value for inter-observer agreement for caries scores before the use of LCD was 0.47 (range 0. 39-0.61) and after LCD 0.48 (range 0.37-0.69). It was concluded that the automated caries detection program was not very consistent and provided different opinions on the caries status in a surface. Inter-observer agreement in caries diagnosis did not improve using the program.

  18. Creating an automated chiller fault detection and diagnostics tool using a data fault library.

    PubMed

    Bailey, Margaret B; Kreider, Jan F

    2003-07-01

    Reliable, automated detection and diagnosis of abnormal behavior within vapor compression refrigeration cycle (VCRC) equipment is extremely desirable for equipment owners and operators. The specific type of VCRC equipment studied in this paper is a 70-ton helical rotary, air-cooled chiller. The fault detection and diagnostic (FDD) tool developed as part of this research analyzes chiller operating data and detects faults through recognizing trends or patterns existing within the data. The FDD method incorporates a neural network (NN) classifier to infer the current state given a vector of observables. Therefore the FDD method relies upon the availability of normal and fault empirical data for training purposes and therefore a fault library of empirical data is assembled. This paper presents procedures for conducting sophisticated fault experiments on chillers that simulate air-cooled condenser, refrigerant, and oil related faults. The experimental processes described here are not well documented in literature and therefore will provide the interested reader with a useful guide. In addition, the authors provide evidence, based on both thermodynamics and empirical data analysis, that chiller performance is significantly degraded during fault operation. The chiller's performance degradation is successfully detected and classified by the NN FDD classifier as discussed in the paper's final section. PMID:12858981

  19. Use of computer and respiratory inductance plethysmography for the automated detection of swallowing in the elderly.

    PubMed

    Moreau-Gaudry, Alexandre; Sabil, Abdelkebir; Baconnier, Pierre; Benchetrit, Gila; Franco, Alain

    2005-01-01

    Deglutition disorders can occur at any age but are especially prevalent in the elderly. The resulting morbidity and mortality are being recognized as major geriatric health issues, Because of difficulties in studying swallowing in the frail elderly, a new, non-invasive, user-friendly, bedside technique has been developed. Ideally suited to such patients, this tool, an intermediary between purely instrumental and clinical methods, combines respiratory inductance plethysmography (RIP) and the computer to detect swallowing automatically, Based on an automated analysis of the airflow estimated by the RIP-derived signal, this new tool was evaluated according to its capacity to detect clinical swallowing from among the 1643 automatically detected respiratory events, This evaluation used contingency tables and Receiver Operator Characteristic (ROC) curves, Results were all significant (chi2(1,n=1643)>100, p<0.01). Considering its high accuracy in detecting swallowing (area under the ROC curve greater than 0.9), this system would be proposed to study deglutition and then deglutition disorders in the frail elderly, to set up medical supervision and to evaluate the efficiency of a swallowing disorder remedial therapeutic.

  20. Automated detection of pain from facial expressions: a rule-based approach using AAM

    NASA Astrophysics Data System (ADS)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  1. An automated dental caries detection and scoring system for optical images of tooth occlusal surface.

    PubMed

    Ghaedi, Leila; Gottlieb, Riki; Sarrett, David C; Ismail, Amid; Belle, Ashwin; Najarian, Kayvan; Hargraves, Rosalyn Hobson

    2014-01-01

    Dental caries are one of the most prevalent chronic diseases. The management of dental caries demands detection of carious lesions at early stages. This study aims to design an automated system to detect and score caries lesions based on optical images of the occlusal tooth surface according to the International Caries Detection and Assessment System (ICDAS) guidelines. The system detects the tooth boundaries and irregular regions, and extracts 77 features from each image. These features include statistical measures of color space, grayscale image, as well as Wavelet Transform and Fourier Transform based features. Used in this study were 88 occlusal surface photographs of extracted teeth examined and scored by ICDAS experts. Seven ICDAS codes which show the different stages in caries development were collapsed into three classes: score 0, scores 1 and 2, and scores 3 to 6. The system shows accuracy of 86.3%, specificity of 91.7%, and sensitivity of 83.0% in ten-fold cross validation in classification of the tooth images. While the system needs further improvement and validation using larger datasets, it presents promising potential for clinical diagnostics with high accuracy and minimal cost. This is a notable advantage over existing systems requiring expensive imaging and external hardware.

  2. Knee X-ray image analysis method for automated detection of Osteoarthritis

    PubMed Central

    Shamir, Lior; Ling, Shari M.; Scott, William W.; Bos, Angelo; Orlov, Nikita; Macura, Tomasz; Eckley, D. Mark; Ferrucci, Luigi; Goldberg, Ilya G.

    2008-01-01

    We describe a method for automated detection of radiographic Osteoarthritis (OA) in knee X-ray images. The detection is based on the Kellgren-Lawrence classification grades, which correspond to the different stages of OA severity. The classifier was built using manually classified X-rays, representing the first four KL grades (normal, doubtful, minimal and moderate). Image analysis is performed by first identifying a set of image content descriptors and image transforms that are informative for the detection of OA in the X-rays, and assigning weights to these image features using Fisher scores. Then, a simple weighted nearest neighbor rule is used in order to predict the KL grade to which a given test X-ray sample belongs. The dataset used in the experiment contained 350 X-ray images classified manually by their KL grades. Experimental results show that moderate OA (KL grade 3) and minimal OA (KL grade 2) can be differentiated from normal cases with accuracy of 91.5% and 80.4%, respectively. Doubtful OA (KL grade 1) was detected automatically with a much lower accuracy of 57%. The source code developed and used in this study is available for free download at www.openmicroscopy.org. PMID:19342330

  3. Automated electronic systems for the detection of oestrus and timing of AI in cattle.

    PubMed

    Nebel, R L; Dransfield, M G; Jobst, S M; Bame, J H

    2000-07-01

    For the majority of dairy herds where artificial insemination (AI) is practiced, the limiting factor toward obtaining efficient reproductive performance is the failure to detect oestrus in a timely and accurate manner. Periodic visual observation has been the dominant method used to identify cows in oestrus. New approaches are being developed to provide automated systems of detection of oestrus using electronic technology. The goal of an oestrus detection program should be to identify oestrus positively and accurately in all cycling animals and consequently to identify animals not cycling. The ultimate goal should be to predict the time of ovulation, thus allowing for insemination that will maximize the opportunity for conception. Unfortunately, most studies designed to evaluate the optimal time of AI generally contained two technical deficiencies: inadequate numbers of cows for valid statistical comparisons and inaccurate knowledge of the onset of oestrus because of low frequency of visual observations and/or efficiency of methods used for the detection of oestrus. Studies using pedometry and a pressure sensing radiotelemetric system will be reviewed as each have independently obtained an optimal time of AI of 5 to 17 h after either the increase in locomotive activity or following the first standing event associated with the onset of oestrus. PMID:10844237

  4. The automated system of detection and research of pollution in the atmosphere

    NASA Astrophysics Data System (ADS)

    Isakova, Anna I.; Smal, Oksana V.; Chistyakova, Liliya K.; Penin, Sergei T.

    2004-02-01

    In the paper, the automated system of data processing (ASDP) for a hardware complex DAN-2, assigned for registration of emission and absorption of optical and the microwave radiation initiated by gas-aerosol pollution in the atmosphere, is presented. The complex DAN-2 has been developed in the Institute of Atsmospheric Optics of the Siberian Branch of the Russian Academy of Science. In the ASDP, a problem of automation of recording processes, storage and processing of the information measured in experiment has been solved. Using in ASDP subsystems of the forecast of optical noise, the forecast of distribution of an impurity in a plume of gas-aerosol emission from industrial plants allows us to carry out the express-analysis of ecological pollution in the inspection zone. Application of a modular principle has created an opportunity to realize all subsystems ASPD independently from each other, thus, they can operate as independently, and in the general complex of programs. As a tool for creation of the system software, the object-oriented instrument of programming Delphi 5.0 has been chosen. It has a number of advantages and distinctive features such as the convenient graphic interface with displaying of calculation results as uniform scrolling tables and graphics, access to the data files, high speed of mathematical calculations, an opportunity of the further expansion and change of the calculation algorithms. Use of the ASPD has allowed us to improve quality of data recording, their processing, and visualization of the processed results. For the first time in the automated system, the complex estimation of ecological situation with use of experimental data in real time has been realized. The ASPD can be used also by other experimental equipment intended for the solution of problems of the atmospheric optics.

  5. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  6. Monitoring gypsy moth defoliation by applying change detection techniques to Landsat imagery

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Stauffer, M. L.

    1978-01-01

    The overall objective of a research effort at NASA's Goddard Space Flight Center is to develop and evaluate digital image processing techniques that will facilitate the assessment of the intensity and spatial distribution of forest insect damage in Northeastern U.S. forests using remotely sensed data from Landsats 1, 2 and C. Automated change detection techniques are presently being investigated as a method of isolating the areas of change in the forest canopy resulting from pest outbreaks. In order to follow the change detection approach, Landsat scene correction and overlay capabilities are utilized to provide multispectral/multitemporal image files of 'defoliation' and 'nondefoliation' forest stand conditions.

  7. Automation of disbond detection in aircraft fuselage through thermal image processing

    NASA Technical Reports Server (NTRS)

    Prabhu, D. R.; Winfree, W. P.

    1992-01-01

    A procedure for interpreting thermal images obtained during the nondestructive evaluation of aircraft bonded joints is presented. The procedure operates on time-derivative thermal images and resulted in a disbond image with disbonds highlighted. The size of the 'black clusters' in the output disbond image is a quantitative measure of disbond size. The procedure is illustrated using simulation data as well as data obtained through experimental testing of fabricated samples and aircraft panels. Good results are obtained, and, except in pathological cases, 'false calls' in the cases studied appeared only as noise in the output disbond image which was easily filtered out. The thermal detection technique coupled with an automated image interpretation capability will be a very fast and effective method for inspecting bonded joints in an aircraft structure.

  8. Automated Detection of Atrial Fibrillation from the Electrocardiogram Channel of Polysomnograms

    PubMed Central

    Monahan, Ken; Song, Yanna; Loparo, Ken; Mehra, Reena; Harrell, Frank E; Redline, Susan

    2016-01-01

    Purpose Accurate identification of atrial fibrillation episodes from polysomnograms is important for research purposes, but requires manual review of a large number of long electrocardiographic tracings. As automated assessment of these tracings for atrial fibrillation may improve efficiency, this study aimed to evaluate this approach in polysomnogram-derived electrocardiographic data. Methods A previously described algorithm to detect atrial fibrillation from single-lead electrocardiograms was applied to polysomnograms from a large epidemiologic study of obstructive sleep apnea in older men (Osteoporotic Fractures in Men [MrOS] Sleep Study). Atrial fibrillation status during each participant's PSG was determined by independent manual review. Models to predict atrial fibrillation status from a combination of algorithm output and clinical/polysomnographic characteristics were developed and their accuracy was evaluated using standard statistical techniques. Results Derivation and validation cohorts each consisted of 1395 individuals; 5% of each group had atrial fibrillation. Model parameters were optimized for the derivation cohort using the Akaike Information Criterion. Application to the validation cohort of these optimized models revealed high sensitivity (85-90%) and specificity (90-95%) as well as good predictive ability, as assessed by the C statistic (> 0.9) and generalized R2 values (∼ 0.6). Addition of cardiovascular or polysomnogram data to the models did not improve their performance. Conclusions In a research setting, automated detection of atrial fibrillation from polysomnogram-derived electrocardiographic signals appears feasible and agrees well with manual identification. Future studies can evaluate the utility of this technique as applied to clinical polysomnograms and ambulatory electrocardiographic monitoring. PMID:26092280

  9. Automated wavelet denoising of photoacoustic signals for circulating melanoma cell detection and burn image reconstruction.

    PubMed

    Holan, Scott H; Viator, John A

    2008-06-21

    Photoacoustic image reconstruction may involve hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a biological sample with laser light are used to produce an image of the acoustic source. Each of these measurements must undergo some signal processing, such as denoising or system deconvolution. In order to process the numerous signals, we have developed an automated wavelet algorithm for denoising signals. We appeal to the discrete wavelet transform for denoising photoacoustic signals generated in a dilute melanoma cell suspension and in thermally coagulated blood. We used 5, 9, 45 and 270 melanoma cells in the laser beam path as test concentrations. For the burn phantom, we used coagulated blood in 1.6 mm silicon tube submerged in Intralipid. Although these two targets were chosen as typical applications for photoacoustic detection and imaging, they are of independent interest. The denoising employs level-independent universal thresholding. In order to accommodate nonradix-2 signals, we considered a maximal overlap discrete wavelet transform (MODWT). For the lower melanoma cell concentrations, as the signal-to-noise ratio approached 1, denoising allowed better peak finding. For coagulated blood, the signals were denoised to yield a clean photoacoustic resulting in an improvement of 22% in the reconstructed image. The entire signal processing technique was automated so that minimal user intervention was needed to reconstruct the images. Such an algorithm may be used for image reconstruction and signal extraction for applications such as burn depth imaging, depth profiling of vascular lesions in skin and the detection of single cancer cells in blood samples. PMID:18495977

  10. NOTE: Automated wavelet denoising of photoacoustic signals for circulating melanoma cell detection and burn image reconstruction

    NASA Astrophysics Data System (ADS)

    Holan, Scott H.; Viator, John A.

    2008-06-01

    Photoacoustic image reconstruction may involve hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a biological sample with laser light are used to produce an image of the acoustic source. Each of these measurements must undergo some signal processing, such as denoising or system deconvolution. In order to process the numerous signals, we have developed an automated wavelet algorithm for denoising signals. We appeal to the discrete wavelet transform for denoising photoacoustic signals generated in a dilute melanoma cell suspension and in thermally coagulated blood. We used 5, 9, 45 and 270 melanoma cells in the laser beam path as test concentrations. For the burn phantom, we used coagulated blood in 1.6 mm silicon tube submerged in Intralipid. Although these two targets were chosen as typical applications for photoacoustic detection and imaging, they are of independent interest. The denoising employs level-independent universal thresholding. In order to accommodate nonradix-2 signals, we considered a maximal overlap discrete wavelet transform (MODWT). For the lower melanoma cell concentrations, as the signal-to-noise ratio approached 1, denoising allowed better peak finding. For coagulated blood, the signals were denoised to yield a clean photoacoustic resulting in an improvement of 22% in the reconstructed image. The entire signal processing technique was automated so that minimal user intervention was needed to reconstruct the images. Such an algorithm may be used for image reconstruction and signal extraction for applications such as burn depth imaging, depth profiling of vascular lesions in skin and the detection of single cancer cells in blood samples.

  11. Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.

    SciTech Connect

    Hart, Darren

    2004-07-01

    MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefronts at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and

  12. A Novel Method for the Separation of Overlapping Pollen Species for Automated Detection and Classification.

    PubMed

    Tello-Mijares, Santiago; Flores, Francisco

    2016-01-01

    The identification of pollen in an automated way will accelerate different tasks and applications of palynology to aid in, among others, climate change studies, medical allergies calendar, and forensic science. The aim of this paper is to develop a system that automatically captures a hundred microscopic images of pollen and classifies them into the 12 different species from Lagunera Region, Mexico. Many times, the pollen is overlapping on the microscopic images, which increases the difficulty for its automated identification and classification. This paper focuses on a method to segment the overlapping pollen. First, the proposed method segments the overlapping pollen. Second, the method separates the pollen based on the mean shift process (100% segmentation) and erosion by H-minima based on the Fibonacci series. Thus, pollen is characterized by its shape, color, and texture for training and evaluating the performance of three classification techniques: random tree forest, multilayer perceptron, and Bayes net. Using the newly developed system, we obtained segmentation results of 100% and classification on top of 96.2% and 96.1% in recall and precision using multilayer perceptron in twofold cross validation.

  13. A Novel Method for the Separation of Overlapping Pollen Species for Automated Detection and Classification

    PubMed Central

    Flores, Francisco

    2016-01-01

    The identification of pollen in an automated way will accelerate different tasks and applications of palynology to aid in, among others, climate change studies, medical allergies calendar, and forensic science. The aim of this paper is to develop a system that automatically captures a hundred microscopic images of pollen and classifies them into the 12 different species from Lagunera Region, Mexico. Many times, the pollen is overlapping on the microscopic images, which increases the difficulty for its automated identification and classification. This paper focuses on a method to segment the overlapping pollen. First, the proposed method segments the overlapping pollen. Second, the method separates the pollen based on the mean shift process (100% segmentation) and erosion by H-minima based on the Fibonacci series. Thus, pollen is characterized by its shape, color, and texture for training and evaluating the performance of three classification techniques: random tree forest, multilayer perceptron, and Bayes net. Using the newly developed system, we obtained segmentation results of 100% and classification on top of 96.2% and 96.1% in recall and precision using multilayer perceptron in twofold cross validation. PMID:27034710

  14. A Novel Method for the Separation of Overlapping Pollen Species for Automated Detection and Classification.

    PubMed

    Tello-Mijares, Santiago; Flores, Francisco

    2016-01-01

    The identification of pollen in an automated way will accelerate different tasks and applications of palynology to aid in, among others, climate change studies, medical allergies calendar, and forensic science. The aim of this paper is to develop a system that automatically captures a hundred microscopic images of pollen and classifies them into the 12 different species from Lagunera Region, Mexico. Many times, the pollen is overlapping on the microscopic images, which increases the difficulty for its automated identification and classification. This paper focuses on a method to segment the overlapping pollen. First, the proposed method segments the overlapping pollen. Second, the method separates the pollen based on the mean shift process (100% segmentation) and erosion by H-minima based on the Fibonacci series. Thus, pollen is characterized by its shape, color, and texture for training and evaluating the performance of three classification techniques: random tree forest, multilayer perceptron, and Bayes net. Using the newly developed system, we obtained segmentation results of 100% and classification on top of 96.2% and 96.1% in recall and precision using multilayer perceptron in twofold cross validation. PMID:27034710

  15. Image patch-based method for automated classification and detection of focal liver lesions on CT

    NASA Astrophysics Data System (ADS)

    Safdari, Mustafa; Pasari, Raghav; Rubin, Daniel; Greenspan, Hayit

    2013-03-01

    We developed a method for automated classification and detection of liver lesions in CT images based on image patch representation and bag-of-visual-words (BoVW). BoVW analysis has been extensively used in the computer vision domain to analyze scenery images. In the current work we discuss how it can be used for liver lesion classification and detection. The methodology includes building a dictionary for a training set using local descriptors and representing a region in the image using a visual word histogram. Two tasks are described: a classification task, for lesion characterization, and a detection task in which a scan window moves across the image and is determined to be normal liver tissue or a lesion. Data: In the classification task 73 CT images of liver lesions were used, 25 images having cysts, 24 having metastasis and 24 having hemangiomas. A radiologist circumscribed the lesions, creating a region of interest (ROI), in each of the images. He then provided the diagnosis, which was established either by biopsy or clinical follow-up. Thus our data set comprises 73 images and 73 ROIs. In the detection task, a radiologist drew ROIs around each liver lesion and two regions of normal liver, for a total of 159 liver lesion ROIs and 146 normal liver ROIs. The radiologist also demarcated the liver boundary. Results: Classification results of more than 95% were obtained. In the detection task, F1 results obtained is 0.76. Recall is 84%, with precision of 73%. Results show the ability to detect lesions, regardless of shape.

  16. Automated SNP detection in expressed sequence tags: statistical considerations and application to maritime pine sequences.

    PubMed

    Dantec, Loïck Le; Chagné, David; Pot, David; Cantin, Olivier; Garnier-Géré, Pauline; Bedon, Frank; Frigerio, Jean-Marc; Chaumeil, Philippe; Léger, Patrick; Garcia, Virginie; Laigret, Frédéric; De Daruvar, Antoine; Plomion, Christophe

    2004-02-01

    We developed an automated pipeline for the detection of single nucleotide polymorphisms (SNPs) in expressed sequence tag (EST) data sets, by combining three DNA sequence analysis programs: Phred, Phrap and PolyBayes. This application requires access to the individual electrophoregram traces. First, a reference set of 65 SNPs was obtained from the sequencing of 30 gametes in 13 maritime pine (Pinus pinaster Ait.) gene fragments (6671 bp), resulting in a frequency of 1 SNP every 102.6 bp. Second, parameters of the three programs were optimized in order to retrieve as many true SNPs, while keeping the rate of false positive as low as possible. Overall, the efficiency of detection of true SNPs was 83.1%. However, this rate varied largely as a function of the rare SNP allele frequency: down to 41% for rare SNP alleles (frequency < 10%), up to 98% for allele frequencies above 10%. Third, the detection method was applied to the 18498 assembled maritime pine (Pinus pinaster Ait.) ESTs, allowing to identify a total of 1400 candidate SNPs, in contigs containing between 4 and 20 sequence reads. These genetic resources, described for the first time in a forest tree species, were made available at http://www.pierroton.inra/genetics/Pinesnps. We also derived an analytical expression for the SNP detection probability as a function of the SNP allele frequency, the number of haploid genomes used to generate the EST sequence database, and the sample size of the contigs considered for SNP detection. The frequency of the SNP allele was shown to be the main factor influencing the probability of SNP detection.

  17. Detecting Landscape Change: The View from Above

    ERIC Educational Resources Information Center

    Porter, Jess

    2008-01-01

    This article will demonstrate an approach for discovering and assessing local landscape change through the use of remotely sensed images. A brief introduction to remotely sensed imagery is followed by a discussion of relevant ways to introduce this technology into the college science classroom. The Map Detective activity demonstrates the…

  18. Detection of abrupt changes in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1984-01-01

    Some of the basic ideas associated with the detection of abrupt changes in dynamic systems are presented. Multiple filter-based techniques and residual-based method and the multiple model and generalized likelihood ratio methods are considered. Issues such as the effect of unknown onset time on algorithm complexity and structure and robustness to model uncertainty are discussed.

  19. Automated high-pressure titration system with in situ infrared spectroscopic detection

    SciTech Connect

    Thompson, Christopher J. Martin, Paul F.; Chen, Jeffrey; Schaef, Herbert T.; Rosso, Kevin M.; Felmy, Andrew R.; Loring, John S.; Benezeth, Pascale

    2014-04-15

    A fully automated titration system with infrared detection was developed for investigating interfacial chemistry at high pressures. The apparatus consists of a high-pressure fluid generation and delivery system coupled to a high-pressure cell with infrared optics. A manifold of electronically actuated valves is used to direct pressurized fluids into the cell. Precise reagent additions to the pressurized cell are made with calibrated tubing loops that are filled with reagent and placed in-line with the cell and a syringe pump. The cell's infrared optics facilitate both transmission and attenuated total reflection (ATR) measurements to monitor bulk-fluid composition and solid-surface phenomena such as adsorption, desorption, complexation, dissolution, and precipitation. Switching between the two measurement modes is accomplished with moveable mirrors that direct the light path of a Fourier transform infrared spectrometer into the cell along transmission or ATR light paths. The versatility of the high-pressure IR titration system was demonstrated with three case studies. First, we titrated water into supercritical CO{sub 2} (scCO{sub 2}) to generate an infrared calibration curve and determine the solubility of water in CO{sub 2} at 50 °C and 90 bar. Next, we characterized the partitioning of water between a montmorillonite clay and scCO{sub 2} at 50 °C and 90 bar. Transmission-mode spectra were used to quantify changes in the clay's sorbed water concentration as a function of scCO{sub 2} hydration, and ATR measurements provided insights into competitive residency of water and CO{sub 2} on the clay surface and in the interlayer. Finally, we demonstrated how time-dependent studies can be conducted with the system by monitoring the carbonation reaction of forsterite (Mg{sub 2}SiO{sub 4}) in water-bearing scCO{sub 2} at 50 °C and 90 bar. Immediately after water dissolved in the scCO{sub 2}, a thin film of adsorbed water formed on the mineral surface, and the film

  20. Automated detection and characterization of microstructural features: application to eutectic particles in single crystal Ni-based superalloys

    NASA Astrophysics Data System (ADS)

    Tschopp, M. A.; Groeber, M. A.; Fahringer, R.; Simmons, J. P.; Rosenberger, A. H.; Woodward, C.

    2010-03-01

    Serial sectioning methods continue to produce an abundant amount of image data for quantifying the three-dimensional nature of material microstructures. Here, we discuss a methodology to automate detecting and characterizing eutectic particles taken from serial images of a production turbine blade made of a heat-treated single crystal Ni-based superalloy (PWA 1484). This method includes two important steps for unassisted eutectic particle characterization: automatically identifying a seed point within each particle and segmenting the particle using a region growing algorithm with an automated stop point. Once detected, the segmented eutectic particles are used to calculate microstructural statistics for characterizing and reconstructing statistically representative synthetic microstructures for single crystal Ni-based superalloys. The significance of this work is its ability to automate characterization for analysing the 3D nature of eutectic particles.

  1. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    SciTech Connect

    Adal, Kedir M.; Sidebe, Desire; Ali, Sharib; Chaum, Edward; Karnowski, Thomas Paul; Meriaudeau, Fabrice

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  2. Breast cancer detection: radiologists’ performance using mammography with and without automated whole-breast ultrasound

    PubMed Central

    Dean, Judy; Lee, Sung-Jae; Comulada, W. Scott

    2010-01-01

    Objective Radiologist reader performance for breast cancer detection using mammography plus automated whole-breast ultrasound (AWBU) was compared with mammography alone. Methods Screenings for non-palpable breast malignancies in women with radiographically dense breasts with contemporaneous mammograms and AWBU were reviewed by 12 radiologists blinded to the diagnoses; half the studies were abnormal. Readers first reviewed the 102 mammograms. The American College of Radiology (ACR) Breast Imaging Reporting and Data System (BIRADS) and Digital Mammographic Imaging Screening Trial (DMIST) likelihood ratings were recorded with location information for identified abnormalities. Readers then reviewed the mammograms and AWBU with knowledge of previous mammogram-only evaluation. We compared reader performance across screening techniques using absolute callback, areas under the curve (AUC), and figure of merit (FOM). Results True positivity of cancer detection increased 63%, with only a 4% decrease in true negativity. Reader-averaged AUC was higher for mammography plus AWBU compared with mammography alone by BIRADS (0.808 versus 0.701) and likelihood scores (0.810 versus 0.703). Similarly, FOM was higher for mammography plus AWBU compared with mammography alone by BIRADS (0.786 versus 0.613) and likelihood scores (0.791 versus 0.614). Conclusion Adding AWBU to mammography improved callback rates, accuracy of breast cancer detection, and confidence in callbacks for dense-breasted women. PMID:20632009

  3. Automated real-time detection of defects during machining of ceramics

    DOEpatents

    Ellingson, William A.; Sun, Jiangang

    1997-01-01

    Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known "feature masks" representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified.

  4. Automated real-time detection of defects during machining of ceramics

    DOEpatents

    Ellingson, W.A.; Sun, J.

    1997-11-18

    Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known ``feature masks`` representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified. 14 figs.

  5. Automated x-ray detection of contaminants in continuous food streams

    NASA Astrophysics Data System (ADS)

    Penman, David W.

    1996-10-01

    As an inspection technology, x-rays have been used in food product inspection for a number of years. Despite this, in contrast with the use of image processing techniques in medical applications of x-rays, food inspection systems have remained relatively rudimentary. SOme of our earlier work in this area has been stimulate by specific x-ray inspection tasks, where we have been required to locate contaminants in batches of particular packaged products. We have developed techniques for contaminant detection in both canned and bagged products. This paper gives an overview of work undertaken by Industrial Research Limited on the development of automated machine vision techniques for the inspection of food products for contaminants. Our recent work has concentrated on the development of more generic techniques for detecting contaminants in a wide range of continuously produced products, with no requirement for product singulation. A particular emphasis in our work has been the real-time detection of contaminants appearing indistinctly in x-ray images in the presence of noise and major product variability.

  6. Automated detection of heuristics and biases among pathologists in a computer-based system.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases. PMID:22618855

  7. Automated detection of cerebral microbleeds in patients with Traumatic Brain Injury.

    PubMed

    van den Heuvel, T L A; van der Eerden, A W; Manniesing, R; Ghafoorian, M; Tan, T; Andriessen, T M J C; Vande Vyvere, T; van den Hauwe, L; Ter Haar Romeny, B M; Goraj, B M; Platel, B

    2016-01-01

    In this paper a Computer Aided Detection (CAD) system is presented to automatically detect Cerebral Microbleeds (CMBs) in patients with Traumatic Brain Injury (TBI). It is believed that the presence of CMBs has clinical prognostic value in TBI patients. To study the contribution of CMBs in patient outcome, accurate detection of CMBs is required. Manual detection of CMBs in TBI patients is a time consuming task that is prone to errors, because CMBs are easily overlooked and are difficult to distinguish from blood vessels. This study included 33 TBI patients. Because of the laborious nature of manually annotating CMBs, only one trained expert manually annotated the CMBs in all 33 patients. A subset of ten TBI patients was annotated by six experts. Our CAD system makes use of both Susceptibility Weighted Imaging (SWI) and T1 weighted magnetic resonance images to detect CMBs. After pre-processing these images, a two-step approach was used for automated detection of CMBs. In the first step, each voxel was characterized by twelve features based on the dark and spherical nature of CMBs and a random forest classifier was used to identify CMB candidate locations. In the second step, segmentations were made from each identified candidate location. Subsequently an object-based classifier was used to remove false positive detections of the voxel classifier, by considering seven object-based features that discriminate between spherical objects (CMBs) and elongated objects (blood vessels). A guided user interface was designed for fast evaluation of the CAD system result. During this process, an expert checked each CMB detected by the CAD system. A Fleiss' kappa value of only 0.24 showed that the inter-observer variability for the TBI patients in this study was very large. An expert using the guided user interface reached an average sensitivity of 93%, which was significantly higher (p = 0.03) than the average sensitivity of 77% (sd 12.4%) that the six experts manually detected

  8. Automated detection of cerebral microbleeds in patients with Traumatic Brain Injury.

    PubMed

    van den Heuvel, T L A; van der Eerden, A W; Manniesing, R; Ghafoorian, M; Tan, T; Andriessen, T M J C; Vande Vyvere, T; van den Hauwe, L; Ter Haar Romeny, B M; Goraj, B M; Platel, B

    2016-01-01

    In this paper a Computer Aided Detection (CAD) system is presented to automatically detect Cerebral Microbleeds (CMBs) in patients with Traumatic Brain Injury (TBI). It is believed that the presence of CMBs has clinical prognostic value in TBI patients. To study the contribution of CMBs in patient outcome, accurate detection of CMBs is required. Manual detection of CMBs in TBI patients is a time consuming task that is prone to errors, because CMBs are easily overlooked and are difficult to distinguish from blood vessels. This study included 33 TBI patients. Because of the laborious nature of manually annotating CMBs, only one trained expert manually annotated the CMBs in all 33 patients. A subset of ten TBI patients was annotated by six experts. Our CAD system makes use of both Susceptibility Weighted Imaging (SWI) and T1 weighted magnetic resonance images to detect CMBs. After pre-processing these images, a two-step approach was used for automated detection of CMBs. In the first step, each voxel was characterized by twelve features based on the dark and spherical nature of CMBs and a random forest classifier was used to identify CMB candidate locations. In the second step, segmentations were made from each identified candidate location. Subsequently an object-based classifier was used to remove false positive detections of the voxel classifier, by considering seven object-based features that discriminate between spherical objects (CMBs) and elongated objects (blood vessels). A guided user interface was designed for fast evaluation of the CAD system result. During this process, an expert checked each CMB detected by the CAD system. A Fleiss' kappa value of only 0.24 showed that the inter-observer variability for the TBI patients in this study was very large. An expert using the guided user interface reached an average sensitivity of 93%, which was significantly higher (p = 0.03) than the average sensitivity of 77% (sd 12.4%) that the six experts manually detected

  9. Automatic change detection using mobile laser scanning

    NASA Astrophysics Data System (ADS)

    Hebel, M.; Hammer, M.; Gordon, M.; Arens, M.

    2014-10-01

    Automatic change detection in 3D environments requires the comparison of multi-temporal data. By comparing current data with past data of the same area, changes can be automatically detected and identified. Volumetric changes in the scene hint at suspicious activities like the movement of military vehicles, the application of camouflage nets, or the placement of IEDs, etc. In contrast to broad research activities in remote sensing with optical cameras, this paper addresses the topic using 3D data acquired by mobile laser scanning (MLS). We present a framework for immediate comparison of current MLS data to given 3D reference data. Our method extends the concept of occupancy grids known from robot mapping, which incorporates the sensor positions in the processing of the 3D point clouds. This allows extracting the information that is included in the data acquisition geometry. For each single range measurement, it becomes apparent that an object reflects laser pulses in the measured range distance, i.e., space is occupied at that 3D position. In addition, it is obvious that space is empty along the line of sight between sensor and the reflecting object. Everywhere else, the occupancy of space remains unknown. This approach handles occlusions and changes implicitly, such that the latter are identifiable by conflicts of empty space and occupied space. The presented concept of change detection has been successfully validated in experiments with recorded MLS data streams. Results are shown for test sites at which MLS data were acquired at different time intervals.

  10. Total least squares for anomalous change detection

    NASA Astrophysics Data System (ADS)

    Theiler, James; Matsekh, Anna M.

    2010-04-01

    A family of subtraction-based anomalous change detection algorithms is derived from a total least squares (TLSQ) framework. This provides an alternative to the well-known chronochrome algorithm, which is derived from ordinary least squares. In both cases, the most anomalous changes are identified with the pixels that exhibit the largest residuals with respect to the regression of the two images against each other. The family of TLSQbased anomalous change detectors is shown to be equivalent to the subspace RX formulation for straight anomaly detection, but applied to the stacked space. However, this family is not invariant to linear coordinate transforms. On the other hand, whitened TLSQ is coordinate invariant, and special cases of it are equivalent to canonical correlation analysis and optimized covariance equalization. What whitened TLSQ offers is a generalization of these algorithms with the potential for better performance.

  11. Performance evaluation of an automated single-channel sleep–wake detection algorithm

    PubMed Central

    Kaplan, Richard F; Wang, Ying; Loparo, Kenneth A; Kelly, Monica R; Bootzin, Richard R

    2014-01-01

    Background A need exists, from both a clinical and a research standpoint, for objective sleep measurement systems that are both easy to use and can accurately assess sleep and wake. This study evaluates the output of an automated sleep–wake detection algorithm (Z-ALG) used in the Zmachine (a portable, single-channel, electroencephalographic [EEG] acquisition and analysis system) against laboratory polysomnography (PSG) using a consensus of expert visual scorers. Methods Overnight laboratory PSG studies from 99 subjects (52 females/47 males, 18–60 years, median age 32.7 years), including both normal sleepers and those with a variety of sleep disorders, were assessed. PSG data obtained from the differential mastoids (A1–A2) were assessed by Z-ALG, which determines sleep versus wake every 30 seconds using low-frequency, intermediate-frequency, and high-frequency and time domain EEG features. PSG data were independently scored by two to four certified PSG technologists, using standard Rechtschaffen and Kales guidelines, and these score files were combined on an epoch-by-epoch basis, using a majority voting rule, to generate a single score file per subject to compare against the Z-ALG output. Both epoch-by-epoch and standard sleep indices (eg, total sleep time, sleep efficiency, latency to persistent sleep, and wake after sleep onset) were compared between the Z-ALG output and the technologist consensus score files. Results Overall, the sensitivity and specificity for detecting sleep using the Z-ALG as compared to the technologist consensus are 95.5% and 92.5%, respectively, across all subjects, and the positive predictive value and the negative predictive value for detecting sleep are 98.0% and 84.2%, respectively. Overall κ agreement is 0.85 (approaching the level of agreement observed among sleep technologists). These results persist when the sleep disorder subgroups are analyzed separately. Conclusion This study demonstrates that the Z-ALG automated sleep

  12. Enhanced pulsar and single pulse detection via automated radio frequency interference detection in multipixel feeds

    NASA Astrophysics Data System (ADS)

    Kocz, J.; Bailes, M.; Barnes, D.; Burke-Spolaor, S.; Levin, L.

    2012-02-01

    Single pixel feeds on large aperture radio telescopes have the ability to detect weak (˜10 mJy) impulsive bursts of radio emission and sub-mJy radio pulsars. Unfortunately, in large-scale blind surveys, radio frequency interference (RFI) mimics both radio bursts and radio pulsars, greatly reducing the sensitivity to new discoveries as real signals of astronomical origin get lost among the millions of false candidates. In this paper a technique that takes advantage of multipixel feeds to use eigenvector decomposition of common signals is used to greatly facilitate radio burst and pulsar discovery. Since the majority of RFI occurs with zero dispersion, the method was tested on the total power present in the 13 beams of the Parkes multibeam receiver using data from archival intermediate-latitude surveys. The implementation of this method greatly reduced the number of false candidates and led to the discovery of one new rotating radio transient or RRAT, six new pulsars and five new pulses that shared the swept-frequency characteristics similar in nature to the `Lorimer burst'. These five new signals occurred within minutes of 11 previous detections of a similar type. When viewed together, they display temporal characteristics related to integer seconds, with non-random distributions and characteristic 'gaps' between them, suggesting they are not from a naturally occurring source. Despite the success in removing RFI, false candidates present in the data that are only visible after integrating in time or at non-zero dispersion remained. It is demonstrated that with some computational penalty, the method can be applied iteratively at all trial dispersions and time resolutions to remove the vast majority of spurious candidates.

  13. Machine Learning Techniques for the Detection of Shockable Rhythms in Automated External Defibrillators

    PubMed Central

    Irusta, Unai; Morgado, Eduardo; Aramendi, Elisabete; Ayala, Unai; Wik, Lars; Kramer-Johansen, Jo; Eftestøl, Trygve; Alonso-Atienza, Felipe

    2016-01-01

    Early recognition of ventricular fibrillation (VF) and electrical therapy are key for the survival of out-of-hospital cardiac arrest (OHCA) patients treated with automated external defibrillators (AED). AED algorithms for VF-detection are customarily assessed using Holter recordings from public electrocardiogram (ECG) databases, which may be different from the ECG seen during OHCA events. This study evaluates VF-detection using data from both OHCA patients and public Holter recordings. ECG-segments of 4-s and 8-s duration were analyzed. For each segment 30 features were computed and fed to state of the art machine learning (ML) algorithms. ML-algorithms with built-in feature selection capabilities were used to determine the optimal feature subsets for both databases. Patient-wise bootstrap techniques were used to evaluate algorithm performance in terms of sensitivity (Se), specificity (Sp) and balanced error rate (BER). Performance was significantly better for public data with a mean Se of 96.6%, Sp of 98.8% and BER 2.2% compared to a mean Se of 94.7%, Sp of 96.5% and BER 4.4% for OHCA data. OHCA data required two times more features than the data from public databases for an accurate detection (6 vs 3). No significant differences in performance were found for different segment lengths, the BER differences were below 0.5-points in all cases. Our results show that VF-detection is more challenging for OHCA data than for data from public databases, and that accurate VF-detection is possible with segments as short as 4-s. PMID:27441719

  14. Machine Learning Techniques for the Detection of Shockable Rhythms in Automated External Defibrillators.

    PubMed

    Figuera, Carlos; Irusta, Unai; Morgado, Eduardo; Aramendi, Elisabete; Ayala, Unai; Wik, Lars; Kramer-Johansen, Jo; Eftestøl, Trygve; Alonso-Atienza, Felipe

    2016-01-01

    Early recognition of ventricular fibrillation (VF) and electrical therapy are key for the survival of out-of-hospital cardiac arrest (OHCA) patients treated with automated external defibrillators (AED). AED algorithms for VF-detection are customarily assessed using Holter recordings from public electrocardiogram (ECG) databases, which may be different from the ECG seen during OHCA events. This study evaluates VF-detection using data from both OHCA patients and public Holter recordings. ECG-segments of 4-s and 8-s duration were analyzed. For each segment 30 features were computed and fed to state of the art machine learning (ML) algorithms. ML-algorithms with built-in feature selection capabilities were used to determine the optimal feature subsets for both databases. Patient-wise bootstrap techniques were used to evaluate algorithm performance in terms of sensitivity (Se), specificity (Sp) and balanced error rate (BER). Performance was significantly better for public data with a mean Se of 96.6%, Sp of 98.8% and BER 2.2% compared to a mean Se of 94.7%, Sp of 96.5% and BER 4.4% for OHCA data. OHCA data required two times more features than the data from public databases for an accurate detection (6 vs 3). No significant differences in performance were found for different segment lengths, the BER differences were below 0.5-points in all cases. Our results show that VF-detection is more challenging for OHCA data than for data from public databases, and that accurate VF-detection is possible with segments as short as 4-s. PMID:27441719

  15. Improved Detection of Hepatitis B Virus Surface Antigen by a New Rapid Automated Assay

    PubMed Central

    Weber, Bernard; Bayer, Anja; Kirch, Peter; Schlüter, Volker; Schlieper, Dietmar; Melchior, Walter

    1999-01-01

    The performance of hepatitis B virus (HBV) surface antigen (HBsAg) screening assays is continuously improved in order to reduce the residual risk of transfusion-associated hepatitis B. In a multicenter study, a new automated rapid screening assay, Elecsys HBsAg (Roche Diagnostics), was compared to well-established tests (Auszyme Monoclonal [overnight incubation] version B and IMx HBsAg [Abbott]). Included in the evaluation were 23 seroconversion panels; sera from the acute and chronic phases of infection; dilution series of various HBsAg standards, HBV subtypes, and S gene mutants; and isolated anti-HBV core antigen-positive samples. To challenge the specificity of the new assay, sera from HBsAg-negative blood donors, pregnant women, and dialysis and hospitalized patients and potentially cross-reactive samples were investigated. Elecsys HBsAg showed a higher sensitivity for HBsAg subtypes ad, ay, adw2, adw4, ayw1, ayw2, ayw4, and adr detection in dilution series of different standards or sera than Auszyme Monoclonal version B and/or IMx HBsAg. Acute hepatitis B was detected in 11 to 16 of 23 seroconversion panels between 2 and 16 days earlier with Elecsys HBsAg than with the alternative assays. Elecsys HBsAg and Auszyme Monoclonal version B detected HBsAg surface mutants with equal sensitivity. The sensitivity and specificity of Elecsys HBsAg were 100%. Auszyme Monoclonal version B had a 99.9% specificity, and its sensitivity was 96.6%. IMx HBsAg showed a poorer sensitivity and specificity than the other assays. In conclusion, Elecsys HBsAg permits earlier detection of acute hepatitis B and different HBV subtypes than the alternative assays. By using highly sensitive HBsAg screening assays, low-level HBsAg carriers among isolated anti-HBV core antigen-positive individuals can be detected. PMID:10405414

  16. Olfactory processing: detection of rapid changes.

    PubMed

    Croy, Ilona; Krone, Franziska; Walker, Susannah; Hummel, Thomas

    2015-06-01

    Changes in the olfactory environment have a rather poor chance of being detected. Aim of the present study was to determine, whether the same (cued) or different (uncued) odors can generally be detected at short inter stimulus intervals (ISI) below 2.5 s. Furthermore we investigated, whether inhibition of return, an attentional phenomenon facilitating the detection of new stimuli at longer ISI, is present in the domain of olfaction. Thirteen normosmic people (3 men, 10 women; age range 19-27 years; mean age 23 years) participated. Stimulation was performed using air-dilution olfactometry with 2 odors: phenylethylalcohol and hydrogen disulfide. Reaction time to target stimuli was assessed in cued and uncued conditions at ISIs of 1, 1.5, 2, and 2.5 s. There was a significant main effect of ISI, indicating that odors presented only 1 s apart are missed frequently. Uncued presentation facilitated detection at short ISIs, implying that changes of the olfactory environment are detected better than presentation of the same odor again. Effects in relation to "olfactory inhibition of return," on the other hand, are not supported by our results. This suggests that attention works different for the olfactory system compared with the visual and auditory systems.

  17. Reproducibility of In Vivo Corneal Confocal Microscopy Using an Automated Analysis Program for Detection of Diabetic Sensorimotor Polyneuropathy

    PubMed Central

    Ostrovski, Ilia; Lovblom, Leif E.; Farooqi, Mohammed A.; Scarr, Daniel; Boulet, Genevieve; Hertz, Paul; Wu, Tong; Halpern, Elise M.; Ngo, Mylan; Ng, Eduardo; Orszag, Andrej; Bril, Vera; Perkins, Bruce A.

    2015-01-01

    Objective In vivo Corneal Confocal Microscopy (IVCCM) is a validated, non-invasive test for diabetic sensorimotor polyneuropathy (DSP) detection, but its utility is limited by the image analysis time and expertise required. We aimed to determine the inter- and intra-observer reproducibility of a novel automated analysis program compared to manual analysis. Methods In a cross-sectional diagnostic study, 20 non-diabetes controls (mean age 41.4±17.3y, HbA1c 5.5±0.4%) and 26 participants with type 1 diabetes (42.8±16.9y, 8.0±1.9%) underwent two separate IVCCM examinations by one observer and a third by an independent observer. Along with nerve density and branch density, corneal nerve fibre length (CNFL) was obtained by manual analysis (CNFLMANUAL), a protocol in which images were manually selected for automated analysis (CNFLSEMI-AUTOMATED), and one in which selection and analysis were performed electronically (CNFLFULLY-AUTOMATED). Reproducibility of each protocol was determined using intraclass correlation coefficients (ICC) and, as a secondary objective, the method of Bland and Altman was used to explore agreement between protocols. Results Mean CNFLManual was 16.7±4.0, 13.9±4.2 mm/mm2 for non-diabetes controls and diabetes participants, while CNFLSemi-Automated was 10.2±3.3, 8.6±3.0 mm/mm2 and CNFLFully-Automated was 12.5±2.8, 10.9 ± 2.9 mm/mm2. Inter-observer ICC and 95% confidence intervals (95%CI) were 0.73(0.56, 0.84), 0.75(0.59, 0.85), and 0.78(0.63, 0.87), respectively (p = NS for all comparisons). Intra-observer ICC and 95%CI were 0.72(0.55, 0.83), 0.74(0.57, 0.85), and 0.84(0.73, 0.91), respectively (p<0.05 for CNFLFully-Automated compared to others). The other IVCCM parameters had substantially lower ICC compared to those for CNFL. CNFLSemi-Automated and CNFLFully-Automated underestimated CNFLManual by mean and 95%CI of 35.1(-4.5, 67.5)% and 21.0(-21.6, 46.1)%, respectively. Conclusions Despite an apparent measurement (underestimation) bias in

  18. Automated Immunomagnetic Separation and Microarray Detection of E. coli O157:H7 from Poultry Carcass Rinse

    SciTech Connect

    Chandler, Darrell P. ); Brown, Jeremy D.; Call, Douglas R. ); Wunschel, Sharon C. ); Grate, Jay W. ); Holman, David A.; Olson, Lydia G.; Stottlemyer, Mark S.; Bruckner-Lea, Cindy J. )

    2001-09-01

    We describe the development and application of a novel electromagnetic flow cell and fluidics system for automated immunomagnetic separation of E. coli directly from unprocessed poultry carcass rinse, and the biochemical coupling of automated sample preparation with nucleic acid microarrays without cell growth. Highly porous nickel foam was used as a magnetic flux conductor. Up to 32% recovery efficiency of 'total' E. coli was achieved within the automated system with 6 sec contact times and 15 minute protocol (from sample injection through elution), statistically similar to cell recovery efficiencies in > 1 hour 'batch' captures. The electromagnet flow cell allowed complete recovery of 2.8 mm particles directly from unprocessed poultry carcass rinse whereas the batch system did not. O157:H7 cells were reproducibly isolated directly from unprocessed poultry rinse with 39% recovery efficiency at 103 cells ml-1 inoculum. Direct plating of washed beads showed positive recovery of O 157:H7 directly from carcass rinse at an inoculum of 10 cells ml-1. Recovered beads were used for direct PCR amplification and microarray detection, with a process-level detection limit (automated cell concentration through microarray detection) of < 103 cells ml-1 carcass rinse. The fluidic system and analytical approach described here are generally applicable to most microbial detection problems and applications.

  19. Change detection based on integration of multi-scale mixed-resolution information

    NASA Astrophysics Data System (ADS)

    Wei, Li; Wang, Cheng; Wen, Chenglu

    2016-03-01

    In this paper, a new method of unsupervised change detection is proposed by modeling multi-scale change detector based on local mixed information and we present a method of automated threshold. A theoretical analysis is presented to demonstrate that more comprehensive information is taken into account by the integration of multi-scale information. The ROC curves show that change detector based on multi-scale mixed information(MSM) is more effective than based on mixed information(MIX). Experiments on artificial and real-world datasets indicate that the multi-scale change detection of mixed information can eliminate the pseudo-change part of the area. Therefore, the proposed algorithm MSM is an effective method for the application of change detection.

  20. Total least squares for anomalous change detection

    SciTech Connect

    Theiler, James P; Matsekh, Anna M

    2010-01-01

    A family of difference-based anomalous change detection algorithms is derived from a total least squares (TLSQ) framework. This provides an alternative to the well-known chronochrome algorithm, which is derived from ordinary least squares. In both cases, the most anomalous changes are identified with the pixels that exhibit the largest residuals with respect to the regression of the two images against each other. The family of TLSQ-based anomalous change detectors is shown to be equivalent to the subspace RX formulation for straight anomaly detection, but applied to the stacked space. However, this family is not invariant to linear coordinate transforms. On the other hand, whitened TLSQ is coordinate invariant, and furthermore it is shown to be equivalent to the optimized covariance equalization algorithm. What whitened TLSQ offers, in addition to connecting with a common language the derivations of two of the most popular anomalous change detection algorithms - chronochrome and covariance equalization - is a generalization of these algorithms with the potential for better performance.

  1. Performance Evaluation of an Automated ELISA System for Alzheimer's Disease Detection in Clinical Routine.

    PubMed

    Chiasserini, Davide; Biscetti, Leonardo; Farotti, Lucia; Eusebi, Paolo; Salvadori, Nicola; Lisetti, Viviana; Baschieri, Francesca; Chipi, Elena; Frattini, Giulia; Stoops, Erik; Vanderstichele, Hugo; Calabresi, Paolo; Parnetti, Lucilla

    2016-07-22

    The variability of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers undermines their full-fledged introduction into routine diagnostics and clinical trials. Automation may help to increase precision and decrease operator errors, eventually improving the diagnostic performance. Here we evaluated three new CSF immunoassays, EUROIMMUNtrademark amyloid-β 1-40 (Aβ1-40), amyloid-β 1-42 (Aβ1-42), and total tau (t-tau), in combination with automated analysis of the samples. The CSF biomarkers were measured in a cohort consisting of AD patients (n = 28), mild cognitive impairment (MCI, n = 77), and neurological controls (OND, n = 35). MCI patients were evaluated yearly and cognitive functions were assessed by Mini-Mental State Examination. The patients clinically diagnosed with AD and MCI were classified according to the CSF biomarkers profile following NIA-AA criteria and the Erlangen score. Technical evaluation of the immunoassays was performed together with the calculation of their diagnostic performance. Furthermore, the results for EUROIMMUN Aβ1-42 and t-tau were compared to standard immunoassay methods (INNOTESTtrademark). EUROIMMUN assays for Aβ1-42 and t-tau correlated with INNOTEST (r = 0.83, p < 0.001 for both) and allowed a similar interpretation of the CSF profiles. The Aβ1-42/Aβ1-40 ratio measured with EUROIMMUN was the best parameter for AD detection and improved the diagnostic accuracy of Aβ1-42 (area under the curve = 0.93). In MCI patients, the Aβ1-42/Aβ1-40 ratio was associated with cognitive decline and clinical progression to AD.The diagnostic performance of the EUROIMMUN assays with automation is comparable to other currently used methods. The variability of the method and the value of the Aβ1-42/Aβ1-40 ratio in AD diagnosis need to be validated in large multi-center studies. PMID:27447425

  2. Competitive SWIFT cluster templates enhance detection of aging changes

    PubMed Central

    Rebhahn, Jonathan A.; Roumanes, David R.; Qi, Yilin; Khan, Atif; Thakar, Juilee; Rosenberg, Alex; Lee, F. Eun‐Hyung; Quataert, Sally A.; Sharma, Gaurav

    2015-01-01

    Abstract Clustering‐based algorithms for automated analysis of flow cytometry datasets have achieved more efficient and objective analysis than manual processing. Clustering organizes flow cytometry data into subpopulations with substantially homogenous characteristics but does not directly address the important problem of identifying the salient differences in subpopulations between subjects and groups. Here, we address this problem by augmenting SWIFT—a mixture model based clustering algorithm reported previously. First, we show that SWIFT clustering using a “template” mixture model, in which all subpopulations are represented, identifies small differences in cell numbers per subpopulation between samples. Second, we demonstrate that resolution of inter‐sample differences is increased by “competition” wherein a joint model is formed by combining the mixture model templates obtained from different groups. In the joint model, clusters from individual groups compete for the assignment of cells, sharpening differences between samples, particularly differences representing subpopulation shifts that are masked under clustering with a single template model. The benefit of competition was demonstrated first with a semisynthetic dataset obtained by deliberately shifting a known subpopulation within an actual flow cytometry sample. Single templates correctly identified changes in the number of cells in the subpopulation, but only the competition method detected small changes in median fluorescence. In further validation studies, competition identified a larger number of significantly altered subpopulations between young and elderly subjects. This enrichment was specific, because competition between templates from consensus male and female samples did not improve the detection of age‐related differences. Several changes between the young and elderly identified by SWIFT template competition were consistent with known alterations in the elderly, and additional

  3. Automated Detection of Soma Location and Morphology in Neuronal Network Cultures

    PubMed Central

    Ozcan, Burcin; Negi, Pooran; Laezza, Fernanda; Papadakis, Manos; Labate, Demetrio

    2015-01-01

    Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS), where the extraction of multiple morphological features of neurons on large data sets is required. Existing algorithms are not very efficient when applied to the analysis of confocal image stacks of neuronal cultures. In addition to the usual difficulties associated with the processing of fluorescent images, these types of stacks contain a small number of images so that only a small number of pixels are available along the z-direction and it is challenging to apply conventional 3D filters. The algorithm we present in this paper applies a number of innovative ideas from the theory of directional multiscale representations and involves the following steps: (i) image segmentation based on support vector machines with specially designed multiscale filters; (ii) soma extraction and separation of contiguous somas, using a combination of level set method and directional multiscale filters. We also present an approach to extract the soma’s surface morphology using the 3D shearlet transform. Extensive numerical experiments show that our algorithms are computationally efficient and highly accurate in segmenting the somas and separating contiguous ones. The algorithms presented in this paper will facilitate the development of a high-throughput quantitative platform for the study of neuronal networks for HCS applications. PMID:25853656

  4. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells.

    PubMed

    Park, Han Sang; Rinehart, Matthew T; Walzer, Katelyn A; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection

  5. Deep convolutional networks for automated detection of posterior-element fractures on spine CT

    NASA Astrophysics Data System (ADS)

    Roth, Holger R.; Wang, Yinong; Yao, Jianhua; Lu, Le; Burns, Joseph E.; Summers, Ronald M.

    2016-03-01

    Injuries of the spine, and its posterior elements in particular, are a common occurrence in trauma patients, with potentially devastating consequences. Computer-aided detection (CADe) could assist in the detection and classification of spine fractures. Furthermore, CAD could help assess the stability and chronicity of fractures, as well as facilitate research into optimization of treatment paradigms. In this work, we apply deep convolutional networks (ConvNets) for the automated detection of posterior element fractures of the spine. First, the vertebra bodies of the spine with its posterior elements are segmented in spine CT using multi-atlas label fusion. Then, edge maps of the posterior elements are computed. These edge maps serve as candidate regions for predicting a set of probabilities for fractures along the image edges using ConvNets in a 2.5D fashion (three orthogonal patches in axial, coronal and sagittal planes). We explore three different methods for training the ConvNet using 2.5D patches along the edge maps of `positive', i.e. fractured posterior-elements and `negative', i.e. non-fractured elements. An experienced radiologist retrospectively marked the location of 55 displaced posterior-element fractures in 18 trauma patients. We randomly split the data into training and testing cases. In testing, we achieve an area-under-the-curve of 0.857. This corresponds to 71% or 81% sensitivities at 5 or 10 false-positives per patient, respectively. Analysis of our set of trauma patients demonstrates the feasibility of detecting posterior-element fractures in spine CT images using computer vision techniques such as deep convolutional networks.

  6. Automated detection and labeling of high-density EEG electrodes from structural MR images

    NASA Astrophysics Data System (ADS)

    Marino, Marco; Liu, Quanying; Brem, Silvia; Wenderoth, Nicole; Mantini, Dante

    2016-10-01

    Objective. Accurate knowledge about the positions of electrodes in electroencephalography (EEG) is very important for precise source localizations. Direct detection of electrodes from magnetic resonance (MR) images is particularly interesting, as it is possible to avoid errors of co-registration between electrode and head coordinate systems. In this study, we propose an automated MR-based method for electrode detection and labeling, particularly tailored to high-density montages. Approach. Anatomical MR images were processed to create an electrode-enhanced image in individual space. Image processing included intensity non-uniformity correction, background noise and goggles artifact removal. Next, we defined a search volume around the head where electrode positions were detected. Electrodes were identified as local maxima in the search volume and registered to the Montreal Neurological Institute standard space using an affine transformation. This allowed the matching of the detected points with the specific EEG montage template, as well as their labeling. Matching and labeling were performed by the coherent point drift method. Our method was assessed on 8 MR images collected in subjects wearing a 256-channel EEG net, using the displacement with respect to manually selected electrodes as performance metric. Main results. Average displacement achieved by our method was significantly lower compared to alternative techniques, such as the photogrammetry technique. The maximum displacement was for more than 99% of the electrodes lower than 1 cm, which is typically considered an acceptable upper limit for errors in electrode positioning. Our method showed robustness and reliability, even in suboptimal conditions, such as in the case of net rotation, imprecisely gathered wires, electrode detachment from the head, and MR image ghosting. Significance. We showed that our method provides objective, repeatable and precise estimates of EEG electrode coordinates. We hope our work

  7. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells

    PubMed Central

    Park, Han Sang; Rinehart, Matthew T.; Walzer, Katelyn A.; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection

  8. Automated detection of optic disk in retinal fundus images using intuitionistic fuzzy histon segmentation.

    PubMed

    Mookiah, Muthu Rama Krishnan; Acharya, U Rajendra; Chua, Chua Kuang; Min, Lim Choo; Ng, E Y K; Mushrif, Milind M; Laude, Augustinus

    2013-01-01

    The human eye is one of the most sophisticated organs, with perfectly interrelated retina, pupil, iris cornea, lens, and optic nerve. Automatic retinal image analysis is emerging as an important screening tool for early detection of eye diseases. Uncontrolled diabetic retinopathy (DR) and glaucoma may lead to blindness. The identification of retinal anatomical regions is a prerequisite for the computer-aided diagnosis of several retinal diseases. The manual examination of optic disk (OD) is a standard procedure used for detecting different stages of DR and glaucoma. In this article, a novel automated, reliable, and efficient OD localization and segmentation method using digital fundus images is proposed. General-purpose edge detection algorithms often fail to segment the OD due to fuzzy boundaries, inconsistent image contrast, or missing edge features. This article proposes a novel and probably the first method using the Attanassov intuitionistic fuzzy histon (A-IFSH)-based segmentation to detect OD in retinal fundus images. OD pixel intensity and column-wise neighborhood operation are employed to locate and isolate the OD. The method has been evaluated on 100 images comprising 30 normal, 39 glaucomatous, and 31 DR images. Our proposed method has yielded precision of 0.93, recall of 0.91, F-score of 0.92, and mean segmentation accuracy of 93.4%. We have also compared the performance of our proposed method with the Otsu and gradient vector flow (GVF) snake methods. Overall, our result shows the superiority of proposed fuzzy segmentation technique over other two segmentation methods. PMID:23516954

  9. Automated Mapping of Rapid Arctic Ocean Coastal Change Over Large Spans of Time and Geography

    NASA Astrophysics Data System (ADS)

    Hulslander, D.

    2012-12-01

    While climate change is global in scope, its impacts vary greatly from region to region. The dynamic Arctic Ocean coastline often shows greater sensitivity to climate change and more obvious impacts. Current longer ice-free conditions, rising sea level, thawing permafrost, and melting of larger ice bodies combine to produce extremely rapid coastal change and erosion. Anderson et al. (2009; Geology News) have measured erosion rates at sites along the Alaskan Arctic Ocean coast of 15 m per year and greater. Completely understanding coastal change in the Arctic requires mapping both current erosional regimes as well as changes in erosional rates over several decades. Studying coastal change and trends in the Arctic, however, presents several significant difficulties. The study area is enormous, with over 45,000 km of coastline; it is also one of the most remote, inaccessible, and hostile environments on Earth. Moreover, the region has little to no historical data from which to start. Thus, any study of the area must be able to construct its own baseline. Remote sensing offers the best solution given these difficulties. Spaceborne platforms allow for regular global coverage at temporal and spatial scales sufficient for mapping coastal erosion and deposition. The Landsat family of instruments (MSS, TM, and ETM) has data available as frequently as every 16 days and starting as early as 1972. The data are freely available from the USGS through earthexplorer.usgs.gov and are well calibrated both geometrically and spectrally, eliminating expensive pre-processing steps and making them analysis-ready. Finally, because manual coastline delineation of the quantity of data involved would be prohibitive in both budget and labor, an automated processing chain must be used. ENVI Feature Extraction can provide results in line with those generated by expert analysts (Hulslander, et al., 2008; GEOBIA 2008 Proceedings). Previous studies near Drew Point, Alaska have shown that feature

  10. Automated detection of remineralization in simulated enamel lesions with PS-OCT

    PubMed Central

    Lee, Robert C.; Darling, Cynthia L.; Fried, Daniel

    2014-01-01

    Previous in vitro and in vivo studies have demonstrated that polarization-sensitive optical coherence tomography (PS-OCT) can be used to nondestructively image the subsurface structure and measure the thickness of the highly mineralized transparent surface zone of caries lesions. There are structural differences between active lesions and arrested lesions, and the surface layer thickness may correlate with activity of the lesion. The purpose of this study was to develop a method that can be used to automatically detect and measure the thickness of the transparent surface layer in PS-OCT images. Automated methods of analysis were used to measure the thickness of the transparent layer and the depth of the bovine enamel lesions produced using simulated caries models that emulate demineralization in the mouth. The transparent layer thickness measured with PS-OCT correlated well with polarization light microscopy (PLM) measurements of all regions (r2=0.9213). This study demonstrates that PS-OCT can automatically detect and measure thickness of the transparent layer formed due to remineralization in simulated caries lesions. PMID:25075267

  11. Automated detection of anesthetic depth levels using chaotic features with artificial neural networks.

    PubMed

    Lalitha, V; Eswaran, C

    2007-12-01

    Monitoring the depth of anesthesia (DOA) during surgery is very important in order to avoid patients' interoperative awareness. Since the traditional methods of assessing DOA which involve monitoring the heart rate, pupil size, sweating etc, may vary from patient to patient depending on the type of surgery and the type of drug administered, modern methods based on electroencephalogram (EEG) are preferred. EEG being a nonlinear signal, it is appropriate to use nonlinear chaotic parameters to identify the anesthetic depth levels. This paper discusses an automated detection method of anesthetic depth levels based on EEG recordings using non-linear chaotic features and neural network classifiers. Three nonlinear parameters, namely, correlation dimension (CD), Lyapunov exponent (LE) and Hurst exponent (HE) are used as features and two neural network models, namely, multi-layer perceptron network (feed forward model) and Elman network (feedback model) are used for classification. The neural network models are trained and tested with single and multiple features derived from chaotic parameters and the performances are evaluated in terms of sensitivity, specificity and overall accuracy. It is found from the experimental results that the Lyapunov exponent feature with Elman network yields an overall accuracy of 99% in detecting the anesthetic depth levels.

  12. Automated detection of perturbed cardiac physiology during oral food allergen challenge in children.

    PubMed

    Twomey, N; Temko, A; Hourihane, J O'B; Marnane, W P

    2014-05-01

    This paper investigates the fully automated computer-based detection of allergic reaction in oral food challenges using pediatric ECG signals. Nonallergic background is modeled using a mixture of Gaussians during oral food challenges, and the model likelihoods are used to determine whether a subject is allergic to a food type. The system performance is assessed on the dataset of 24 children (15 allergic and 9 nonallergic) totaling 34 h of data. The proposed detector correctly classified all nonallergic subjects (100% specificity) and 12 allergic subjects (80% sensitivity) and is capable of detecting allergy on average 17 min earlier than trained clinicians during oral food challenges, the gold standard of allergy diagnosis. Inclusion of the developed allergy classification platform during oral food challenges recorded would result in a 30% reduction of doses administered to allergic subjects. The results of study introduce the possibility to halt challenges earlier which can safely advance the state of clinical art of allergy diagnosis by reducing the overall exposure to the allergens.

  13. Automated detection of remineralization in simulated enamel lesions with PS-OCT

    NASA Astrophysics Data System (ADS)

    Lee, Robert C.; Darling, Cynthia L.; Fried, Daniel

    2014-02-01

    Previous in vitro and in vivo studies have demonstrated that polarization-sensitive optical coherence tomography (PS-OCT) can be used to nondestructively image the subsurface structure and measure the thickness of the highly mineralized transparent surface zone of caries lesions. There are structural differences between active lesions and arrested lesions, and the surface layer thickness may correlate with activity of the lesion. The purpose of this study was to develop a method that can be used to automatically detect and measure the thickness of the transparent surface layer in PS-OCT images. Automated methods of analysis were used to measure the thickness of the transparent layer and the depth of the bovine enamel lesions produced using simulated caries models that emulate demineralization in the mouth. The transparent layer thickness measured with PS-OCT correlated well with polarization light microscopy (PLM) measurements of all regions (r2=0.9213). This study demonstrates that PS-OCT can automatically detect and measure thickness of the transparent layer formed due to remineralization in simulated caries lesions.

  14. Performance of the Automated Neuropsychological Assessment Metrics (ANAM) in Detecting Cognitive Impairment in Heart Failure Patients

    PubMed Central

    Xie, Susan S.; Goldstein, Carly M.; Gathright, Emily C.; Gunstad, John; Dolansky, Mary A.; Redle, Joseph; Hughes, Joel W.

    2015-01-01

    Objective Evaluate capacity of the Automated Neuropsychological Assessment Metrics (ANAM) to detect cognitive impairment (CI) in heart failure (HF) patients. Background CI is a key prognostic marker in HF. Though the most widely used cognitive screen in HF, the Mini-Mental State Examination (MMSE) is insufficiently sensitive. The ANAM has demonstrated sensitivity to cognitive domains affected by HF, but has not been assessed in this population. Methods Investigators administered the ANAM and MMSE to 57 HF patients, compared against a composite model of cognitive function. Results ANAM efficiency (p < .05) and accuracy scores (p < .001) successfully differentiated CI and non-CI. ANAM efficiency and accuracy scores classified 97.7% and 93.0% of non-CI patients, and 14.3% and 21.4% with CI, respectively. Conclusions The ANAM is more effective than the MMSE for detecting CI, but further research is needed to develop a more optimal cognitive screen for routine use in HF patients. PMID:26354858

  15. Evaluation of automated term groupings for detecting anaphylactic shock signals for drugs

    PubMed Central

    Souvignet, Julien; Declerck, Gunnar; Trombert, Béatrice; Rodrigues, Jean Marie; Jaulent, Marie-Christine; Bousquet, Cédric

    2012-01-01

    Signal detection in pharmacovigilance should take into account all terms related to a medical concept rather than a single term. We built an OWL-DL file with formal definitions of MedDRA and SNOMED-CT concepts and performed two queries, Query 1 and 2, to retrieve narrow and broad terms within the Standard MedDRA Query (SMQ) related to ‘anaphylactic shock’ and the terms from the High Level Term (HLT) grouping related to ‘anaphylaxis’. We compared values of the EB05 (EBGM) statistical test for disproportionality with 50 active ingredients randomly selected in the public version of the FDA pharmacovigilance database. Coefficient of correlation was R2 = 1.00 between Query 1 and HLT; R2 = 0.98 between Query 1 and SMQ narrow; R2 = 0.89 between Query 2 and SMQ Narrow+Broad. Generating automated groupings of terms for signal detection is feasible but requires additional efforts in modeling MedDRA terms in order to improve precision and recall of these groupings. PMID:23304363

  16. Automated detection of diagnostically relevant regions in H&E stained digital pathology slides

    NASA Astrophysics Data System (ADS)

    Bahlmann, Claus; Patel, Amar; Johnson, Jeffrey; Ni, Jie; Chekkoury, Andrei; Khurd, Parmeshwar; Kamen, Ali; Grady, Leo; Krupinski, Elizabeth; Graham, Anna; Weinstein, Ronald

    2012-03-01

    We present a computationally efficient method for analyzing H&E stained digital pathology slides with the objective of discriminating diagnostically relevant vs. irrelevant regions. Such technology is useful for several applications: (1) It can speed up computer aided diagnosis (CAD) for histopathology based cancer detection and grading by an order of magnitude through a triage-like preprocessing and pruning. (2) It can improve the response time for an interactive digital pathology workstation (which is usually dealing with several GByte digital pathology slides), e.g., through controlling adaptive compression or prioritization algorithms. (3) It can support the detection and grading workflow for expert pathologists in a semi-automated diagnosis, hereby increasing throughput and accuracy. At the core of the presented method is the statistical characterization of tissue components that are indicative for the pathologist's decision about malignancy vs. benignity, such as, nuclei, tubules, cytoplasm, etc. In order to allow for effective yet computationally efficient processing, we propose visual descriptors that capture the distribution of color intensities observed for nuclei and cytoplasm. Discrimination between statistics of relevant vs. irrelevant regions is learned from annotated data, and inference is performed via linear classification. We validate the proposed method both qualitatively and quantitatively. Experiments show a cross validation error rate of 1.4%. We further show that the proposed method can prune ~90% of the area of pathological slides while maintaining 100% of all relevant information, which allows for a speedup of a factor of 10 for CAD systems.

  17. Detecting changes during pregnancy with Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Vargis, Elizabeth; Robertson, Kesha; Al-Hendy, Ayman; Reese, Jeff; Mahadevan-Jansen, Anita

    2010-02-01

    Preterm labor is the second leading cause of neonatal mortality and leads to a myriad of complications like delayed development and cerebral palsy. Currently, there is no way to accurately predict preterm labor, making its prevention and treatment virtually impossible. While there are some at-risk patients, over half of all preterm births do not fall into any high-risk category. This study seeks to predict and prevent preterm labor by using Raman spectroscopy to detect changes in the cervix during pregnancy. Since Raman spectroscopy has been used to detect cancers in vivo in organs like the cervix and skin, it follows that spectra will change over the course of pregnancy. Previous studies have shown that fluorescence decreased during pregnancy and increased during post-partum exams to pre-pregnancy levels. We believe significant changes will occur in the Raman spectra obtained during the course of pregnancy. In this study, Raman spectra from the cervix of pregnant mice and women will be acquired. Specific changes that occur due to cervical softening or changes in hormonal levels will be observed to understand the likelihood that a female mouse or a woman will enter labor.

  18. Detecting genetic responses to environmental change.

    PubMed

    Hoffmann, Ary A; Willi, Yvonne

    2008-06-01

    Changes in environmental conditions can rapidly shift allele frequencies in populations of species with relatively short generation times. Frequency shifts might be detectable in neutral genetic markers when stressful conditions cause a population decline. However, frequency shifts that are diagnostic of specific conditions depend on isolating sets of genes that are involved in adaptive responses. Shifts at candidate loci underlying adaptive responses and DNA regions that control their expression have now been linked to evolutionary responses to pollution, global warming and other changes. Conversely, adaptive constraints, particularly in physiological traits, are recognized through DNA decay in candidate genes. These approaches help researchers and conservation managers understand the power and constraints of evolution.

  19. Detecting hydrological changes through conceptual model

    NASA Astrophysics Data System (ADS)

    Viola, Francesco; Caracciolo, Domenico; Pumo, Dario; Francipane, Antonio; Valerio Noto, Leonardo

    2015-04-01

    Natural changes and human modifications in hydrological systems coevolve and interact in a coupled and interlinked way. If, on one hand, climatic changes are stochastic, non-steady, and affect the hydrological systems, on the other hand, human-induced changes due to over-exploitation of soils and water resources modifies the natural landscape, water fluxes and its partitioning. Indeed, the traditional assumption of static systems in hydrological analysis, which has been adopted for long time, fails whenever transient climatic conditions and/or land use changes occur. Time series analysis is a way to explore environmental changes together with societal changes; unfortunately, the not distinguishability between causes restrict the scope of this method. In order to overcome this limitation, it is possible to couple time series analysis with an opportune hydrological model, such as a conceptual hydrological model, which offers a schematization of complex dynamics acting within a basin. Assuming that model parameters represent morphological basin characteristics and that calibration is a way to detect hydrological signature at a specific moment, it is possible to argue that calibrating the model over different time windows could be a method for detecting potential hydrological changes. In order to test the capabilities of a conceptual model in detecting hydrological changes, this work presents different "in silico" experiments. A synthetic-basin is forced with an ensemble of possible future scenarios generated with a stochastic weather generator able to simulate steady and non-steady climatic conditions. The experiments refer to Mediterranean climate, which is characterized by marked seasonality, and consider the outcomes of the IPCC 5th report for describing climate evolution in the next century. In particular, in order to generate future climate change scenarios, a stochastic downscaling in space and time is carried out using realizations of an ensemble of General

  20. Ischemia detection from morphological QRS angle changes.

    PubMed

    Romero, Daniel; Martínez, Juan Pablo; Laguna, Pablo; Pueyo, Esther

    2016-07-01

    In this paper, an ischemia detector is presented based on the analysis of QRS-derived angles. The detector has been developed by modeling ischemic effects on the QRS angles as a gradual change with a certain transition time and assuming a Laplacian additive modeling error contaminating the angle series. Both standard and non-standard leads were used for analysis. Non-standard leads were obtained by applying the PCA technique over specific lead subsets to represent different potential locations of the ischemic zone. The performance of the proposed detector was tested over a population of 79 patients undergoing percutaneous coronary intervention in one of the major coronary arteries (LAD (n  =  25), RCA (n  =  16) and LCX (n  =  38)). The best detection performance, obtained for standard ECG leads, was achieved in the LAD group with values of sensitivity and specificity of [Formula: see text], [Formula: see text], followed by the RCA group with [Formula: see text], Sp  =  94.4 and the LCX group with [Formula: see text], [Formula: see text], notably outperforming detection based on the ST series in all cases, with the same detector structure. The timing of the detected ischemic events ranged from 30 s up to 150 s (mean  =  66.8 s) following the start of occlusion. We conclude that changes in the QRS angles can be used to detect acute myocardial ischemia. PMID:27243441

  1. Seabed change detection in challenging environments

    NASA Astrophysics Data System (ADS)

    Matthews, Cameron A.; Sternlicht, Daniel D.

    2011-06-01

    Automatic Change Detection (ACD) compares new and stored terrain images for alerting to changes occurring over time. These techniques, long used in airborne radar, are just beginning to be applied to sidescan sonar. Under the right conditions ACD by image correlation-comparing multi-temporal image data at the pixel or parcel level-can be used to detect new objects on the seafloor. Synthetic aperture sonars (SAS)-coherent sensors that produce fine-scale, range-independent resolution seafloor images-are well suited for this approach; however, dynamic seabed environments can introduce "clutter" to the process. This paper explores an ACD method that uses salience mapping in a global-to-local analysis architecture. In this method, termed Temporally Invariant Saliency (TIS), variance ratios of median-filtered repeat-pass images are used to detect new objects, while deemphasizing modest environmental or radiometric-induced changes in the background. Successful tests with repeat-pass data from two SAS systems mounted on autonomous undersea vehicles (AUV) demonstrate the feasibility of the technique.

  2. Scene change detection based on multimodal integration

    NASA Astrophysics Data System (ADS)

    Zhu, Yingying; Zhou, Dongru

    2003-09-01

    Scene change detection is an essential step to automatic and content-based video indexing, retrieval and browsing. In this paper, a robust scene change detection and classification approach is presented, which analyzes audio, visual and textual sources and accounts for their inter-relations and coincidence to semantically identify and classify video scenes. Audio analysis focuses on the segmentation of audio stream into four types of semantic data such as silence, speech, music and environmental sound. Further processing on speech segments aims at locating speaker changes. Video analysis partitions visual stream into shots. Text analysis can provide a supplemental source of clues for scene classification and indexing information. We integrate the video and audio analysis results to identify video scenes and use the text information detected by the video OCR technology or derived from transcripts available to refine scene classification. Results from single source segmentation are in some cases suboptimal. By combining visual, aural features adn the accessorial text information, the scence extraction accuracy is enhanced, and more semantic segmentations are developed. Experimental results are proven to rather promising.

  3. Development of an automated updated Selvester QRS scoring system using SWT-based QRS fractionation detection and classification.

    PubMed

    Bono, Valentina; Mazomenos, Evangelos B; Chen, Taihai; Rosengarten, James A; Acharyya, Amit; Maharatna, Koushik; Morgan, John M; Curzen, Nick

    2014-01-01

    The Selvester score is an effective means for estimating the extent of myocardial scar in a patient from low-cost ECG recordings. Automation of such a system is deemed to help implementing low-cost high-volume screening mechanisms of scar in the primary care. This paper describes, for the first time to the best of our knowledge, an automated implementation of the updated Selvester scoring system for that purpose, where fractionated QRS morphologies and patterns are identified and classified using a novel stationary wavelet transform (SWT)-based fractionation detection algorithm. This stage informs the two principal steps of the updated Selvester scoring scheme--the confounder classification and the point awarding rules. The complete system is validated on 51 ECG records of patients detected with ischemic heart disease. Validation has been carried out using manually detected confounder classes and computation of the actual score by expert cardiologists as the ground truth. Our results show that as a stand-alone system it is able to classify different confounders with 94.1% accuracy whereas it exhibits 94% accuracy in computing the actual score. When coupled with our previously proposed automated ECG delineation algorithm, that provides the input ECG parameters, the overall system shows 90% accuracy in confounder classification and 92% accuracy in computing the actual score and thereby showing comparable performance to the stand-alone system proposed here, with the added advantage of complete automated analysis without any human intervention.

  4. Automated monitoring of early neurobehavioral changes in mice following traumatic brain injury

    PubMed Central

    Qu, Wenrui; Liu, Nai-kui; Xie, Xin-min (Simon); Li, Rui; Xu, Xiao-ming

    2016-01-01

    Traumatic brain injury often causes a variety of behavioral and emotional impairments that can develop into chronic disorders. Therefore, there is a need to shift towards identifying early symptoms that can aid in the prediction of traumatic brain injury outcomes and behavioral endpoints in patients with traumatic brain injury after early interventions. In this study, we used the SmartCage system, an automated quantitative approach to assess behavior alterations in mice during an early phase of traumatic brain injury in their home cages. Female C57BL/6 adult mice were subjected to moderate controlled cortical impact (CCI) injury. The mice then received a battery of behavioral assessments including neurological score, locomotor activity, sleep/wake states, and anxiety-like behaviors on days 1, 2, and 7 after CCI. Histological analysis was performed on day 7 after the last assessment. Spontaneous activities on days 1 and 2 after injury were significantly decreased in the CCI group. The average percentage of sleep time spent in both dark and light cycles were significantly higher in the CCI group than in the sham group. For anxiety-like behaviors, the time spent in a light compartment and the number of transitions between the dark/light compartments were all significantly reduced in the CCI group than in the sham group. In addition, the mice suffering from CCI exhibited a preference of staying in the dark compartment of a dark/light cage. The CCI mice showed reduced neurological score and histological abnormalities, which are well correlated to the automated behavioral assessments. Our findings demonstrate that the automated SmartCage system provides sensitive and objective measures for early behavior changes in mice following traumatic brain injury. PMID:27073377

  5. Topographic attributes as a guide for automated detection or highlighting of geological features

    NASA Astrophysics Data System (ADS)

    Viseur, Sophie; Le Men, Thibaud; Guglielmi, Yves

    2015-04-01

    Photogrammetry or LIDAR technology combined with photography allow geoscientists to obtain 3D high-resolution numerical representations of outcrops, generally termed as Digital Outcrop Models (DOM). For over a decade, these 3D numerical outcrops serve as support for precise and accurate interpretations of geological features such as fracture traces or plans, strata, facies mapping, etc. These interpretations have the benefit to be directly georeferenced and embedded into the 3D space. They are then easily integrated into GIS or geomodeler softwares for modelling in 3D the subsurface geological structures. However, numerical outcrops generally represent huge data sets that are heavy to manipulate and hence to interpret. This may be particularly tedious as soon as several scales of geological features must be investigated or as geological features are very dense and imbricated. Automated tools for interpreting geological features from DOMs would be then a significant help to process these kinds of data. Such technologies are commonly used for interpreting seismic or medical data. However, it may be noticed that even if many efforts have been devoted to easily and accurately acquire 3D topographic point clouds and photos and to visualize accurate 3D textured DOMs, few attentions have been paid to the development of algorithms for automated detection of the geological structures from DOMs. The automatic detection of objects on numerical data generally assumes that signals or attributes computed from this data allows the recognition of the targeted object boundaries. The first step consists then in defining attributes that highlight the objects or their boundaries. For DOM interpretations, some authors proposed to use differential operators computed on the surface such as normal or curvatures. These methods generally extract polylines corresponding to fracture traces or bed limits. Other approaches rely on the PCA technology to segregate different topographic plans

  6. Sink detection on tilted terrain for automated identification of glacial cirques

    NASA Astrophysics Data System (ADS)

    Prasicek, Günther; Robl, Jörg; Lang, Andreas

    2016-04-01

    Glacial cirques are morphologically distinct but complex landforms and represent a vital part of high mountain topography. Their distribution, elevation and relief are expected to hold information on (1) the extent of glacial occupation, (2) the mechanism of glacial cirque erosion, and (3) how glacial in concert with periglacial processes can limit peak altitude and mountain range height. While easily detectably for the expert's eye both in nature and on various representations of topography, their complicated nature makes them a nemesis for computer algorithms. Consequently, manual mapping of glacial cirques is commonplace in many mountain landscapes worldwide, but consistent datasets of cirque distribution and objectively mapped cirques and their morphometrical attributes are lacking. Among the biggest problems for algorithm development are the complexity in shape and the great variability of cirque size. For example, glacial cirques can be rather circular or longitudinal in extent, exist as individual and composite landforms, show prominent topographic depressions or can entirely be filled with water or sediment. For these reasons, attributes like circularity, size, drainage area and topology of landform elements (e.g. a flat floor surrounded by steep walls) have only a limited potential for automated cirque detection. Here we present a novel, geomorphometric method for automated identification of glacial cirques on digital elevation models that exploits their genetic bowl-like shape. First, we differentiate between glacial and fluvial terrain employing an algorithm based on a moving window approach and multi-scale curvature, which is also capable of fitting the analysis window to valley width. We then fit a plane to the valley stretch clipped by the analysis window and rotate the terrain around the center cell until the plane is level. Doing so, we produce sinks of considerable size if the clipped terrain represents a cirque, while no or only very small sinks

  7. Automated Indexing of Internet Stories for Health Behavior Change: Weight Loss Attitude Pilot Study

    PubMed Central

    Manuvinakurike, Ramesh; Velicer, Wayne F

    2014-01-01

    Background Automated health behavior change interventions show promise, but suffer from high attrition and disuse. The Internet abounds with thousands of personal narrative accounts of health behavior change that could not only provide useful information and motivation for others who are also trying to change, but an endless source of novel, entertaining stories that may keep participants more engaged than messages authored by interventionists. Objective Given a collection of relevant personal health behavior change stories gathered from the Internet, the aim of this study was to develop and evaluate an automated indexing algorithm that could select the best possible story to provide to a user to have the greatest possible impact on their attitudes toward changing a targeted health behavior, in this case weight loss. Methods An indexing algorithm was developed using features informed by theories from behavioral medicine together with text classification and machine learning techniques. The algorithm was trained using a crowdsourced dataset, then evaluated in a 2×2 between-subjects randomized pilot study. One factor compared the effects of participants reading 2 indexed stories vs 2 randomly selected stories, whereas the second factor compared the medium used to tell the stories: text or animated conversational agent. Outcome measures included changes in self-efficacy and decisional balance for weight loss before and after the stories were read. Results Participants were recruited from a crowdsourcing website (N=103; 53.4%, 55/103 female; mean age 35, SD 10.8 years; 65.0%, 67/103 precontemplation; 19.4%, 20/103 contemplation for weight loss). Participants who read indexed stories exhibited a significantly greater increase in self-efficacy for weight loss compared to the control group (F 1,107=5.5, P=.02). There were no significant effects of indexing on change in decisional balance (F 1,97=0.05, P=.83) and no significant effects of medium on change in self

  8. Time series change detection: Algorithms for land cover change

    NASA Astrophysics Data System (ADS)

    Boriah, Shyam

    can be used for decision making and policy planning purposes. In particular, previous change detection studies have primarily relied on examining differences between two or more satellite images acquired on different dates. Thus, a technological solution that detects global land cover change using high temporal resolution time series data will represent a paradigm-shift in the field of land cover change studies. To realize these ambitious goals, a number of computational challenges in spatio-temporal data mining need to be addressed. Specifically, analysis and discovery approaches need to be cognizant of climate and ecosystem data characteristics such as seasonality, non-stationarity/inter-region variability, multi-scale nature, spatio-temporal autocorrelation, high-dimensionality and massive data size. This dissertation, a step in that direction, translates earth science challenges to computer science problems, and provides computational solutions to address these problems. In particular, three key technical capabilities are developed: (1) Algorithms for time series change detection that are effective and can scale up to handle the large size of earth science data; (2) Change detection algorithms that can handle large numbers of missing and noisy values present in satellite data sets; and (3) Spatio-temporal analysis techniques to identify the scale and scope of disturbance events.

  9. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    NASA Astrophysics Data System (ADS)

    Girolamo, D.; Girolamo, L.; Yuan, F. G.

    2015-03-01

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  10. Semi-automated scar detection in delayed enhanced cardiac magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Morisi, Rita; Donini, Bruno; Lanconelli, Nico; Rosengarden, James; Morgan, John; Harden, Stephen; Curzen, Nick

    2015-06-01

    Late enhancement cardiac magnetic resonance images (MRI) has the ability to precisely delineate myocardial scars. We present a semi-automated method for detecting scars in cardiac MRI. This model has the potential to improve routine clinical practice since quantification is not currently offered due to time constraints. A first segmentation step was developed for extracting the target regions for potential scar and determining pre-candidate objects. Pattern recognition methods are then applied to the segmented images in order to detect the position of the myocardial scar. The database of late gadolinium enhancement (LE) cardiac MR images consists of 111 blocks of images acquired from 63 patients at the University Hospital Southampton NHS Foundation Trust (UK). At least one scar was present for each patient, and all the scars were manually annotated by an expert. A group of images (around one third of the entire set) was used for training the system which was subsequently tested on all the remaining images. Four different classifiers were trained (Support Vector Machine (SVM), k-nearest neighbor (KNN), Bayesian and feed-forward neural network) and their performance was evaluated by using Free response Receiver Operating Characteristic (FROC) analysis. Feature selection was implemented for analyzing the importance of the various features. The segmentation method proposed allowed the region affected by the scar to be extracted correctly in 96% of the blocks of images. The SVM was shown to be the best classifier for our task, and our system reached an overall sensitivity of 80% with less than 7 false positives per patient. The method we present provides an effective tool for detection of scars on cardiac MRI. This may be of value in clinical practice by permitting routine reporting of scar quantification.

  11. Evaluation of automated and manual DNA purification methods for detecting Ricinus communis DNA during ricin investigations.

    PubMed

    Hutchins, Anne S; Astwood, Michael J; Saah, J Royden; Michel, Pierre A; Newton, Bruce R; Dauphin, Leslie A

    2014-03-01

    In April of 2013, letters addressed to the President of United States and other government officials were intercepted and found to be contaminated with ricin, heightening awareness about the need to evaluate laboratory methods for detecting ricin. This study evaluated commercial DNA purification methods for isolating Ricinus communis DNA as measured by real-time polymerase chain reaction (PCR). Four commercially available DNA purification methods (two automated, MagNA Pure compact and MagNA Pure LC, and two manual, MasterPure complete DNA and RNA purification kit and QIAamp DNA blood mini kit) were evaluated. We compared their ability to purify detectable levels of R. communis DNA from four different sample types, including crude preparations of ricin that could be used for biological crimes or acts of bioterrorism. Castor beans, spiked swabs, and spiked powders were included to simulate sample types typically tested during criminal and public health investigations. Real-time PCR analysis indicated that the QIAamp kit resulted in the greatest sensitivity for ricin preparations; the MasterPure kit performed best with spiked powders. The four methods detected equivalent levels by real-time PCR when castor beans and spiked swabs were used. All four methods yielded DNA free of PCR inhibitors as determined by the use of a PCR inhibition control assay. This study demonstrated that DNA purification methods differ in their ability to purify R. communis DNA; therefore, the purification method used for a given sample type can influence the sensitivity of real-time PCR assays for R. communis.

  12. Imaging and automated detection of Sitophilus oryzae (Coleoptera: Curculionidae) pupae in hard red winter wheat.

    PubMed

    Toews, Michael D; Pearson, Tom C; Campbell, James F

    2006-04-01

    Computed tomography, an imaging technique commonly used for diagnosing internal human health ailments, uses multiple x-rays and sophisticated software to recreate a cross-sectional representation of a subject. The use of this technique to image hard red winter wheat, Triticum aestivm L., samples infested with pupae of Sitophilus oryzae (L.) was investigated. A software program was developed to rapidly recognize and quantify the infested kernels. Samples were imaged in a 7.6-cm (o.d.) plastic tube containing 0, 50, or 100 infested kernels per kg of wheat. Interkernel spaces were filled with corn oil so as to increase the contrast between voids inside kernels and voids among kernels. Automated image processing, using a custom C language software program, was conducted separately on each 100 g portion of the prepared samples. The average detection accuracy in the five infested kernels per 100-g samples was 94.4 +/- 7.3% (mean +/- SD, n = 10), whereas the average detection accuracy in the 10 infested kernels per 100-g sample was 87.3 +/- 7.9% (n = 10). Detection accuracy in the 10 infested kernels per 100-g samples was slightly less than the five infested kernels per 100-g samples because of some infested kernels overlapping with each other or air bubbles in the oil. A mean of 1.2 +/- 0.9 (n = 10) bubbles (per tube) was incorrectly classed as infested kernels in replicates containing no infested kernels. In light of these positive results, future studies should be conducted using additional grains, insect species, and life stages.

  13. Imaging and automated detection of Sitophilus oryzae (Coleoptera: Curculionidae) pupae in hard red winter wheat.

    PubMed

    Toews, Michael D; Pearson, Tom C; Campbell, James F

    2006-04-01

    Computed tomography, an imaging technique commonly used for diagnosing internal human health ailments, uses multiple x-rays and sophisticated software to recreate a cross-sectional representation of a subject. The use of this technique to image hard red winter wheat, Triticum aestivm L., samples infested with pupae of Sitophilus oryzae (L.) was investigated. A software program was developed to rapidly recognize and quantify the infested kernels. Samples were imaged in a 7.6-cm (o.d.) plastic tube containing 0, 50, or 100 infested kernels per kg of wheat. Interkernel spaces were filled with corn oil so as to increase the contrast between voids inside kernels and voids among kernels. Automated image processing, using a custom C language software program, was conducted separately on each 100 g portion of the prepared samples. The average detection accuracy in the five infested kernels per 100-g samples was 94.4 +/- 7.3% (mean +/- SD, n = 10), whereas the average detection accuracy in the 10 infested kernels per 100-g sample was 87.3 +/- 7.9% (n = 10). Detection accuracy in the 10 infested kernels per 100-g samples was slightly less than the five infested kernels per 100-g samples because of some infested kernels overlapping with each other or air bubbles in the oil. A mean of 1.2 +/- 0.9 (n = 10) bubbles (per tube) was incorrectly classed as infested kernels in replicates containing no infested kernels. In light of these positive results, future studies should be conducted using additional grains, insect species, and life stages. PMID:16686163

  14. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    SciTech Connect

    Girolamo, D. Yuan, F. G.; Girolamo, L.

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  15. Automated laser scatter detection of surface and subsurface defects in Si{sub 3}N{sub 4} components

    SciTech Connect

    Steckenrider, J.S.

    1995-06-01

    Silicon Nitride (Si{sub 3}N{sub 4}) ceramics are currently a primary material of choice to replace conventional materials in many structural applications because of their oxidation resistance and desirable mechanical and thermal properties at elevated temperatures. However, surface or near-subsurface defects, such as cracks, voids, or inclusions, significantly affect component lifetimes. These defects are currently difficult to detect, so a technique is desired for the rapid automated detection and quantification of both surface and subsurface defects. To address this issue, the authors have developed an automated system based on the detection of scattered laser light which provides a 2-D map of surface or subsurface defects. This system has been used for the analysis of flexure bars and button-head tensile rods of several Si{sub 3}N{sub 4} materials. Mechanical properties of these bars have also been determined and compared with the laser scatter results.

  16. Development of an automated classification scheme for detection of polar stratospheric clouds over Antarctica using AVHRR imagery

    NASA Astrophysics Data System (ADS)

    Foschi, Patricia S.; Pagan, Kathy L.; Garcia, Oswaldo; Smith, Deborah K.; Gaines, Steven E.; Hipskind, R. Stephen

    1995-12-01

    Although polar stratospheric clouds (PSCs) are a critical component in the ozone depletion process, their timing, duration, geographic extent, and annual variability are not well understood. The goal of this study is the development of an automated classification scheme for detecting PSCs using NOAA AVHRR data. Visual interpretation, density slicing, and standard multispectral classification detect most optically thick PSCs, but only some thin PSCs. Two types of automated techniques for detecting thin PSCs are being investigated: namely, multispectral classification methods, including the use of texture and other imagederived features, and back-propagation neural networks, including the use of hyperspatial and hypertemporal data. UARS CLAES temperature and aerosol extinction coefficient data are being used as a verification dataset. If successful, this classification scheme will be used to process the entire record of AVHRR data in order to assemble a long-term PSC climatology.

  17. Evaluation of change detection techniques for monitoring coastal zone environments

    NASA Technical Reports Server (NTRS)

    Weismiller, R. A. (Principal Investigator); Kristof, S. J.; Scholz, D. K.; Anuta, P. E.; Momin, S. M.

    1977-01-01

    The author has identified the following significant results. Four change detection techniques were designed and implemented for evaluation: (1) post classification comparison change detection, (2) delta data change detection, (3) spectral/temporal change classification, and (4) layered spectral/temporal change classification. The post classification comparison technique reliably identified areas of change and was used as the standard for qualitatively evaluating the other three techniques. The layered spectral/temporal change classification and the delta data change detection results generally agreed with the post classification comparison technique results; however, many small areas of change were not identified. Major discrepancies existed between the post classification comparison and spectral/temporal change detection results.

  18. Intelligent Transportation Systems: Automated Guided Vehicle Systems in Changing Logistics Environments

    NASA Astrophysics Data System (ADS)

    Schulze, L.; Behling, S.; Buhrs, S.

    2008-06-01

    The usage of Automated Guided Vehicle Systems (AGVS) is growing. This has not always been the case in the past. A new record of the sells numbers is the result of inventive developments, new applications and modern thinking. One market that AGVS were not able to thoroughly conquer yet were rapidly changing logistics environments. The advantages in recurrent transportation with AGVS used to be hindered by the needs of flexibility. When nowadays managers talk about Flexible Manufacturing Systems (FMS) there is no reason not to consider AGVS. Fixed guidelines, permanent transfer stations and static routes are no necessity for most AGVS producers. Flexible Manufacturing Systems can raise profitability with AGVS. When robots start saving billions in production costs, the next step at same plants are automated materials handling systems. Today, there are hundreds of instances of computer-controlled systems designed to handle and transport materials, many of which have replaced conventional human-driven platform trucks. Reduced costs due to damages and failures, tracking and tracing as well as improved production scheduling on top of fewer personnel needs are only some of the advantages.

  19. Selective Automated Perimetry Under Photopic, Mesopic, and Scotopic Conditions: Detection Mechanisms and Testing Strategies

    PubMed Central

    Simunovic, Matthew P.; Moore, Anthony T.; MacLaren, Robert E.

    2016-01-01

    Purpose Automated scotopic, mesopic, and photopic perimetry are likely to be important paradigms in the assessment of emerging treatments of retinal diseases, yet our knowledge of the photoreceptor mechanisms detecting targets under these conditions remains largely dependent on simian data. We therefore aimed to establish the photoreceptor/postreceptoral mechanisms detecting perimetric targets in humans under photopic, mesopic, and scotopic conditions and to make recommendations for suitable clinical testing strategies for selective perimetry. Methods Perimetric sensitivities within 30° of fixation were determined for eight wavelengths (410, 440, 480, 520, 560, 600, 640, and 680 nm) under scotopic, mesopic (1.3 cd.m−2) and photopic (10 cd.m−2) conditions. Data were fitted with vector combinations of rod, S-cone, nonopponent M+L-cone mechanism, and opponent M- versus L-cone mechanism templates. Results Scotopicperimetric sensitivity was determined by rods peripherally and by a combination of rods and cones at, and immediately around, fixation. Mesopic perimetric sensitivity was mediated by M+L-cones and S-cones centrally and by M+L-cones and rods more peripherally. Photopic perimetric sensitivity was determined by an opponent M- versus L-cone, a nonopponent M+L-cone, and an S-cone mechanism centrally and by a combination of an S-cone and an M+L-cone mechanism peripherally. Conclusions Under scotopic conditions, a 480-nm stimulus provides adequate isolation (≥28 dB) of the rod mechanism. Several mechanisms contribute to mesopic sensitivity: this redundancy in detection may cause both insensitivity to broadband white targets and ambiguity in determining which mechanism is being probed with short-wavelength stimuli. M- and L-cone–derived mechanisms are well isolated at 10 cd.m−2: these may be selectively probed by a stimulus at 640 nm (≥ 20 dB isolation). Translation Relevance In human observers, multiple mechanisms contribute to the detection of Goldmann

  20. A fully automated microfluidic micellar electrokinetic chromatography analyzer for organic compound detection.

    PubMed

    Jang, Lee-Woon; Razu, Md Enayet; Jensen, Erik C; Jiao, Hong; Kim, Jungkyu

    2016-09-21

    An integrated microfluidic chemical analyzer utilizing micellar electrokinetic chromatography (MEKC) is developed using a pneumatically actuated Lifting-Gate microvalve array and a capillary zone electrophoresis (CZE) chip. Each of the necessary liquid handling processes such as metering, mixing, transferring, and washing steps are performed autonomously by the microvalve array. In addition, a method is presented for automated washing of the high resistance CZE channel for device reuse and periodic automated in situ analyses. To demonstrate the functionality of this MEKC platform, amino acids and thiols are labeled and efficiently separated via a fully automated program. Reproducibility of the automated programs for sample labeling and periodic in situ MEKC analysis was tested and found to be equivalent to conventional sample processing techniques for capillary electrophoresis analysis. This platform enables simple, portable, and automated chemical compound analysis which can be used in challenging environments. PMID:27507322

  1. Immunohistochemical Detection of Changes in Tumor Hypoxia

    SciTech Connect

    Russell, James Carlin, Sean; Burke, Sean A.; Wen Bixiu; Yang, Kwang Mo; Ling, C. Clifton

    2009-03-15

    Purpose: Although hypoxia is a known prognostic factor, its effect will be modified by the rate of reoxygenation and the extent to which the cells are acutely hypoxic. We tested the ability of exogenous and endogenous markers to detect reoxygenation in a xenograft model. Our technique might be applicable to stored patient samples. Methods and Materials: The human colorectal carcinoma line, HT29, was grown in nude mice. Changes in tumor hypoxia were examined by injection of pimonidazole, followed 24 hours later by EF5. Cryosections were stained for these markers and for carbonic anhydrase IX (CAIX) and hypoxia-inducible factor 1{alpha} (HIF1{alpha}). Tumor hypoxia was artificially manipulated by carbogen exposure. Results: In unstressed tumors, all four markers showed very similar spatial distributions. After carbogen treatment, pimonidazole and EF5 could detect decreased hypoxia. HIF1{alpha} staining was also decreased relative to CAIX, although the effect was less pronounced than for EF5. Control tumors displayed small regions that had undergone spontaneous changes in tumor hypoxia, as judged by pimonidazole relative to EF5; most of these changes were reflected by CAIX and HIF1{alpha}. Conclusion: HIF1{alpha} can be compared with either CAIX or a previously administered nitroimidazole to provide an estimate of reoxygenation.

  2. Nationwide Hybrid Change Detection of Buildings

    NASA Astrophysics Data System (ADS)

    Hron, V.; Halounova, L.

    2016-06-01

    The Fundamental Base of Geographic Data of the Czech Republic (hereinafter FBGD) is a national 2D geodatabase at a 1:10,000 scale with more than 100 geographic objects. This paper describes the design of the permanent updating mechanism of buildings in FBGD. The proposed procedure belongs to the category of hybrid change detection (HCD) techniques which combine pixel-based and object-based evaluation. The main sources of information for HCD are cadastral information and bi-temporal vertical digital aerial photographs. These photographs have great information potential because they contain multispectral, position and also elevation information. Elevation information represents a digital surface model (DSM) which can be obtained using the image matching technique. Pixel-based evaluation of bi-temporal DSMs enables fast localization of places with potential building changes. These coarse results are subsequently classified through the object-based image analysis (OBIA) using spectral, textural and contextual features and GIS tools. The advantage of the two-stage evaluation is the pre-selection of locations where image segmentation (a computationally demanding part of OBIA) is performed. It is not necessary to apply image segmentation to the entire scene, but only to the surroundings of detected changes, which contributes to significantly faster processing and lower hardware requirements. The created technology is based on open-source software solutions that allow easy portability on multiple computers and parallelization of processing. This leads to significant savings of financial resources which can be expended on the further development of FBGD.

  3. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    NASA Astrophysics Data System (ADS)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  4. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    NASA Astrophysics Data System (ADS)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  5. Automated detection of the left ventricular region in gated nuclear cardiac imaging.

    PubMed

    Boudraa, A E; Arzi, M; Sau, J; Champier, J; Hadj-Moussa, S; Besson, J E; Sappey-Marinier, D; Itti, R; Mallet, J J

    1996-04-01

    An approach to automated outlining the left ventricular contour and its bounded area in gated isotopic ventriculography is proposed. Its purpose is to determine the ejection fraction (EF), an important parameter for measuring cardiac function. The method uses a modified version of the fuzzy C-means (MFCM) algorithm and a labeling technique. The MFCM algorithm is applied to the end diastolic (ED) frame and then the (FCM) is applied to the remaining images in a "box" of interest. The MFCM generates a number of fuzzy clusters. Each cluster is a substructure of the heart (left ventricle,...). A cluster validity index to estimate the optimum clusters number present in image data point is used. This index takes account of the homogeneity in each cluster and is connected to the geometrical property of data set. The labeling is only performed to achieve the detection process in the ED frame. Since the left ventricle (LV) cluster has the greatest area of the cardiac images sequence in ED phase, a framing operation is performed to obtain, automatically, the "box" enclosing the LV cluster. THe EF assessed in 50 patients by the proposed method and a semi-automatic one, routinely used, are presented. A good correlation between the two methods EF values is obtained (R = 0.93). The LV contour found has been judged very satisfactory by a team of trained clinicians. PMID:8626193

  6. ECO fill: automated fill modification to support late-stage design changes

    NASA Astrophysics Data System (ADS)

    Davis, Greg; Wilson, Jeff; Yu, J. J.; Chiu, Anderson; Chuang, Yao-Jen; Yang, Ricky

    2014-03-01

    One of the most critical factors in achieving a positive return for a design is ensuring the design not only meets performance specifications, but also produces sufficient yield to meet the market demand. The goal of design for manufacturability (DFM) technology is to enable designers to address manufacturing requirements during the design process. While new cell-based, DP-aware, and net-aware fill technologies have emerged to provide the designer with automated fill engines that support these new fill requirements, design changes that arrive late in the tapeout process (as engineering change orders, or ECOs) can have a disproportionate effect on tapeout schedules, due to the complexity of replacing fill. If not handled effectively, the impacts on file size, run time, and timing closure can significantly extend the tapeout process. In this paper, the authors examine changes to design flow methodology, supported by new fill technology, that enable efficient, fast, and accurate adjustments to metal fill late in the design process. We present an ECO fill methodology coupled with the support of advanced fill tools that can quickly locate the portion of the design affected by the change, remove and replace only the fill in that area, while maintaining the fill hierarchy. This new fill approach effectively reduces run time, contains fill file size, minimizes timing impact, and minimizes mask costs due to ECO-driven fill changes, all of which are critical factors to ensuring time-to-market schedules are maintained.

  7. Imaging, object detection, and change detection with a polarized multistatic GPR array

    DOEpatents

    Beer, N. Reginald; Paglieroni, David W.

    2015-07-21

    A polarized detection system performs imaging, object detection, and change detection factoring in the orientation of an object relative to the orientation of transceivers. The polarized detection system may operate on one of several modes of operation based on whether the imaging, object detection, or change detection is performed separately for each transceiver orientation. In combined change mode, the polarized detection system performs imaging, object detection, and change detection separately for each transceiver orientation, and then combines changes across polarizations. In combined object mode, the polarized detection system performs imaging and object detection separately for each transceiver orientation, and then combines objects across polarizations and performs change detection on the result. In combined image mode, the polarized detection system performs imaging separately for each transceiver orientation, and then combines images across polarizations and performs object detection followed by change detection on the result.

  8. Detection of coronary calcifications from computed tomography scans for automated risk assessment of coronary artery disease

    SciTech Connect

    Isgum, Ivana; Rutten, Annemarieke; Prokop, Mathias; Ginneken, Bram van

    2007-04-15

    A fully automated method for coronary calcification detection from non-contrast-enhanced, ECG-gated multi-slice computed tomography (CT) data is presented. Candidates for coronary calcifications are extracted by thresholding and component labeling. These candidates include coronary calcifications, calcifications in the aorta and in the heart, and other high-density structures such as noise and bone. A dedicated set of 64 features is calculated for each candidate object. They characterize the object's spatial position relative to the heart and the aorta, for which an automatic segmentation scheme was developed, its size and shape, and its appearance, which is described by a set of approximated Gaussian derivatives for which an efficient computational scheme is presented. Three classification strategies were designed. The first one tested direct classification without feature selection. The second approach also utilized direct classification, but with feature selection. Finally, the third scheme employed two-stage classification. In a computationally inexpensive first stage, the most easily recognizable false positives were discarded. The second stage discriminated between more difficult to separate coronary calcium and other candidates. Performance of linear, quadratic, nearest neighbor, and support vector machine classifiers was compared. The method was tested on 76 scans containing 275 calcifications in the coronary arteries and 335 calcifications in the heart and aorta. The best performance was obtained employing a two-stage classification system with a k-nearest neighbor (k-NN) classifier and a feature selection scheme. The method detected 73.8% of coronary calcifications at the expense of on average 0.1 false positives per scan. A calcium score was computed for each scan and subjects were assigned one of four risk categories based on this score. The method assigned the correct risk category to 93.4% of all scans.

  9. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry.

  10. An Automated Measurement of Ciliary Beating Frequency using a Combined Optical Flow and Peak Detection

    PubMed Central

    Kim, Woojae; Han, Tae Hwa; Kim, Hyun Jun; Park, Man Young; Kim, Ku Sang

    2011-01-01

    Objectives The mucociliary transport system is a major defense mechanism of the respiratory tract. The performance of mucous transportation in the nasal cavity can be represented by a ciliary beating frequency (CBF). This study proposes a novel method to measure CBF by using optical flow. Methods To obtain objective estimates of CBF from video images, an automated computer-based image processing technique is developed. This study proposes a new method based on optical flow for image processing and peak detection for signal processing. We compare the measuring accuracy of the method in various combinations of image processing (optical flow versus difference image) and signal processing (fast Fourier transform [FFT] vs. peak detection [PD]). The digital high-speed video method with a manual count of CBF in slow motion video play, is the gold-standard in CBF measurement. We obtained a total of fifty recorded ciliated sinonasal epithelium images to measure CBF from the Department of Otolaryngology. The ciliated sinonasal epithelium images were recorded at 50-100 frames per second using a charge coupled device camera with an inverted microscope at a magnification of ×1,000. Results The mean square errors and variance for each method were 1.24, 0.84 Hz; 11.8, 2.63 Hz; 3.22, 1.46 Hz; and 3.82, 1.53 Hz for optical flow (OF) + PD, OF + FFT, difference image [DI] + PD, and DI + FFT, respectively. Of the four methods, PD using optical flow showed the best performance for measuring the CBF of nasal mucosa. Conclusions The proposed method was able to measure CBF more objectively and efficiently than what is currently possible. PMID:21886872

  11. Automated Detection and Classification of Rockfall Induced Seismic Signals with Hidden-Markov-Models

    NASA Astrophysics Data System (ADS)

    Zeckra, M.; Hovius, N.; Burtin, A.; Hammer, C.

    2015-12-01

    Originally introduced in speech recognition, Hidden Markov Models are applied in different research fields of pattern recognition. In seismology, this technique has recently been introduced to improve common detection algorithms, like STA/LTA ratio or cross-correlation methods. Mainly used for the monitoring of volcanic activity, this study is one of the first applications to seismic signals induced by geomorphologic processes. With an array of eight broadband seismometers deployed around the steep Illgraben catchment (Switzerland) with high-level erosion, we studied a sequence of landslides triggered over a period of several days in winter. A preliminary manual classification led us to identify three main seismic signal classes that were used as a start for the HMM automated detection and classification: (1) rockslide signal, including a failure source and the debris mobilization along the slope, (2) rockfall signal from the remobilization of debris along the unstable slope, and (3) single cracking signal from the affected cliff observed before the rockslide events. Besides the ability to classify the whole dataset automatically, the HMM approach reflects the origin and the interactions of the three signal classes, which helps us to understand this geomorphic crisis and the possible triggering mechanisms for slope processes. The temporal distribution of crack events (duration > 5s, frequency band [2-8] Hz) follows an inverse Omori law, leading to the catastrophic behaviour of the failure mechanisms and the interest for warning purposes in rockslide risk assessment. Thanks to a dense seismic array and independent weather observations in the landslide area, this dataset also provides information about the triggering mechanisms, which exhibit a tight link between rainfall and freezing level fluctuations.

  12. Automated determinations of selenium in thermal power plant wastewater by sequential hydride generation and chemiluminescence detection.

    PubMed

    Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei

    2016-02-01

    After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry. PMID:26653491

  13. Automated Visual Event Detection, Tracking, and Data Management System for Cabled- Observatory Video

    NASA Astrophysics Data System (ADS)

    Edgington, D. R.; Cline, D. E.; Schlining, B.; Raymond, E.

    2008-12-01

    Ocean observatories and underwater video surveys have the potential to unlock important discoveries with new and existing camera systems. Yet the burden of video management and analysis often requires reducing the amount of video recorded through time-lapse video or similar methods. It's unknown how many digitized video data sets exist in the oceanographic community, but we suspect that many remain under analyzed due to lack of good tools or human resources to analyze the video. To help address this problem, the Automated Visual Event Detection (AVED) software and The Video Annotation and Reference System (VARS) have been under development at MBARI. For detecting interesting events in the video, the AVED software has been developed over the last 5 years. AVED is based on a neuromorphic-selective attention algorithm, modeled on the human vision system. Frames are decomposed into specific feature maps that are combined into a unique saliency map. This saliency map is then scanned to determine the most salient locations. The candidate salient locations are then segmented from the scene using algorithms suitable for the low, non-uniform light and marine snow typical of deep underwater video. For managing the AVED descriptions of the video, the VARS system provides an interface and database for describing, viewing, and cataloging the video. VARS was developed by the MBARI for annotating deep-sea video data and is currently being used to describe over 3000 dives by our remotely operated vehicles (ROV), making it well suited to this deepwater observatory application with only a few modifications. To meet the compute and data intensive job of video processing, a distributed heterogeneous network of computers is managed using the Condor workload management system. This system manages data storage, video transcoding, and AVED processing. Looking to the future, we see high-speed networks and Grid technology as an important element in addressing the problem of processing and

  14. A completely automated CAD system for mass detection in a large mammographic database

    SciTech Connect

    Bellotti, R.; De Carlo, F.; Tangaro, S.

    2006-08-15

    Mass localization plays a crucial role in computer-aided detection (CAD) systems for the classification of suspicious regions in mammograms. In this article we present a completely automated classification system for the detection of masses in digitized mammographic images. The tool system we discuss consists in three processing levels: (a) Image segmentation for the localization of regions of interest (ROIs). This step relies on an iterative dynamical threshold algorithm able to select iso-intensity closed contours around gray level maxima of the mammogram. (b) ROI characterization by means of textural features computed from the gray tone spatial dependence matrix (GTSDM), containing second-order spatial statistics information on the pixel gray level intensity. As the images under study were recorded in different centers and with different machine settings, eight GTSDM features were selected so as to be invariant under monotonic transformation. In this way, the images do not need to be normalized, as the adopted features depend on the texture only, rather than on the gray tone levels, too. (c) ROI classification by means of a neural network, with supervision provided by the radiologist's diagnosis. The CAD system was evaluated on a large database of 3369 mammographic images [2307 negative, 1062 pathological (or positive), containing at least one confirmed mass, as diagnosed by an expert radiologist]. To assess the performance of the system, receiver operating characteristic (ROC) and free-response ROC analysis were employed. The area under the ROC curve was found to be A{sub z}=0.783{+-}0.008 for the ROI-based classification. When evaluating the accuracy of the CAD against the radiologist-drawn boundaries, 4.23 false positives per image are found at 80% of mass sensitivity.

  15. Detecting past changes of effective population size.

    PubMed

    Nikolic, Natacha; Chevalet, Claude

    2014-06-01

    Understanding and predicting population abundance is a major challenge confronting scientists. Several genetic models have been developed using microsatellite markers to estimate the present and ancestral effective population sizes. However, to get an overview on the evolution of population requires that past fluctuation of population size be traceable. To address the question, we developed a new model estimating the past changes of effective population size from microsatellite by resolving coalescence theory and using approximate likelihoods in a Monte Carlo Markov Chain approach. The efficiency of the model and its sensitivity to gene flow and to assumptions on the mutational process were checked using simulated data and analysis. The model was found especially useful to provide evidence of transient changes of population size in the past. The times at which some past demographic events cannot be detected because they are too ancient and the risk that gene flow may suggest the false detection of a bottleneck are discussed considering the distribution of coalescence times. The method was applied on real data sets from several Atlantic salmon populations. The method called VarEff (Variation of Effective size) was implemented in the R package VarEff and is made available at https://qgsp.jouy.inra.fr and at http://cran.r-project.org/web/packages/VarEff.

  16. Point pattern match-based change detection in a constellation of previously detected objects

    DOEpatents

    Paglieroni, David W.

    2016-06-07

    A method and system is provided that applies attribute- and topology-based change detection to objects that were detected on previous scans of a medium. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, detection strength, size, elongation, orientation, etc. The locations define a three-dimensional network topology forming a constellation of previously detected objects. The change detection system stores attributes of the previously detected objects in a constellation database. The change detection system detects changes by comparing the attributes and topological consistency of newly detected objects encountered during a new scan of the medium to previously detected objects in the constellation database. The change detection system may receive the attributes of the newly detected objects as the objects are detected by an object detection system in real time.

  17. Automated detection of retinal nerve fiber layer defects on fundus images: false positive reduction based on vessel likelihood

    NASA Astrophysics Data System (ADS)

    Muramatsu, Chisako; Ishida, Kyoko; Sawada, Akira; Hatanaka, Yuji; Yamamoto, Tetsuya; Fujita, Hiroshi

    2016-03-01

    Early detection of glaucoma is important to slow down or cease progression of the disease and for preventing total blindness. We have previously proposed an automated scheme for detection of retinal nerve fiber layer defect (NFLD), which is one of the early signs of glaucoma observed on retinal fundus images. In this study, a new multi-step detection scheme was included to improve detection of subtle and narrow NFLDs. In addition, new features were added to distinguish between NFLDs and blood vessels, which are frequent sites of false positives (FPs). The result was evaluated with a new test dataset consisted of 261 cases, including 130 cases with NFLDs. Using the proposed method, the initial detection rate was improved from 82% to 98%. At the sensitivity of 80%, the number of FPs per image was reduced from 4.25 to 1.36. The result indicates the potential usefulness of the proposed method for early detection of glaucoma.

  18. Erratum to: Automated Sample Preparation Method for Suspension Arrays using Renewable Surface Separations with Multiplexed Flow Cytometry Fluorescence Detection

    SciTech Connect

    Grate, Jay W.; Bruckner-Lea, Cindy J.; Jarrell, Ann E.; Chandler, Darrell P.

    2003-04-10

    In this paper we describe a new method of automated sample preparation for multiplexed biological analysis systems that use flow cytometry fluorescence detection. In this approach, color-encoded microspheres derivatized to capture particular biomolecules are temporarily trapped in a renewable surface separation column to enable perfusion with sample and reagents prior to delivery to the detector. This method provides for separation of the biomolecules of interest from other sample matrix components as well as from labeling solutions.

  19. Automated image-based colon cleansing for laxative-free CT colonography computer-aided polyp detection

    SciTech Connect

    Linguraru, Marius George; Panjwani, Neil; Fletcher, Joel G.; Summer, Ronald M.

    2011-12-15

    Purpose: To evaluate the performance of a computer-aided detection (CAD) system for detecting colonic polyps at noncathartic computed tomography colonography (CTC) in conjunction with an automated image-based colon cleansing algorithm. Methods: An automated colon cleansing algorithm was designed to detect and subtract tagged-stool, accounting for heterogeneity and poor tagging, to be used in conjunction with a colon CAD system. The method is locally adaptive and combines intensity, shape, and texture analysis with probabilistic optimization. CTC data from cathartic-free bowel preparation were acquired for testing and training the parameters. Patients underwent various colonic preparations with barium or Gastroview in divided doses over 48 h before scanning. No laxatives were administered and no dietary modifications were required. Cases were selected from a polyp-enriched cohort and included scans in which at least 90% of the solid stool was visually estimated to be tagged and each colonic segment was distended in either the prone or supine view. The CAD system was run comparatively with and without the stool subtraction algorithm. Results: The dataset comprised 38 CTC scans from prone and/or supine scans of 19 patients containing 44 polyps larger than 10 mm (22 unique polyps, if matched between prone and supine scans). The results are robust on fine details around folds, thin-stool linings on the colonic wall, near polyps and in large fluid/stool pools. The sensitivity of the CAD system is 70.5% per polyp at a rate of 5.75 false positives/scan without using the stool subtraction module. This detection improved significantly (p = 0.009) after automated colon cleansing on cathartic-free data to 86.4% true positive rate at 5.75 false positives/scan. Conclusions: An automated image-based colon cleansing algorithm designed to overcome the challenges of the noncathartic colon significantly improves the sensitivity of colon CAD by approximately 15%.

  20. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    SciTech Connect

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.

    2012-09-01

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.

  1. Automated detection of unstable glacier flow and a spectrum of speedup behavior in the Alaska Range

    NASA Astrophysics Data System (ADS)

    Herreid, Sam; Truffer, Martin

    2016-01-01

    Surge-type glaciers are loosely defined as glaciers that experience periodic alterations between slow and fast flow regimes. Glaciers from a variety of mountain ranges around the world have been classified as surge type, yet consensus of what defines a glacier as surge type has not always been met. A common source of dispute is the lack of a succinct and globally applicable delimiter between a surging and nonsurging glacier. The attempt is often a Boolean classification; however, glacier speedup events can vary significantly with respect to event magnitude, duration, and the fraction of the glacier that participates in the speedup. For this study, we first updated the inventory of glaciers that show flow instabilities in the Alaska Range and then quantified the spectrum of speedup behavior. We developed a new method that automatically detects glaciers with flow instabilities. Our automated results show a 91% success rate when compared to direct observations of speedup events and glaciers that are suspected to display unstable flow based on surface features. Through a combination of observations from the Landsat archive and previously published data, our inventory now contains 36 glaciers that encompass at least one branch exhibiting unstable flow and we document 53 speedup events that occurred between 1936 and 2014. We then present a universal method for comparing glacier speedup events based on a normalized event magnitude metric. This method provides a consistent way to include and quantify the full spectrum of speedup events and allows for comparisons with glaciers that exhibit clear surge characteristics yet have no observed surge event to date. Our results show a continuous spectrum of speedup magnitudes, from steady flow to clearly surge type, which suggests that qualitative classifications, such as "surge-type" or "pulse-type" behavior, might be too simplistic and should be accompanied by a standardized magnitude metric.

  2. Automated detection of arterial input function in DSC perfusion MRI in a stroke rat model

    NASA Astrophysics Data System (ADS)

    Yeh, M.-Y.; Lee, T.-H.; Yang, S.-T.; Kuo, H.-H.; Chyi, T.-K.; Liu, H.-L.

    2009-05-01

    Quantitative cerebral blood flow (CBF) estimation requires deconvolution of the tissue concentration time curves with an arterial input function (AIF). However, image-based determination of AIF in rodent is challenged due to limited spatial resolution. We evaluated the feasibility of quantitative analysis using automated AIF detection and compared the results with commonly applied semi-quantitative analysis. Permanent occlusion of bilateral or unilateral common carotid artery was used to induce cerebral ischemia in rats. The image using dynamic susceptibility contrast method was performed on a 3-T magnetic resonance scanner with a spin-echo echo-planar-image sequence (TR/TE = 700/80 ms, FOV = 41 mm, matrix = 64, 3 slices, SW = 2 mm), starting from 7 s prior to contrast injection (1.2 ml/kg) at four different time points. For quantitative analysis, CBF was calculated by the AIF which was obtained from 10 voxels with greatest contrast enhancement after deconvolution. For semi-quantitative analysis, relative CBF was estimated by the integral divided by the first moment of the relaxivity time curves. We observed if the AIFs obtained in the three different ROIs (whole brain, hemisphere without lesion and hemisphere with lesion) were similar, the CBF ratios (lesion/normal) between quantitative and semi-quantitative analyses might have a similar trend at different operative time points. If the AIFs were different, the CBF ratios might be different. We concluded that using local maximum one can define proper AIF without knowing the anatomical location of arteries in a stroke rat model.

  3. New Automated Chemiluminescence Immunoassay for Simultaneous but Separate Detection of Human Immunodeficiency Virus Antigens and Antibodies

    PubMed Central

    López Roa, Paula; Suárez, Marisol; Bouza, Emilio

    2014-01-01

    The recently launched Liaison XL Murex HIV Ab/Ag assay (DiaSorin S.p.A) uses chemiluminescence immunoassay technology for the combined qualitative determination of p24 antigen of HIV-1 and specific antibodies to both HIV-1 and HIV-2. We studied 571 serum samples from those submitted to our laboratory for HIV screening. The samples were divided into 3 subsets: subset A, 365 samples collected prospectively during 1 week; subset B, 158 samples from confirmed HIV-positive patients; and subset C, 48 samples with a positive screening result but a negative or indeterminate confirmatory test result. Our standard screening/confirmatory algorithm was used as a reference. In subset A (prospective), 5 samples were positive and 360 negative by the standard procedure. Liaison XL Murex HIV Ab/Ag correctly identified all 5 positive samples (100%) and 357 negative samples (99.2%). In subset B (confirmed positive), all 158 positive samples were in total agreement in both procedures. In subset C (screen positive only), Liaison XL Murex HIV Ab/Ag yielded accurate results in 42 out of 48 samples (87.5%). Global sensitivity and specificity for Liaison XL Murex HIV Ab/Ag (all subsets included) were 98.3% and 98.5%, respectively. Considering only nonselected prospective samples and confirmed positive samples (subsets A and B), the corresponding sensitivity and specificity values were 100% and 99.2%, respectively. The new fully automated HIV screening test showed high sensitivity and specificity compared to our standard algorithm. Its added advantage of being able to detect HIV-1 and HIV-2 antibodies and p24 antigen separately could prove useful in the diagnosis of early infections. PMID:24574285

  4. Evaluation of a CLEIA automated assay system for the detection of a panel of tumor markers.

    PubMed

    Falzarano, Renato; Viggiani, Valentina; Michienzi, Simona; Longo, Flavia; Tudini, Silvestra; Frati, Luigi; Anastasi, Emanuela

    2013-10-01

    Tumor markers are commonly used to detect a relapse of disease in oncologic patients during follow-up. It is important to evaluate new assay systems for a better and more precise assessment, as a standardized method is currently lacking. The aim of this study was to assess the concordance between an automated chemiluminescent enzyme immunoassay system (LUMIPULSE® G1200) and our reference methods using seven tumor markers. Serum samples from 787 subjects representing a variety of diagnoses, including oncologic, were analyzed using LUMIPULSE® G1200 and our reference methods. Serum values were measured for the following analytes: prostate-specific antigen (PSA), alpha-fetoprotein (AFP), carcinoembryonic antigen (CEA), cancer antigen 125 (CA125), carbohydrate antigen 15-3 (CA15-3), carbohydrate antigen 19-9 (CA19-9), and cytokeratin 19 fragment (CYFRA 21-1). For the determination of CEA, AFP, and PSA, an automatic analyzer based on chemiluminescence was applied as reference method. To assess CYFRA 21-1, CA125, CA19-9, and CA15-3, an immunoradiometric manual system was employed. Method comparison by Passing-Bablok analysis resulted in slopes ranging from 0.9728 to 1.9089 and correlation coefficients from 0.9977 to 0.9335. The precision of each assay was assessed by testing six serum samples. Each sample was analyzed for all tumor biomarkers in duplicate and in three different runs. The coefficients of variation were less than 6.3 and 6.2 % for within-run and between-run variation, respectively. Our data suggest an overall good interassay agreement for all markers. The comparison with our reference methods showed good precision and reliability, highlighting its usefulness in clinical laboratory's routine. PMID:23775009

  5. Automated Detection and Predictive Modeling of Flux Transfer Events using CLUSTER Data

    NASA Astrophysics Data System (ADS)

    Sipes, T. B.; Karimabadi, H.; Driscoll, J.; Wang, Y.; Lavraud, B.; Slavin, J. A.

    2006-12-01

    Almost all statistical studies of flux ropes (FTEs) and traveling compression regions (TCRs) have been based on (i) visual inspection of data to compile a list of events and (ii) use of histograms and simple linear correlation analysis to study their properties and potential causes and dependencies. This approach has several major drawbacks including being highly subjective and inefficient. The traditional use of histograms and simple linear correlation analysis is also only useful for analysis of systems that show dominant dependencies on one or two variables at the most. However, if the system has complex dependencies, more sophisticated statistical techniques are required. For example, Wang et al. [2006] showed evidence that FTE occurrence rate are affected by IMF Bygsm, Bzgsm, and magnitude, and the IMF clock, tilt, spiral, and cone angles. If the initial findings were correct that FTEs occur only during periods of southward IMF, one could use the direction of IMF as a predictor of occurrence of FTEs. But in light of Wang et al. result, one cannot draw quantitative conclusions about conditions under which FTEs occur. It may be that a certain combination of these parameters is the true controlling parameter. To uncover this, one needs to deploy more sophisticated techniques. We have developed a new, sophisticated data mining tool called MineTool. MineTool is highly accurate, flexible and capable of handling difficult and even noisy datasets extremely well. It has the ability to outperform standard data mining tools such as artificial neural networks, decision/regression trees and support vector machines. Here we present preliminary results of the application of this tool to the CLUSTER data to perform two tasks: (i) automated detection of FTEs, and (ii) predictive modeling of occurrences of FTEs based on IMF and magnetospheric conditions.

  6. Automated detection of prostate cancer using wavelet transform features of ultrasound RF time series

    NASA Astrophysics Data System (ADS)

    Aboofazeli, Mohammad; Abolmaesumi, Purang; Moradi, Mehdi; Sauerbrei, Eric; Siemens, Robert; Boag, Alexander; Mousavi, Parvin

    2009-02-01

    The aim of this research was to investigate the performance of wavelet transform based features of ultrasound radiofrequency (RF) time series for automated detection of prostate cancer tumors in transrectal ultrasound images. Sequential frames of RF echo signals from 35 extracted prostate specimens were recorded in parallel planes, while the ultrasound probe and the tissue were fixed in position in each imaging plane. The sequence of RF echo signal samples corresponding to a particular spot in tissue imaging plane constitutes one RF time series. Each region of interest (ROI) of ultrasound image was represented by three groups of features of its time series, namely, wavelet, spectral and fractal features. Wavelet transform approximation and detail sequences of each ROI were averaged and used as wavelet features. The average value of the normalized spectrum in four quarters of the frequency range along with the intercept and slope of a regression line fitted to the values of the spectrum versus normalized frequency plot formed six spectral features. Fractal dimension (FD) of the RF time series were computed based on the Higuchi's approach. A support vector machine (SVM) classifier was used to classify the ROIs. The results indicate that combining wavelet coefficient based features with previously proposed spectral and fractal features of RF time series data would increase the area under ROC curve from 93.1% to 95.0%, respectively. Furthermore, the accuracy, sensitivity, and specificity increases to 91.7%, 86.6%, and 94.7%, from 85.7%, 85.2%, and 86.1%, respectively, using only spectral and fractal features.

  7. Evaluation of a CLEIA automated assay system for the detection of a panel of tumor markers.

    PubMed

    Falzarano, Renato; Viggiani, Valentina; Michienzi, Simona; Longo, Flavia; Tudini, Silvestra; Frati, Luigi; Anastasi, Emanuela

    2013-10-01

    Tumor markers are commonly used to detect a relapse of disease in oncologic patients during follow-up. It is important to evaluate new assay systems for a better and more precise assessment, as a standardized method is currently lacking. The aim of this study was to assess the concordance between an automated chemiluminescent enzyme immunoassay system (LUMIPULSE® G1200) and our reference methods using seven tumor markers. Serum samples from 787 subjects representing a variety of diagnoses, including oncologic, were analyzed using LUMIPULSE® G1200 and our reference methods. Serum values were measured for the following analytes: prostate-specific antigen (PSA), alpha-fetoprotein (AFP), carcinoembryonic antigen (CEA), cancer antigen 125 (CA125), carbohydrate antigen 15-3 (CA15-3), carbohydrate antigen 19-9 (CA19-9), and cytokeratin 19 fragment (CYFRA 21-1). For the determination of CEA, AFP, and PSA, an automatic analyzer based on chemiluminescence was applied as reference method. To assess CYFRA 21-1, CA125, CA19-9, and CA15-3, an immunoradiometric manual system was employed. Method comparison by Passing-Bablok analysis resulted in slopes ranging from 0.9728 to 1.9089 and correlation coefficients from 0.9977 to 0.9335. The precision of each assay was assessed by testing six serum samples. Each sample was analyzed for all tumor biomarkers in duplicate and in three different runs. The coefficients of variation were less than 6.3 and 6.2 % for within-run and between-run variation, respectively. Our data suggest an overall good interassay agreement for all markers. The comparison with our reference methods showed good precision and reliability, highlighting its usefulness in clinical laboratory's routine.

  8. Monitoring seasonal and diurnal changes in photosynthetic pigments with automated PRI and NDVI sensors

    NASA Astrophysics Data System (ADS)

    Gamon, J. A.; Kovalchuck, O.; Wong, C. Y. S.; Harris, A.; Garrity, S. R.

    2015-07-01

    The vegetation indices normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI) provide indicators of pigmentation and photosynthetic activity that can be used to model photosynthesis from remote sensing with the light-use-efficiency model. To help develop and validate this approach, reliable proximal NDVI and PRI sensors have been needed. We tested new NDVI and PRI sensors, "spectral reflectance sensors" (SRS sensors; recently developed by Decagon Devices, during spring activation of photosynthetic activity in evergreen and deciduous stands. We also evaluated two methods of sensor cross-calibration - one that considered sky conditions (cloud cover) at midday only, and another that also considered diurnal sun angle effects. Cross-calibration clearly affected sensor agreement with independent measurements, with the best method dependent upon the study aim and time frame (seasonal vs. diurnal). The seasonal patterns of NDVI and PRI differed for evergreen and deciduous species, demonstrating the complementary nature of these two indices. Over the spring season, PRI was most strongly influenced by changing chlorophyll : carotenoid pool sizes, while over the diurnal timescale, PRI was most affected by the xanthophyll cycle epoxidation state. This finding demonstrates that the SRS PRI sensors can resolve different processes affecting PRI over different timescales. The advent of small, inexpensive, automated PRI and NDVI sensors offers new ways to explore environmental and physiological constraints on photosynthesis, and may be particularly well suited for use at flux tower sites. Wider application of automated sensors could lead to improved integration of flux and remote sensing approaches for studying photosynthetic carbon uptake, and could help define the concept of contrasting vegetation optical types.

  9. Monitoring seasonal and diurnal changes in photosynthetic pigments with automated PRI and NDVI sensors

    NASA Astrophysics Data System (ADS)

    Gamon, J. A.; Kovalchuk, O.; Wong, C. Y. S.; Harris, A.; Garrity, S. R.

    2015-02-01

    The vegetation indices normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI) provide indicators of pigmentation and photosynthetic activity that can be used to model photosynthesis from remote sensing with the light-use efficiency model. To help develop and validate this approach, reliable proximal NDVI and PRI sensors have been needed. We tested new NDVI and PRI sensors, "SRS" sensors recently developed by Decagon Devices, during spring activation of photosynthetic activity in evergreen and deciduous stands. We also evaluated two methods of sensor cross-calibration, one that considered sky conditions (cloud cover) at midday only, and the other that also considered diurnal sun angle effects. Cross-calibration clearly affected sensor agreement with independent measurements, with the best method dependent upon the study aim and time frame (seasonal vs. diurnal). The seasonal patterns of NDVI and PRI differed for evergreen and deciduous species, demonstrating the complementary nature of these two indices. Over the spring season, PRI was most strongly influenced by changing chlorophyll : carotenoid pool sizes, while over the diurnal time scale PRI was most affected by the xanthophyll cycle epoxidation state. This finding demonstrates that the SRS PRI sensors can resolve different processes affecting PRI over different time scales. The advent of small, inexpensive, automated PRI and NDVI sensors offers new ways to explore environmental and physiological constraints on photosynthesis, and may be particularly well-suited for use at flux tower sites. Wider application of automated sensors could lead to improved integration of flux and remote sensing approaches to studying photosynthetic carbon uptake, and could help define the concept of contrasting vegetation optical types.

  10. Study of vegetation index selection and changing detection thresholds in land cover change detection assessment using change vector analysis

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy; Tran, Giang

    2012-07-01

    In recent years, Vietnamese rapidly developing economy has led to speedy changes in land cover. The study of changing detection of land cover plays an important role in making the strategy of the managers. There are two main approaches in changing detection research by using remote sensing and GIS: post- classification change detection analysis approach and pre-classification changing spectral determination approach. Each has their own different advantages and disadvantages. The second one is further divided into: Image Differencing, Multi-date Principal Component Analysis (MPCA); Change Vector Analysis (CVA). In this study, researchers introduce CVA method. This method is based on two important index to show the primary feature of land cover, such as: vegetation index (NDVI-) and barren land index (-BI). Ability to apply methods of CVA has been mentioned in the studies [1, 2, 3, and 4]. However, in these studies did not mention the NDVI index selection and changing detection threshold in changing detection assessment? This paper proposes application to solve these two problems.

  11. Sensitivity testing of trypanosome detection by PCR from whole blood samples using manual and automated DNA extraction methods.

    PubMed

    Dunlop, J; Thompson, C K; Godfrey, S S; Thompson, R C A

    2014-11-01

    Automated extraction of DNA for testing of laboratory samples is an attractive alternative to labour-intensive manual methods when higher throughput is required. However, it is important to maintain the maximum detection sensitivity possible to reduce the occurrence of type II errors (false negatives; failure to detect the target when it is present), especially in the biomedical field, where PCR is used for diagnosis. We used blood infected with known concentrations of Trypanosoma copemani to test the impact of analysis techniques on trypanosome detection sensitivity by PCR. We compared combinations of a manual and an automated DNA extraction method and two different PCR primer sets to investigate the impact of each on detection levels. Both extraction techniques and specificity of primer sets had a significant impact on detection sensitivity. Samples extracted using the same DNA extraction technique performed substantially differently for each of the separate primer sets. Type I errors (false positives; detection of the target when it is not present), produced by contaminants, were avoided with both extraction methods. This study highlights the importance of testing laboratory techniques with known samples to optimise accuracy of test results.

  12. Census cities experiment in urban change detection

    NASA Technical Reports Server (NTRS)

    Wray, J. R. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Work continues on mapping of 1970 urban land use from 1970 census contemporaneous aircraft photography. In addition, change detection analysis from 1972 aircraft photography is underway for several urban test sites. Land use maps, mosaics, and census overlays for the two largest urban test sites are nearing publication readiness. Preliminary examinations of ERTS-1 imagery of San Francisco Bay have been conducted which show that tracts of land of more than 10 acres in size which are undergoing development in an urban setting can be identified. In addition, each spectral band is being evaluated as to its utility for urban analyses. It has been found that MSS infrared band 7 helps to differentiate intra-urban land use details not found in other MSS bands or in the RBV coverage of the same scene. Good quality false CIR composites have been generated from 9 x 9 inch positive MSS bands using the Diazo process.

  13. Feasibility of fully automated detection of fiducial markers implanted into the prostate using electronic portal imaging: A comparison of methods

    SciTech Connect

    Harris, Emma J. . E-mail: eharris@icr.ac.uk; McNair, Helen A.; Evans, Phillip M.

    2006-11-15

    Purpose: To investigate the feasibility of fully automated detection of fiducial markers implanted into the prostate using portal images acquired with an electronic portal imaging device. Methods and Materials: We have made a direct comparison of 4 different methods (2 template matching-based methods, a method incorporating attenuation and constellation analyses and a cross correlation method) that have been published in the literature for the automatic detection of fiducial markers. The cross-correlation technique requires a-priory information from the portal images, therefore the technique is not fully automated for the first treatment fraction. Images of 7 patients implanted with gold fiducial markers (8 mm in length and 1 mm in diameter) were acquired before treatment (set-up images) and during treatment (movie images) using 1MU and 15MU per image respectively. Images included: 75 anterior (AP) and 69 lateral (LAT) set-up images and 51 AP and 83 LAT movie images. Using the different methods described in the literature, marker positions were automatically identified. Results: The method based upon cross correlation techniques gave the highest percentage detection success rate of 99% (AP) and 83% (LAT) set-up (1MU) images. The methods gave detection success rates of less than 91% (AP) and 42% (LAT) set-up images. The amount of a-priory information used and how it affects the way the techniques are implemented, is discussed. Conclusions: Fully automated marker detection in set-up images for the first treatment fraction is unachievable using these methods and that using cross-correlation is the best technique for automatic detection on subsequent radiotherapy treatment fractions.

  14. Treehuggers: Wireless Sensor Networks for Automated Measurement and Reporting of Changes in Tree Diameter

    NASA Astrophysics Data System (ADS)

    DeLucia, E. H.; Mies, T. A.; Anderson-Teixeira, K. J.; Bohleber, A. P.; Herrmann, V.

    2014-12-01

    Ground-based measurements of changes in tree diameter and subsequent calculation of carbon storage provide validation of indirect estimates of forest productivity from remote sensing platforms, and measurements made with high temporal resolution provide critical information about the responsiveness of tree growth to variations in important physical drivers (e.g. temperature and water availability). We have developed an environmentally robust instrument for automated measurement of expansion and contraction in tree diameter that can be deployed in remote locations (TreeHuggers; TH). TH uses a membrane potentiometer to measure changes in circumference with resolution ≤ 6 mm at user-selected intervals (≥ 1 min). Simultaneous measurement of temperature is used to correct for the thermal properties of the stainless steel band. Data are stored on micro SD cards and transmitted tree-to-tree to a base station. Preliminary measurement of beech trees shows the precise initiation of growth and the emergence of diel changes in stem diameter associated with sap flow. Because of their low cost and on-board data logging and communication packages, TH will greatly increase the capacity of the scientific community and private sectors to monitor tree growth and carbon storage. Possible applications include deploying TH in the footprint of eddy covariance sites to help interpret drivers affecting net ecosystem exchange and evapotranspiration. A large scale implementation of TH will contribute to our ability to forecast changes in the carbon sink strength of forests across environmental gradients and biotic disturbances, and they could prove useful in assessing changes in forest stocks as part of evaluating carbon offsets purchased by commercial entities.

  15. Automated Non-invasive Video-Microscopy of Oyster Spat Heart Rate during Acute Temperature Change: Impact of Acclimation Temperature.

    PubMed

    Domnik, Nicolle J; Polymeropoulos, Elias T; Elliott, Nicholas G; Frappell, Peter B; Fisher, John T

    2016-01-01

    We developed an automated, non-invasive method to detect real-time cardiac contraction in post-larval (1.1-1.7 mm length), juvenile oysters (i.e., oyster spat) via a fiber-optic trans-illumination system. The system is housed within a temperature-controlled chamber and video microscopy imaging of the heart was coupled with video edge-detection to measure cardiac contraction, inter-beat interval, and heart rate (HR). We used the method to address the hypothesis that cool acclimation (10°C vs. 22°C-Ta10 or Ta22, respectively; each n = 8) would preserve cardiac phenotype (assessed via HR variability, HRV analysis and maintained cardiac activity) during acute temperature changes. The temperature ramp (TR) protocol comprised 2°C steps (10 min/experimental temperature, Texp) from 22°C to 10°C to 22°C. HR was related to Texp in both acclimation groups. Spat became asystolic at low temperatures, particularly Ta22 spat (Ta22: 8/8 vs. Ta10: 3/8 asystolic at Texp = 10°C). The rate of HR decrease during cooling was less in Ta10 vs. Ta22 spat when asystole was included in analysis (P = 0.026). Time-domain HRV was inversely related to temperature and elevated in Ta10 vs. Ta22 spat (P < 0.001), whereas a lack of defined peaks in spectral density precluded frequency-domain analysis. Application of the method during an acute cooling challenge revealed that cool temperature acclimation preserved active cardiac contraction in oyster spat and increased time-domain HRV responses, whereas warm acclimation enhanced asystole. These physiologic changes highlight the need for studies of mechanisms, and have translational potential for oyster aquaculture practices. PMID:27445833

  16. Automated Non-invasive Video-Microscopy of Oyster Spat Heart Rate during Acute Temperature Change: Impact of Acclimation Temperature

    PubMed Central

    Domnik, Nicolle J.; Polymeropoulos, Elias T.; Elliott, Nicholas G.; Frappell, Peter B.; Fisher, John T.

    2016-01-01

    We developed an automated, non-invasive method to detect real-time cardiac contraction in post-larval (1.1–1.7 mm length), juvenile oysters (i.e., oyster spat) via a fiber-optic trans-illumination system. The system is housed within a temperature-controlled chamber and video microscopy imaging of the heart was coupled with video edge-detection to measure cardiac contraction, inter-beat interval, and heart rate (HR). We used the method to address the hypothesis that cool acclimation (10°C vs. 22°C—Ta10 or Ta22, respectively; each n = 8) would preserve cardiac phenotype (assessed via HR variability, HRV analysis and maintained cardiac activity) during acute temperature changes. The temperature ramp (TR) protocol comprised 2°C steps (10 min/experimental temperature, Texp) from 22°C to 10°C to 22°C. HR was related to Texp in both acclimation groups. Spat became asystolic at low temperatures, particularly Ta22 spat (Ta22: 8/8 vs. Ta10: 3/8 asystolic at Texp = 10°C). The rate of HR decrease during cooling was less in Ta10 vs. Ta22 spat when asystole was included in analysis (P = 0.026). Time-domain HRV was inversely related to temperature and elevated in Ta10 vs. Ta22 spat (P < 0.001), whereas a lack of defined peaks in spectral density precluded frequency-domain analysis. Application of the method during an acute cooling challenge revealed that cool temperature acclimation preserved active cardiac contraction in oyster spat and increased time-domain HRV responses, whereas warm acclimation enhanced asystole. These physiologic changes highlight the need for studies of mechanisms, and have translational potential for oyster aquaculture practices. PMID:27445833

  17. Flow cytometric-membrane potential detection of sodium channel active marine toxins: application to ciguatoxins in fish muscle and feasibility of automating saxitoxin detection.

    PubMed

    Manger, Ronald; Woodle, Doug; Berger, Andrew; Dickey, Robert W; Jester, Edward; Yasumoto, Takeshi; Lewis, Richard; Hawryluk, Timothy; Hungerford, James

    2014-01-01

    Ciguatoxins are potent neurotoxins with a significant public health impact. Cytotoxicity assays have allowed the most sensitive means of detection of ciguatoxin-like activity without reliance on mouse bioassays and have been invaluable in studying outbreaks. An improvement of these cell-based assays is presented here in which rapid flow cytometric detection of ciguatoxins and saxitoxins is demonstrated using fluorescent voltage sensitive dyes. A depolarization response can be detected directly due to ciguatoxin alone; however, an approximate 1000-fold increase in sensitivity is observed in the presence of veratridine. These results demonstrate that flow cytometric assessment of ciguatoxins is possible at levels approaching the trace detection limits of our earlier cytotoxicity assays, however, with a significant reduction in analysis time. Preliminary results are also presented for detection of brevetoxins and for automation and throughput improvements to a previously described method for detecting saxitoxins in shellfish extracts.

  18. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    SciTech Connect

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, ,; R.ubel, O.; Weber, G,; Hamann, B.

    2010-05-21

    scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis

  19. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice.

    PubMed

    Reeves, Sheldon L; Fleming, Kelsey E; Zhang, Lin; Scimemi, Annalisa

    2016-09-01

    Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies. PMID:27636358

  20. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice

    PubMed Central

    Zhang, Lin

    2016-01-01

    Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies. PMID:27636358

  1. Detecting Thermohaline Circulation Changes from Ocean properties

    NASA Astrophysics Data System (ADS)

    Hu, A.; Meehl, G. A.; Han, W.

    2003-12-01

    height gradient (SHG) between 30oS and 60oN in Atlantic. In HOS and CON, it shows a higher SSS contrast related to a stronger THC, but opposite in TRC. However, a colder NP (warmer NA) is related to a stronger THC for both forced runs. The SHG in Atlantic gives the most consistent result among these 3 runs, however, the linear regression shows a 17 Sv change in THC vs a change of one cm/deg-lat in SHG for CON and HOS, but a number increased to 29 for TRC. EOF analyses of the global SST indicate that the first EOF in CON, explaining 14% of the total variance, is a ENSO related pattern with a 2.6-year frequency. In the forced runs, this pattern becomes the second EOF with a frequency of 3.3- to 4.5-year, explaining 11% and 3% of the total variance for HOS and TRC, respectively. The first EOFs are a general cooling in Northern Hemisphere and warming in Southern Hemisphere in HOS (explaining 12% of the variance) and a global warming in TRC (explaining 70% of the variance). The general conclusion is that the proposed mechanisms used to detecting the THC strength are held for past and current climate condition, but not perfectly held for the future (at least in NCAR's CCSM2.0). The increase in atmospheric CO2 level seems changed the behavior of the THC, and causes a breakdown of many teleconnections between THC and others.

  2. Rational Manual and Automated Scoring Thresholds for the Immunohistochemical Detection of TP53 Missense Mutations in Human Breast Carcinomas.

    PubMed

    Taylor, Nicholas J; Nikolaishvili-Feinberg, Nana; Midkiff, Bentley R; Conway, Kathleen; Millikan, Robert C; Geradts, Joseph

    2016-07-01

    Missense mutations in TP53 are common in human breast cancer, have been associated with worse prognosis, and may predict therapy effect. TP53 missense mutations are associated with aberrant accumulation of p53 protein in tumor cell nuclei. Previous studies have used relatively arbitrary cutoffs to characterize breast tumors as positive for p53 staining by immunohistochemical assays. This study aimed to objectively determine optimal thresholds for p53 positivity by manual and automated scoring methods using whole tissue sections from the Carolina Breast Cancer Study. p53-immunostained slides were available for 564 breast tumors previously assayed for TP53 mutations. Average nuclear p53 staining intensity was manually scored as negative, borderline, weak, moderate, or strong and percentage of positive tumor cells was estimated. Automated p53 signal intensity was measured using the Aperio nuclear v9 algorithm combined with the Genie histology pattern recognition tool and tuned to achieve optimal nuclear segmentation. Receiver operating characteristic curve analysis was performed to determine optimal cutoffs for average staining intensity and percent cells positive to distinguish between tumors with and without a missense mutation. Receiver operating characteristic curve analysis demonstrated a threshold of moderate average nuclear staining intensity as a good surrogate for TP53 missense mutations in both manual (area under the curve=0.87) and automated (area under the curve=0.84) scoring systems. Both manual and automated immunohistochemical scoring methods predicted missense mutations in breast carcinomas with high accuracy. Validation of the automated intensity scoring threshold suggests a role for such algorithms in detecting TP53 missense mutations in high throughput studies.

  3. 2006 Automation Survey: The Systems Are Changing. But School Libraries Aren't

    ERIC Educational Resources Information Center

    Fuller, Daniel

    2006-01-01

    This article presents the findings of the 2006 School Library Journal-San Jose State University Automation Survey. The study takes a close look at the systems that media specialists are using, how they are using them, and what librarians want from their future automation programs. The findings reveal that while respondents were satisfied with…

  4. Attribute and topology based change detection in a constellation of previously detected objects

    DOEpatents

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  5. The use of adaptable automation: Effects of extended skill lay-off and changes in system reliability.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain

    2017-01-01

    This experiment aimed to examine how skill lay-off and system reliability would affect operator behaviour in a simulated work environment under wide-range and large-choice adaptable automation comprising six different levels. Twenty-four participants were tested twice during a 2-hr testing session, with the second session taking place 8 months after the first. In the middle of the second testing session, system reliability changed. The results showed that after the retention interval trust increased and self-confidence decreased. Complacency was unaffected by the lay-off period. Diagnostic speed slowed down after the retention interval but diagnostic accuracy was maintained. No difference between experimental conditions was found for automation management behaviour (i.e. level of automation chosen and frequency of switching between levels). There were few effects of system reliability. Overall, the findings showed that subjective measures were more sensitive to the impact of skill lay-off than objective behavioural measures.

  6. The use of adaptable automation: Effects of extended skill lay-off and changes in system reliability.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain

    2017-01-01

    This experiment aimed to examine how skill lay-off and system reliability would affect operator behaviour in a simulated work environment under wide-range and large-choice adaptable automation comprising six different levels. Twenty-four participants were tested twice during a 2-hr testing session, with the second session taking place 8 months after the first. In the middle of the second testing session, system reliability changed. The results showed that after the retention interval trust increased and self-confidence decreased. Complacency was unaffected by the lay-off period. Diagnostic speed slowed down after the retention interval but diagnostic accuracy was maintained. No difference between experimental conditions was found for automation management behaviour (i.e. level of automation chosen and frequency of switching between levels). There were few effects of system reliability. Overall, the findings showed that subjective measures were more sensitive to the impact of skill lay-off than objective behavioural measures. PMID:27633244

  7. Texture analysis and classification in coherent anti-Stokes Raman scattering (CARS) microscopy images for automated detection of skin cancer.

    PubMed

    Legesse, Fisseha Bekele; Medyukhina, Anna; Heuke, Sandro; Popp, Jürgen

    2015-07-01

    Coherent anti-Stokes Raman scattering (CARS) microscopy is a powerful tool for fast label-free tissue imaging, which is promising for early medical diagnostics. To facilitate the diagnostic process, automatic image analysis algorithms, which are capable of extracting relevant features from the image content, are needed. In this contribution we perform an automated classification of healthy and tumor areas in CARS images of basal cell carcinoma (BCC) skin samples. The classification is based on extraction of texture features from image regions and subsequent classification of these regions into healthy and cancerous with a perceptron algorithm. The developed approach is capable of an accurate classification of texture types with high sensitivity and specificity, which is an important step towards an automated tumor detection procedure. PMID:25797604

  8. Automated detection of kinks from blood vessels for optic cup segmentation in retinal images

    NASA Astrophysics Data System (ADS)

    Wong, D. W. K.; Liu, J.; Lim, J. H.; Li, H.; Wong, T. Y.

    2009-02-01

    The accurate localization of the optic cup in retinal images is important to assess the cup to disc ratio (CDR) for glaucoma screening and management. Glaucoma is physiologically assessed by the increased excavation of the optic cup within the optic nerve head, also known as the optic disc. The CDR is thus an important indicator of risk and severity of glaucoma. In this paper, we propose a method of determining the cup boundary using non-stereographic retinal images by the automatic detection of a morphological feature within the optic disc known as kinks. Kinks are defined as the bendings of small vessels as they traverse from the disc to the cup, providing physiological validation for the cup boundary. To detect kinks, localized patches are first generated from a preliminary cup boundary obtained via level set. Features obtained using edge detection and wavelet transform are combined using a statistical approach rule to identify likely vessel edges. The kinks are then obtained automatically by analyzing the detected vessel edges for angular changes, and these kinks are subsequently used to obtain the cup boundary. A set of retinal images from the Singapore Eye Research Institute was obtained to assess the performance of the method, with each image being clinically graded for the CDR. From experiments, when kinks were used, the error on the CDR was reduced to less than 0.1 CDR units relative to the clinical CDR, which is within the intra-observer variability of 0.2 CDR units.

  9. [Application of optical flow dynamic texture in land use/cover change detection].

    PubMed

    Yan, Li; Gong, Yi-Long; Zhang, Yi; Duan, Wei

    2014-11-01

    In the present study, a novel change detection approach for high resolution remote sensing images is proposed based on the optical flow dynamic texture (OFDT), which could achieve the land use & land cover change information automatically with a dynamic description of ground-object changes. This paper describes the ground-object gradual change process from the principle using optical flow theory, which breaks the ground-object sudden change hypothesis in remote sensing change detection methods in the past. As the steps of this method are simple, it could be integrated in the systems and software such as Land Resource Management and Urban Planning software that needs to find ground-object changes. This method takes into account the temporal dimension feature between remote sensing images, which provides a richer set of information for remote sensing change detection, thereby improving the status that most of the change detection methods are mainly dependent on the spatial dimension information. In this article, optical flow dynamic texture is the basic reflection of changes, and it is used in high resolution remote sensing image support vector machine post-classification change detection, combined with spectral information. The texture in the temporal dimension which is considered in this article has a smaller amount of data than most of the textures in the spatial dimensions. The highly automated texture computing has only one parameter to set, which could relax the onerous manual evaluation present status. The effectiveness of the proposed approach is evaluated with the 2011 and 2012 QuickBird datasets covering Duerbert Mongolian Autonomous County of Daqing City, China. Then, the effects of different optical flow smooth coefficient and the impact on the description of the ground-object changes in the method are deeply analyzed: The experiment result is satisfactory, with an 87.29% overall accuracy and an 0.850 7 Kappa index, and the method achieves better

  10. Automated Flaw Detection Scheme For Cast Austenitic Stainless Steel Weld Specimens Using Hilbert Huang Transform Of Ultrasonic Phased Array Data

    SciTech Connect

    Khan, T.; Majumdar, Shantanu; Udpa, L.; Ramuhalli, Pradeep; Crawford, Susan L.; Diaz, Aaron A.; Anderson, Michael T.

    2012-01-01

    The objective of this work is to develop processing algorithms to detect and localize the flaws using NDE ultrasonic data. Data was collected using cast austenitic stainless steel (CASS) weld specimens on-loan from the U.S. nuclear power industry’s Pressurized Water Reactor Owners Group (PWROG) specimen set. Each specimen consists of a centrifugally cast stainless steel (CCSS) pipe section welded to a statically cast (SCSS) or wrought (WRSS) section. The paper presents a novel automated flaw detection and localization scheme using low frequency ultrasonic phased array inspection signals in the weld and heat affected zone of the base materials. The major steps of the overall scheme are preprocessing and region of interest (ROI) detection followed by the Hilbert Huang transform (HHT) of A-scans in the detected ROIs. HHT offers time-frequency-energy distribution for each ROI. The accumulation of energy in a particular frequency band is used as a classification feature for the particular ROI.

  11. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly

  12. Ensembles of detectors for online detection of transient changes

    NASA Astrophysics Data System (ADS)

    Artemov, Alexey; Burnaev, Evgeny

    2015-12-01

    Classical change-point detection procedures assume a change-point model to be known and a change consisting in establishing a new observations regime, i.e. the change lasts infinitely long. These modeling assumptions contradicts applied problems statements. Therefore, even theoretically optimal statistics in practice very often fail when detecting transient changes online. In this work in order to overcome limitations of classical change-point detection procedures we consider approaches to constructing ensembles of change-point detectors, i.e. algorithms that use many detectors to reliably identify a change-point. We propose a learning paradigm and specific implementations of ensembles for change detection of short-term (transient) changes in observed time series. We demonstrate by means of numerical experiments that the performance of an ensemble is superior to that of the conventional change-point detection procedures.

  13. Automated JPSS VIIRS GEO code change testing by using Chain Run Scripts

    NASA Astrophysics Data System (ADS)

    Chen, W.; Wang, W.; Zhao, Q.; Das, B.; Mikles, V. J.; Sprietzer, K.; Tsidulko, M.; Zhao, Y.; Dharmawardane, V.; Wolf, W.

    2015-12-01

    The Joint Polar Satellite System (JPSS) is the next generation polar-orbiting operational environmental satellite system. The first satellite in the JPSS series of satellites, J-1, is scheduled to launch in early 2017. J1 will carry similar versions of the instruments that are on board of Suomi National Polar-Orbiting Partnership (S-NPP) satellite which was launched on October 28, 2011. The center for Satellite Applications and Research Algorithm Integration Team (STAR AIT) uses the Algorithm Development Library (ADL) to run S-NPP and pre-J1 algorithms in a development and test mode. The ADL is an offline test system developed by Raytheon to mimic the operational system while enabling a development environment for plug and play algorithms. The Perl Chain Run Scripts have been developed by STAR AIT to automate the staging and processing of multiple JPSS Sensor Data Record (SDR) and Environmental Data Record (EDR) products. JPSS J1 VIIRS Day Night Band (DNB) has anomalous non-linear response at high scan angles based on prelaunch testing. The flight project has proposed multiple mitigation options through onboard aggregation, and the Option 21 has been suggested by the VIIRS SDR team as the baseline aggregation mode. VIIRS GEOlocation (GEO) code analysis results show that J1 DNB GEO product cannot be generated correctly without the software update. The modified code will support both Op21, Op21/26 and is backward compatible with SNPP. J1 GEO code change version 0 delivery package is under development for the current change request. In this presentation, we will discuss how to use the Chain Run Script to verify the code change and Lookup Tables (LUTs) update in ADL Block2.

  14. Supervised automated microscopy increases sensitivity and efficiency of detection of sentinel node micrometastases in patients with breast cancer

    PubMed Central

    Mesker, W E; Torrenga, H; Sloos, W C R; Vrolijk, H; Tollenaar, R A E M; de Bruin, P C; van Diest, P J; Tanke, H J

    2004-01-01

    Aims: To investigate the practicality and sensitivity of supervised automated microscopy (AM) for the detection of micrometastasis in sentinel lymph nodes (SLNs) from patients with breast carcinoma. Methods: In total, 440 SLN slides (immunohistochemically stained for cytokeratin) from 86 patients were obtained from two hospitals. Samples were selected on the basis of: (1) a pathology report mentioning micrometastases or isolated tumour cells (ITCs) and (2) reported as negative nodes (N0). Results: From a test set of 29 slides (12 SLN positive patients, including positive and negative nodes), 18 slides were scored positive by supervised AM and 11 were negative. Routine examination revealed 17 positive slides and 12 negative. Subsequently, automated reanalysis of 187 slides (34 patients; institute I) and 216 slides (40 patients; institute II) from reported node negative (N0) patients showed that two and seven slides (from two and five patients, respectively) contained ITCs, respectively, all confirmed by the pathologists, corresponding to 5.9% and 12.5% missed patients. In four of the seven missed cases from institute II, AM also detected clusters of four to 30 cells, but all with a size ⩽ 0.2 mm. Conclusions: Supervised AM is a more sensitive method for detecting immunohistochemically stained micrometastasis and ITCs in SLNs than routine pathology. However, the clinical relevance of detecting cytokeratin positive cells in SLNs of patients with breast cancer is still an unresolved issue and is at the moment being validated in larger clinical trials. PMID:15333658

  15. Color changing photonic crystals detect blast exposure.

    PubMed

    Cullen, D Kacy; Xu, Yongan; Reneer, Dexter V; Browne, Kevin D; Geddes, James W; Yang, Shu; Smith, Douglas H

    2011-01-01

    Blast-induced traumatic brain injury (bTBI) is the "signature wound" of the current wars in Iraq and Afghanistan. However, with no objective information of relative blast exposure, warfighters with bTBI may not receive appropriate medical care and are at risk of being returned to the battlefield. Accordingly, we have created a colorimetric blast injury dosimeter (BID) that exploits material failure of photonic crystals to detect blast exposure. Appearing like a colored sticker, the BID is fabricated in photosensitive polymers via multi-beam interference lithography. Although very stable in the presence of heat, cold or physical impact, sculpted micro- and nano-structures of the BID are physically altered in a precise manner by blast exposure, resulting in color changes that correspond with blast intensity. This approach offers a lightweight, power-free sensor that can be readily interpreted by the naked eye. Importantly, with future refinement this technology may be deployed to identify soldiers exposed to blast at levels suggested to be supra-threshold for non-impact blast-induced mild TBI.

  16. Color changing photonic crystals detect blast exposure

    PubMed Central

    Cullen, D. Kacy; Xu, Yongan; Reneer, Dexter V.; Browne, Kevin D.; Geddes, James W.; Yang, Shu; Smith, Douglas H.

    2010-01-01

    Blast-induced traumatic brain injury (bTBI) is the “signature wound” of the current wars in Iraq and Afghanistan. However, with no objective information of relative blast exposure, warfighters with bTBI may not receive appropriate medical care and are at risk of being returned to the battlefield. Accordingly, we have created a colorimetric blast injury dosimeter (BID) that exploits material failure of photonic crystals to detect blast exposure. Appearing like a colored sticker, the BID is fabricated in photosensitive polymers via multi-beam interference lithography. Although very stable in the presence of heat, cold or physical impact, sculpted micro- and nano-structures of the BID are physically altered in a precise manner by blast exposure, resulting in color changes that correspond with blast intensity. This approach offers a lightweight, power-free sensor that can be readily interpreted by the naked eye. Importantly, with future refinement this technology may be deployed to identify soldiers exposed to blast at levels suggested to be supra-threshold for non-impact blast-induced mild TBI. PMID:21040795

  17. Eye Movements and Display Change Detection during Reading

    ERIC Educational Resources Information Center

    Slattery, Timothy J.; Angele, Bernhard; Rayner, Keith

    2011-01-01

    In the boundary change paradigm (Rayner, 1975), when a reader's eyes cross an invisible boundary location, a preview word is replaced by a target word. Readers are generally unaware of such changes due to saccadic suppression. However, some readers detect changes on a few trials and a small percentage of them detect many changes. Two experiments…

  18. Automated Detection of Essay Revising Patterns: Applications for Intelligent Feedback in a Writing Tutor

    ERIC Educational Resources Information Center

    Roscoe, Rod D.; Snow, Erica L.; Allen, Laura K.; McNamara, Danielle S.

    2015-01-01

    The Writing Pal is an intelligent tutoring system designed to support writing proficiency and strategy acquisition for adolescent writers. A fundamental aspect of the instructional model is automated formative feedback that provides concrete information and strategies oriented toward student improvement. In this paper, the authors explore…

  19. Automated pattern analysis: A newsilent partner in insect acoustic detection studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This seminar reviews methods that have been developed for automated analysis of field-collected sounds used to estimate pest populations and guide insect pest management decisions. Several examples are presented of successful usage of acoustic technology to map insect distributions in field environ...

  20. Automated Detection of Geomorphic Features in LiDAR Point Clouds of Various Spatial Density

    NASA Astrophysics Data System (ADS)

    Dorninger, Peter; Székely, Balázs; Zámolyi, András.; Nothegger, Clemens

    2010-05-01

    LiDAR, also referred to as laser scanning, has proved to be an important tool for topographic data acquisition. Terrestrial laser scanning allows for accurate (several millimeter) and high resolution (several centimeter) data acquisition at distances of up to some hundred meters. By contrast, airborne laser scanning allows for acquiring homogeneous data for large areas, albeit with lower accuracy (decimeter) and resolution (some ten points per square meter) compared to terrestrial laser scanning. Hence, terrestrial laser scanning is preferably used for precise data acquisition of limited areas such as landslides or steep structures, while airborne laser scanning is well suited for the acquisition of topographic data of huge areas or even country wide. Laser scanners acquire more or less homogeneously distributed point clouds. These points represent natural objects like terrain and vegetation and artificial objects like buildings, streets or power lines. Typical products derived from such data are geometric models such as digital surface models representing all natural and artificial objects and digital terrain models representing the geomorphic topography only. As the LiDAR technology evolves, the amount of data produced increases almost exponentially even in smaller projects. This means a considerable challenge for the end user of the data: the experimenter has to have enough knowledge, experience and computer capacity in order to manage the acquired dataset and to derive geomorphologically relevant information from the raw or intermediate data products. Additionally, all this information might need to be integrated with other data like orthophotos. In all theses cases, in general, interactive interpretation is necessary to determine geomorphic structures from such models to achieve effective data reduction. There is little support for the automatic determination of characteristic features and their statistical evaluation. From the lessons learnt from automated

  1. Automated kidney detection for 3D ultrasound using scan line searching

    NASA Astrophysics Data System (ADS)

    Noll, Matthias; Nadolny, Anne; Wesarg, Stefan

    2016-04-01

    Ultrasound (U/S) is a fast and non-expensive imaging modality that is used for the examination of various anatomical structures, e.g. the kidneys. One important task for automatic organ tracking or computer-aided diagnosis is the identification of the organ region. During this process the exact information about the transducer location and orientation is usually unavailable. This renders the implementation of such automatic methods exceedingly challenging. In this work we like to introduce a new automatic method for the detection of the kidney in 3D U/S images. This novel technique analyses the U/S image data along virtual scan lines. Here, characteristic texture changes when entering and leaving the symmetric tissue regions of the renal cortex are searched for. A subsequent feature accumulation along a second scan direction produces a 2D heat map of renal cortex candidates, from which the kidney location is extracted in two steps. First, the strongest candidate as well as its counterpart are extracted by heat map intensity ranking and renal cortex size analysis. This process exploits the heat map gap caused by the renal pelvis region. Substituting the renal pelvis detection with this combined cortex tissue feature increases the detection robustness. In contrast to model based methods that generate characteristic pattern matches, our method is simpler and therefore faster. An evaluation performed on 61 3D U/S data sets showed, that in 55 cases showing none or minor shadowing the kidney location could be correctly identified.

  2. Evaluation of change detection techniques for monitoring coastal zone environments

    NASA Technical Reports Server (NTRS)

    Weismiller, R. A.; Kristof, S. J.; Scholz, D. K.; Anuta, P. E.; Momin, S. M.

    1977-01-01

    Development of satisfactory techniques for detecting change in coastal zone environments is required before operational monitoring procedures can be established. In an effort to meet this need a study was directed toward developing and evaluating different types of change detection techniques, based upon computer aided analysis of LANDSAT multispectral scanner (MSS) data, to monitor these environments. The Matagorda Bay estuarine system along the Texas coast was selected as the study area. Four change detection techniques were designed and implemented for evaluation: (1) post classification comparison change detection, (2) delta data change detection, (3) spectral/temporal change classification, and (4) layered spectral/temporal change classification. Each of the four techniques was used to analyze a LANDSAT MSS temporal data set to detect areas of change of the Matagorda Bay region.

  3. AUTOMATED LEAK DETECTION OF BURIED TANKS USING GEOPHYSICAL METHODS AT THE HANFORD NUCLEAR SITE

    SciTech Connect

    CALENDINE S; SCHOFIELD JS; LEVITT MT; FINK JB; RUCKER DF

    2011-03-30

    At the Hanford Nuclear Site in Washington State, the Department of Energy oversees the containment, treatment, and retrieval of liquid high-level radioactive waste. Much of the waste is stored in single-shelled tanks (SSTs) built between 1943 and 1964. Currently, the waste is being retrieved from the SSTs and transferred into newer double-shelled tanks (DSTs) for temporary storage before final treatment. Monitoring the tanks during the retrieval process is critical to identifying leaks. An electrically-based geophysics monitoring program for leak detection and monitoring (LDM) has been successfully deployed on several SSTs at the Hanford site since 2004. The monitoring program takes advantage of changes in contact resistance that will occur when conductive tank liquid leaks into the soil. During monitoring, electrical current is transmitted on a number of different electrode types (e.g., steel cased wells and surface electrodes) while voltages are measured on all other electrodes, including the tanks. Data acquisition hardware and software allow for continuous real-time monitoring of the received voltages and the leak assessment is conducted through a time-series data analysis. The specific hardware and software combination creates a highly sensitive method of leak detection, complementing existing drywell logging as a means to detect and quantify leaks. Working in an industrial environment such as the Hanford site presents many challenges for electrical monitoring: cathodic protection, grounded electrical infrastructure, lightning strikes, diurnal and seasonal temperature trends, and precipitation, all of which create a complex environment for leak detection. In this discussion we present examples of challenges and solutions to working in the tank farms of the Hanford site.

  4. Eye movements and display change detection during reading.

    PubMed

    Slattery, Timothy J; Angele, Bernhard; Rayner, Keith

    2011-12-01

    In the boundary change paradigm (Rayner, 1975), when a reader's eyes cross an invisible boundary location, a preview word is replaced by a target word. Readers are generally unaware of such changes due to saccadic suppression. However, some readers detect changes on a few trials and a small percentage of them detect many changes. Two experiments are reported in which we combined eye movement data with signal detection analyses to investigate display change detection. On each trial, readers had to indicate if they saw a display change in addition to reading for meaning. On half the trials the display change occurred during the saccade (immediate condition); on the other half, it was slowed by 15-25 ms (delay condition) to increase the likelihood that a change would be detected. Sentences were presented in an alternating case fashion allowing us to investigate the influence of both letter identity and case. In the immediate condition, change detection was higher when letters changed than when case changed corroborating findings that word processing utilizes abstract (case independent) letter identities. However, in the delay condition (where d' was much higher than the immediate condition), detection was equal for letter and case changes. The results of both experiments indicate that sensitivity to display changes was related to how close the eyes were to the invalid preview on the fixation prior to the display change, as well as the timing of the completion of this change relative to the start of the post-change fixation.

  5. Computerized nodule detection in thin-slice CT using selective enhancement filter and automated rule-based classifier

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Li, Feng; Doi, Kunio

    2005-04-01

    We have been developing computer-aided diagnostic (CAD) scheme to assist radiologists detect lung nodules in thoracic CT images. In order to improve the sensitivity for nodule detection, we developed a selective nodule enhancement filter for nodule which can simultaneously enhance nodules and suppress other normal anatomic structures such as blood vessels and airway walls. Therefore, as preprocessing steps, this filter is useful for improving the sensitivity of nodule detection and for reducing the number of false positives. Another new technique we employed in this study is an automated rule-based classifier. It can significantly reduce the extent of the disadvantages of existing rule-based classifiers, including manual design, poor reproducibility, poor evaluation methods such as re-substitution, and a large overtraining effect. Experimental results performed with Monte Carlo simulation and a real lung nodule CT dataset demonstrated that the automated method can completely eliminate overtraining effect in the procedure of cutoff threshold selection, and thus can minimize overall overtraining effect in the rule-based classifier.

  6. Evaluation of genotoxicity using automated detection of γH2AX in metabolically competent HepaRG cells.

    PubMed

    Quesnot, Nicolas; Rondel, Karine; Audebert, Marc; Martinais, Sophie; Glaise, Denise; Morel, Fabrice; Loyer, Pascal; Robin, Marie-Anne

    2016-01-01

    The in situ detection of γH2AX was recently reported to be a promising biomarker of genotoxicity. In addition, the human HepaRG hepatoma cells appear to be relevant for investigating hepatic genotoxicity since they express most of drug metabolizing enzymes and a wild type p53. The aim of this study was to determine whether the automated in situ detection of γH2AX positive HepaRG cells could be relevant for evaluation of genotoxicity after single or long-term repeated in vitro exposure compared to micronucleus assay. Metabolically competent HepaRG cells were treated daily with environmental contaminants and genotoxicity was evaluated after 1, 7 and 14 days. Using these cells, we confirmed the genotoxicity of aflatoxin B1 and benzo(a)pyrene and demonstrated that dimethylbenzanthracene, fipronil and endosulfan previously found genotoxic with comet or micronucleus assays also induced γH2AX phosphorylation. Furthermore, we showed that fluoranthene and bisphenol A induced γH2AX while no effect had been previously reported in HepG2 cells. In addition, induction of γH2AX was observed with some compounds only after 7 days, highlighting the importance of studying long-term effects of low doses of contaminants. Together, our data demonstrate that automated γH2AX detection in metabolically competent HepaRG cells is a suitable high-through put genotoxicity screening assay.

  7. Neural network for change detection of remotely sensed imagery

    NASA Astrophysics Data System (ADS)

    Chen, C. F.; Chen, Kun S.; Chang, J. S.

    1995-11-01

    The use of a neural network for determining the change of landcover/land-use with remotely sensed data is proposed. In this study, a single image contains both spectral and temporal information is created from a multidate satellite imagery. The proposed change detection method can be divided into two main steps: training data selection and change detection. At the training step, the training set, basically consists of the classes of no-change and possible change data, is obtained from the composited image. Then the training data is used to input the neural network and obtain the network's weights. At the change detection step, the network's weights is employed to detect the change and no-change classes in the combined image. The proposed method is tested using a multidate SPOT imageries and a satisfied change pattern detection is obtained.

  8. Automated detection of structural alerts (chemical fragments) in (eco)toxicology

    PubMed Central

    Lepailleur, Alban; Poezevara, Guillaume; Bureau, Ronan

    2013-01-01

    This mini-review describes the evolution of different algorithms dedicated to the automated discovery of chemical fragments associated to (eco)toxicological endpoints. These structural alerts correspond to one of the most interesting approach of in silico toxicology due to their direct link with specific toxicological mechanisms. A number of expert systems are already available but, since the first work in this field which considered a binomial distribution of chemical fragments between two datasets, new data miners were developed and applied with success in chemoinformatics. The frequency of a chemical fragment in a dataset is often at the core of the process for the definition of its toxicological relevance. However, recent progresses in data mining provide new insights into the automated discovery of new rules. Particularly, this review highlights the notion of Emerging Patterns that can capture contrasts between classes of data. PMID:24688706

  9. Detecting holocene changes in thermohaline circulation.

    PubMed

    Keigwin, L D; Boyle, E A

    2000-02-15

    Throughout the last glacial cycle, reorganizations of deep ocean water masses were coincident with rapid millennial-scale changes in climate. Climate changes have been less severe during the present interglacial, but evidence for concurrent deep ocean circulation change is ambiguous. PMID:10677463

  10. Detecting Holocene changes in thermohaline circulation

    PubMed Central

    Keigwin, L. D.; Boyle, E. A.

    2000-01-01

    Throughout the last glacial cycle, reorganizations of deep ocean water masses were coincident with rapid millennial-scale changes in climate. Climate changes have been less severe during the present interglacial, but evidence for concurrent deep ocean circulation change is ambiguous. PMID:10677463

  11. Detecting holocene changes in thermohaline circulation.

    PubMed

    Keigwin, L D; Boyle, E A

    2000-02-15

    Throughout the last glacial cycle, reorganizations of deep ocean water masses were coincident with rapid millennial-scale changes in climate. Climate changes have been less severe during the present interglacial, but evidence for concurrent deep ocean circulation change is ambiguous.

  12. Change Detection in Naturalistic Pictures among Children with Autism

    ERIC Educational Resources Information Center

    Burack, Jacob A.; Joseph, Shari; Russo, Natalie; Shore, David I.; Porporino, Mafalda; Enns, James T.

    2009-01-01

    Persons with autism often show strong reactions to changes in the environment, suggesting that they may detect changes more efficiently than typically developing (TD) persons. However, Fletcher-Watson et al. (Br J Psychol 97:537-554, 2006) reported no differences between adults with autism and TD adults with a change-detection task. In this study,…

  13. Automated Analysis of Planktic Foraminifers Part I: Application to Macroevolution and Climate Change

    NASA Astrophysics Data System (ADS)

    Schmidt, D. N.; Thierstein, H. R.; Bollmann, J.

    2003-04-01

    Size and shape data of microfossils are tedious to gather manually but are efficiently obtained electronically. Using automated microscopy (see Part II, Bollmann et al., this volume) we have collected digitized images and extracted size information of >1.6 million foraminiferal tests in globally distributed Holocene and Cenozoic samples. In the Holocene the maximum sizes and maximum abundances co-occur under similar "optimum" environmental conditions. Total assemblage test sizes show a strong correlation with surface water temperature and stratification: they increase from high to low latitudes and show deviations from this general relationship towards smaller sizes in frontal systems and upwelling areas. Cenozoic size changes in planktic foraminiferal assemblages were pronounced in low latitude and minor in high latitude areas. The global size record reveals the following major features: (a) a dramatic size decrease at the K/T-boundary, followed by (b) an interval of "dwarfs" (Paleocene - late Eocene), (c) a transition period from the late Eocene to the mid-Miocene, when low-latitude assemblages started evolving towards larger size, and (d) an interval of growth to gigantism in tropical and subtropical assemblages from mid-Miocene to Recent. Comparisons with proxy data show that low latitude planktic foraminiferal size increases are positively correlated to polar cooling. We infer that cooling of deep-waters led to enhanced stratification of surface waters and thus to increases in number and stability of ecological niches into which species could evolve. Whether these processes affected all species or only some awaits size analyses of single taxa (see Part III, Schiebel et al., this volume). The observed macroevolutionary size increase of planktic foraminifers in the Cenozoic seems largely abiotically driven.

  14. Can Automated Imaging for Optic Disc and Retinal Nerve Fiber Layer Analysis Aid Glaucoma Detection?

    PubMed Central

    Banister, Katie; Boachie, Charles; Bourne, Rupert; Cook, Jonathan; Burr, Jennifer M.; Ramsay, Craig; Garway-Heath, David; Gray, Joanne; McMeekin, Peter; Hernández, Rodolfo; Azuara-Blanco, Augusto

    2016-01-01

    Purpose To compare the diagnostic performance of automated imaging for glaucoma. Design Prospective, direct comparison study. Participants Adults with suspected glaucoma or ocular hypertension referred to hospital eye services in the United Kingdom. Methods We evaluated 4 automated imaging test algorithms: the Heidelberg Retinal Tomography (HRT; Heidelberg Engineering, Heidelberg, Germany) glaucoma probability score (GPS), the HRT Moorfields regression analysis (MRA), scanning laser polarimetry (GDx enhanced corneal compensation; Glaucoma Diagnostics (GDx), Carl Zeiss Meditec, Dublin, CA) nerve fiber indicator (NFI), and Spectralis optical coherence tomography (OCT; Heidelberg Engineering) retinal nerve fiber layer (RNFL) classification. We defined abnormal tests as an automated classification of outside normal limits for HRT and OCT or NFI ≥ 56 (GDx). We conducted a sensitivity analysis, using borderline abnormal image classifications. The reference standard was clinical diagnosis by a masked glaucoma expert including standardized clinical assessment and automated perimetry. We analyzed 1 eye per patient (the one with more advanced disease). We also evaluated the performance according to severity and using a combination of 2 technologies. Main Outcome Measures Sensitivity and specificity, likelihood ratios, diagnostic, odds ratio, and proportion of indeterminate tests. Results We recruited 955 participants, and 943 were included in the analysis. The average age was 60.5 years (standard deviation, 13.8 years); 51.1% were women. Glaucoma was diagnosed in at least 1 eye in 16.8%; 32% of participants had no glaucoma-related findings. The HRT MRA had the highest sensitivity (87.0%; 95% confidence interval [CI], 80.2%–92.1%), but lowest specificity (63.9%; 95% CI, 60.2%–67.4%); GDx had the lowest sensitivity (35.1%; 95% CI, 27.0%–43.8%), but the highest specificity (97.2%; 95% CI, 95.6%–98.3%). The HRT GPS sensitivity was 81.5% (95% CI, 73.9%–87.6%), and

  15. Change detection on a hunch: pre-attentive vision allows "sensing" of unique feature changes.

    PubMed

    Ball, Felix; Busch, Niko A

    2015-11-01

    Studies on change detection and change blindness have investigated the nature of visual representations by testing the conditions under which observers are able to detect when an object in a complex scene changes from one moment to the next. Several authors have proposed that change detection can occur without identification of the changing object, but the perceptual processes underlying this phenomenon are currently unknown. We hypothesized that change detection without localization or identification occurs when the change happens outside the focus of attention. Such changes would usually go entirely unnoticed, unless the change brings about a modification of one of the feature maps representing the scene. Thus, the appearance or disappearance of a unique feature might be registered even in the absence of focused attention and without feature binding, allowing for change detection, but not localization or identification. We tested this hypothesis in three experiments, in which changes either involved colors that were already present elsewhere in the display or entirely unique colors. Observers detected whether any change had occurred and then localized or identified the change. Change detection without localization occurred almost exclusively when changes involved a unique color. Moreover, change detection without localization for unique feature changes was independent of the number of objects in the display and independent of change identification. These findings suggest that pre-attentive registration of a change on a feature map can give rise to a conscious experience even when feature binding has failed: that something has changed without knowing what or where. PMID:26353860

  16. Change detection on a hunch: pre-attentive vision allows "sensing" of unique feature changes.

    PubMed

    Ball, Felix; Busch, Niko A

    2015-11-01

    Studies on change detection and change blindness have investigated the nature of visual representations by testing the conditions under which observers are able to detect when an object in a complex scene changes from one moment to the next. Several authors have proposed that change detection can occur without identification of the changing object, but the perceptual processes underlying this phenomenon are currently unknown. We hypothesized that change detection without localization or identification occurs when the change happens outside the focus of attention. Such changes would usually go entirely unnoticed, unless the change brings about a modification of one of the feature maps representing the scene. Thus, the appearance or disappearance of a unique feature might be registered even in the absence of focused attention and without feature binding, allowing for change detection, but not localization or identification. We tested this hypothesis in three experiments, in which changes either involved colors that were already present elsewhere in the display or entirely unique colors. Observers detected whether any change had occurred and then localized or identified the change. Change detection without localization occurred almost exclusively when changes involved a unique color. Moreover, change detection without localization for unique feature changes was independent of the number of objects in the display and independent of change identification. These findings suggest that pre-attentive registration of a change on a feature map can give rise to a conscious experience even when feature binding has failed: that something has changed without knowing what or where.

  17. Is a pre-change object representation weakened under correct detection of a change?

    PubMed

    Yeh, Yei-Yu; Yang, Cheng-Ta

    2009-03-01

    We investigated whether a pre-change representation is inhibited or weakened under correct change detection. Two arrays of six objects were rapidly presented for change detection in three experiments. After detection, the perceptual identification of degraded stimuli was tested in Experiments 1 and 2. The weakening of a pre-change representation was not observed under correct detection. The repetition priming effect was observed for a pre-change object and the magnitude was equivalent to the effect for a post-change object. Under change blindness, repetition priming for a pre-change representation was observed when detection did not require report of location in Experiment 1 and was not observed when location was required to be reported in Experiment 2. The results of Experiment 3 showed that a pre-change representation was recognized at a higher rate under correct detection than under change blindness, reflecting a stronger rather than a weaker pre-change representation in the former context.

  18. Object-based change detection: dimension of damage in residential areas of Abu Suruj, Sudan

    NASA Astrophysics Data System (ADS)

    Demharter, Timo; Michel, Ulrich; Ehlers, Manfred; Reinartz, Peter

    2011-11-01

    Given the importance of Change Detection, especially in the field of crisis management, this paper discusses the advantage of object-based Change Detection. This project and the used methods give an opportunity to coordinate relief actions strategically. The principal objective of this project was to develop an algorithm which allows to detect rapidly damaged and destroyed buildings in the area of Abu Suruj. This Sudanese village is located in West-Darfur and has become the victim of civil war. The software eCognition Developer was used to per-form an object-based Change Detection on two panchromatic Quickbird 2 images from two different time slots. The first image shows the area before, the second image shows the area after the massacres in this region. Seeking a classification for the huts of the Sudanese town Abu Suruj was reached by first segmenting the huts and then classifying them on the basis of geo-metrical and brightness-related values. The huts were classified as "new", "destroyed" and "preserved" with the help of a automated algorithm. Finally the results were presented in the form of a map which displays the different conditions of the huts. The accuracy of the project is validated by an accuracy assessment resulting in an Overall Classification Accuracy of 90.50 percent. These change detection results allow aid organizations to provide quick and efficient help where it is needed the most.

  19. Automated identification of abnormal metaphase chromosome cells for the detection of chronic myeloid leukemia using microscopic images

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Zheng, Bin; Li, Shibo; Mulvihill, John J.; Chen, Xiaodong; Liu, Hong

    2010-07-01

    Karyotyping is an important process to classify chromosomes into standard classes and the results are routinely used by the clinicians to diagnose cancers and genetic diseases. However, visual karyotyping using microscopic images is time-consuming and tedious, which reduces the diagnostic efficiency and accuracy. Although many efforts have been made to develop computerized schemes for automated karyotyping, no schemes can get be performed without substantial human intervention. Instead of developing a method to classify all chromosome classes, we develop an automatic scheme to detect abnormal metaphase cells by identifying a specific class of chromosomes (class 22) and prescreen for suspicious chronic myeloid leukemia (CML). The scheme includes three steps: (1) iteratively segment randomly distributed individual chromosomes, (2) process segmented chromosomes and compute image features to identify the candidates, and (3) apply an adaptive matching template to identify chromosomes of class 22. An image data set of 451 metaphase cells extracted from bone marrow specimens of 30 positive and 30 negative cases for CML is selected to test the scheme's performance. The overall case-based classification accuracy is 93.3% (100% sensitivity and 86.7% specificity). The results demonstrate the feasibility of applying an automated scheme to detect or prescreen the suspicious cancer cases.

  20. Detection of Thermometer Clustering in the Calibration of Large Batches of Industrial Thermometers for the LHC by Automated Data Processing

    NASA Astrophysics Data System (ADS)

    Pavese, F.; Ichim, D.; Ciarlini, P.; Balle, C.; Casas-Cubillos, J.

    2003-09-01

    The complete procedure to calibrate thermometers is a complex process, especially when several thousand semiconductor-type thermometers are used and must be individually calibrated, as in the case of the instrumentation of the new Large Hadron Collider (LHC) machine at CERN. Indeed, the similarity of the characteristics of semiconducting thermometers is more limited than that of wire-wound thermometers. The span of the characteristics spread can appear as a homogeneous set, or can show clusters (groups) of characteristics. In the latter case, one of the reasons for clustering may be the fabrication process by batches of numerous devices on the same wafer. Consequently, the detection of the groups can be useful, from the supplier point of view, to give information relevant to improving the fabrication uniformity. From the user point of view, it is useful for making a guess of the possible thermometer stability with time, when this is a must, as in the LHC case. In fact, thermometers showing characteristics outlying or in small clusters should be considered to be potentially anomalous. In addition, the identification of anomalous groups allows the detection of artifacts due to the experimental process. For a large number of thermometers, this analysis requires the use of automated procedures and, consequently, automated decisions that avoid false effects. The paper describes the mathematical methodology adopted for the identification of the clusters, based on the mixed-effect modeling of the thermometer characteristics.

  1. Detection of intestinal parasites by use of the cuvette-based automated microscopy analyser sediMAX(®).

    PubMed

    Intra, J; Taverna, E; Sala, M R; Falbo, R; Cappellini, F; Brambilla, P

    2016-03-01

    Microscopy is the reference method for intestinal parasite identification. The cuvette-based automated microscopy analyser, sediMAX 1, provides 15 digital images of each sediment sample. In this study, we have evaluated this fully automated instrument for detection of enteric parasites, helminths and protozoa. A total of 700 consecutively preserved samples consisting of 60 positive samples (50 protozoa, ten helminths) and 640 negative samples were analysed. Operators were blinded to each others' results. Samples were randomized and were tested both by manual microscopy and sediMAX 1 for parasite recognition. The sediMAX 1 analysis was conducted using a dilution of faecal samples, allowing determination of morphology. The data obtained using sediMAX 1 showed a specificity of 100% and a sensitivity of 100%. Some species of helminths, such as Enterobius vermicularis, Strongyloides stercolaris, the Ancylostoma duodenale/Necator americanus complex, and schistosomes were not considered in this work, because they are rare in stool specimens, are not easily detectable with microscopy analysis, and require specific recovery techniques. This study demonstrated for the first time that sediMAX 1 can be an aid in enteric parasite identification.

  2. Automated characterization of cell shape changes during amoeboid motility by skeletonization

    PubMed Central

    2010-01-01

    Background The ability of a cell to change shape is crucial for the proper function of many cellular processes, including cell migration. One type of cell migration, referred to as amoeboid motility, involves alternating cycles of morphological expansion and retraction. Traditionally, this process has been characterized by a number of parameters providing global information about shape changes, which are insufficient to distinguish phenotypes based on local pseudopodial activities that typify amoeboid motility. Results We developed a method that automatically detects and characterizes pseudopodial behavior of cells. The method uses skeletonization, a technique from morphological image processing to reduce a shape into a series of connected lines. It involves a series of automatic algorithms including image segmentation, boundary smoothing, skeletonization and branch pruning, and takes into account the cell shape changes between successive frames to detect protrusion and retraction activities. In addition, the activities are clustered into different groups, each representing the protruding and retracting history of an individual pseudopod. Conclusions We illustrate the algorithms on movies of chemotaxing Dictyostelium cells and show that our method makes it possible to capture the spatial and temporal dynamics as well as the stochastic features of the pseudopodial behavior. Thus, the method provides a powerful tool for investigating amoeboid motility. PMID:20334652

  3. The utility of automated measures of ocular metrics for detecting driver drowsiness during extended wakefulness.

    PubMed

    Jackson, Melinda L; Kennedy, Gerard A; Clarke, Catherine; Gullo, Melissa; Swann, Philip; Downey, Luke A; Hayley, Amie C; Pierce, Rob J; Howard, Mark E

    2016-02-01

    Slowed eyelid closure coupled with increased duration and frequency of closure is associated with drowsiness. This study assessed the utility of two devices for automated measurement of slow eyelid closure in a standard poor performance condition (alcohol) and following 12-h sleep deprivation. Twenty-two healthy participants (mean age=20.8 (SD 1.9) years) with no history of sleep disorders participated in the study. Participants underwent one baseline and one counterbalanced session each over two weeks; one 24-hour period of sleep deprivation, and one daytime session during which alcohol was consumed after a normal night of sleep. Participants completed a test battery consisting of a 30-min simulated driving task, a 10-min Psychomotor Vigilance Task (PVT) and the Karolinska Sleepiness Scale (KSS) each in two baseline sessions, and in two randomised, counterbalanced experimental sessions; following sleep deprivation and following alcohol consumption. Eyelid closure was measured during both tasks using two automated devices (Copilot and Optalert™). There was an increase in the proportion of time with eyelids closed and the Johns Drowsiness Score (incorporating relative velocity of eyelid movements) following sleep deprivation using Optalert (p<0.05 for both). These measures correlated significantly with crashes, PVT lapses and subjective sleepiness (r-values 0.46-0.69, p<0.05). No difference between the two sessions for PERCLOS recorded during the PVT or the driving task as measured by the Copilot. The duration of eyelid closure predicted frequent lapses following sleep deprivation (which were equivalent to the average lapses at a blood alcohol concentration of 0.05% - area under curve for ROC curve 0.87, p<0.01). The duration of time with slow eyelid closure, assessed by the automated devices, increased following sleep deprivation and was associated with deterioration in psychomotor performance and subjective sleepiness. Comprehensive algorithms inclusive of

  4. Applicability of day-to-day variation in behavior for the automated detection of lameness in dairy cows.

    PubMed

    de Mol, R M; André, G; Bleumer, E J B; van der Werf, J T N; de Haas, Y; van Reenen, C G

    2013-06-01

    Lameness is a major problem in modern dairy husbandry and has welfare implications and other negative consequences. The behavior of dairy cows is influenced by lameness. Automated lameness detection can, among other methods, be based on day-to-day variation in animal behavior. Activity sensors that measure lying time, number of lying bouts, and other parameters were used to record behavior per cow per day. The objective of this research was to develop and validate a lameness detection model based on daily activity data. Besides the activity data, milking data and data from the computerized concentrate feeders were available as input data. Locomotion scores were available as reference data. Data from up to 100 cows collected at an experimental farm during 23 mo in 2010 and 2011 were available for model development. Behavior is cow-dependent, and therefore quadratic trend models were fitted with a dynamic linear model on-line per cow for 7 activity variables and 2 other variables (milk yield per day and concentrate leftovers per day). It is assumed that lameness develops gradually; therefore, a lameness alert was given when the linear trend in 2 or more of the 9 models differed significantly from zero in a direction that corresponded with lameness symptoms. The developed model was validated during the first 4 mo of 2012 with almost 100 cows on the same farm by generating lameness alerts each week. Performance on the model validation data set was comparable with performance on the model development data set. The overall sensitivity (percentage of detected lameness cases) was 85.5% combined with specificity (percentage of nonlame cow-days that were not alerted) of 88.8%. All variables contributed to this performance. These results indicate that automated lameness detection based on day-to-day variation in behavior is a useful tool for dairy management. PMID:23548300

  5. Occupancy change detection system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-01

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes instructions for producing an occupancy grid map of an environment around the robot, scanning the environment to generate a current obstacle map relative to a current robot position, and converting the current obstacle map to a current occupancy grid map. The instructions also include processing each grid cell in the occupancy grid map. Within the processing of each grid cell, the instructions include comparing each grid cell in the occupancy grid map to a corresponding grid cell in the current occupancy grid map. For grid cells with a difference, the instructions include defining a change vector for each changed grid cell, wherein the change vector includes a direction from the robot to the changed grid cell and a range from the robot to the changed grid cell.

  6. Automated detection and reporting of Volatile Organic Compounds (VOCs) in complex environments

    SciTech Connect

    Hargis, P.J. Jr.; Preppernau, B.L.; Osbourn, G.C.

    1997-03-01

    This paper describes results from efforts to develop VOC sensing systems based on two complementary techniques. The first technique used a gated channeltron detector for resonant laser-induced multiphoton photoionization detection of trace organic vapors in a supersonic molecular beam. The channeltron was gated using a relatively simple circuit to generate a negative gate pulse with a width of 400 ns (FWHM), a 50 ns turn-on (rise) time, a 1.5 {mu}s turn-off (decay) time, a pulse amplitude of {minus}1000 Volts, and a DC offset adjustable from zero to {minus}1500 Volts. The gated channeltron allows rejection of spurious responses to UV laser light scattered directly into the channeltron and time-delayed ionization signals induced by photoionization of residual gas in the vacuum chamber. Detection limits in the part-per-trillion range have been demonstrated with the gated detector. The second technique used arrays of surface acoustic wave (SAW) devices coated with various chemically selective materials (e.g., polymers, self assembled monolayers) to provide unique response patterns to various chemical analytes. This work focused on polymers, formed by spin casting from solution or by plasma polymerization, as well as on self assembled monolayers. Response from coated SAWs to various concentrations of water, volatile organics, and organophosphonates (chemical warfare agent simulants) were used to provide calibration data. A novel visual empirical region of influence (VIERI) pattern recognition technique was used to evaluate the ability to use these response patterns to correctly identify chemical species. This investigation shows how the VERI technique can be used to determine the best set of coatings for an array, to predict the performance of the array even if sensor responses change due to aging of the coating materials, and to identify unknown analytes based on previous calibration data.

  7. Detecting Temporal Change in Watershed Nutrient Yields

    NASA Astrophysics Data System (ADS)

    Wickham, James D.; Wade, Timothy G.; Riitters, Kurt H.

    2008-08-01

    Meta-analyses reveal that nutrient yields tend to be higher for watersheds dominated by anthropogenic uses (e.g., urban, agriculture) and lower for watersheds dominated by natural vegetation. One implication of this pattern is that loss of natural vegetation will produce increases in watershed nutrient yields. Yet, the same meta-analyses also reveal that, absent land-cover change, watershed nutrient yields vary from one year to the next due to many exogenous factors. The interacting effects of land cover and exogenous factors suggest nutrient yields should be treated as distributions, and the effect of land-cover change should be examined by looking for significant changes in the distributions. We compiled nutrient yield distributions from published data. The published data included watersheds with homogeneous land cover that typically reported two or more years of annual nutrient yields for the same watershed. These data were used to construct statistical models, and the models were used to estimate changes in the nutrient yield distributions as a result of land-cover change. Land-cover changes were derived from the National Land Cover Database (NLCD). Total nitrogen (TN) yield distributions increased significantly for 35 of 1550 watersheds and decreased significantly for 51. Total phosphorus (TP) yield distributions increased significantly for 142 watersheds and decreased significantly for 17. The amount of land-cover change required to produce significant shifts in nutrient yield distributions was not constant. Small land-cover changes led to significant shifts in nutrient yield distributions when watersheds were dominated by natural vegetation, whereas much larger land-cover changes were needed to produce significant shifts when watersheds were dominated by urban or agriculture. We discuss our results in the context of the Clean Water Act.

  8. Detecting temporal change in watershed nutrient yields.

    PubMed

    Wickham, James D; Wade, Timothy G; Riitters, Kurt H

    2008-08-01

    Meta-analyses reveal that nutrient yields tend to be higher for watersheds dominated by anthropogenic uses (e.g., urban, agriculture) and lower for watersheds dominated by natural vegetation. One implication of this pattern is that loss of natural vegetation will produce increases in watershed nutrient yields. Yet, the same meta-analyses also reveal that, absent land-cover change, watershed nutrient yields vary from one year to the next due to many exogenous factors. The interacting effects of land cover and exogenous factors suggest nutrient yields should be treated as distributions, and the effect of land-cover change should be examined by looking for significant changes in the distributions. We compiled nutrient yield distributions from published data. The published data included watersheds with homogeneous land cover that typically reported two or more years of annual nutrient yields for the same watershed. These data were used to construct statistical models, and the models were used to estimate changes in the nutrient yield distributions as a result of land-cover change. Land-cover changes were derived from the National Land Cover Database (NLCD). Total nitrogen (TN) yield distributions increased significantly for 35 of 1550 watersheds and decreased significantly for 51. Total phosphorus (TP) yield distributions increased significantly for 142 watersheds and decreased significantly for 17. The amount of land-cover change required to produce significant shifts in nutrient yield distributions was not constant. Small land-cover changes led to significant shifts in nutrient yield distributions when watersheds were dominated by natural vegetation, whereas much larger land-cover changes were needed to produce significant shifts when watersheds were dominated by urban or agriculture. We discuss our results in the context of the Clean Water Act. PMID:18446405

  9. Understanding reliance on automation: effects of error type, error distribution, age and experience

    PubMed Central

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  10. Robust detection and classification of longitudinal changes in color retinal fundus images for monitoring diabetic retinopathy.

    PubMed

    Narasimha-Iyer, Harihar; Can, Ali; Roysam, Badrinath; Stewart, Charles V; Tanenbaum, Howard L; Majerovics, Anna; Singh, Hanumant

    2006-06-01

    A fully automated approach is presented for robust detection and classification of changes in longitudinal time-series of color retinal fundus images of diabetic retinopathy. The method is robust to: 1) spatial variations in illumination resulting from instrument limitations and changes both within, and between patient visits; 2) imaging artifacts such as dust particles; 3) outliers in the training data; 4) segmentation and alignment errors. Robustness to illumination variation is achieved by a novel iterative algorithm to estimate the reflectance of the retina exploiting automatically extracted segmentations of the retinal vasculature, optic disk, fovea, and pathologies. Robustness to dust artifacts is achieved by exploiting their spectral characteristics, enabling application to film-based, as well as digital imaging systems. False changes from alignment errors are minimized by subpixel accuracy registration using a 12-parameter transformation that accounts for unknown retinal curvature and camera parameters. Bayesian detection and classification algorithms are used to generate a color-coded output that is readily inspected. A multiobserver validation on 43 image pairs from 22 eyes involving nonproliferative and proliferative diabetic retinopathies, showed a 97% change detection rate, a 3% miss rate, and a 10% false alarm rate. The performance in correctly classifying the changes was 99.3%. A self-consistency metric, and an error factor were developed to measure performance over more than two periods. The average self consistency was 94% and the error factor was 0.06%. Although this study focuses on diabetic changes, the proposed techniques have broader applicability in ophthalmology.

  11. The