Sample records for quality detection method

  1. Detecting grouting quality of tendon ducts using the impact-echo method

    NASA Astrophysics Data System (ADS)

    Qu, Guangzhen; Sun, Min; Zhou, Guangli

    2018-06-01

    The performance, durability and safety of prestressed concrete bridge were directly affected by the compaction of prestressed pipe. However, the pipe was hidden in the beam, and its grouting density was difficult to detect. The paper had modified three different status of gouting quality through making test model. After that, the impact-Echo method was adopted to detect the grouting quality of tendon ducts, the study was sunmmarized as follow. If the reflect time of slab bottom and nominal thickness of slab increased, the degree of density will increase; testing from half-hole of web, the reflect time and nominal thickness of slab was biggest. At the same time, the reflect time of compacted and uncompacted tendon ducts were mainly. At last, the method was verified by the engineering project, which provided reference value.

  2. Quality detection system and method of micro-accessory based on microscopic vision

    NASA Astrophysics Data System (ADS)

    Li, Dongjie; Wang, Shiwei; Fu, Yu

    2017-10-01

    Considering that the traditional manual detection of micro-accessory has some problems, such as heavy workload, low efficiency and large artificial error, a kind of quality inspection system of micro-accessory has been designed. Micro-vision technology has been used to inspect quality, which optimizes the structure of the detection system. The stepper motor is used to drive the rotating micro-platform to transfer quarantine device and the microscopic vision system is applied to get graphic information of micro-accessory. The methods of image processing and pattern matching, the variable scale Sobel differential edge detection algorithm and the improved Zernike moments sub-pixel edge detection algorithm are combined in the system in order to achieve a more detailed and accurate edge of the defect detection. The grade at the edge of the complex signal can be achieved accurately by extracting through the proposed system, and then it can distinguish the qualified products and unqualified products with high precision recognition.

  3. A simple method for low-contrast detectability, image quality and dose optimisation with CT iterative reconstruction algorithms and model observers.

    PubMed

    Bellesi, Luca; Wyttenbach, Rolf; Gaudino, Diego; Colleoni, Paolo; Pupillo, Francesco; Carrara, Mauro; Braghetti, Antonio; Puligheddu, Carla; Presilla, Stefano

    2017-01-01

    The aim of this work was to evaluate detection of low-contrast objects and image quality in computed tomography (CT) phantom images acquired at different tube loadings (i.e. mAs) and reconstructed with different algorithms, in order to find appropriate settings to reduce the dose to the patient without any image detriment. Images of supraslice low-contrast objects of a CT phantom were acquired using different mAs values. Images were reconstructed using filtered back projection (FBP), hybrid and iterative model-based methods. Image quality parameters were evaluated in terms of modulation transfer function; noise, and uniformity using two software resources. For the definition of low-contrast detectability, studies based on both human (i.e. four-alternative forced-choice test) and model observers were performed across the various images. Compared to FBP, image quality parameters were improved by using iterative reconstruction (IR) algorithms. In particular, IR model-based methods provided a 60% noise reduction and a 70% dose reduction, preserving image quality and low-contrast detectability for human radiological evaluation. According to the model observer, the diameters of the minimum detectable detail were around 2 mm (up to 100 mAs). Below 100 mAs, the model observer was unable to provide a result. IR methods improve CT protocol quality, providing a potential dose reduction while maintaining a good image detectability. Model observer can in principle be useful to assist human performance in CT low-contrast detection tasks and in dose optimisation.

  4. Detection of coliform bacteria and Escherichia coli by multiplex polymerase chain reaction: comparison with defined substrate and plating methods for water quality monitoring.

    PubMed Central

    Bej, A K; McCarty, S C; Atlas, R M

    1991-01-01

    Multiplex polymerase chain reaction (PCR) and gene probe detection of target lacZ and uidA genes were used to detect total coliform bacteria and Escherichia coli, respectively, for determining water quality. In tests of environmental water samples, the lacZ PCR method gave results statistically equivalent to those of the plate count and defined substrate methods accepted by the U.S. Environmental Protection Agency for water quality monitoring and the uidA PCR method was more sensitive than 4-methylumbelliferyl-beta-D-glucuronide-based defined substrate tests for specific detection of E. coli. Images PMID:1768116

  5. Anomaly Detection in Power Quality at Data Centers

    NASA Technical Reports Server (NTRS)

    Grichine, Art; Solano, Wanda M.

    2015-01-01

    The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.

  6. Outlier Detection in Urban Air Quality Sensor Networks.

    PubMed

    van Zoest, V M; Stein, A; Hoek, G

    2018-01-01

    Low-cost urban air quality sensor networks are increasingly used to study the spatio-temporal variability in air pollutant concentrations. Recently installed low-cost urban sensors, however, are more prone to result in erroneous data than conventional monitors, e.g., leading to outliers. Commonly applied outlier detection methods are unsuitable for air pollutant measurements that have large spatial and temporal variations as occur in urban areas. We present a novel outlier detection method based upon a spatio-temporal classification, focusing on hourly NO 2 concentrations. We divide a full year's observations into 16 spatio-temporal classes, reflecting urban background vs. urban traffic stations, weekdays vs. weekends, and four periods per day. For each spatio-temporal class, we detect outliers using the mean and standard deviation of the normal distribution underlying the truncated normal distribution of the NO 2 observations. Applying this method to a low-cost air quality sensor network in the city of Eindhoven, the Netherlands, we found 0.1-0.5% of outliers. Outliers could reflect measurement errors or unusual high air pollution events. Additional evaluation using expert knowledge is needed to decide on treatment of the identified outliers. We conclude that our method is able to detect outliers while maintaining the spatio-temporal variability of air pollutant concentrations in urban areas.

  7. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery

  8. Image Quality Ranking Method for Microscopy

    PubMed Central

    Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.

    2016-01-01

    Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703

  9. Effect of image quality on calcification detection in digital mammography

    PubMed Central

    Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C.

    2012-01-01

    Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC

  10. [A review on studies and applications of near infrared spectroscopy technique(NIRS) in detecting quality of hay].

    PubMed

    Ding, Wu-Rong; Gan, You-Min; Guo, Xu-Sheng; Yang, Fu-Yu

    2009-02-01

    The quality of hay can directly affect the price of hay and also livestock productivity. Many kinds of methods have been developed for detecting the quality of hay and the method of near infrared spectroscopy (NIRS) has been widely used with consideration of its fast, effective and nondestructive characteristics during detecting process. In the present paper, the feasibility and effectiveness of application of NIRS to detecting hay quality were expounded. Meanwhile, the advance in the study of using NIRS to detect chemical compositions, extent of incursion by epiphyte, amount of toxicant excreted by endogenetic epiphyte and some minim components that can not be detected by using chemical methods were also introduced detailedly. Based on the review of the progresses in using NIRS to detect the quality of hay, it can be concluded that using NIRS to detect hay quality can avoid the disadvantages of time wasting, complication and high cost when using traditional chemical method. And for better utilization of NIRS in practice, some more studies still need to be implemented to further perfect and improve the utilization of NIRS for detecting forage quality, and more accurate modes and systematic analysis software need to be established in times to come.

  11. Method and apparatus for detecting neutrons

    DOEpatents

    Perkins, Richard W.; Reeder, Paul L.; Wogman, Ned A.; Warner, Ray A.; Brite, Daniel W.; Richey, Wayne C.; Goldman, Don S.

    1997-01-01

    The instant invention is a method for making and using an apparatus for detecting neutrons. Scintillating optical fibers are fabricated by melting SiO.sub.2 with a thermal neutron capturing substance and a scintillating material in a reducing atmosphere. The melt is then drawn into fibers in an anoxic atmosphere. The fibers may then be coated and used directly in a neutron detection apparatus, or assembled into a geometrical array in a second, hydrogen-rich, scintillating material such as a polymer. Photons generated by interaction with thermal neutrons are trapped within the coated fibers and are directed to photoelectric converters. A measurable electronic signal is generated for each thermal neutron interaction within the fiber. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. When the fibers are arranged in an array within a second scintillating material, photons generated by kinetic neutrons interacting with the second scintillating material and photons generated by thermal neutron capture within the fiber can both be directed to photoelectric converters. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation.

  12. Automatic detection of retina disease: robustness to image quality and localization of anatomy structure.

    PubMed

    Karnowski, T P; Aykac, D; Giancardo, L; Li, Y; Nichols, T; Tobin, K W; Chaum, E

    2011-01-01

    The automated detection of diabetic retinopathy and other eye diseases in images of the retina has great promise as a low-cost method for broad-based screening. Many systems in the literature which perform automated detection include a quality estimation step and physiological feature detection, including the vascular tree and the optic nerve / macula location. In this work, we study the robustness of an automated disease detection method with respect to the accuracy of the optic nerve location and the quality of the images obtained as judged by a quality estimation algorithm. The detection algorithm features microaneurysm and exudate detection followed by feature extraction on the detected population to describe the overall retina image. Labeled images of retinas ground-truthed to disease states are used to train a supervised learning algorithm to identify the disease state of the retina image and exam set. Under the restrictions of high confidence optic nerve detections and good quality imagery, the system achieves a sensitivity and specificity of 94.8% and 78.7% with area-under-curve of 95.3%. Analysis of the effect of constraining quality and the distinction between mild non-proliferative diabetic retinopathy, normal retina images, and more severe disease states is included.

  13. Applications of hyperspectral imaging in chicken meat safety and quality detection and evaluation: a review.

    PubMed

    Xiong, Zhenjie; Xie, Anguo; Sun, Da-Wen; Zeng, Xin-An; Liu, Dan

    2015-01-01

    Currently, the issue of food safety and quality is a great public concern. In order to satisfy the demands of consumers and obtain superior food qualities, non-destructive and fast methods are required for quality evaluation. As one of these methods, hyperspectral imaging (HSI) technique has emerged as a smart and promising analytical tool for quality evaluation purposes and has attracted much interest in non-destructive analysis of different food products. With the main advantage of combining both spectroscopy technique and imaging technique, HSI technique shows a convinced attitude to detect and evaluate chicken meat quality objectively. Moreover, developing a quality evaluation system based on HSI technology would bring economic benefits to the chicken meat industry. Therefore, in recent years, many studies have been conducted on using HSI technology for the safety and quality detection and evaluation of chicken meat. The aim of this review is thus to give a detailed overview about HSI and focus on the recently developed methods exerted in HSI technology developed for microbiological spoilage detection and quality classification of chicken meat. Moreover, the usefulness of HSI technique for detecting fecal contamination and bone fragments of chicken carcasses are presented. Finally, some viewpoints on its future research and applicability in the modern poultry industry are proposed.

  14. Method and apparatus for detecting neutrons

    DOEpatents

    Perkins, R.W.; Reeder, P.L.; Wogman, N.A.; Warner, R.A.; Brite, D.W.; Richey, W.C.; Goldman, D.S.

    1997-10-21

    The instant invention is a method for making and using an apparatus for detecting neutrons. Scintillating optical fibers are fabricated by melting SiO{sub 2} with a thermal neutron capturing substance and a scintillating material in a reducing atmosphere. The melt is then drawn into fibers in an anoxic atmosphere. The fibers may then be coated and used directly in a neutron detection apparatus, or assembled into a geometrical array in a second, hydrogen-rich, scintillating material such as a polymer. Photons generated by interaction with thermal neutrons are trapped within the coated fibers and are directed to photoelectric converters. A measurable electronic signal is generated for each thermal neutron interaction within the fiber. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. When the fibers are arranged in an array within a second scintillating material, photons generated by kinetic neutrons interacting with the second scintillating material and photons generated by thermal neutron capture within the fiber can both be directed to photoelectric converters. These electronic signals are then manipulated, stored, and interpreted by normal methods to infer the quality and quantity of incident radiation. 5 figs.

  15. Effect of image quality on calcification detection in digital mammography.

    PubMed

    Warren, Lucy M; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M; Wallis, Matthew G; Chakraborty, Dev P; Dance, David R; Bosmans, Hilde; Young, Kenneth C

    2012-06-01

    This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC (AFROC) area decreased from

  16. Shallow Reflection Method for Water-Filled Void Detection and Characterization

    NASA Astrophysics Data System (ADS)

    Zahari, M. N. H.; Madun, A.; Dahlan, S. H.; Joret, A.; Hazreek, Z. A. M.; Mohammad, A. H.; Izzaty, R. A.

    2018-04-01

    Shallow investigation is crucial in enhancing the characteristics of subsurface void commonly encountered in civil engineering, and one such technique commonly used is seismic-reflection technique. An assessment of the effectiveness of such an approach is critical to determine whether the quality of the works meets the prescribed requirements. Conventional quality testing suffers limitations including: limited coverage (both area and depth) and problems with resolution quality. Traditionally quality assurance measurements use laboratory and in-situ invasive and destructive tests. However geophysical approaches, which are typically non-invasive and non-destructive, offer a method by which improvement of detection can be measured in a cost-effective way. Of this seismic reflection have proved useful to assess void characteristic, this paper evaluates the application of shallow seismic-reflection method in characterizing the water-filled void properties at 0.34 m depth, specifically for detection and characterization of void measurement using 2-dimensional tomography.

  17. Automated macromolecular crystal detection system and method

    DOEpatents

    Christian, Allen T [Tracy, CA; Segelke, Brent [San Ramon, CA; Rupp, Bernard [Livermore, CA; Toppani, Dominique [Fontainebleau, FR

    2007-06-05

    An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.

  18. New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the U.S. Geological Survey National Water Quality Laboratory

    USGS Publications Warehouse

    Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.

    1999-01-01

    This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if

  19. A portable device for rapid nondestructive detection of fresh meat quality

    NASA Astrophysics Data System (ADS)

    Lin, Wan; Peng, Yankun

    2014-05-01

    Quality attributes of fresh meat influence nutritional value and consumers' purchasing power. In order to meet the demand of inspection department for portable device, a rapid and nondestructive detection device for fresh meat quality based on ARM (Advanced RISC Machines) processor and VIS/NIR technology was designed. Working principal, hardware composition, software system and functional test were introduced. Hardware system consisted of ARM processing unit, light source unit, detection probe unit, spectral data acquisition unit, LCD (Liquid Crystal Display) touch screen display unit, power unit and the cooling unit. Linux operating system and quality parameters acquisition processing application were designed. This system has realized collecting spectral signal, storing, displaying and processing as integration with the weight of 3.5 kg. 40 pieces of beef were used in experiment to validate the stability and reliability. The results indicated that prediction model developed using PLSR method using SNV as pre-processing method had good performance, with the correlation coefficient of 0.90 and root mean square error of 1.56 for validation set for L*, 0.95 and 1.74 for a*,0.94 and 0.59 for b*, 0.88 and 0.13 for pH, 0.79 and 12.46 for tenderness, 0.89 and 0.91 for water content, respectively. The experimental result shows that this device can be a useful tool for detecting quality of meat.

  20. The advance of non-invasive detection methods in osteoarthritis

    NASA Astrophysics Data System (ADS)

    Dai, Jiao; Chen, Yanping

    2011-06-01

    Osteoarthritis (OA) is one of the most prevalent chronic diseases which badly affected the patients' living quality and economy. Detection and evaluation technology can provide basic information for early treatment. A variety of imaging methods in OA were reviewed, such as conventional X-ray, computed tomography (CT), ultrasound (US), magnetic resonance imaging (MRI) and near-infrared spectroscopy (NIRS). Among the existing imaging modalities, the spatial resolution of X-ray is extremely high; CT is a three-dimensional method, which has high density resolution; US as an evaluation method of knee OA discriminates lesions sensitively between normal cartilage and degenerative one; as a sensitive and nonionizing method, MRI is suitable for the detection of early OA, but the cost is too expensive for routine use; NIRS is a safe, low cost modality, and is also good at detecting early stage OA. In a word, each method has its own advantages, but NIRS is provided with broader application prospect, and it is likely to be used in clinical daily routine and become the golden standard for diagnostic detection.

  1. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles

    NASA Astrophysics Data System (ADS)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  2. Nonparametric rank regression for analyzing water quality concentration data with multiple detection limits.

    PubMed

    Fu, Liya; Wang, You-Gan

    2011-02-15

    Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which clearly demonstrates the advantages of the rank regression models.

  3. Infrared machine vision system for the automatic detection of olive fruit quality.

    PubMed

    Guzmán, Elena; Baeten, Vincent; Pierna, Juan Antonio Fernández; García-Mesa, José A

    2013-11-15

    External quality is an important factor in the extraction of olive oil and the marketing of olive fruits. The appearance and presence of external damage are factors that influence the quality of the oil extracted and the perception of consumers, determining the level of acceptance prior to purchase in the case of table olives. The aim of this paper is to report on artificial vision techniques developed for the online estimation of olive quality and to assess the effectiveness of these techniques in evaluating quality based on detecting external defects. This method of classifying olives according to the presence of defects is based on an infrared (IR) vision system. Images of defects were acquired using a digital monochrome camera with band-pass filters on near-infrared (NIR). The original images were processed using segmentation algorithms, edge detection and pixel value intensity to classify the whole fruit. The detection of the defect involved a pixel classification procedure based on nonparametric models of the healthy and defective areas of olives. Classification tests were performed on olives to assess the effectiveness of the proposed method. This research showed that the IR vision system is a useful technology for the automatic assessment of olives that has the potential for use in offline inspection and for online sorting for defects and the presence of surface damage, easily distinguishing those that do not meet minimum quality requirements. Crown Copyright © 2013 Published by Elsevier B.V. All rights reserved.

  4. High-quality JPEG compression history detection for fake uncompressed images

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Wang, Rang-Ding; Guo, Li-Jun; Jiang, Bao-Chuan

    2017-05-01

    Authenticity is one of the most important evaluation factors of images for photography competitions or journalism. Unusual compression history of an image often implies the illicit intent of its author. Our work aims at distinguishing real uncompressed images from fake uncompressed images that are saved in uncompressed formats but have been previously compressed. To detect the potential image JPEG compression, we analyze the JPEG compression artifacts based on the tetrolet covering, which corresponds to the local image geometrical structure. Since the compression can alter the structure information, the tetrolet covering indexes may be changed if a compression is performed on the test image. Such changes can provide valuable clues about the image compression history. To be specific, the test image is first compressed with different quality factors to generate a set of temporary images. Then, the test image is compared with each temporary image block-by-block to investigate whether the tetrolet covering index of each 4×4 block is different between them. The percentages of the changed tetrolet covering indexes corresponding to the quality factors (from low to high) are computed and used to form the p-curve, the local minimum of which may indicate the potential compression. Our experimental results demonstrate the advantage of our method to detect JPEG compressions of high quality, even the highest quality factors such as 98, 99, or 100 of the standard JPEG compression, from uncompressed-format images. At the same time, our detection algorithm can accurately identify the corresponding compression quality factor.

  5. A study on real-time low-quality content detection on Twitter from the users' perspective.

    PubMed

    Chen, Weiling; Yeo, Chai Kiat; Lau, Chiew Tong; Lee, Bu Sung

    2017-01-01

    Detection techniques of malicious content such as spam and phishing on Online Social Networks (OSN) are common with little attention paid to other types of low-quality content which actually impacts users' content browsing experience most. The aim of our work is to detect low-quality content from the users' perspective in real time. To define low-quality content comprehensibly, Expectation Maximization (EM) algorithm is first used to coarsely classify low-quality tweets into four categories. Based on this preliminary study, a survey is carefully designed to gather users' opinions on different categories of low-quality content. Both direct and indirect features including newly proposed features are identified to characterize all types of low-quality content. We then further combine word level analysis with the identified features and build a keyword blacklist dictionary to improve the detection performance. We manually label an extensive Twitter dataset of 100,000 tweets and perform low-quality content detection in real time based on the characterized significant features and word level analysis. The results of our research show that our method has a high accuracy of 0.9711 and a good F1 of 0.8379 based on a random forest classifier with real time performance in the detection of low-quality content in tweets. Our work therefore achieves a positive impact in improving user experience in browsing social media content.

  6. A study on real-time low-quality content detection on Twitter from the users’ perspective

    PubMed Central

    Yeo, Chai Kiat; Lau, Chiew Tong; Lee, Bu Sung

    2017-01-01

    Detection techniques of malicious content such as spam and phishing on Online Social Networks (OSN) are common with little attention paid to other types of low-quality content which actually impacts users’ content browsing experience most. The aim of our work is to detect low-quality content from the users’ perspective in real time. To define low-quality content comprehensibly, Expectation Maximization (EM) algorithm is first used to coarsely classify low-quality tweets into four categories. Based on this preliminary study, a survey is carefully designed to gather users’ opinions on different categories of low-quality content. Both direct and indirect features including newly proposed features are identified to characterize all types of low-quality content. We then further combine word level analysis with the identified features and build a keyword blacklist dictionary to improve the detection performance. We manually label an extensive Twitter dataset of 100,000 tweets and perform low-quality content detection in real time based on the characterized significant features and word level analysis. The results of our research show that our method has a high accuracy of 0.9711 and a good F1 of 0.8379 based on a random forest classifier with real time performance in the detection of low-quality content in tweets. Our work therefore achieves a positive impact in improving user experience in browsing social media content. PMID:28793347

  7. DEVELOPMENT OF MOLECULAR METHODS TO DETECT ...

    EPA Pesticide Factsheets

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are being developed. Current studies focus on detecting two virus groups, the caliciviruses and the hepatitis E virus strains, both of which have been found to cause significant outbraks via contaminated drinking water. Once developed, these methods will be used to collect occurrence data for risk assessment studies. Develop sensitive techniques to detect and identify emerging human waterborne pathogenic viruses and viruses on the CCL.Determine effectiveness of viral indicators to measure microbial quality in water matrices.Support activities: (a) culture and distribution of mammalian cells for Agency and scientific community research needs, (b) provide operator expertise for research requiring confocal and electron microscopy, (c) glassware cleaning, sterilization and biological waste disposal for the Cincinnati EPA facility, (d) operation of infectious pathogenic suite, (e) maintenance of walk-in constant temperature rooms and (f) provide Giardia cysts.

  8. A nonlinear quality-related fault detection approach based on modified kernel partial least squares.

    PubMed

    Jiao, Jianfang; Zhao, Ning; Wang, Guang; Yin, Shen

    2017-01-01

    In this paper, a new nonlinear quality-related fault detection method is proposed based on kernel partial least squares (KPLS) model. To deal with the nonlinear characteristics among process variables, the proposed method maps these original variables into feature space in which the linear relationship between kernel matrix and output matrix is realized by means of KPLS. Then the kernel matrix is decomposed into two orthogonal parts by singular value decomposition (SVD) and the statistics for each part are determined appropriately for the purpose of quality-related fault detection. Compared with relevant existing nonlinear approaches, the proposed method has the advantages of simple diagnosis logic and stable performance. A widely used literature example and an industrial process are used for the performance evaluation for the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Critical considerations for the application of environmental DNA methods to detect aquatic species

    USGS Publications Warehouse

    Goldberg, Caren S.; Turner, Cameron R.; Deiner, Kristy; Klymus, Katy E.; Thomsen, Philip Francis; Murphy, Melanie A.; Spear, Stephen F.; McKee, Anna; Oyler-McCance, Sara J.; Cornman, Robert S.; Laramie, Matthew B.; Mahon, Andrew R.; Lance, Richard F.; Pilliod, David S.; Strickler, Katherine M.; Waits, Lisette P.; Fremier, Alexander K.; Takahara, Teruhiko; Herder, Jelger E.; Taberlet, Pierre

    2016-01-01

    Species detection using environmental DNA (eDNA) has tremendous potential for contributing to the understanding of the ecology and conservation of aquatic species. Detecting species using eDNA methods, rather than directly sampling the organisms, can reduce impacts on sensitive species and increase the power of field surveys for rare and elusive species. The sensitivity of eDNA methods, however, requires a heightened awareness and attention to quality assurance and quality control protocols. Additionally, the interpretation of eDNA data demands careful consideration of multiple factors. As eDNA methods have grown in application, diverse approaches have been implemented to address these issues. With interest in eDNA continuing to expand, supportive guidelines for undertaking eDNA studies are greatly needed.Environmental DNA researchers from around the world have collaborated to produce this set of guidelines and considerations for implementing eDNA methods to detect aquatic macroorganisms.Critical considerations for study design include preventing contamination in the field and the laboratory, choosing appropriate sample analysis methods, validating assays, testing for sample inhibition and following minimum reporting guidelines. Critical considerations for inference include temporal and spatial processes, limits of correlation of eDNA with abundance, uncertainty of positive and negative results, and potential sources of allochthonous DNA.We present a synthesis of knowledge at this stage for application of this new and powerful detection method.

  10. QRS detection based ECG quality assessment.

    PubMed

    Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter

    2012-09-01

    Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available.

  11. RAPID, PCR-BASED METHODS FOR MEASURING THE QUALITY OF BATHING BEACH WATERS

    EPA Science Inventory

    The current methods for measuring the quality of recreational waters were developed in the 1970's and were recommended to the States by EPA in 1986. These methods detect and quantify Escherichia coli and enterococci, two bacteria that are consistently associated with fecal wast...

  12. Improving image quality for digital breast tomosynthesis: an automated detection and diffusion-based method for metal artifact reduction

    NASA Astrophysics Data System (ADS)

    Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2017-10-01

    In digital breast tomosynthesis (DBT), the high-attenuation metallic clips marking a previous biopsy site in the breast cause errors in the estimation of attenuation along the ray paths intersecting the markers during reconstruction, which result in interplane and inplane artifacts obscuring the visibility of subtle lesions. We proposed a new metal artifact reduction (MAR) method to improve image quality. Our method uses automatic detection and segmentation to generate a marker location map for each projection (PV). A voting technique based on the geometric correlation among different PVs is designed to reduce false positives (FPs) and to label the pixels on the PVs and the voxels in the imaged volume that represent the location and shape of the markers. An iterative diffusion method replaces the labeled pixels on the PVs with estimated tissue intensity from the neighboring regions while preserving the original pixel values in the neighboring regions. The inpainted PVs are then used for DBT reconstruction. The markers are repainted on the reconstructed DBT slices for radiologists’ information. The MAR method is independent of reconstruction techniques or acquisition geometry. For the training set, the method achieved 100% success rate with one FP in 19 views. For the test set, the success rate by view was 97.2% for core biopsy microclips and 66.7% for clusters of large post-lumpectomy markers with a total of 10 FPs in 58 views. All FPs were large dense benign calcifications that also generated artifacts if they were not corrected by MAR. For the views with successful detection, the metal artifacts were reduced to a level that was not visually apparent in the reconstructed slices. The visibility of breast lesions obscured by the reconstruction artifacts from the metallic markers was restored.

  13. Waterborne Pathogens: Detection Methods and Challenges

    PubMed Central

    Ramírez-Castillo, Flor Yazmín; Loera-Muro, Abraham; Jacques, Mario; Garneau, Philippe; Avelar-González, Francisco Javier; Harel, Josée; Guerrero-Barrera, Alma Lilián

    2015-01-01

    Waterborne pathogens and related diseases are a major public health concern worldwide, not only by the morbidity and mortality that they cause, but by the high cost that represents their prevention and treatment. These diseases are directly related to environmental deterioration and pollution. Despite the continued efforts to maintain water safety, waterborne outbreaks are still reported globally. Proper assessment of pathogens on water and water quality monitoring are key factors for decision-making regarding water distribution systems’ infrastructure, the choice of best water treatment and prevention waterborne outbreaks. Powerful, sensitive and reproducible diagnostic tools are developed to monitor pathogen contamination in water and be able to detect not only cultivable pathogens but also to detect the occurrence of viable but non-culturable microorganisms as well as the presence of pathogens on biofilms. Quantitative microbial risk assessment (QMRA) is a helpful tool to evaluate the scenarios for pathogen contamination that involve surveillance, detection methods, analysis and decision-making. This review aims to present a research outlook on waterborne outbreaks that have occurred in recent years. This review also focuses in the main molecular techniques for detection of waterborne pathogens and the use of QMRA approach to protect public health. PMID:26011827

  14. [Development of an automated processing method to detect coronary motion for coronary magnetic resonance angiography].

    PubMed

    Asou, Hiroya; Imada, N; Sato, T

    2010-06-20

    On coronary MR angiography (CMRA), cardiac motions worsen the image quality. To improve the image quality, detection of cardiac especially for individual coronary motion is very important. Usually, scan delay and duration were determined manually by the operator. We developed a new evaluation method to calculate static time of individual coronary artery. At first, coronary cine MRI was taken at the level of about 3 cm below the aortic valve (80 images/R-R). Chronological change of the signals were evaluated with Fourier transformation of each pixel of the images were done. Noise reduction with subtraction process and extraction process were done. To extract higher motion such as coronary arteries, morphological filter process and labeling process were added. Using these imaging processes, individual coronary motion was extracted and individual coronary static time was calculated automatically. We compared the images with ordinary manual method and new automated method in 10 healthy volunteers. Coronary static times were calculated with our method. Calculated coronary static time was shorter than that of ordinary manual method. And scan time became about 10% longer than that of ordinary method. Image qualities were improved in our method. Our automated detection method for coronary static time with chronological Fourier transformation has a potential to improve the image quality of CMRA and easy processing.

  15. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water.

    PubMed

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-03

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.

  16. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water

    PubMed Central

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-01

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956

  17. Impact of polymeric membrane breakage on drinking water quality and an online detection method of the breakage.

    PubMed

    Wu, Qilong; Zhang, Zhenghua; Cao, Guodong; Zhang, Xihui

    2017-10-15

    online detection of particle count can be used to evaluate the bacterial risk. It was also suggested that the online detection of particle count after backwashing within 100 s would be a quick and precise method to identify any fiber breakage in time. These results are very important for the safety issue in the application of polymeric membrane to water treatment plants.

  18. PCR-Based Method for Detecting Viral Penetration of Medical Exam Gloves

    PubMed Central

    Broyles, John M.; O'Connell, Kevin P.; Korniewicz, Denise M.

    2002-01-01

    The test approved by the U.S. Food and Drug Administration for assessment of the barrier quality of medical exam gloves includes visual inspection and a water leak test. Neither method tests directly the ability of gloves to prevent penetration by microorganisms. Methods that use microorganisms (viruses and bacteria) to test gloves have been developed but require classical culturing of the organism to detect it. We have developed a PCR assay for bacteriophage φX174 that allows the rapid detection of penetration of gloves by this virus. The method is suitable for use with both latex and synthetic gloves. The presence of glove powder on either latex or synthetic gloves had no effect on the ability of the PCR assay to detect bacteriophage DNA. The assay is rapid, sensitive, and inexpensive; requires only small sample volumes; and can be automated. PMID:12149320

  19. Rapid detection of Salmonella in pet food: design and evaluation of integrated methods based on real-time PCR detection.

    PubMed

    Balachandran, Priya; Friberg, Maria; Vanlandingham, V; Kozak, K; Manolis, Amanda; Brevnov, Maxim; Crowley, Erin; Bird, Patrick; Goins, David; Furtado, Manohar R; Petrauskene, Olga V; Tebbs, Robert S; Charbonneau, Duane

    2012-02-01

    Reducing the risk of Salmonella contamination in pet food is critical for both companion animals and humans, and its importance is reflected by the substantial increase in the demand for pathogen testing. Accurate and rapid detection of foodborne pathogens improves food safety, protects the public health, and benefits food producers by assuring product quality while facilitating product release in a timely manner. Traditional culture-based methods for Salmonella screening are laborious and can take 5 to 7 days to obtain definitive results. In this study, we developed two methods for the detection of low levels of Salmonella in pet food using real-time PCR: (i) detection of Salmonella in 25 g of dried pet food in less than 14 h with an automated magnetic bead-based nucleic acid extraction method and (ii) detection of Salmonella in 375 g of composite dry pet food matrix in less than 24 h with a manual centrifugation-based nucleic acid preparation method. Both methods included a preclarification step using a novel protocol that removes food matrix-associated debris and PCR inhibitors and improves the sensitivity of detection. Validation studies revealed no significant differences between the two real-time PCR methods and the standard U.S. Food and Drug Administration Bacteriological Analytical Manual (chapter 5) culture confirmation method.

  20. The practical application of signal detection theory to image quality assessment in x-ray image intensifier-TV fluoroscopy.

    PubMed

    Marshall, N W

    2001-06-01

    This paper applies a published version of signal detection theory to x-ray image intensifier fluoroscopy data and compares the results with more conventional subjective image quality measures. An eight-bit digital framestore was used to acquire temporally contiguous frames of fluoroscopy data from which the modulation transfer function (MTF(u)) and noise power spectrum were established. These parameters were then combined to give detective quantum efficiency (DQE(u)) and used in conjunction with signal detection theory to calculate contrast-detail performance. DQE(u) was found to lie between 0.1 and 0.5 for a range of fluoroscopy systems. Two separate image quality experiments were then performed in order to assess the correspondence between the objective and subjective methods. First, image quality for a given fluoroscopy system was studied as a function of doserate using objective parameters and a standard subjective contrast-detail method. Following this, the two approaches were used to assess three different fluoroscopy units. Agreement between objective and subjective methods was good; doserate changes were modelled correctly while both methods ranked the three systems consistently.

  1. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  2. New methods for the detection of viruses: call for review of drinking water quality guidelines.

    PubMed

    Grabow, W O; Taylor, M B; de Villiers, J C

    2001-01-01

    Drinking water supplies which meet international recommendations for source, treatment and disinfection were analysed. Viruses recovered from 100 L-1,000 L volumes by in-line glass wool filters were inoculated in parallel into four cell culture systems. Cell culture inoculation was used to isolate cytopathogenic viruses, amplify the nucleic acid of non-cytopathogenic viruses and confirm viability of viruses. Over a period of two years, viruses were detected in 23% of 413 drinking water samples and 73% of 224 raw water samples. Cytopathogenic viruses were detected in 6% raw water samples but not in any treated drinking water supplies. Enteroviruses were detected in 17% drinking water samples, adenoviruses in 4% and hepatitis A virus in 3%. In addition to these viruses, astro- and rotaviruses were detected in raw water. All drinking water supplies had heterotrophic plate counts of < 100/mL, total and faecal coliform counts of 0/100 mL and negative results in qualitative presence-absence tests for somatic and F-RNA coliphages (500 mL samples). These results call for a revision of water quality guidelines based on indicator organisms and vague reference to the absence of viruses.

  3. Multigrid contact detection method

    NASA Astrophysics Data System (ADS)

    He, Kejing; Dong, Shoubin; Zhou, Zhaoyao

    2007-03-01

    Contact detection is a general problem of many physical simulations. This work presents a O(N) multigrid method for general contact detection problems (MGCD). The multigrid idea is integrated with contact detection problems. Both the time complexity and memory consumption of the MGCD are O(N) . Unlike other methods, whose efficiencies are influenced strongly by the object size distribution, the performance of MGCD is insensitive to the object size distribution. We compare the MGCD with the no binary search (NBS) method and the multilevel boxing method in three dimensions for both time complexity and memory consumption. For objects with similar size, the MGCD is as good as the NBS method, both of which outperform the multilevel boxing method regarding memory consumption. For objects with diverse size, the MGCD outperform both the NBS method and the multilevel boxing method. We use the MGCD to solve the contact detection problem for a granular simulation system based on the discrete element method. From this granular simulation, we get the density property of monosize packing and binary packing with size ratio equal to 10. The packing density for monosize particles is 0.636. For binary packing with size ratio equal to 10, when the number of small particles is 300 times as the number of big particles, the maximal packing density 0.824 is achieved.

  4. Missed detection of significant positive and negative shifts in gentamicin assay: implications for routine laboratory quality practices.

    PubMed

    Koerbin, Gus; Liu, Jiakai; Eigenstetter, Alex; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-15

    A product recall was issued for the Roche/Hitachi Cobas Gentamicin II assays on 25 th May 2016 in Australia, after a 15 - 20% positive analytical shift was discovered. Laboratories were advised to employ the Thermo Fisher Gentamicin assay as an alternative. Following the reintroduction of the revised assay on 12 th September 2016, a second reagent recall was made on 20 th March 2017 after the discovery of a 20% negative analytical shift due to erroneous instrument adjustment factor. The practices of an index laboratory were examined to determine how the analytical shifts evaded detection by routine internal quality control (IQC) and external quality assurance (EQA) systems. The ability of the patient result-based approaches, including moving average (MovAvg) and moving sum of outliers (MovSO) approaches in detecting these shifts were examined. Internal quality control data of the index laboratory were acceptable prior to the product recall. The practice of adjusting IQC target following a change in assay method resulted in the missed negative shift when the revised Roche assay was reintroduced. While the EQA data of the Roche subgroup showed clear negative bias relative to other laboratory methods, the results were considered as possible 'matrix effect'. The MovAvg method detected the positive shift before the product recall. The MovSO did not detect the negative shift in the index laboratory but did so in another laboratory 5 days before the second product recall. There are gaps in current laboratory quality practices that leave room for analytical errors to evade detection.

  5. The Detection Method of Escherichia coli in Water Resources: A Review

    NASA Astrophysics Data System (ADS)

    Nurliyana, M. R.; Sahdan, M. Z.; Wibowo, K. M.; Muslihati, A.; Saim, H.; Ahmad, S. A.; Sari, Y.; Mansor, Z.

    2018-04-01

    This article reviews several approaches for Escherichia coli (E. coli) bacteria detection from conventional methods, emerging method and goes to biosensor-based techniques. Detection and enumeration of E. coli bacteria usually required long duration of time in obtaining the result since laboratory-based approach is normally used in its assessment. It requires 24 hours to 72 hours after sampling to process the culturing samples before results are available. Although faster technique for detecting E. coli in water such as Polymerase Chain Reaction (PCR) and Enzyme-Linked Immunosorbent Assay (ELISA) have been developed, it still required transporting the samples from water resources to the laboratory, high-cost, complicated equipment usage, complex procedures, as well as the requirement of skilled specialist to cope with the complexity which limit their wide spread practice in water quality detection. Recently, development of biosensor device that is easy to perform, portable, highly sensitive and selective becomes indispensable in detecting extremely lower consolidation of pathogenic E. coli bacteria in water samples.

  6. Methods of DNA methylation detection

    NASA Technical Reports Server (NTRS)

    Maki, Wusi Chen (Inventor); Filanoski, Brian John (Inventor); Mishra, Nirankar (Inventor); Rastogi, Shiva (Inventor)

    2010-01-01

    The present invention provides for methods of DNA methylation detection. The present invention provides for methods of generating and detecting specific electronic signals that report the methylation status of targeted DNA molecules in biological samples.Two methods are described, direct and indirect detection of methylated DNA molecules in a nano transistor based device. In the direct detection, methylated target DNA molecules are captured on the sensing surface resulting in changes in the electrical properties of a nano transistor. These changes generate detectable electronic signals. In the indirect detection, antibody-DNA conjugates are used to identify methylated DNA molecules. RNA signal molecules are generated through an in vitro transcription process. These RNA molecules are captured on the sensing surface change the electrical properties of nano transistor thereby generating detectable electronic signals.

  7. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    PubMed

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved

  8. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. DNA extraction methods for detecting genetically modified foods: A comparative study.

    PubMed

    Elsanhoty, Rafaat M; Ramadan, Mohamed Fawzy; Jany, Klaus Dieter

    2011-06-15

    The work presented in this manuscript was achieved to compare six different methods for extracting DNA from raw maize and its derived products. The methods that gave higher yield and quality of DNA were chosen to detect the genetic modification in the samples collected from the Egyptian market. The different methods used were evaluated for extracting DNA from maize kernels (without treatment), maize flour (mechanical treatment), canned maize (sweet corn), frozen maize (sweet corn), maize starch, extruded maize, popcorn, corn flacks, maize snacks, and bread made from corn flour (mechanical and thermal treatments). The quality and quantity of the DNA extracted from the standards, containing known percentages of GMO material and from the different food products were evaluated. For qualitative detection of the GMO varieties in foods, the GMOScreen 35S/NOS test kit was used, to screen the genetic modification in the samples. The positive samples for the 35S promoter and/or the NOS terminator were identified by the standard methods adopted by EU. All of the used methods extracted yielded good DNA quality. However, we noted that the purest DNA extract were obtained using the DNA extraction kit (Roche) and this generally was the best method for extracting DNA from most of the maize-derived foods. We have noted that the yield of DNA extracted from maize-derived foods was generally lower in the processed products. The results indicated that 17 samples were positive for the presence of 35S promoter, while 34% from the samples were positive for the genetically modified maize line Bt-176. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Method modification of the Legipid® Legionella fast detection test kit.

    PubMed

    Albalat, Guillermo Rodríguez; Broch, Begoña Bedrina; Bono, Marisa Jiménez

    2014-01-01

    Legipid(®) Legionella Fast Detection is a test based on combined magnetic immunocapture and enzyme-immunoassay (CEIA) for the detection of Legionella in water. The test is based on the use of anti-Legionella antibodies immobilized on magnetic microspheres. Target microorganism is preconcentrated by filtration. Immunomagnetic analysis is applied on these preconcentrated water samples in a final test portion of 9 mL. The test kit was certified by the AOAC Research Institute as Performance Tested Method(SM) (PTM) No. 111101 in a PTM validation which certifies the performance claims of the test method in comparison to the ISO reference method 11731-1998 and the revision 11731-2004 "Water Quality: Detection and Enumeration of Legionella pneumophila" in potable water, industrial water, and waste water. The modification of this test kit has been approved. The modification includes increasing the target analyte from L. pneumophila to Legionella species and adding an optical reader to the test method. In this study, 71 strains of Legionella spp. other than L. pneumophila were tested to determine its reactivity with the kit based on CEIA. All the strains of Legionella spp. tested by the CEIA test were confirmed positive by reference standard method ISO 11731. This test (PTM 111101) has been modified to include a final optical reading. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Two water matrixes were analyzed. Results show no statistically detectable difference between the test method and the reference culture method for the enumeration of Legionella spp. The relative level of detection was 93 CFU/volume examined (LOD50). For optical reading, the LOD was 40 CFU/volume examined and the LOQ was 60 CFU/volume examined. Results showed that the test Legipid Legionella Fast Detection is equivalent to the reference culture method for the enumeration of Legionella spp.

  11. Automatic detection method for mura defects on display film surface using modified Weber's law

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Muk; Lee, Seung-Ho

    2014-07-01

    We propose a method that automatically detects mura defects on display film surfaces using a modified version of Weber's law. The proposed method detects mura defects regardless of their properties and shapes by identifying regions perceived by human vision as mura using the brightness of pixel and image distribution ratio of mura in an image histogram. The proposed detection method comprises five stages. In the first stage, the display film surface image is acquired and a gray-level shift performed. In the second and third stages, the image histogram is acquired and analyzed, respectively. In the fourth stage, the mura range is acquired. This is followed by postprocessing in the fifth stage. Evaluations of the proposed method conducted using 200 display film mura image samples indicate a maximum detection rate of ˜95.5%. Further, the results of application of the Semu index for luminance mura in flat panel display (FPD) image quality inspection indicate that the proposed method is more reliable than a popular conventional method.

  12. [Application of THz technology to nondestructive detection of agricultural product quality].

    PubMed

    Jiang, Yu-ying; Ge, Hong-yi; Lian, Fei-yu; Zhang, Yuan; Xia, Shan-hong

    2014-08-01

    With recent development of THz sources and detector, applications of THz radiation to nondestructive testing and quality control have expanded in many fields, such as agriculture, safety inspection and quality control, medicine, biochemistry, communication etc. Compared with other detection technique, being a new kind of technique, THz radiation has low energy, good perspectivity, and high signal-to-noise ratio, and thus can obtain physical, chemical and biological information. This paper first introduces the basic concept of THz radiation and the major properties, then gives an extensive review of recent research progress in detection of the quality of agricultural products via THz technique, analyzes the existing shortcomings of THz detection and discusses the outlook of potential application, finally proposes the new application of THz technique to detection of quality of stored grain.

  13. Classification methods to detect sleep apnea in adults based on respiratory and oximetry signals: a systematic review.

    PubMed

    Uddin, M B; Chow, C M; Su, S W

    2018-03-26

    Sleep apnea (SA), a common sleep disorder, can significantly decrease the quality of life, and is closely associated with major health risks such as cardiovascular disease, sudden death, depression, and hypertension. The normal diagnostic process of SA using polysomnography is costly and time consuming. In addition, the accuracy of different classification methods to detect SA varies with the use of different physiological signals. If an effective, reliable, and accurate classification method is developed, then the diagnosis of SA and its associated treatment will be time-efficient and economical. This study aims to systematically review the literature and present an overview of classification methods to detect SA using respiratory and oximetry signals and address the automated detection approach. Sixty-two included studies revealed the application of single and multiple signals (respiratory and oximetry) for the diagnosis of SA. Both airflow and oxygen saturation signals alone were effective in detecting SA in the case of binary decision-making, whereas multiple signals were good for multi-class detection. In addition, some machine learning methods were superior to the other classification methods for SA detection using respiratory and oximetry signals. To deal with the respiratory and oximetry signals, a good choice of classification method as well as the consideration of associated factors would result in high accuracy in the detection of SA. An accurate classification method should provide a high detection rate with an automated (independent of human action) analysis of respiratory and oximetry signals. Future high-quality automated studies using large samples of data from multiple patient groups or record batches are recommended.

  14. Application of Nemerow Index Method and Integrated Water Quality Index Method in Water Quality Assessment of Zhangze Reservoir

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Feng, Minquan; Hao, Xiaoyan

    2018-03-01

    [Objective] Based on the water quality historical data from the Zhangze Reservoir from the last five years, the water quality was assessed by the integrated water quality identification index method and the Nemerow pollution index method. The results of different evaluation methods were analyzed and compared and the characteristics of each method were identified.[Methods] The suitability of the water quality assessment methods were compared and analyzed, based on these results.[Results] the water quality tended to decrease over time with 2016 being the year with the worst water quality. The sections with the worst water quality were the southern and northern sections.[Conclusion] The results produced by the traditional Nemerow index method fluctuated greatly in each section of water quality monitoring and therefore could not effectively reveal the trend of water quality at each section. The combination of qualitative and quantitative measures of the comprehensive pollution index identification method meant it could evaluate the degree of water pollution as well as determine that the river water was black and odorous. However, the evaluation results showed that the water pollution was relatively low.The results from the improved Nemerow index evaluation were better as the single indicators and evaluation results are in strong agreement; therefore the method is able to objectively reflect the water quality of each water quality monitoring section and is more suitable for the water quality evaluation of the reservoir.

  15. Detection and Analysis of the Quality of Ibuprofen Granules

    NASA Astrophysics Data System (ADS)

    Yu-bin, Ji; Xin, LI; Guo-song, Xin; Qin-bing, Xue

    2017-12-01

    The Ibuprofen Granules comprehensive quality testing to ensure that it is in accordance with the provisions of Chinese pharmacopoeia. With reference of Chinese pharmacopoeia, the Ibuprofen Granules is tested by UV, HPLC, in terms of grain size checking, volume deviation, weight loss on drying detection, dissolution rate detection, and quality evaluation. Results indicated that Ibuprofen Granules conform to the standards. The Ibuprofen Granules are qualified and should be permitted to be marketed.

  16. Laboratory Evaluations of the Enterococcus qPCR Method for Recreational Water Quality Testing: Method Performance and Sources of Uncertainty in Quantitative Measurements

    EPA Science Inventory

    The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...

  17. Methods for automatic detection of artifacts in microelectrode recordings.

    PubMed

    Bakštein, Eduard; Sieger, Tomáš; Wild, Jiří; Novák, Daniel; Schneider, Jakub; Vostatek, Pavel; Urgošík, Dušan; Jech, Robert

    2017-10-01

    Extracellular microelectrode recording (MER) is a prominent technique for studies of extracellular single-unit neuronal activity. In order to achieve robust results in more complex analysis pipelines, it is necessary to have high quality input data with a low amount of artifacts. We show that noise (mainly electromagnetic interference and motion artifacts) may affect more than 25% of the recording length in a clinical MER database. We present several methods for automatic detection of noise in MER signals, based on (i) unsupervised detection of stationary segments, (ii) large peaks in the power spectral density, and (iii) a classifier based on multiple time- and frequency-domain features. We evaluate the proposed methods on a manually annotated database of 5735 ten-second MER signals from 58 Parkinson's disease patients. The existing methods for artifact detection in single-channel MER that have been rigorously tested, are based on unsupervised change-point detection. We show on an extensive real MER database that the presented techniques are better suited for the task of artifact identification and achieve much better results. The best-performing classifiers (bagging and decision tree) achieved artifact classification accuracy of up to 89% on an unseen test set and outperformed the unsupervised techniques by 5-10%. This was close to the level of agreement among raters using manual annotation (93.5%). We conclude that the proposed methods are suitable for automatic MER denoising and may help in the efficient elimination of undesirable signal artifacts. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantifying clutter: A comparison of four methods and their relationship to bat detection

    Treesearch

    Joy M. O’Keefe; Susan C. Loeb; Hoke S. Hill Jr.; J. Drew Lanham

    2014-01-01

    The degree of spatial complexity in the environment, or clutter, affects the quality of foraging habitats for bats and their detection with acoustic systems. Clutter has been assessed in a variety of ways but there are no standardized methods for measuring clutter. We compared four methods (Visual Clutter, Cluster, Single Variable, and Clutter Index) and related these...

  19. Comparative studies of copy number variation detection methods for next-generation sequencing technologies.

    PubMed

    Duan, Junbo; Zhang, Ji-Gang; Deng, Hong-Wen; Wang, Yu-Ping

    2013-01-01

    Copy number variation (CNV) has played an important role in studies of susceptibility or resistance to complex diseases. Traditional methods such as fluorescence in situ hybridization (FISH) and array comparative genomic hybridization (aCGH) suffer from low resolution of genomic regions. Following the emergence of next generation sequencing (NGS) technologies, CNV detection methods based on the short read data have recently been developed. However, due to the relatively young age of the procedures, their performance is not fully understood. To help investigators choose suitable methods to detect CNVs, comparative studies are needed. We compared six publicly available CNV detection methods: CNV-seq, FREEC, readDepth, CNVnator, SegSeq and event-wise testing (EWT). They are evaluated both on simulated and real data with different experiment settings. The receiver operating characteristic (ROC) curve is employed to demonstrate the detection performance in terms of sensitivity and specificity, box plot is employed to compare their performances in terms of breakpoint and copy number estimation, Venn diagram is employed to show the consistency among these methods, and F-score is employed to show the overlapping quality of detected CNVs. The computational demands are also studied. The results of our work provide a comprehensive evaluation on the performances of the selected CNV detection methods, which will help biological investigators choose the best possible method.

  20. Suitability of Optical, Physical and Chemical Measurements for Detection of Changes in Bacterial Drinking Water Quality

    PubMed Central

    Ikonen, Jenni; Pitkänen, Tarja; Miettinen, Ilkka T.

    2013-01-01

    In this study, different optical, physical and chemical measurements were tested for their capacity to detect changes in water quality. The tests included UV-absorbance at 254 nm, absorbance at 420 nm, turbidity, particle counting, temperature, pH, electric conductivity (EC), free chlorine concentration and ATP concentration measurements. Special emphasis was given to investigating the potential for measurement tools to detect changes in bacterial concentrations in drinking water. Bacterial colony counts (CFU) and total bacterial cell counts (TBC) were used as reference methods for assessing the bacterial water quality. The study consists of a series of laboratory scale experiments: monitoring of regrowth of Pseudomonas fluorescens, estimation of the detection limits for optical measurements using Escherichia coli dilutions, verification of the relationships by analysing grab water samples from various distribution systems and utilisation of the measurements in the case of an accidentally contaminated distribution network. We found significant correlations between the tested measurements and the bacterial water quality. As the bacterial contamination of water often co-occurs with the intrusion of matrixes containing mainly non-bacterial components, the tested measurement tools can be considered to have the potential to rapidly detect any major changes in drinking water quality. PMID:24284353

  1. Suitability of optical, physical and chemical measurements for detection of changes in bacterial drinking water quality.

    PubMed

    Ikonen, Jenni; Pitkänen, Tarja; Miettinen, Ilkka T

    2013-10-25

    In this study, different optical, physical and chemical measurements were tested for their capacity to detect changes in water quality. The tests included UV-absorbance at 254 nm, absorbance at 420 nm, turbidity, particle counting, temperature, pH, electric conductivity (EC), free chlorine concentration and ATP concentration measurements. Special emphasis was given to investigating the potential for measurement tools to detect changes in bacterial concentrations in drinking water. Bacterial colony counts (CFU) and total bacterial cell counts (TBC) were used as reference methods for assessing the bacterial water quality. The study consists of a series of laboratory scale experiments: monitoring of regrowth of Pseudomonas fluorescens, estimation of the detection limits for optical measurements using Escherichia coli dilutions, verification of the relationships by analysing grab water samples from various distribution systems and utilisation of the measurements in the case of an accidentally contaminated distribution network. We found significant correlations between the tested measurements and the bacterial water quality. As the bacterial contamination of water often co-occurs with the intrusion of matrixes containing mainly non-bacterial components, the tested measurement tools can be considered to have the potential to rapidly detect any major changes in drinking water quality.

  2. Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan

    2015-05-01

    With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction

  3. [Quant efficiency of the detection as a quality parameter of the visualization equipment].

    PubMed

    Morgun, O N; Nemchenko, K E; Rogov, Iu V

    2003-01-01

    The critical parameter of notion "quant efficiency of detection" is defined in the paper. Different methods of specifying the detection quant efficiency (DQE) are under discussion. Thus, techniques of DQE determination for a whole unit and means of DQE finding at terminal space frequency are addressed. The notion of DQE at zero frequency is in the focus of attention. Finally, difficulties occurring in determining the above parameter as well as its disadvantages (as a parameter characterizing the quality of X-ray irradiation visualizing systems) are also discussed.

  4. Automated Fall Detection With Quality Improvement “Rewind” to Reduce Falls in Hospital Rooms

    PubMed Central

    Rantz, Marilyn J.; Banerjee, Tanvi S.; Cattoor, Erin; Scott, Susan D.; Skubic, Marjorie; Popescu, Mihail

    2014-01-01

    The purpose of this study was to test the implementation of a fall detection and “rewind” privacy-protecting technique using the Microsoft® Kinect™ to not only detect but prevent falls from occurring in hospitalized patients. Kinect sensors were placed in six hospital rooms in a step-down unit and data were continuously logged. Prior to implementation with patients, three researchers performed a total of 18 falls (walking and then falling down or falling from the bed) and 17 non-fall events (crouching down, stooping down to tie shoe laces, and lying on the floor). All falls and non-falls were correctly identified using automated algorithms to process Kinect sensor data. During the first 8 months of data collection, processing methods were perfected to manage data and provide a “rewind” method to view events that led to falls for post-fall quality improvement process analyses. Preliminary data from this feasibility study show that using the Microsoft Kinect sensors provides detection of falls, fall risks, and facilitates quality improvement after falls in real hospital environments unobtrusively, while taking into account patient privacy. PMID:24296567

  5. A method of rapidly evaluating image quality of NED optical system

    NASA Astrophysics Data System (ADS)

    Sun, Qi; Qiu, Chuankai; Yang, Huan

    2014-11-01

    In recent years, with the development of technology of micro-display, advanced optics and the software and hardware, near-to-eye display ( NED) optical system will have a wide range of potential applications in the fields of amusement and virtual reality. However, research on the evaluating image quality of this kind optical system is comparatively lagging behind. Although now there are some methods and equipment for evaluation, they can't be applied in commercial production because of their complex operation and inaccuracy. In this paper, an academic method is proposed and a Rapid Evaluation System (RES) is designed to evaluate the image of optical system rapidly and exactly. Firstly, a set of parameters that eyes are sensitive to and also express the quality of system should be extracted and quantized to be criterion, so the evaluation standards can be established. Then, some parameters can be detected by RES consisted of micro-display, CCD camera and computer and so on. By process of scaling, the measuring results of the RES are exact and creditable, relationship between object measurement, subjective evaluation and the RES will be established. After that, image quality of optical system can be evaluated just by detecting parameters of that. The RES is simple and the results of evaluation are exact and keeping with human vision. So the method can be used not only for optimizing design of optical system, but also for evaluation in commercial production.

  6. Detection of License Plate using Sliding Window, Histogram of Oriented Gradient, and Support Vector Machines Method

    NASA Astrophysics Data System (ADS)

    Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari

    2018-01-01

    The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.

  7. Improved Statistical Method For Hydrographic Climatic Records Quality Control

    NASA Astrophysics Data System (ADS)

    Gourrion, J.; Szekely, T.

    2016-02-01

    Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.

  8. Statistical methods for convergence detection of multi-objective evolutionary algorithms.

    PubMed

    Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J

    2009-01-01

    In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.

  9. Faint Debris Detection by Particle Based Track-Before-Detect Method

    NASA Astrophysics Data System (ADS)

    Uetsuhara, M.; Ikoma, N.

    2014-09-01

    This study proposes a particle method to detect faint debris, which is hardly seen in single frame, from an image sequence based on the concept of track-before-detect (TBD). The most widely used detection method is detect-before-track (DBT), which firstly detects signals of targets from single frame by distinguishing difference of intensity between foreground and background then associate the signals for each target between frames. DBT is capable of tracking bright targets but limited. DBT is necessary to consider presence of false signals and is difficult to recover from false association. On the other hand, TBD methods try to track targets without explicitly detecting the signals followed by evaluation of goodness of each track and obtaining detection results. TBD has an advantage over DBT in detecting weak signals around background level in single frame. However, conventional TBD methods for debris detection apply brute-force search over candidate tracks then manually select true one from the candidates. To reduce those significant drawbacks of brute-force search and not-fully automated process, this study proposes a faint debris detection algorithm by a particle based TBD method consisting of sequential update of target state and heuristic search of initial state. The state consists of position, velocity direction and magnitude, and size of debris over the image at a single frame. The sequential update process is implemented by a particle filter (PF). PF is an optimal filtering technique that requires initial distribution of target state as a prior knowledge. An evolutional algorithm (EA) is utilized to search the initial distribution. The EA iteratively applies propagation and likelihood evaluation of particles for the same image sequences and resulting set of particles is used as an initial distribution of PF. This paper describes the algorithm of the proposed faint debris detection method. The algorithm demonstrates performance on image sequences acquired

  10. Guidelines for the detection of Trichinella larvae at the slaughterhouse in a quality assurance system.

    PubMed

    Rossi, Patrizia; Pozio, Edoardo

    2008-01-01

    The European Community Regulation (EC) No. 2075/2005 lays down specific rules on official controls for the detection of Trichinella in fresh meat for human consumption, recommending the pooled-sample digestion method as the reference method. The aim of this document is to provide specific guidance to implement an appropriate Trichinella digestion method by a laboratory accredited according to the ISO/IEC 17025:2005 international standard, and performing microbiological testing following the EA-04/10:2002 international guideline. Technical requirements for the correct implementation of the method, such as the personnel competence, specific equipments and reagents, validation of the method, reference materials, sampling, quality assurance of results and quality control of performance are provided, pointing out the critical control points for the correct implementation of the digestion method.

  11. Landsat change detection can aid in water quality monitoring

    NASA Technical Reports Server (NTRS)

    Macdonald, H. C.; Steele, K. F.; Waite, W. P.; Shinn, M. R.

    1977-01-01

    Comparison between Landsat-1 and -2 imagery of Arkansas provided evidence of significant land use changes during the 1972-75 time period. Analysis of Arkansas historical water quality information has shown conclusively that whereas point source pollution generally can be detected by use of water quality data collected by state and federal agencies, sampling methodologies for nonpoint source contamination attributable to surface runoff are totally inadequate. The expensive undertaking of monitoring all nonpoint sources for numerous watersheds can be lessened by implementing Landsat change detection analyses.

  12. A Precise Visual Method for Narrow Butt Detection in Specular Reflection Workpiece Welding

    PubMed Central

    Zeng, Jinle; Chang, Baohua; Du, Dong; Hong, Yuxiang; Chang, Shuhe; Zou, Yirong

    2016-01-01

    During the complex path workpiece welding, it is important to keep the welding torch aligned with the groove center using a visual seam detection method, so that the deviation between the torch and the groove can be corrected automatically. However, when detecting the narrow butt of a specular reflection workpiece, the existing methods may fail because of the extremely small groove width and the poor imaging quality. This paper proposes a novel detection method to solve these issues. We design a uniform surface light source to get high signal-to-noise ratio images against the specular reflection effect, and a double-line laser light source is used to obtain the workpiece surface equation relative to the torch. Two light sources are switched on alternately and the camera is synchronized to capture images when each light is on; then the position and pose between the torch and the groove can be obtained nearly at the same time. Experimental results show that our method can detect the groove effectively and efficiently during the welding process. The image resolution is 12.5 μm and the processing time is less than 10 ms per frame. This indicates our method can be applied to real-time narrow butt detection during high-speed welding process. PMID:27649173

  13. A Precise Visual Method for Narrow Butt Detection in Specular Reflection Workpiece Welding.

    PubMed

    Zeng, Jinle; Chang, Baohua; Du, Dong; Hong, Yuxiang; Chang, Shuhe; Zou, Yirong

    2016-09-13

    During the complex path workpiece welding, it is important to keep the welding torch aligned with the groove center using a visual seam detection method, so that the deviation between the torch and the groove can be corrected automatically. However, when detecting the narrow butt of a specular reflection workpiece, the existing methods may fail because of the extremely small groove width and the poor imaging quality. This paper proposes a novel detection method to solve these issues. We design a uniform surface light source to get high signal-to-noise ratio images against the specular reflection effect, and a double-line laser light source is used to obtain the workpiece surface equation relative to the torch. Two light sources are switched on alternately and the camera is synchronized to capture images when each light is on; then the position and pose between the torch and the groove can be obtained nearly at the same time. Experimental results show that our method can detect the groove effectively and efficiently during the welding process. The image resolution is 12.5 μm and the processing time is less than 10 ms per frame. This indicates our method can be applied to real-time narrow butt detection during high-speed welding process.

  14. Genomic Data Quality Impacts Automated Detection of Lateral Gene Transfer in Fungi

    PubMed Central

    Dupont, Pierre-Yves; Cox, Murray P.

    2017-01-01

    Lateral gene transfer (LGT, also known as horizontal gene transfer), an atypical mechanism of transferring genes between species, has almost become the default explanation for genes that display an unexpected composition or phylogeny. Numerous methods of detecting LGT events all rely on two fundamental strategies: primary structure composition or gene tree/species tree comparisons. Discouragingly, the results of these different approaches rarely coincide. With the wealth of genome data now available, detection of laterally transferred genes is increasingly being attempted in large uncurated eukaryotic datasets. However, detection methods depend greatly on the quality of the underlying genomic data, which are typically complex for eukaryotes. Furthermore, given the automated nature of genomic data collection, it is typically impractical to manually verify all protein or gene models, orthology predictions, and multiple sequence alignments, requiring researchers to accept a substantial margin of error in their datasets. Using a test case comprising plant-associated genomes across the fungal kingdom, this study reveals that composition- and phylogeny-based methods have little statistical power to detect laterally transferred genes. In particular, phylogenetic methods reveal extreme levels of topological variation in fungal gene trees, the vast majority of which show departures from the canonical species tree. Therefore, it is inherently challenging to detect LGT events in typical eukaryotic genomes. This finding is in striking contrast to the large number of claims for laterally transferred genes in eukaryotic species that routinely appear in the literature, and questions how many of these proposed examples are statistically well supported. PMID:28235827

  15. On-line bolt-loosening detection method of key components of running trains using binocular vision

    NASA Astrophysics Data System (ADS)

    Xie, Yanxia; Sun, Junhua

    2017-11-01

    Bolt loosening, as one of hidden faults, affects the running quality of trains and even causes serious safety accidents. However, the developed fault detection approaches based on two-dimensional images cannot detect bolt-loosening due to lack of depth information. Therefore, we propose a novel online bolt-loosening detection method using binocular vision. Firstly, the target detection model based on convolutional neural network (CNN) is used to locate the target regions. And then, stereo matching and three-dimensional reconstruction are performed to detect bolt-loosening faults. The experimental results show that the looseness of multiple bolts can be characterized by the method simultaneously. The measurement repeatability and precision are less than 0.03mm, 0.09mm respectively, and its relative error is controlled within 1.09%.

  16. Detection of outliers in water quality monitoring samples using functional data analysis in San Esteban estuary (Northern Spain).

    PubMed

    Díaz Muñiz, C; García Nieto, P J; Alonso Fernández, J R; Martínez Torres, J; Taboada, J

    2012-11-15

    Water quality controls involve large number of variables and observations, often subject to some outliers. An outlier is an observation that is numerically distant from the rest of the data or that appears to deviate markedly from other members of the sample in which it occurs. An interesting analysis is to find those observations that produce measurements that are different from the pattern established in the sample. Therefore, identification of atypical observations is an important concern in water quality monitoring and a difficult task because of the multivariate nature of water quality data. Our study provides a new method for detecting outliers in water quality monitoring parameters, using oxygen and turbidity as indicator variables. Until now, methods were based on considering the different parameters as a vector whose components were their concentration values. Our approach lies in considering water quality monitoring through time as curves instead of vectors, that is to say, the data set of the problem is considered as a time-dependent function and not as a set of discrete values in different time instants. The methodology, which is based on the concept of functional depth, was applied to the detection of outliers in water quality monitoring samples in San Esteban estuary. Results were discussed in terms of origin, causes, etc., and compared with those obtained using the conventional method based on vector comparison. Finally, the advantages of the functional method are exposed. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Detection of water quality trends at high, median, and low flow in a Catskill Mountain stream, New York, through a new statistical method

    USGS Publications Warehouse

    Murdoch, Peter S.; Shanley, James B.

    2006-01-01

    The effects of changes in acid deposition rates resulting from the Clean Air Act Amendments of 1990 should first appear in stream waters during rainstorms and snowmelt, when the surface of the watershed is most hydrologically connected to the stream. Early detection of improved stream water quality is possible if trends at high flow could be separately determined. Trends in concentrations of sulfate (SO42−), nitrate (NO3−), calcium plus magnesium (Ca2++Mg2+), and acid‐neutralizing capacity (ANC) in Biscuit Brook, Catskill Mountains, New York, were assessed through segmented regression analysis (SRA). The method uses annual concentration‐to‐discharge relations to predict concentrations for specific discharges, then compares those annual values to determine trends at specific discharge levels. Median‐flow trends using SRA were comparable to those predicted by the seasonal Kendall tau test and a multiple regression residual analysis. All of these methods show that stream water SO42− concentrations have decreased significantly since 1983; Ca2++Mg2+ concentrations have decreased at a steady but slower rate than SO42−; and ANC shows no trend. The new SRA method, however, reveals trends that differ at specified flow levels. ANC has increased, and NO3−concentrations have decreased at high flows, but neither has changed as significantly at low flows. The general downward trend in SO42− flattened at median flow and reversed at high flow between 1997 and 2002. The reversal of the high‐flow SO42− trend is consistent with increases in SO42− concentrations in both precipitation and soil solutions at Biscuit Brook. Separate calculation of high‐flow trends provides resource managers with an early detection system for assessing changes in water quality resulting from changes in acidic deposition.

  18. Chemical investigation of commercial grape seed derived products to assess quality and detect adulteration.

    PubMed

    Villani, Tom S; Reichert, William; Ferruzzi, Mario G; Pasinetti, Giulio M; Simon, James E; Wu, Qingli

    2015-03-01

    Fundamental concerns in quality control arise due to increasing use of grape seed extract (GSE) and the complex chemical composition of GSE. Proanthocyanidin monomers and oligomers are the major bioactive compounds in GSE. Given no standardized criteria for quality, large variation exists in the composition of commercial GSE supplements. Using HPLC/UV/MS, 21 commercial GSE containing products were purchased and chemically profiled, major compounds quantitated, and compared against authenticated grape seed extract, peanut skin extract, and pine bark extract. The antioxidant capacity and total polyphenol content for each sample was also determined and compared using standard techniques. Nine products were adulterated, found to contain peanut skin extract. A wide degree of variability in chemical composition was detected in commercial products, demonstrating the need for development of quality control standards for GSE. A TLC method was developed to allow for rapid and inexpensive detection of adulteration in GSE by peanut skin. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Gastrointestinal symptoms and quality of life in screen-detected celiac disease.

    PubMed

    Paavola, Aku; Kurppa, Kalle; Ukkola, Anniina; Collin, Pekka; Lähdeaho, Marja-Leena; Huhtala, Heini; Mäki, Markku; Kaukinen, Katri

    2012-10-01

    Active serological screening has proved an effective means of increasing the diagnostic rate in celiac disease. The effects of a long-term gluten-free diet on possible gastrointestinal symptoms and psychological well-being in screen-detected patients have nevertheless remained obscure. Abdominal symptoms and quality of life were measured in a large cohort of treated screen-detected celiac adults. Comparisons were made with corresponding symptom-detected patients and with non-celiac controls. Dietary adherence was assessed both by structured interview and by serological testing. In both screen- and symptom-detected celiac groups, 88% of the patients were adherent. On a diet, both screen- and symptom-detected patients reported significantly more gastrointestinal symptoms than non-celiac controls. Those screen-detected patients who reported having no symptoms at the time of diagnosis, also remained asymptomatic during the diet. Despite persistent symptoms, psychological well-being in screen-detected patients was comparable with that in non-celiac controls, whereas the symptom-detected patients showed lower quality of life. Long-term treated screen-detected celiac patients, especially women, suffer from gastrointestinal symptoms on a gluten free diet similarly to symptom-detected patients. However, despite a similar frequency of persistent symptoms, the quality of life was unimpaired in the screen found, but remained low in the symptom-detected group. Copyright © 2012 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  20. Natural Fatigue Crack Initiation and Detection in High Quality Spur Gears

    DTIC Science & Technology

    2012-06-01

    Natural Fatigue Crack Initiation and Detection in High Quality Spur Gears by David “Blake” Stringer, Ph.D., Kelsen E. LaBerge, Ph.D., Cory...0383 June 2012 Natural Fatigue Crack Initiation and Detection in High Quality Spur Gears David “Blake” Stringer and Ph.D., Kelsen E. LaBerge...Quality Spur Gears 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) David “Blake” Stringer, Ph.D., Kelsen E

  1. A comparison of moving object detection methods for real-time moving object detection

    NASA Astrophysics Data System (ADS)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  2. Image Quality Assessment for Fake Biometric Detection: Application to Iris, Fingerprint, and Face Recognition.

    PubMed

    Galbally, Javier; Marcel, Sébastien; Fierrez, Julian

    2014-02-01

    To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. In this paper, we present a novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment. The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results, obtained on publicly available data sets of fingerprint, iris, and 2D face, show that the proposed method is highly competitive compared with other state-of-the-art approaches and that the analysis of the general image quality of real biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.

  3. Towards a Systematic Screening Tool for Quality Assurance and Semiautomatic Fraud Detection for Images in the Life Sciences.

    PubMed

    Koppers, Lars; Wormer, Holger; Ickstadt, Katja

    2017-08-01

    The quality and authenticity of images is essential for data presentation, especially in the life sciences. Questionable images may often be a first indicator for questionable results, too. Therefore, a tool that uses mathematical methods to detect suspicious images in large image archives can be a helpful instrument to improve quality assurance in publications. As a first step towards a systematic screening tool, especially for journal editors and other staff members who are responsible for quality assurance, such as laboratory supervisors, we propose a basic classification of image manipulation. Based on this classification, we developed and explored some simple algorithms to detect copied areas in images. Using an artificial image and two examples of previously published modified images, we apply quantitative methods such as pixel-wise comparison, a nearest neighbor and a variance algorithm to detect copied-and-pasted areas or duplicated images. We show that our algorithms are able to detect some simple types of image alteration, such as copying and pasting background areas. The variance algorithm detects not only identical, but also very similar areas that differ only by brightness. Further types could, in principle, be implemented in a standardized scanning routine. We detected the copied areas in a proven case of image manipulation in Germany and showed the similarity of two images in a retracted paper from the Kato labs, which has been widely discussed on sites such as pubpeer and retraction watch.

  4. Parametric Analysis of Surveillance Quality and Level and Quality of Intent Information and Their Impact on Conflict Detection Performance

    NASA Technical Reports Server (NTRS)

    Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.

    2016-01-01

    A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.

  5. Improved statistical method for temperature and salinity quality control

    NASA Astrophysics Data System (ADS)

    Gourrion, Jérôme; Szekely, Tanguy

    2017-04-01

    Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.

  6. Improving the Quality of Positive Datasets for the Establishment of Machine Learning Models for pre-microRNA Detection.

    PubMed

    Demirci, Müşerref Duygu Saçar; Allmer, Jens

    2017-07-28

    MicroRNAs (miRNAs) are involved in the post-transcriptional regulation of protein abundance and thus have a great impact on the resulting phenotype. It is, therefore, no wonder that they have been implicated in many diseases ranging from virus infections to cancer. This impact on the phenotype leads to a great interest in establishing the miRNAs of an organism. Experimental methods are complicated which led to the development of computational methods for pre-miRNA detection. Such methods generally employ machine learning to establish models for the discrimination between miRNAs and other sequences. Positive training data for model establishment, for the most part, stems from miRBase, the miRNA registry. The quality of the entries in miRBase has been questioned, though. This unknown quality led to the development of filtering strategies in attempts to produce high quality positive datasets which can lead to a scarcity of positive data. To analyze the quality of filtered data we developed a machine learning model and found it is well able to establish data quality based on intrinsic measures. Additionally, we analyzed which features describing pre-miRNAs could discriminate between low and high quality data. Both models are applicable to data from miRBase and can be used for establishing high quality positive data. This will facilitate the development of better miRNA detection tools which will make the prediction of miRNAs in disease states more accurate. Finally, we applied both models to all miRBase data and provide the list of high quality hairpins.

  7. Detection of oral HPV infection - Comparison of two different specimen collection methods and two HPV detection methods.

    PubMed

    de Souza, Marjorie M A; Hartel, Gunter; Whiteman, David C; Antonsson, Annika

    2018-04-01

    Very little is known about the natural history of oral HPV infection. Several different methods exist to collect oral specimens and detect HPV, but their respective performance characteristics are unknown. We compared two different methods for oral specimen collection (oral saline rinse and commercial saliva kit) from 96 individuals and then analyzed the samples for HPV by two different PCR detection methods (single GP5+/6+ PCR and nested MY09/11 and GP5+/6+ PCR). For the oral rinse samples, the oral HPV prevalence was 10.4% (GP+ PCR; 10% repeatability) vs 11.5% (nested PCR method; 100% repeatability). For the commercial saliva kit samples, the prevalences were 3.1% vs 16.7% with the GP+ PCR vs the nested PCR method (repeatability 100% for both detection methods). Overall the agreement was fair or poor between samples and methods (kappa 0.06-0.36). Standardizing methods of oral sample collection and HPV detection would ensure comparability between future oral HPV studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. The detection methods of dynamic objects

    NASA Astrophysics Data System (ADS)

    Knyazev, N. L.; Denisova, L. A.

    2018-01-01

    The article deals with the application of cluster analysis methods for solving the task of aircraft detection on the basis of distribution of navigation parameters selection into groups (clusters). The modified method of cluster analysis for search and detection of objects and then iterative combining in clusters with the subsequent count of their quantity for increase in accuracy of the aircraft detection have been suggested. The course of the method operation and the features of implementation have been considered. In the conclusion the noted efficiency of the offered method for exact cluster analysis for finding targets has been shown.

  9. Picking vs Waveform based detection and location methods for induced seismicity monitoring

    NASA Astrophysics Data System (ADS)

    Grigoli, Francesco; Boese, Maren; Scarabello, Luca; Diehl, Tobias; Weber, Bernd; Wiemer, Stefan; Clinton, John F.

    2017-04-01

    Microseismic monitoring is a common operation in various industrial activities related to geo-resouces, such as oil and gas and mining operations or geothermal energy exploitation. In microseismic monitoring we generally deal with large datasets from dense monitoring networks that require robust automated analysis procedures. The seismic sequences being monitored are often characterized by very many events with short inter-event times that can even provide overlapped seismic signatures. In these situations, traditional approaches that identify seismic events using dense seismic networks based on detections, phase identification and event association can fail, leading to missed detections and/or reduced location resolution. In recent years, to improve the quality of automated catalogues, various waveform-based methods for the detection and location of microseismicity have been proposed. These methods exploit the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. Although this family of methods have been applied to different induced seismicity datasets, an extensive comparison with sophisticated pick-based detection and location methods is still lacking. We aim here to perform a systematic comparison in term of performance using the waveform-based method LOKI and the pick-based detection and location methods (SCAUTOLOC and SCANLOC) implemented within the SeisComP3 software package. SCANLOC is a new detection and location method specifically designed for seismic monitoring at local scale. Although recent applications have proved an extensive test with induced seismicity datasets have been not yet performed. This method is based on a cluster search algorithm to associate detections to one or many potential earthquake sources. On the other hand, SCAUTOLOC is more a "conventional" method and is the basic tool for seismic event detection and location in SeisComp3. This approach was specifically designed for

  10. TBDQ: A Pragmatic Task-Based Method to Data Quality Assessment and Improvement

    PubMed Central

    Vaziri, Reza; Mohsenzadeh, Mehran; Habibi, Jafar

    2016-01-01

    Organizations are increasingly accepting data quality (DQ) as a major key to their success. In order to assess and improve DQ, methods have been devised. Many of these methods attempt to raise DQ by directly manipulating low quality data. Such methods operate reactively and are suitable for organizations with highly developed integrated systems. However, there is a lack of a proactive DQ method for businesses with weak IT infrastructure where data quality is largely affected by tasks that are performed by human agents. This study aims to develop and evaluate a new method for structured data, which is simple and practical so that it can easily be applied to real world situations. The new method detects the potentially risky tasks within a process, and adds new improving tasks to counter them. To achieve continuous improvement, an award system is also developed to help with the better selection of the proposed improving tasks. The task-based DQ method (TBDQ) is most appropriate for small and medium organizations, and simplicity in implementation is one of its most prominent features. TBDQ is case studied in an international trade company. The case study shows that TBDQ is effective in selecting optimal activities for DQ improvement in terms of cost and improvement. PMID:27192547

  11. Applications of emerging imaging techniques for meat quality and safety detection and evaluation: A review.

    PubMed

    Xiong, Zhenjie; Sun, Da-Wen; Pu, Hongbin; Gao, Wenhong; Dai, Qiong

    2017-03-04

    With improvement in people's living standards, many people nowadays pay more attention to quality and safety of meat. However, traditional methods for meat quality and safety detection and evaluation, such as manual inspection, mechanical methods, and chemical methods, are tedious, time-consuming, and destructive, which cannot meet the requirements of modern meat industry. Therefore, seeking out rapid, non-destructive, and accurate inspection techniques is important for the meat industry. In recent years, a number of novel and noninvasive imaging techniques, such as optical imaging, ultrasound imaging, tomographic imaging, thermal imaging, and odor imaging, have emerged and shown great potential in quality and safety assessment. In this paper, a detailed overview of advanced applications of these emerging imaging techniques for quality and safety assessment of different types of meat (pork, beef, lamb, chicken, and fish) is presented. In addition, advantages and disadvantages of each imaging technique are also summarized. Finally, future trends for these emerging imaging techniques are discussed, including integration of multiple imaging techniques, cost reduction, and developing powerful image-processing algorithms.

  12. Characteristic image quality of a third generation dual-source MDCT scanner: Noise, resolution, and detectability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solomon, Justin, E-mail: justin.solomon@duke.edu; Wilson, Joshua; Samei, Ehsan

    2015-08-15

    Purpose: The purpose of this work was to assess the inherent image quality characteristics of a new multidetector computed tomography system in terms of noise, resolution, and detectability index as a function of image acquisition and reconstruction for a range of clinically relevant settings. Methods: A multisized image quality phantom (37, 30, 23, 18.5, and 12 cm physical diameter) was imaged on a SOMATOM Force scanner (Siemens Medical Solutions) under variable dose, kVp, and tube current modulation settings. Images were reconstructed with filtered back projection (FBP) and with advanced modeled iterative reconstruction (ADMIRE) with iterative strengths of 3, 4, andmore » 5. Image quality was assessed in terms of the noise power spectrum (NPS), task transfer function (TTF), and detectability index for a range of detection tasks (contrasts of approximately 45, 90, 300, −900, and 1000 HU, and 2–20 mm diameter) based on a non-prewhitening matched filter model observer with eye filter. Results: Image noise magnitude decreased with decreasing phantom size, increasing dose, and increasing ADMIRE strength, offering up to 64% noise reduction relative to FBP. Noise texture in terms of the NPS was similar between FBP and ADMIRE (<5% shift in peak frequency). The resolution, based on the TTF, improved with increased ADMIRE strength by an average of 15% in the TTF 50% frequency for ADMIRE-5. The detectability index increased with increasing dose and ADMIRE strength by an average of 55%, 90%, and 163% for ADMIRE 3, 4, and 5, respectively. Assessing the impact of mA modulation for a fixed average dose over the length of the phantom, detectability was up to 49% lower in smaller phantom sections and up to 26% higher in larger phantom sections for the modulated scan compared to a fixed tube current scan. Overall, the detectability exhibited less variability with phantom size for modulated scans compared to fixed tube current scans. Conclusions: Image quality increased with

  13. Molecular Method for Detection of Total Coliforms in Drinking Water Samples

    PubMed Central

    Boudreau, Dominique K.; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G.; Rodriguez, Manuel J.

    2014-01-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. PMID:24771030

  14. Comparison of outlier identification methods in hospital surgical quality improvement programs.

    PubMed

    Bilimoria, Karl Y; Cohen, Mark E; Merkow, Ryan P; Wang, Xue; Bentrem, David J; Ingraham, Angela M; Richards, Karen; Hall, Bruce L; Ko, Clifford Y

    2010-10-01

    Surgeons and hospitals are being increasingly assessed by third parties regarding surgical quality and outcomes, and much of this information is reported publicly. Our objective was to compare various methods used to classify hospitals as outliers in established surgical quality assessment programs by applying each approach to a single data set. Using American College of Surgeons National Surgical Quality Improvement Program data (7/2008-6/2009), hospital risk-adjusted 30-day morbidity and mortality were assessed for general surgery at 231 hospitals (cases = 217,630) and for colorectal surgery at 109 hospitals (cases = 17,251). The number of outliers (poor performers) identified using different methods and criteria were compared. The overall morbidity was 10.3% for general surgery and 25.3% for colorectal surgery. The mortality was 1.6% for general surgery and 4.0% for colorectal surgery. Programs used different methods (logistic regression, hierarchical modeling, partitioning) and criteria (P < 0.01, P < 0.05, P < 0.10) to identify outliers. Depending on outlier identification methods and criteria employed, when each approach was applied to this single dataset, the number of outliers ranged from 7 to 57 hospitals for general surgery morbidity, 1 to 57 hospitals for general surgery mortality, 4 to 27 hospitals for colorectal morbidity, and 0 to 27 hospitals for colorectal mortality. There was considerable variation in the number of outliers identified using different detection approaches. Quality programs seem to be utilizing outlier identification methods contrary to what might be expected, thus they should justify their methodology based on the intent of the program (i.e., quality improvement vs. reimbursement). Surgeons and hospitals should be aware of variability in methods used to assess their performance as these outlier designations will likely have referral and reimbursement consequences.

  15. Application of Islanding Detection and Classification of Power Quality Disturbance in Hybrid Energy System

    NASA Astrophysics Data System (ADS)

    Sun, L. B.; Wu, Z. S.; Yang, K. K.

    2018-04-01

    Islanding and power quality (PQ) disturbances in hybrid energy system become more serious with the application of renewable energy sources. In this paper, a novel method based on wavelet transform (WT) and modified feed forward neural network (FNN) is proposed to detect islanding and classify PQ problems. First, the performance indices, i.e., the energy content and SD of the transformed signal are extracted from the negative sequence component of the voltage signal at PCC using WT. Afterward, WT indices are fed to train FNNs midfield by Particle Swarm Optimization (PSO) which is a novel heuristic optimization method. Then, the results of simulation based on WT-PSOFNN are discussed in MATLAB/SIMULINK. Simulations on the hybrid power system show that the accuracy can be significantly improved by the proposed method in detecting and classifying of different disturbances connected to multiple distributed generations.

  16. GMDD: a database of GMO detection methods

    PubMed Central

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-01-01

    Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755

  17. GMDD: a database of GMO detection methods.

    PubMed

    Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing

    2008-06-04

    Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.

  18. Comparing CNV detection methods for SNP arrays.

    PubMed

    Winchester, Laura; Yau, Christopher; Ragoussis, Jiannis

    2009-09-01

    Data from whole genome association studies can now be used for dual purposes, genotyping and copy number detection. In this review we discuss some of the methods for using SNP data to detect copy number events. We examine a number of algorithms designed to detect copy number changes through the use of signal-intensity data and consider methods to evaluate the changes found. We describe the use of several statistical models in copy number detection in germline samples. We also present a comparison of data using these methods to assess accuracy of prediction and detection of changes in copy number.

  19. Quality Assurance Through Quality Improvement and Professional Development in the National Breast and Cervical Cancer Early Detection Program

    PubMed Central

    Siegl, Elvira J.; Miller, Jacqueline W.; Khan, Kris; Harris, Susan E.

    2015-01-01

    Quality assurance (QA) is the process of providing evidence that the outcome meets the established standards. Quality improvement (QI), by contrast, is the act of methodically developing ways to meet acceptable quality standards and evaluating current processes to improve overall performance. In the case of the National Breast and Cervical Cancer Early Detection Program (NBCCEDP), the desired outcome is the delivery of quality health care services to program clients. The NBCCEDP provides professional development to ensure that participating providers have current knowledge of evidence-based clinical standards regarding breast and cervical cancer screening and diagnosis and are monitoring women with abnormal screening results for timely follow-up. To assess the quality of clinical care provided to NBCCEDP clients, performance data are collected by NBCCEDP grantees and compared against predetermined Centers for Disease Control and Prevention (CDC) benchmarks known as Data Quality Indicator Guides. In this article, the authors describe 1) the development and use of indicators for QI in the NBCCEDP and 2) the professional development activities implemented to improve clinical outcomes. QA identifies problems, whereas QI systematically corrects them. The quality of service delivery and improved patient outcomes among NBCCEDP grantees has enhanced significantly because of continuous monitoring of performance and professional development. By using QA, NBCCEDP grantees can maximize the quality of patient screening, diagnostic services, and follow-up. Examples of grantee activities to maintain quality of care are also described in this report. PMID:25099901

  20. Effects of three highway-runoff detention methods on water quality of the surficial aquifer system in central Florida

    USGS Publications Warehouse

    Schiffer, D.M.

    1989-01-01

    Water quality of the surficial aquifer system in central Florida was evaluated at one exfiltration pipe, two ponds (detention and retention), and two swales in central Florida, representing three runoff-detention methods, to detect any effect from infiltrating highway runoff. Concentrations of major ions, metals, and nutrients in groundwater and bottom sediments were measured from 1984 through 1986. At each study area, constituent concentrations in groundwater near the structure were compared to concentrations in groundwater from an upgradient control site. Groundwater quality data were also pooled by detention method and statistically compared to detect any significant differences between methods. Significantly greater mean phosphorus concentrations in groundwater near the exfiltration pipe than those in the control well was the only evidence of increasing constituent concentrations in groundwater near structures. The quality of water was more variable, and had greater constituent concentrations in the unsaturated zone than in the saturated zone near the exfiltration pipe. Values of water quality variables measured in groundwater at all study areas generally were within State drinking water standards. The main exception was dissolved iron, which commonly exceeded 300 micrograms/L at one swale and the detention pond. Results of the study indicate that natural processes occurring in soils attenuate inorganic constituent concentrations prior to reaching the receiving groundwater. However, organic compounds detected in bottom sediments at the retention pond indicate a potential problem that may eventually affect the quality of the receiving groundwater. (USGS)

  1. Chemicals of emerging concern in water and bottom sediment in the Great Lakes Basin, 2012: collection methods, analytical methods, quality assurance, and study data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Hansen, Donald S.; Foreman, William T.; Furlong, Edward T.; Jorgenson, Zachary G.; Choy, Steven J.; Moore, Jeremy N.; Banda, JoAnn; Gefell, Daniel J.

    2015-01-01

    During this study, 53 environmental samples, 4 field duplicate samples, and 8 field spike samples of bottom sediment and laboratory matrix-spike samples were analyzed for a wide variety of CECs at the USGS National Water Quality Laboratory using laboratory schedule 5433 for wastewater indicators; research method 6434 for steroid hormones, sterols, and bisphenol A; and research method 9008 for human-use pharmaceuticals and antidepressants. Forty of the 57 chemicals analyzed using laboratory schedule 5433 had detectable concentrations ranging from 1 to 49,000 micrograms per kilogram. Fourteen of the 20 chemicals analyzed using research method 6434 had detectable concentrations ranging from 0.04 to 24,940 nanograms per gram. Ten of the 20 chemicals analyzed using research method 9008 had detectable concentrations ranging from 0.59 to 197.5 micrograms per kilogram. Five of the 11 chemicals analyzed using research method 9008 had detectable concentrations ranging from 1.16 to 25.0 micrograms per kilogram.

  2. [Establishment of Quality Control System of Nucleic Acid Detection for Ebola Virus in Sierra Leone-China Friendship Biological Safety Laboratory].

    PubMed

    Wang, Qin; Zhang, Yong; Nie, Kai; Wang, Huanyu; Du, Haijun; Song, Jingdong; Xiao, Kang; Lei, Wenwen; Guo, Jianqiang; Wei, Hejiang; Cai, Kun; Wang, Yanhai; Wu, Jiang; Gerald, Bangura; Kamara, Idrissa Laybohr; Liang, Mifang; Wu, Guizhen; Dong, Xiaoping

    2016-03-01

    The quality control process throughout the Ebola virus nucleic acid detection in Sierra Leone-China Friendship Biological Safety Laboratory (SLE-CHN Biosafety Lab) was described in detail, in order to comprehensively display the scientific, rigorous, accurate and efficient practice in detection of Ebola virus of first batch detection team in SLE-CHN Biosafety Lab. Firstly, the key points of laboratory quality control system was described, including the managements and organizing, quality control documents and information management, instrument, reagents and supplies, assessment, facilities design and space allocation, laboratory maintenance and biosecurity. Secondly, the application of quality control methods in the whole process of the Ebola virus detection, including before the test, during the test and after the test, was analyzed. The excellent and professional laboratory staffs, the implementation of humanized management are the cornerstone of the success; High-level biological safety protection is the premise for effective quality control and completion of Ebola virus detection tasks. And professional logistics is prerequisite for launching the laboratory diagnosis of Ebola virus. The establishment and running of SLE-CHN Biosafety Lab has landmark significance for the friendship between Sierra Leone and China, and the lab becomes the most important base for Ebola virus laboratory testing in Sierra Leone.

  3. Method variation in the impact of missing data on response shift detection.

    PubMed

    Schwartz, Carolyn E; Sajobi, Tolulope T; Verdam, Mathilde G E; Sebille, Veronique; Lix, Lisa M; Guilleux, Alice; Sprangers, Mirjam A G

    2015-03-01

    Missing data due to attrition or item non-response can result in biased estimates and loss of power in longitudinal quality-of-life (QOL) research. The impact of missing data on response shift (RS) detection is relatively unknown. This overview article synthesizes the findings of three methods tested in this special section regarding the impact of missing data patterns on RS detection in incomplete longitudinal data. The RS detection methods investigated include: (1) Relative importance analysis to detect reprioritization RS in stroke caregivers; (2) Oort's structural equation modeling (SEM) to detect recalibration, reprioritization, and reconceptualization RS in cancer patients; and (3) Rasch-based item-response theory-based (IRT) models as compared to SEM models to detect recalibration and reprioritization RS in hospitalized chronic disease patients. Each method dealt with missing data differently, either with imputation (1), attrition-based multi-group analysis (2), or probabilistic analysis that is robust to missingness due to the specific objectivity property (3). Relative importance analyses were sensitive to the type and amount of missing data and imputation method, with multiple imputation showing the largest RS effects. The attrition-based multi-group SEM revealed differential effects of both the changes in health-related QOL and the occurrence of response shift by attrition stratum, and enabled a more complete interpretation of findings. The IRT RS algorithm found evidence of small recalibration and reprioritization effects in General Health, whereas SEM mostly evidenced small recalibration effects. These differences may be due to differences between the two methods in handling of missing data. Missing data imputation techniques result in different conclusions about the presence of reprioritization RS using the relative importance method, while the attrition-based SEM approach highlighted different recalibration and reprioritization RS effects by

  4. Molecular method for detection of total coliforms in drinking water samples.

    PubMed

    Maheux, Andrée F; Boudreau, Dominique K; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G; Rodriguez, Manuel J

    2014-07-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  5. Error detection method

    DOEpatents

    Olson, Eric J.

    2013-06-11

    An apparatus, program product, and method that run an algorithm on a hardware based processor, generate a hardware error as a result of running the algorithm, generate an algorithm output for the algorithm, compare the algorithm output to another output for the algorithm, and detect the hardware error from the comparison. The algorithm is designed to cause the hardware based processor to heat to a degree that increases the likelihood of hardware errors to manifest, and the hardware error is observable in the algorithm output. As such, electronic components may be sufficiently heated and/or sufficiently stressed to create better conditions for generating hardware errors, and the output of the algorithm may be compared at the end of the run to detect a hardware error that occurred anywhere during the run that may otherwise not be detected by traditional methodologies (e.g., due to cooling, insufficient heat and/or stress, etc.).

  6. A novel method for sex determination by detecting the number of X chromosomes.

    PubMed

    Nakanishi, Hiroaki; Shojo, Hideki; Ohmori, Takeshi; Hara, Masaaki; Takada, Aya; Adachi, Noboru; Saito, Kazuyuki

    2015-01-01

    A novel method for sex determination, based on the detection of the number of X chromosomes, was established. Current methods, based on the detection of the Y chromosome, can directly identify an unknown sample as male, but female gender is determined indirectly, by not detecting the Y chromosome. Thus, a direct determination of female gender is important because the quality (e.g., fragmentation and amelogenin-Y null allele) of the Y chromosome DNA may lead to a false result. Thus, we developed a novel sex determination method by analyzing the number of X chromosomes using a copy number variation (CNV) detection technique (the comparative Ct method). In this study, we designed a primer set using the amelogenin-X gene without the CNV region as the target to determine the X chromosome copy number, to exclude the influence of the CNV region from the comparative Ct value. The number of X chromosomes was determined statistically using the CopyCaller software with real-time PCR. All DNA samples from participants (20 males, 20 females) were evaluated correctly using this method with 1-ng template DNA. A minimum of 0.2-ng template DNA was found to be necessary for accurate sex determination with this method. When using ultraviolet-irradiated template DNA, as mock forensic samples, the sex of the samples could not be determined by short tandem repeat (STR) analysis but was correctly determined using our method. Thus, we successfully developed a method of sex determination based on the number of X chromosomes. Our novel method will be useful in forensic practice for sex determination.

  7. Detection of Perlger-Huet anomaly based on augmented fast marching method and speeded up robust features.

    PubMed

    Sun, Minglei; Yang, Shaobao; Jiang, Jinling; Wang, Qiwei

    2015-01-01

    Pelger-Huet anomaly (PHA) and Pseudo Pelger-Huet anomaly (PPHA) are neutrophil with abnormal morphology. They have the bilobed or unilobed nucleus and excessive clumping chromatin. Currently, detection of this kind of cell mainly depends on the manual microscopic examination by a clinician, thus, the quality of detection is limited by the efficiency and a certain subjective consciousness of the clinician. In this paper, a detection method for PHA and PPHA is proposed based on karyomorphism and chromatin distribution features. Firstly, the skeleton of the nucleus is extracted using an augmented Fast Marching Method (AFMM) and width distribution is obtained through distance transform. Then, caryoplastin in the nucleus is extracted based on Speeded Up Robust Features (SURF) and a K-nearest-neighbor (KNN) classifier is constructed to analyze the features. Experiment shows that the sensitivity and specificity of this method achieved 87.5% and 83.33%, which means that the detection accuracy of PHA is acceptable. Meanwhile, the detection method should be helpful to the automatic morphological classification of blood cells.

  8. NONDESTRUCTIVE QUALITY CONTROL: SOME SPECIAL METHODS OF IRRADIATION TESTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van der Klis, T.

    1961-06-10

    S>Various methods, using open radioactive sources are discussed. In one method, oil is used containing Pd/sup 109/ which is adsorbed by Mg compounds with which the object to be tested is covered after it has been enveloped in a photographic film. Another method consists of coking the material in the radioactive oil and then scanning it with a suitable detector. A third method, applied especially to porous materials, uses pressure to promote the penetration of the radioactive oil into the cracks and fissures. The filtered particle technique is also used for detection of cracks or cavities in porous materials, suchmore » as ceramics, cement, graphite pressed powdered metals, and sintered carbides. In this method, radioactive liquids are used along with fluid fluorescent substances. Finally, a method is mentioned in which radioactive powder is made to adhere to the surface of the investigated objects by means of an electrostatic charge. This method is used for quality control of china, glass, email, and electric insulation material. (OID)« less

  9. Automated methods for multiplexed pathogen detection.

    PubMed

    Straub, Timothy M; Dockendorff, Brian P; Quiñonez-Díaz, Maria D; Valdez, Catherine O; Shutthanandan, Janani I; Tarasevich, Barbara J; Grate, Jay W; Bruckner-Lea, Cynthia J

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides "live vs. dead" capabilities. However

  10. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    PubMed Central

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421

  11. Apparatus and method for combusting low quality fuel

    DOEpatents

    Brushwood, John Samuel; Pillsbury, Paul; Foote, John; Heilos, Andreas

    2003-11-04

    A gas turbine (12) capable of combusting a low quality gaseous fuel having a ratio of flammability limits less than 2, or a heat value below 100 BTU/SCF. A high quality fuel is burned simultaneously with the low quality fuel to eliminate instability in the combustion flame. A sensor (46) is used to monitor at least one parameter of the flame indicative of instability. A controller (50) having the sensor signal (48) as input is programmed to control the relative flow rates of the low quality and high quality fuels. When instability is detected, the flow rate of high quality fuel is automatically increased in relation to the flow rate of low quality fuel to restore stability.

  12. Nucleic Acid Detection Methods

    DOEpatents

    Smith, Cassandra L.; Yaar, Ron; Szafranski, Przemyslaw; Cantor, Charles R.

    1998-05-19

    The invention relates to methods for rapidly determining the sequence and/or length a target sequence. The target sequence may be a series of known or unknown repeat sequences which are hybridized to an array of probes. The hybridized array is digested with a single-strand nuclease and free 3'-hydroxyl groups extended with a nucleic acid polymerase. Nuclease cleaved heteroduplexes can be easily distinguish from nuclease uncleaved heteroduplexes by differential labeling. Probes and target can be differentially labeled with detectable labels. Matched target can be detected by cleaving resulting loops from the hybridized target and creating free 3-hydroxyl groups. These groups are recognized and extended by polymerases added into the reaction system which also adds or releases one label into solution. Analysis of the resulting products using either solid phase or solution. These methods can be used to detect characteristic nucleic acid sequences, to determine target sequence and to screen for genetic defects and disorders. Assays can be conducted on solid surfaces allowing for multiple reactions to be conducted in parallel and, if desired, automated.

  13. Nucleic acid detection methods

    DOEpatents

    Smith, C.L.; Yaar, R.; Szafranski, P.; Cantor, C.R.

    1998-05-19

    The invention relates to methods for rapidly determining the sequence and/or length a target sequence. The target sequence may be a series of known or unknown repeat sequences which are hybridized to an array of probes. The hybridized array is digested with a single-strand nuclease and free 3{prime}-hydroxyl groups extended with a nucleic acid polymerase. Nuclease cleaved heteroduplexes can be easily distinguish from nuclease uncleaved heteroduplexes by differential labeling. Probes and target can be differentially labeled with detectable labels. Matched target can be detected by cleaving resulting loops from the hybridized target and creating free 3-hydroxyl groups. These groups are recognized and extended by polymerases added into the reaction system which also adds or releases one label into solution. Analysis of the resulting products using either solid phase or solution. These methods can be used to detect characteristic nucleic acid sequences, to determine target sequence and to screen for genetic defects and disorders. Assays can be conducted on solid surfaces allowing for multiple reactions to be conducted in parallel and, if desired, automated. 18 figs.

  14. Nondestructive Methods for Detecting Defects in Softwood Logs

    Treesearch

    Kristin C. Schad; Daniel L. Schmoldt; Robert J. Ross

    1996-01-01

    Wood degradation and defects, such as voids and knots, affect the quality and processing time of lumber. The ability to detect internal defects in the log can save mills time and processing costs. In this study, we investigated three nondestructive evaluation techniques for detecting internal wood defects. Sound wave transmission, x-ray computed tomography, and impulse...

  15. ADSA Foundation Scholar Award: Trends in culture-independent methods for assessing dairy food quality and safety: emerging metagenomic tools.

    PubMed

    Yeung, Marie

    2012-12-01

    Enhancing the quality and safety of dairy food is critical to maintaining the competitiveness of dairy products in the food and beverage market and in reinforcing consumer confidence in the dairy industry. Raw milk quality has a significant effect on finished product quality. Several microbial groups found in raw milk have been shown to adversely affect the shelf life of pasteurized milk. Current microbiological criteria used to define milk quality are based primarily on culture-dependent methods, some of which are perceived to lack the desired sensitivity and specificity. To supplement traditional methods, culture-independent methods are increasingly being used to identify specific species or microbial groups, and to detect indicator genes or proteins in raw milk or dairy products. Some molecular subtyping techniques have been developed to track the transmission of microbes in dairy environments. The burgeoning "-omics" technologies offer new and exciting opportunities to enhance our understanding of food quality and safety in relation to microbes. Metagenomics has the potential to characterize microbial diversity, detect nonculturable microbes, and identify unique sequences or other factors associated with dairy product quality and safety. In this review, fluid milk will be used as the primary example to examine the adequacy and validity of conventional methods, the current trend of culture-independent methods, and the potential applications of metagenomics in dairy food research. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Extracting transient Rayleigh wave and its application in detecting quality of highway roadbed

    USGS Publications Warehouse

    Liu, J.; Xia, J.; Luo, Y.; Li, X.; Xu, S.; ,

    2004-01-01

    This paper first explains the tau-p mapping method of extracting Rayleigh waves (LR waves) from field shot gathers. It also explains a mathematical model of physical character parameters of quality of high-grade roads. This paper then discusses an algorithm of computing dispersion curves using adjacent channels. Shear velocity and physical character parameters are obtained by inversion of dispersion curves. The algorithm using adjacent channels to calculating dispersion curves eliminates average effects that exist by using multi-channels to obtain dispersion curves so that it improves longitudinal and transverse resolution of LR waves and precision of non-invasive detection, and also broadens its application fields. By analysis of modeling results of detached computation of the ground roll and real examples of detecting density and pressure strength of a high-grade roadbed, and by comparison of shallow seismic image method with borehole cores, we concluded that: 1 the abnormal scale and configuration obtained by LR waves are mostly the same as the result of shallow seismic image method; 2 an average relative error of density obtained from LR waves inversion is 1.6% comparing with borehole coring; 3 transient LR waves in detecting density and pressure strength of a high-grade roadbed is feasible and effective.

  17. A Simple and Rapid UPLC-PDA Method for Quality Control of Nardostachys jatamansi.

    PubMed

    Zhang, Weize; Nan, Guo; Wu, Hong-Hua; Jiang, Miaomiao; Li, Tian-Xiang; Wang, Meng; Gao, Xiu-Mei; Zhu, Yan; Song, Yun Seon; Wang, Jiaming; Xu, Yan-Tong

    2018-05-01

    Nardostachys jatamansi is a well-documented herbal agent used to treat digestive and neuropsychiatric disorders in oriental medicinal systems. However, few simple, rapid, and comprehensive methods were reported for quality assessment and control of N. jatamansi . Herein, a UPLC with photodiode array detection method was developed for both fingerprint investigation of N. jatamansi and simultaneous quantitative analysis of the six serotonin transporter modulatory constituents in N. jatamansi . For chromatographic fingerprinting, 24 common peaks were selected as characteristic peaks to assess the consistency of N. jatamansi samples from different retail sources. Six of the common peaks (5, 7, 12: , and 16:  - 18: ) were identified as desoxo-narchinol A, buddleoside, isonardosinone, nardosinone, kanshone H, and (-)-aristolone, respectively, by phytochemical investigation. Five of the six compounds significantly either enhanced or inhibited serotonin transporter activity, while (-)-aristolone (18: ) didn't show any serotonin transporter activity. In quantitative analysis, the six compounds showed good linearity ( r  > 0.999) within test ranges. The precision, expressed as relative standard deviation, was in the range of 0.25 - 2.77%, and the recovery of the method was in the range of 92 - 105%. The UPLC-photodiode array detection-based fingerprint analysis and quantitative methods reported here could be used for routine quality control of N. jatamansi . Georg Thieme Verlag KG Stuttgart · New York.

  18. Application of principal component regression and partial least squares regression in ultraviolet spectrum water quality detection

    NASA Astrophysics Data System (ADS)

    Li, Jiangtong; Luo, Yongdao; Dai, Honglin

    2018-01-01

    Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.

  19. Parallel evaluation of broad virus detection methods.

    PubMed

    Modrof, Jens; Berting, Andreas; Kreil, Thomas R

    2014-01-01

    The testing for adventitious viruses is of critical importance during development and production of biological products. The recent emergence and ongoing development of broad virus detection methods calls for an evaluation of whether these methods can appropriately be implemented into current adventitious agent testing procedures. To assess the suitability of several broad virus detection methods, a comparative experimental study was conducted: four virus preparations, which were spiked at two different concentrations each into two different cell culture media, were sent to four investigators in a blinded fashion for analysis with broad virus detection methods such as polymerase chain reaction-electrospray ionization mass spectrometry (PCR-ESI/MS), microarray, and two approaches utilizing massively parallel sequencing. The results that were reported by the investigators revealed that all methods were able to identify the majority of samples correctly (mean 83%), with a surprisingly narrow range among the methods, that is, between 72% (PCR-ESI/MS) and 95% (microarray). In addition to the correct results, a variety of unexpected assignments were reported for a minority of samples, again with little variation regarding the methods used (range 20-45%), while false negatives were reported for 0-25% of the samples. Regarding assay sensitivity, the viruses were detected by all methods included in this study at concentrations of about 4-5 log10 quantitative PCR copies/mL, and probably with higher sensitivity in some cases. In summary, the broad virus detection methods investigated were shown to be suitable even for detection of relatively low virus concentrations. However, there is also some potential for the production of false-positive as well as false-negative assignments, which indicates the requirement for further improvements before these methods can be considered for routine use. © PDA, Inc. 2014.

  20. Detecting sulphate aerosol geoengineering with different methods

    DOE PAGES

    Lo, Y. T. Eunice; Charlton-Perez, Andrew J.; Lott, Fraser C.; ...

    2016-12-15

    Sulphate aerosol injection has been widely discussed as a possible way to engineer future climate. Monitoring it would require detecting its effects amidst internal variability and in the presence of other external forcings. Here, we investigate how the use of different detection methods and filtering techniques affects the detectability of sulphate aerosol geoengineering in annual-mean global-mean near-surface air temperature. This is done by assuming a future scenario that injects 5 Tg yr -1 of sulphur dioxide into the stratosphere and cross-comparing simulations from 5 climate models. 64% of the studied comparisons would require 25 years or more for detection whenmore » no filter and the multi-variate method that has been extensively used for attributing climate change are used, while 66% of the same comparisons would require fewer than 10 years for detection using a trend-based filter. This then highlights the high sensitivity of sulphate aerosol geoengineering detectability to the choice of filter. With the same trend-based filter but a non-stationary method, 80% of the comparisons would require fewer than 10 years for detection. This does not imply sulphate aerosol geoengineering should be deployed, but suggests that both detection methods could be used for monitoring geoengineering in global, annual mean temperature should it be needed.« less

  1. Automated Methods for Multiplexed Pathogen Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cyclermore » where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities

  2. Conditional anomaly detection methods for patient–management alert systems

    PubMed Central

    Valko, Michal; Cooper, Gregory; Seybert, Amy; Visweswaran, Shyam; Saul, Melissa; Hauskrecht, Milos

    2010-01-01

    Anomaly detection methods can be very useful in identifying unusual or interesting patterns in data. A recently proposed conditional anomaly detection framework extends anomaly detection to the problem of identifying anomalous patterns on a subset of attributes in the data. The anomaly always depends (is conditioned) on the value of remaining attributes. The work presented in this paper focuses on instance–based methods for detecting conditional anomalies. The methods rely on the distance metric to identify examples in the dataset that are most critical for detecting the anomaly. We investigate various metrics and metric learning methods to optimize the performance of the instance–based anomaly detection methods. We show the benefits of the instance–based methods on two real–world detection problems: detection of unusual admission decisions for patients with the community–acquired pneumonia and detection of unusual orders of an HPF4 test that is used to confirm Heparin induced thrombocytopenia — a life–threatening condition caused by the Heparin therapy. PMID:25392850

  3. [The establishment and application of internal quality control system for real-time quantitative PCR detection of BCR-ABL (P210) transcript levels].

    PubMed

    Zhong, C Q; He, N; Hua, M Q; Wei, X D; Ma, D X; Ji, C Y

    2016-09-14

    Objective: To set internal quality control system of BCR-ABL (P210) transcript levels for real-time quantitative PCR (RQ-PCR). Methods: Using K562 cells and HL-60 cells, we prepared high- and low-level BCR-ABL internal quality control substance. The BCR-ABL (P210) transcript levels of internal quality control substance have been determined for 184 times together with clinical samples from August 2013 to October 2015. The slope rate, intercept and correlation coefficient of standard curve were calculated according to different reagent lots (lots number 20130303, 20131212, 20140411 and 20150327 are called R1、R2、R3 and R4 for short respectively), and the detection results of quality control substance were calculated according to different reagent lots and quality control substance lots (lots number 20130725, 20140611 are called Q1、Q2 for short respectively). Then the results were analyzed by Levey-Jennings quality control chart combined with Westgard multi-rules theory. Results: ①We analyzed the slope rate and intercept of standard curve. Fifty-three times of the R1 reagent detection, 80 times of the R3 reagent detection and 14 times of the R4 reagent detection were all under control. For 37 times detection of R2 reagent, the slope rate was out of control for 6 times. It was lower than x - s for the 2-8 tests and upper the average for the 12-37 tests. The intercept was out of control for 9 times, upper the x + s for the 1-8 tests and lower the average for the 12-37 tests. ② According to the detection results of quality control substance, for Q1 quality control substance, 49 tests by R1 reagent were under control, and 1 out of 23 tests by R2 reagent was out of control. For Q2 quality control substance, 14 tests by R2 reagent detection, 72 tests by R3 reagent detection and 14 tests by R4 reagent were all under control. Conclusion: The preparation of high- and low-level quality control substance using K562 and HL-60 cells was convenient and the detection

  4. WE-G-204-07: Automated Characterization of Perceptual Quality of Clinical Chest Radiographs: Improvements in Lung, Spine, and Hardware Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, J; Zhang, L; Samei, E

    Purpose: To develop and validate more robust methods for automated lung, spine, and hardware detection in AP/PA chest images. This work is part of a continuing effort to automatically characterize the perceptual image quality of clinical radiographs. [Y. Lin et al. Med. Phys. 39, 7019–7031 (2012)] Methods: Our previous implementation of lung/spine identification was applicable to only one vendor. A more generalized routine was devised based on three primary components: lung boundary detection, fuzzy c-means (FCM) clustering, and a clinically-derived lung pixel probability map. Boundary detection was used to constrain the lung segmentations. FCM clustering produced grayscale- and neighborhood-based pixelmore » classification probabilities which are weighted by the clinically-derived probability maps to generate a final lung segmentation. Lung centerlines were set along the left-right lung midpoints. Spine centerlines were estimated as a weighted average of body contour, lateral lung contour, and intensity-based centerline estimates. Centerline estimation was tested on 900 clinical AP/PA chest radiographs which included inpatient/outpatient, upright/bedside, men/women, and adult/pediatric images from multiple imaging systems. Our previous implementation further did not account for the presence of medical hardware (pacemakers, wires, implants, staples, stents, etc.) potentially biasing image quality analysis. A hardware detection algorithm was developed using a gradient-based thresholding method. The training and testing paradigm used a set of 48 images from which 1920 51×51 pixel{sup 2} ROIs with and 1920 ROIs without hardware were manually selected. Results: Acceptable lung centerlines were generated in 98.7% of radiographs while spine centerlines were acceptable in 99.1% of radiographs. Following threshold optimization, the hardware detection software yielded average true positive and true negative rates of 92.7% and 96.9%, respectively. Conclusion: Updated

  5. Performance of Traditional and Molecular Methods for Detecting Biological Agents in Drinking Water

    USGS Publications Warehouse

    Francy, Donna S.; Bushon, Rebecca N.; Brady, Amie M.G.; Bertke, Erin E.; Kephart, Christopher M.; Likirdopulos, Christina A.; Mailot, Brian E.; Schaefer, Frank W.; Lindquist, H.D. Alan

    2009-01-01

    To reduce the impact from a possible bioterrorist attack on drinking-water supplies, analytical methods are needed to rapidly detect the presence of biological agents in water. To this end, 13 drinking-water samples were collected at 9 water-treatment plants in Ohio to assess the performance of a molecular method in comparison to traditional analytical methods that take longer to perform. Two 100-liter samples were collected at each site during each sampling event; one was seeded in the laboratory with six biological agents - Bacillus anthracis (B. anthracis), Burkholderia cepacia (as a surrogate for Bu. pseudomallei), Francisella tularensis (F. tularensis), Salmonella Typhi (S. Typhi), Vibrio cholerae (V. cholerae), and Cryptospordium parvum (C. parvum). The seeded and unseeded samples were processed by ultrafiltration and analyzed by use of quantiative polymerase chain reaction (qPCR), a molecular method, and culture methods for bacterial agents or the immunomagnetic separation/fluorescent antibody (IMS/FA) method for C. parvum as traditional methods. Six replicate seeded samples were also processed and analyzed. For traditional methods, recoveries were highly variable between samples and even between some replicate samples, ranging from below detection to greater than 100 percent. Recoveries were significantly related to water pH, specific conductance, and dissolved organic carbon (DOC) for all bacteria combined by culture methods, but none of the water-quality characteristics tested were related to recoveries of C. parvum by IMS/FA. Recoveries were not determined by qPCR because of problems in quantifying organisms by qPCR in the composite seed. Instead, qPCR results were reported as detected, not detected (no qPCR signal), or +/- detected (Cycle Threshold or 'Ct' values were greater than 40). Several sample results by qPCR were omitted from the dataset because of possible problems with qPCR reagents, primers, and probes. For the remaining 14 qPCR results

  6. [Method for the quality assessment of data collection processes in epidemiological studies].

    PubMed

    Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L

    2017-10-01

    For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.

  7. A non-reference evaluation method for edge detection of wear particles in ferrograph images

    NASA Astrophysics Data System (ADS)

    Wang, Jingqiu; Bi, Ju; Wang, Lianjun; Wang, Xiaolei

    2018-02-01

    Edges are one of the most important features of wear particles in a ferrograph image and are widely used to extract parameters, recognize types of wear particles, and assist in the identification of the wear mode and severity. Edge detection is a critical step in ferrograph image processing and analysis. Till date, there has been no single algorithm that guarantees the production of good quality edges in ferrograph images for a variety of applications. Therefore, it is desirable to have a reliable evaluation method for measuring the performance of various edge detection algorithms and for aiding in the selection of the optimal parameter and algorithm for ferrographic applications. In this paper, a new non-reference method for the objective evaluation of wear particle edge detection is proposed. In this method, a comprehensive index of edge evaluation is composed of three components, i.e., the reconstruction based similarity sub-index between the original image and the reconstructed image, the confidence degree sub-index used to show the true or false degree of the edge pixels, and the edge form sub-index that is used to determine the direction consistency and width uniformity of the edges. Two experiments are performed to illustrate the validity of the proposed method. First, this method is used to select the best parameters for an edge detection algorithm, and it is then used to compare the results obtained using various edge detection algorithms and determine the best algorithm. Experimental results of various real ferrograph images verify the effectiveness of the proposed method.

  8. Raman spectroscopy method for subsurface detection of food powders through plastic layers

    NASA Astrophysics Data System (ADS)

    Dhakal, Sagar; Chao, Kuanglin; Qin, Jianwei; Schmidt, Walter F.; Kim, Moon S.; Chan, Diane E.; Bae, Abigail

    2017-05-01

    Proper chemical analyses of materials in sealed containers are important for quality control purpose. Although it is feasible to detect chemicals at top surface layer, it is relatively challenging to detect objects beneath obscuring surface. This study used spatially offset Raman spectroscopy (SORS) method to detect urea, ibuprofen and acetaminophen powders contained within one or more (up to eight) layers of gelatin capsules to demonstrate subsurface chemical detection and identification. A 785 nm point-scan Raman spectroscopy system was used to acquire spatially offset Raman spectra for an offset range of 0 to 10 mm from the surfaces of 24 encapsulated samples, using a step size of 0.1 mm to obtain 101 spectral measurements per sample. With increasing offset distance, the fraction of information from the deeper subsurface material increased compared to that from the top surface material. The series of measurements was analyzed to differentiate and identify the top surface and subsurface materials. Containing mixed contributions from the powder and capsule, the SORS of each sample was decomposed using self modeling mixture analysis (SMA) to obtain pure component spectra of each component and corresponding components were identified using spectral information divergence values. Results show that SORS technique together with SMA method has a potential for non-invasive detection of chemicals at deep subsurface layer.

  9. Improved astigmatic focus error detection method

    NASA Technical Reports Server (NTRS)

    Bernacki, Bruce E.

    1992-01-01

    All easy-to-implement focus- and track-error detection methods presently used in magneto-optical (MO) disk drives using pre-grooved media suffer from a side effect known as feedthrough. Feedthrough is the unwanted focus error signal (FES) produced when the optical head is seeking a new track, and light refracted from the pre-grooved disk produces an erroneous FES. Some focus and track-error detection methods are more resistant to feedthrough, but tend to be complicated and/or difficult to keep in alignment as a result of environmental insults. The astigmatic focus/push-pull tracking method is an elegant, easy-to-align focus- and track-error detection method. Unfortunately, it is also highly susceptible to feedthrough when astigmatism is present, with the worst effects caused by astigmatism oriented such that the tangential and sagittal foci are at 45 deg to the track direction. This disclosure outlines a method to nearly completely eliminate the worst-case form of feedthrough due to astigmatism oriented 45 deg to the track direction. Feedthrough due to other primary aberrations is not improved, but performance is identical to the unimproved astigmatic method.

  10. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  11. Major quality trait analysis and QTL detection in hexaploid wheat in humid rain-fed agriculture.

    PubMed

    Li, H M; Tang, Z X; Zhang, H Q; Yan, B J; Ren, Z L

    2013-05-21

    Humid rain-fed agriculture is a special environment for wheat (Triticum aestivum) culture that tends to negatively affect wheat yield and quality. To identify quality characters of wheat in a humid environment, we conducted quality analysis and quantitative trait loci (QTL) detection in a recombinant inbred line whose parent had a high level of quality for several years. We found that high-quality wheat had less gluten content and lower protein content. Apparently, wheat quality and associated quantity traits were in a dynamic state of equilibrium. We detected 83 QTL for 10 wheat quality traits in this recombinant inbred line population. Nine QTL were detected in both evaluation years; Q.DT.scau-2A, linked to Xwmc522-2A, was detected at the same genetic location in both years. Other QTL for different traits were detected simultaneously in more than one location. Consequently, there appeared to be pleiotropic genes that control wheat quality. Based on previous studies and our research on QTL analysis of grain protein content, we conclude that there must be one or more genes for grain protein content on chromosome 6B, whose expression was little affected by environment. We constructed a consensus map and projected the QTL on it. It was useful for choosing optimal markers for marker-assisted breeding and map-based cloning.

  12. Detection of expression quantitative trait Loci in complex mouse crosses: impact and alleviation of data quality and complex population substructure.

    PubMed

    Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon

    2012-01-01

    Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.

  13. Study of New Method Combined Ultra-High Frequency (UHF) Method and Ultrasonic Method on PD Detection for GIS

    NASA Astrophysics Data System (ADS)

    Li, Yanran; Chen, Duo; Zhang, Jiwei; Chen, Ning; Li, Xiaoqi; Gong, Xiaojing

    2017-09-01

    GIS (gas insulated switchgear), is an important equipment in power system. Partial discharge plays an important role in detecting the insulation performance of GIS. UHF method and ultrasonic method frequently used in partial discharge (PD) detection for GIS. It is necessary to investigate UHF method and ultrasonic method for partial discharge in GIS. However, very few studies have been conducted on the method combined this two methods. From the view point of safety, a new method based on UHF method and ultrasonic method of PD detection for GIS is proposed in order to greatly enhance the ability of anti-interference of signal detection and the accuracy of fault localization. This paper presents study aimed at clarifying the effect of the new method combined UHF method and ultrasonic method. Partial discharge tests were performed in laboratory simulated environment. Obtained results show the ability of anti-interference of signal detection and the accuracy of fault localization for this new method combined UHF method and ultrasonic method.

  14. Enhanced data validation strategy of air quality monitoring network.

    PubMed

    Harkat, Mohamed-Faouzi; Mansouri, Majdi; Nounou, Mohamed; Nounou, Hazem

    2018-01-01

    Quick validation and detection of faults in measured air quality data is a crucial step towards achieving the objectives of air quality networks. Therefore, the objectives of this paper are threefold: (i) to develop a modeling technique that can be used to predict the normal behavior of air quality variables and help provide accurate reference for monitoring purposes; (ii) to develop fault detection method that can effectively and quickly detect any anomalies in measured air quality data. For this purpose, a new fault detection method that is based on the combination of generalized likelihood ratio test (GLRT) and exponentially weighted moving average (EWMA) will be developed. GLRT is a well-known statistical fault detection method that relies on maximizing the detection probability for a given false alarm rate. In this paper, we propose to develop GLRT-based EWMA fault detection method that will be able to detect the changes in the values of certain air quality variables; (iii) to develop fault isolation and identification method that allows defining the fault source(s) in order to properly apply appropriate corrective actions. In this paper, reconstruction approach that is based on Midpoint-Radii Principal Component Analysis (MRPCA) model will be developed to handle the types of data and models associated with air quality monitoring networks. All air quality modeling, fault detection, fault isolation and reconstruction methods developed in this paper will be validated using real air quality data (such as particulate matter, ozone, nitrogen and carbon oxides measurement). Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Thermoelectric SQUID method for the detection of segregations

    NASA Astrophysics Data System (ADS)

    Hinken, Johann H.; Tavrin, Yury

    2000-05-01

    Aero engine turbine discs are most critical parts. Material inhomogeneities can cause disc fractures during the flight with fatal air disasters. Nondestructive testing (NDT) of the discs in various machining steps is necessary and performed as well as possible. Conventional NDT methods, however, like eddy current testing and ultrasonic testing have unacceptable limits. For example, subsurface segregations often cannot be detected directly but only indirectly in such cases when cracks already have developed from them. This may be too late. A new NDT method, which we call the Thermoelectric SQUID Method, has been developed. It allows for the detection of metallic inclusions within non-ferromagnetic metallic base material. This paper describes the results of a feasibility study on aero engine turbine discs made from Inconel® 718. These contained segregations that had been detected before by anodic etching. With the Thermoelectric SQUID Method, these segregations were detected again, and further segregations below the surfaces have been found, which had not been detected before. For this new NDT method the disc material is quasi-transparent. The Thermoelectric SQUID Method is also useful to detect distributed and localized inhomogeneities in pure metals like niobium sheets for particle accelerators.

  16. Evaluation of Pan-Sharpening Methods for Automatic Shadow Detection in High Resolution Images of Urban Areas

    NASA Astrophysics Data System (ADS)

    de Azevedo, Samara C.; Singh, Ramesh P.; da Silva, Erivaldo A.

    2017-04-01

    Finer spatial resolution of areas with tall objects within urban environment causes intense shadows that lead to wrong information in urban mapping. Due to the shadows, automatic detection of objects (such as buildings, trees, structures, towers) and to estimate the surface coverage from high spatial resolution is difficult. Thus, automatic shadow detection is the first necessary preprocessing step to improve the outcome of many remote sensing applications, particularly for high spatial resolution images. Efforts have been made to explore spatial and spectral information to evaluate such shadows. In this paper, we have used morphological attribute filtering to extract contextual relations in an efficient multilevel approach for high resolution images. The attribute selected for the filtering was the area estimated from shadow spectral feature using the Normalized Saturation-Value Difference Index (NSVDI) derived from pan-sharpening images. In order to assess the quality of fusion products and the influence on shadow detection algorithm, we evaluated three pan-sharpening methods - Intensity-Hue-Saturation (IHS), Principal Components (PC) and Gran-Schmidt (GS) through the image quality measures: Correlation Coefficient (CC), Root Mean Square Error (RMSE), Relative Dimensionless Global Error in Synthesis (ERGAS) and Universal Image Quality Index (UIQI). Experimental results over Worldview II scene from São Paulo city (Brazil) show that GS method provides good correlation with original multispectral bands with no radiometric and contrast distortion. The automatic method using GS method for NSDVI generation clearly provide a clear distinction of shadows and non-shadows pixels with an overall accuracy more than 90%. The experimental results confirm the effectiveness of the proposed approach which could be used for further shadow removal and reliable for object recognition, land-cover mapping, 3D reconstruction, etc. especially in developing countries where land use and

  17. Comparative analysis of methods for detecting interacting loci

    PubMed Central

    2011-01-01

    Background Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. Results We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs), with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR), full interaction model (FIM), information gain (IG), Bayesian epistasis association mapping (BEAM), SNP harvester (SH), maximum entropy conditional probability modeling (MECPM), logistic regression with an interaction term (LRIT), and logistic regression (LR) were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the methods to control the

  18. Comparative analysis of methods for detecting interacting loci.

    PubMed

    Chen, Li; Yu, Guoqiang; Langefeld, Carl D; Miller, David J; Guy, Richard T; Raghuram, Jayaram; Yuan, Xiguo; Herrington, David M; Wang, Yue

    2011-07-05

    Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs), with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR), full interaction model (FIM), information gain (IG), Bayesian epistasis association mapping (BEAM), SNP harvester (SH), maximum entropy conditional probability modeling (MECPM), logistic regression with an interaction term (LRIT), and logistic regression (LR) were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the methods to control the type I error rate

  19. Multidrug-resistant tuberculosis treatment failure detection depends on monitoring interval and microbiological method

    PubMed Central

    White, Richard A.; Lu, Chunling; Rodriguez, Carly A.; Bayona, Jaime; Becerra, Mercedes C.; Burgos, Marcos; Centis, Rosella; Cohen, Theodore; Cox, Helen; D'Ambrosio, Lia; Danilovitz, Manfred; Falzon, Dennis; Gelmanova, Irina Y.; Gler, Maria T.; Grinsdale, Jennifer A.; Holtz, Timothy H.; Keshavjee, Salmaan; Leimane, Vaira; Menzies, Dick; Milstein, Meredith B.; Mishustin, Sergey P.; Pagano, Marcello; Quelapio, Maria I.; Shean, Karen; Shin, Sonya S.; Tolman, Arielle W.; van der Walt, Martha L.; Van Deun, Armand; Viiklepp, Piret

    2016-01-01

    Debate persists about monitoring method (culture or smear) and interval (monthly or less frequently) during treatment for multidrug-resistant tuberculosis (MDR-TB). We analysed existing data and estimated the effect of monitoring strategies on timing of failure detection. We identified studies reporting microbiological response to MDR-TB treatment and solicited individual patient data from authors. Frailty survival models were used to estimate pooled relative risk of failure detection in the last 12 months of treatment; hazard of failure using monthly culture was the reference. Data were obtained for 5410 patients across 12 observational studies. During the last 12 months of treatment, failure detection occurred in a median of 3 months by monthly culture; failure detection was delayed by 2, 7, and 9 months relying on bimonthly culture, monthly smear and bimonthly smear, respectively. Risk (95% CI) of failure detection delay resulting from monthly smear relative to culture is 0.38 (0.34–0.42) for all patients and 0.33 (0.25–0.42) for HIV-co-infected patients. Failure detection is delayed by reducing the sensitivity and frequency of the monitoring method. Monthly monitoring of sputum cultures from patients receiving MDR-TB treatment is recommended. Expanded laboratory capacity is needed for high-quality culture, and for smear microscopy and rapid molecular tests. PMID:27587552

  20. Chemicals of emerging concern in water and bottom sediment in Great Lakes areas of concern, 2010 to 2011-Collection methods, analyses methods, quality assurance, and data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Foreman, William T.; Furlong, Edward T.; Smith, Steven G.

    2012-01-01

    The U.S. Geological Survey (USGS) cooperated with the U.S. Environmental Protection Agency and the U.S. Fish and Wildlife Service on a study to identify the occurrence of chemicals of emerging concern (CECs) in water and bottom-sediment samples collected during 2010–11 at sites in seven areas of concern (AOCs) throughout the Great Lakes. Study sites include tributaries to the Great Lakes in AOCs located near Duluth, Minn.; Green Bay, Wis.; Roches­ter, N.Y.; Detroit, Mich.; Toledo, Ohio; Milwaukee, Wis.; and Ashtabula, Ohio. This report documents the collection meth­ods, analyses methods, quality-assurance data and analyses, and provides the data for this study. Water and bottom-sediment samples were analyzed at the USGS National Water Quality Laboratory in Denver, Colo., for a broad suite of CECs. During this study, 135 environmental and 23 field dupli­cate samples of surface water and wastewater effluent, 10 field blank water samples, and 11 field spike water samples were collected and analyzed. Sixty-one of the 69 wastewater indicator chemicals (laboratory method 4433) analyzed were detected at concentrations ranging from 0.002 to 11.2 micrograms per liter. Twenty-eight of the 48 pharmaceuticals (research method 8244) analyzed were detected at concentrations ranging from 0.0029 to 22.0 micro­grams per liter. Ten of the 20 steroid hormones and sterols analyzed (research method 4434) were detected at concentrations ranging from 0.16 to 10,000 nanograms per liter. During this study, 75 environmental, 13 field duplicate samples, and 9 field spike samples of bottom sediment were collected and analyzed for a wide variety of CECs. Forty-seven of the 57 wastewater indicator chemicals (laboratory method 5433) analyzed were detected at concentrations ranging from 0.921 to 25,800 nanograms per gram. Seventeen of the 20 steroid hormones and sterols (research method 6434) analyzed were detected at concentrations ranging from 0.006 to 8,921 nanograms per gram. Twelve of

  1. Coherent scattering noise reduction method with wavelength diversity detection for holographic data storage system

    NASA Astrophysics Data System (ADS)

    Nakamura, Yusuke; Hoshizawa, Taku; Takashima, Yuzuru

    2017-09-01

    A new method, wavelength diversity detection (WDD), for improving signal quality is proposed and its effectiveness is numerically confirmed. We consider that WDD is especially effective for high-capacity systems having low hologram diffraction efficiencies. In such systems, the signal quality is primarily limited by coherent scattering noise; thus, effective improvement of the signal quality under a scattering-limited system is of great interest. WDD utilizes a new degree of freedom, the spectrum width, and scattering by molecules to improve the signal quality of the system. We found that WDD improves the quality by counterbalancing the degradation of the quality due to Bragg mismatch. With WDD, a higher-scattering-coefficient medium can improve the quality. The result provides an interesting insight into the requirements for material characteristics, especially for a large-M/# material. In general, a larger-M/# material contains more molecules; thus, the system is subject to more scattering, which actually improves the quality with WDD. We propose a pathway for a future holographic data storage system (HDSS) using WDD, which can record a larger amount of data than a conventional HDSS.

  2. Weld quality inspection using laser-EMAT ultrasonic system and C-scan method

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Ume, I. Charles

    2014-02-01

    Laser/EMAT ultrasonic technique has attracted more and more interests in weld quality inspection because of its non-destructive and non-contact characteristics. When ultrasonic techniques are used to detect welds joining relative thin plates, the dominant ultrasonic waves present in the plates are Lamb waves, which propagate all through the thickness. Traditional Time of Flight(ToF) method loses its power. The broadband nature of laser excited ultrasound plus dispersive and multi-modal characteristic of Lamb waves make the EMAT acquired signals very complicated in this situation. Challenge rises in interpreting the received signals and establishing relationship between signal feature and weld quality. In this paper, the laser/EMAT ultrasonic technique was applied in a C-scan manner to record full wave propagation field over an area close to the weld. Then the effect of weld defect on the propagation field of Lamb waves was studied visually by watching an movie resulted from the recorded signals. This method was proved to be effective to detect the presence of hidden defect in the weld. Discrete wavelet transform(DWT) was applied to characterize the acquired ultrasonic signals and ideal band-pass filter was used to isolate wave components most sensitive to the weld defect. Different interactions with the weld defect were observed for different wave components. Thus this C-Scan method, combined with DWT and ideal band-pass filter, proved to be an effective methodology to experimentally study interactions of various laser excited Lamb Wave components with weld defect. In this work, the method was demonstrated by inspecting a hidden local incomplete penetration in weld. In fact, this method can be applied to study Lamb Wave interactions with any type of structural inconsistency. This work also proposed a ideal filtered based method to effectively reduce the total experimental time.

  3. Face detection on distorted images using perceptual quality-aware features

    NASA Astrophysics Data System (ADS)

    Gunasekar, Suriya; Ghosh, Joydeep; Bovik, Alan C.

    2014-02-01

    We quantify the degradation in performance of a popular and effective face detector when human-perceived image quality is degraded by distortions due to additive white gaussian noise, gaussian blur or JPEG compression. It is observed that, within a certain range of perceived image quality, a modest increase in image quality can drastically improve face detection performance. These results can be used to guide resource or bandwidth allocation in a communication/delivery system that is associated with face detection tasks. A new face detector based on QualHOG features is also proposed that augments face-indicative HOG features with perceptual quality-aware spatial Natural Scene Statistics (NSS) features, yielding improved tolerance against image distortions. The new detector provides statistically significant improvements over a strong baseline on a large database of face images representing a wide range of distortions. To facilitate this study, we created a new Distorted Face Database, containing face and non-face patches from images impaired by a variety of common distortion types and levels. This new dataset is available for download and further experimentation at www.ideal.ece.utexas.edu/˜suriya/DFD/.

  4. A rapid silica spin column-based method of RNA extraction from fruit trees for RT-PCR detection of viruses.

    PubMed

    Yang, Fan; Wang, Guoping; Xu, Wenxing; Hong, Ni

    2017-09-01

    Efficient recovery of high quality RNA is very important for successful RT-PCR detection of plant RNA viruses. High levels of polyphenols and polysaccharides in plant tissues can irreversibly bind to and/or co-precipitate with RNA, which influences RNA isolation. In this study, a silica spin column-based RNA isolation method was developed by using commercially available silica columns combined with the application of a tissue lysis solution, and binding and washing buffers with high concentration guanidinium thiocyanate (GuSCN, 50% w/v), which helps remove plant proteins, polysaccharides and polyphenolic compounds. The method was successfully used to extract high quality RNA from citrus (Citrus aurantifolia), grapevine (Vitis vinifera), peach (Prunus persica), pear (Pyrus spp.), taro (Colocosia esculenta) and tobacco (Nicotiana benthamiana) samples. The method was comparable to conventional CTAB method in RNA isolation efficiency, but it was more sample-adaptable and cost-effective than commercial kits. High quality RNA isolated using silica spin column-based method was successfully used for the RT-PCR and/or multiplex RT-PCR amplification of woody fruit tree viruses and a viroid. The study provided a useful tool for the detection and characterization of plant viruses. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Detecting adverse events in surgery: comparing events detected by the Veterans Health Administration Surgical Quality Improvement Program and the Patient Safety Indicators.

    PubMed

    Mull, Hillary J; Borzecki, Ann M; Loveland, Susan; Hickson, Kathleen; Chen, Qi; MacDonald, Sally; Shin, Marlena H; Cevasco, Marisa; Itani, Kamal M F; Rosen, Amy K

    2014-04-01

    The Patient Safety Indicators (PSIs) use administrative data to screen for select adverse events (AEs). In this study, VA Surgical Quality Improvement Program (VASQIP) chart review data were used as the gold standard to measure the criterion validity of 5 surgical PSIs. Independent chart review was also used to determine reasons for PSI errors. The sensitivity, specificity, and positive predictive value of PSI software version 4.1a were calculated among Veterans Health Administration hospitalizations (2003-2007) reviewed by VASQIP (n = 268,771). Nurses re-reviewed a sample of hospitalizations for which PSI and VASQIP AE detection disagreed. Sensitivities ranged from 31% to 68%, specificities from 99.1% to 99.8%, and positive predictive values from 31% to 72%. Reviewers found that coding errors accounted for some PSI-VASQIP disagreement; some disagreement was also the result of differences in AE definitions. These results suggest that the PSIs have moderate criterion validity; however, some surgical PSIs detect different AEs than VASQIP. Future research should explore using both methods to evaluate surgical quality. Published by Elsevier Inc.

  6. How do we watch images? A case of change detection and quality estimation

    NASA Astrophysics Data System (ADS)

    Radun, Jenni; Leisti, Tuomas; Virtanen, Toni; Nyman, Göte

    2012-01-01

    The most common tasks in subjective image estimation are change detection (a detection task) and image quality estimation (a preference task). We examined how the task influences the gaze behavior when comparing detection and preference tasks. The eye movements of 16 naïve observers were recorded with 8 observers in both tasks. The setting was a flicker paradigm, where the observers see a non-manipulated image, a manipulated version of the image and again the non-manipulated image and estimate the difference they perceived in them. The material was photographic material with different image distortions and contents. To examine the spatial distribution of fixations, we defined the regions of interest using a memory task and calculated information entropy to estimate how concentrated the fixations were on the image plane. The quality task was faster and needed fewer fixations and the first eight fixations were more concentrated on certain image areas than the change detection task. The bottom-up influences of the image also caused more variation to the gaze behavior in the quality estimation task than in the change detection task The results show that the quality estimation is faster and the regions of interest are emphasized more on certain images compared with the change detection task that is a scan task where the whole image is always thoroughly examined. In conclusion, in subjective image estimation studies it is important to think about the task.

  7. Water quality real-time monitoring system via biological detection based on video analysis

    NASA Astrophysics Data System (ADS)

    Xin, Chen; Fei, Yuan

    2017-11-01

    With the development of society, water pollution has become the most serious problem in China. Therefore, real-time water quality monitoring is an important part of human activities and water pollution prevention. In this paper, the behavior of zebrafish was monitored by computer vision. Firstly, the moving target was extracted by the method of saliency detection, and tracked by fitting the ellipse model. Then the motion parameters were extracted by optical flow method, and the data were monitored in real time by means of Hinkley warning and threshold warning. We achieved classification warning through a number of dimensions by comprehensive toxicity index. The experimental results show that the system can achieve more accurate real-time monitoring.

  8. Design of Passive Power Filter for Hybrid Series Active Power Filter using Estimation, Detection and Classification Method

    NASA Astrophysics Data System (ADS)

    Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.

    2016-06-01

    This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.

  9. High sensitivity leak detection method and apparatus

    DOEpatents

    Myneni, Ganapatic R.

    1994-01-01

    An improved leak detection method is provided that utilizes the cyclic adsorption and desorption of accumulated helium on a non-porous metallic surface. The method provides reliable leak detection at superfluid helium temperatures. The zero drift that is associated with residual gas analyzers in common leak detectors is virtually eliminated by utilizing a time integration technique. The sensitivity of the apparatus of this disclosure is capable of detecting leaks as small as 1.times.10.sup.-18 atm cc sec.sup.-1.

  10. Rapid Detection Methods for Asphalt Pavement Thicknesses and Defects by a Vehicle-Mounted Ground Penetrating Radar (GPR) System

    PubMed Central

    Dong, Zehua; Ye, Shengbo; Gao, Yunze; Fang, Guangyou; Zhang, Xiaojuan; Xue, Zhongjun; Zhang, Tao

    2016-01-01

    The thickness estimation of the top surface layer and surface layer, as well as the detection of road defects, are of great importance to the quality conditions of asphalt pavement. Although ground penetrating radar (GPR) methods have been widely used in non-destructive detection of pavements, the thickness estimation of the thin top surface layer is still a difficult problem due to the limitations of GPR resolution and the similar permittivity of asphalt sub-layers. Besides, the detection of some road defects, including inadequate compaction and delamination at interfaces, require further practical study. In this paper, a newly-developed vehicle-mounted GPR detection system is introduced. We used a horizontal high-pass filter and a modified layer localization method to extract the underground layers. Besides, according to lab experiments and simulation analysis, we proposed theoretical methods for detecting the degree of compaction and delamination at the interface, respectively. Moreover, a field test was carried out and the estimated results showed a satisfactory accuracy of the system and methods. PMID:27929409

  11. Rapid Detection Methods for Asphalt Pavement Thicknesses and Defects by a Vehicle-Mounted Ground Penetrating Radar (GPR) System.

    PubMed

    Dong, Zehua; Ye, Shengbo; Gao, Yunze; Fang, Guangyou; Zhang, Xiaojuan; Xue, Zhongjun; Zhang, Tao

    2016-12-06

    The thickness estimation of the top surface layer and surface layer, as well as the detection of road defects, are of great importance to the quality conditions of asphalt pavement. Although ground penetrating radar (GPR) methods have been widely used in non-destructive detection of pavements, the thickness estimation of the thin top surface layer is still a difficult problem due to the limitations of GPR resolution and the similar permittivity of asphalt sub-layers. Besides, the detection of some road defects, including inadequate compaction and delamination at interfaces, require further practical study. In this paper, a newly-developed vehicle-mounted GPR detection system is introduced. We used a horizontal high-pass filter and a modified layer localization method to extract the underground layers. Besides, according to lab experiments and simulation analysis, we proposed theoretical methods for detecting the degree of compaction and delamination at the interface, respectively. Moreover, a field test was carried out and the estimated results showed a satisfactory accuracy of the system and methods.

  12. Survey of Anomaly Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ng, B

    This survey defines the problem of anomaly detection and provides an overview of existing methods. The methods are categorized into two general classes: generative and discriminative. A generative approach involves building a model that represents the joint distribution of the input features and the output labels of system behavior (e.g., normal or anomalous) then applies the model to formulate a decision rule for detecting anomalies. On the other hand, a discriminative approach aims directly to find the decision rule, with the smallest error rate, that distinguishes between normal and anomalous behavior. For each approach, we will give an overview ofmore » popular techniques and provide references to state-of-the-art applications.« less

  13. Detection methods and performance criteria for genetically modified organisms.

    PubMed

    Bertheau, Yves; Diolez, Annick; Kobilinsky, André; Magin, Kimberly

    2002-01-01

    Detection methods for genetically modified organisms (GMOs) are necessary for many applications, from seed purity assessment to compliance of food labeling in several countries. Numerous analytical methods are currently used or under development to support these needs. The currently used methods are bioassays and protein- and DNA-based detection protocols. To avoid discrepancy of results between such largely different methods and, for instance, the potential resulting legal actions, compatibility of the methods is urgently needed. Performance criteria of methods allow evaluation against a common standard. The more-common performance criteria for detection methods are precision, accuracy, sensitivity, and specificity, which together specifically address other terms used to describe the performance of a method, such as applicability, selectivity, calibration, trueness, precision, recovery, operating range, limit of quantitation, limit of detection, and ruggedness. Performance criteria should provide objective tools to accept or reject specific methods, to validate them, to ensure compatibility between validated methods, and be used on a routine basis to reject data outside an acceptable range of variability. When selecting a method of detection, it is also important to consider its applicability, its field of applications, and its limitations, by including factors such as its ability to detect the target analyte in a given matrix, the duration of the analyses, its cost effectiveness, and the necessary sample sizes for testing. Thus, the current GMO detection methods should be evaluated against a common set of performance criteria.

  14. Accuracy of imaging methods for detection of bone tissue invasion in patients with oral squamous cell carcinoma

    PubMed Central

    Uribe, S; Rojas, LA; Rosas, CF

    2013-01-01

    The objective of this review is to evaluate the diagnostic accuracy of imaging methods for detection of mandibular bone tissue invasion by squamous cell carcinoma (SCC). A systematic review was carried out of studies in MEDLINE, SciELO and ScienceDirect, published between 1960 and 2012, in English, Spanish or German, which compared detection of mandibular bone tissue invasion via different imaging tests against a histopathology reference standard. Sensitivity and specificity data were extracted from each study. The outcome measure was diagnostic accuracy. We found 338 articles, of which 5 fulfilled the inclusion criteria. Tests included were: CT (four articles), MRI (four articles), panoramic radiography (one article), positron emission tomography (PET)/CT (one article) and cone beam CT (CBCT) (one article). The quality of articles was low to moderate and the evidence showed that all tests have a high diagnostic accuracy for detection of mandibular bone tissue invasion by SCC, with sensitivity values of 94% (MRI), 91% (CBCT), 83% (CT) and 55% (panoramic radiography), and specificity values of 100% (CT, MRI, CBCT), 97% (PET/CT) and 91.7% (panoramic radiography). Available evidence is scarce and of only low to moderate quality. However, it is consistently shown that current imaging methods give a moderate to high diagnostic accuracy for the detection of mandibular bone tissue invasion by SCC. Recommendations are given for improving the quality of future reports, in particular provision of a detailed description of the patients' conditions, the imaging instrument and both imaging and histopathological invasion criteria. PMID:23420854

  15. [Seed quality test methods of Paeonia suffruticosa].

    PubMed

    Cao, Ya-Yue; Zhu, Zai-Biao; Guo, Qiao-Sheng; Liu, Li; Wang, Chang-Lin

    2014-11-01

    In order to optimize the testing methods for Paeonia suffruticosa seed quality, and provide basis for establishing seed testing rules and seed quality standard of P. suffruticosa. The seed quality of P. suffruticosa from different producing areas was measured based on the related seed testing regulations. The seed testing methods for quality items of P. suffruticosa was established preliminarily. The samples weight of P. suffruticosa was at least 7 000 g for purity analysis and was at least 700 g for test. The phenotypic observation and size measurement were used for authenticity testing. The 1 000-seed weight was determined by 100-seed method, and the water content was carried out by low temperature drying method (10 hours). After soaking in distilled water for 24 h, the seeds was treated with different temperature stratifications of day and night (25 degrees C/20 degrees C, day/night) in the dark for 60 d. After soaking in the liquor of GA3 300 mg x L(-1) for 24 h, the P. suffruticos seeds were cultured in wet sand at 15 degrees C for 12-60 days for germination testing. Seed viability was tested by TlC method.

  16. Infrared and visible image fusion method based on saliency detection in sparse domain

    NASA Astrophysics Data System (ADS)

    Liu, C. H.; Qi, Y.; Ding, W. R.

    2017-06-01

    Infrared and visible image fusion is a key problem in the field of multi-sensor image fusion. To better preserve the significant information of the infrared and visible images in the final fused image, the saliency maps of the source images is introduced into the fusion procedure. Firstly, under the framework of the joint sparse representation (JSR) model, the global and local saliency maps of the source images are obtained based on sparse coefficients. Then, a saliency detection model is proposed, which combines the global and local saliency maps to generate an integrated saliency map. Finally, a weighted fusion algorithm based on the integrated saliency map is developed to achieve the fusion progress. The experimental results show that our method is superior to the state-of-the-art methods in terms of several universal quality evaluation indexes, as well as in the visual quality.

  17. High sensitivity leak detection method and apparatus

    DOEpatents

    Myneni, G.R.

    1994-09-06

    An improved leak detection method is provided that utilizes the cyclic adsorption and desorption of accumulated helium on a non-porous metallic surface. The method provides reliable leak detection at superfluid helium temperatures. The zero drift that is associated with residual gas analyzers in common leak detectors is virtually eliminated by utilizing a time integration technique. The sensitivity of the apparatus of this disclosure is capable of detecting leaks as small as 1 [times] 10[sup [minus]18] atm cc sec[sup [minus]1]. 2 figs.

  18. Radionuclide detection devices and associated methods

    DOEpatents

    Mann, Nicholas R [Rigby, ID; Lister, Tedd E [Idaho Falls, ID; Tranter, Troy J [Idaho Falls, ID

    2011-03-08

    Radionuclide detection devices comprise a fluid cell comprising a flow channel for a fluid stream. A radionuclide collector is positioned within the flow channel and configured to concentrate one or more radionuclides from the fluid stream onto at least a portion of the radionuclide collector. A scintillator for generating scintillation pulses responsive to an occurrence of a decay event is positioned proximate at least a portion of the radionuclide collector and adjacent to a detection system for detecting the scintillation pulses. Methods of selectively detecting a radionuclide are also provided.

  19. Method for remote detection of trace contaminants

    DOEpatents

    Simonson, Robert J.; Hance, Bradley G.

    2003-09-09

    A method for remote detection of trace contaminants in a target area comprises applying sensor particles that preconcentrate the trace contaminant to the target area and detecting the contaminant-sensitive fluorescence from the sensor particles. The sensor particles can have contaminant-sensitive and contaminant-insensitive fluorescent compounds to enable the determination of the amount of trace contaminant present in the target are by relative comparison of the emission of the fluorescent compounds by a local or remote fluorescence detector. The method can be used to remotely detect buried minefields.

  20. Method for detecting toxic gases

    DOEpatents

    Stetter, J.R.; Zaromb, S.; Findlay, M.W. Jr.

    1991-10-08

    A method is disclosed which is capable of detecting low concentrations of a pollutant or other component in air or other gas. This method utilizes a combination of a heating filament having a catalytic surface of a noble metal for exposure to the gas and producing a derivative chemical product from the component. An electrochemical sensor responds to the derivative chemical product for providing a signal indicative of the product. At concentrations in the order of about 1-100 ppm of tetrachloroethylene, neither the heating filament nor the electrochemical sensor is individually capable of sensing the pollutant. In the combination, the heating filament converts the benzyl chloride to one or more derivative chemical products which may be detected by the electrochemical sensor. 6 figures.

  1. An improved method to detect correct protein folds using partial clustering.

    PubMed

    Zhou, Jianjun; Wishart, David S

    2013-01-16

    Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient "partial" clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either C(α) RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance.

  2. An improved method to detect correct protein folds using partial clustering

    PubMed Central

    2013-01-01

    Background Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient “partial“ clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. Results We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either Cα RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. Conclusions The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance. PMID:23323835

  3. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of organochlorine pesticides and polychlorinated biphenyls in bottom sediment by dual capillary-column gas chromatography with electron-capture detection

    USGS Publications Warehouse

    Foreman, William T.; Connor, Brooke F.; Furlong, Edward T.; Vaught, Deborah G.; Merten, Leslie M.

    1995-01-01

    A method for the determination of 30 individual organochlorine pesticides, total toxaphene, and total polychlorinated biphenyls (PCBs) in bottom sediment is described. The method isolates the pesticides and PCBs by solvent extraction with dichlorobenzene, removes inorganic sulfur, large naturally occurring molecules, and other unwanted interferences by gel permeation chromatography, and further cleans up and class fractionates the extract using adsorption chromatography. The com- pounds then are instrumentally determined using dual capillary-column gas chromatography with electron-capture detection. Reporting limits range from 1 to 5 micrograms per kilogram for 30 individual pesticides, 50 micrograms per kilogram for total PCBs, and 200 micrograms per kilogram for total toxaphene. The method also is designed to allow the simultaneous isolation of 79 other semivolatile organic compounds from the sediment, which are separately quantified using gas chromatography with mass spectrometric detection. The method was developed in support of the U.S. Geological Survey's National Water-Quality Assessment program.

  4. Fluorescence-based methods for detecting caries lesions: systematic review, meta-analysis and sources of heterogeneity.

    PubMed

    Gimenez, Thais; Braga, Mariana Minatel; Raggio, Daniela Procida; Deery, Chris; Ricketts, David N; Mendes, Fausto Medeiros

    2013-01-01

    Fluorescence-based methods have been proposed to aid caries lesion detection. Summarizing and analysing findings of studies about fluorescence-based methods could clarify their real benefits. We aimed to perform a comprehensive systematic review and meta-analysis to evaluate the accuracy of fluorescence-based methods in detecting caries lesions. Two independent reviewers searched PubMed, Embase and Scopus through June 2012 to identify papers/articles published. Other sources were checked to identify non-published literature. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS AND DIAGNOSTIC METHODS: The eligibility criteria were studies that: (1) have assessed the accuracy of fluorescence-based methods of detecting caries lesions on occlusal, approximal or smooth surfaces, in both primary or permanent human teeth, in the laboratory or clinical setting; (2) have used a reference standard; and (3) have reported sufficient data relating to the sample size and the accuracy of methods. A diagnostic 2×2 table was extracted from included studies to calculate the pooled sensitivity, specificity and overall accuracy parameters (Diagnostic Odds Ratio and Summary Receiver-Operating curve). The analyses were performed separately for each method and different characteristics of the studies. The quality of the studies and heterogeneity were also evaluated. Seventy five studies met the inclusion criteria from the 434 articles initially identified. The search of the grey or non-published literature did not identify any further studies. In general, the analysis demonstrated that the fluorescence-based method tend to have similar accuracy for all types of teeth, dental surfaces or settings. There was a trend of better performance of fluorescence methods in detecting more advanced caries lesions. We also observed moderate to high heterogeneity and evidenced publication bias. Fluorescence-based devices have similar overall performance; however, better accuracy in detecting more advanced caries

  5. An Effective Method for Substance Detection Using the Broad Spectrum THz Signal: A “Terahertz Nose”

    PubMed Central

    Trofimov, Vyacheslav A.; Varentsova, Svetlana A.

    2015-01-01

    We propose an effective method for the detection and identification of dangerous substances by using the broadband THz pulse. This pulse excites, for example, many vibrational or rotational energy levels of molecules simultaneously. By analyzing the time-dependent spectrum of the THz pulse transmitted through or reflected from a substance, we follow the average response spectrum dynamics. Comparing the absorption and emission spectrum dynamics of a substance under analysis with the corresponding data for a standard substance, one can detect and identify the substance under real conditions taking into account the influence of packing material, water vapor and substance surface. For quality assessment of the standard substance detection in the signal under analysis, we propose time-dependent integral correlation criteria. Restrictions of usually used detection and identification methods, based on a comparison between the absorption frequencies of a substance under analysis and a standard substance, are demonstrated using a physical experiment with paper napkins. PMID:26020281

  6. Paper analytical devices for detection of low-quality pharmaceuticals

    NASA Astrophysics Data System (ADS)

    Weaver, A.; Lieberman, M.

    2014-03-01

    There is currently no global screening system to detect low quality pharmaceuticals, despite widespread recognition of the public health problems caused by substandard and falsified medicines. In order to fill this void, we designed a rapid field screening test that is interfaced with the mobile phone network. The user scrapes a pill over several reaction areas on a paper test card, and then dips one edge of the card into water to activate dried reagents stored on the paper. These reagents carry out multiple color tests and result in a pattern of colored stripes that give information about the chemical content of the pill. The test cards are inexpensive and instrument-free, and we think they will be a scalable testing option in low resource settings. Studies on falsified drugs archived at the FDA show that the test cards are effective at detecting a wide variety of low-quality formulations of many classes of pharmaceuticals, and field tests are currently under way in Kenya.

  7. High-performance combination method of electric network frequency and phase for audio forgery detection in battery-powered devices.

    PubMed

    Savari, Maryam; Abdul Wahab, Ainuddin Wahid; Anuar, Nor Badrul

    2016-09-01

    Audio forgery is any act of tampering, illegal copy and fake quality in the audio in a criminal way. In the last decade, there has been increasing attention to the audio forgery detection due to a significant increase in the number of forge in different type of audio. There are a number of methods for forgery detection, which electric network frequency (ENF) is one of the powerful methods in this area for forgery detection in terms of accuracy. In spite of suitable accuracy of ENF in a majority of plug-in powered devices, the weak accuracy of ENF in audio forgery detection for battery-powered devices, especially in laptop and mobile phone, can be consider as one of the main obstacles of the ENF. To solve the ENF problem in terms of accuracy in battery-powered devices, a combination method of ENF and phase feature is proposed. From experiment conducted, ENF alone give 50% and 60% accuracy for forgery detection in mobile phone and laptop respectively, while the proposed method shows 88% and 92% accuracy respectively, for forgery detection in battery-powered devices. The results lead to higher accuracy for forgery detection with the combination of ENF and phase feature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. A fast automatic target detection method for detecting ships in infrared scenes

    NASA Astrophysics Data System (ADS)

    Özertem, Kemal Arda

    2016-05-01

    Automatic target detection in infrared scenes is a vital task for many application areas like defense, security and border surveillance. For anti-ship missiles, having a fast and robust ship detection algorithm is crucial for overall system performance. In this paper, a straight-forward yet effective ship detection method for infrared scenes is introduced. First, morphological grayscale reconstruction is applied to the input image, followed by an automatic thresholding onto the suppressed image. For the segmentation step, connected component analysis is employed to obtain target candidate regions. At this point, it can be realized that the detection is defenseless to outliers like small objects with relatively high intensity values or the clouds. To deal with this drawback, a post-processing stage is introduced. For the post-processing stage, two different methods are used. First, noisy detection results are rejected with respect to target size. Second, the waterline is detected by using Hough transform and the detection results that are located above the waterline with a small margin are rejected. After post-processing stage, there are still undesired holes remaining, which cause to detect one object as multi objects or not to detect an object as a whole. To improve the detection performance, another automatic thresholding is implemented only to target candidate regions. Finally, two detection results are fused and post-processing stage is repeated to obtain final detection result. The performance of overall methodology is tested with real world infrared test data.

  9. A fast method for detecting Cryptosporidium parvum oocysts in real world samples

    NASA Astrophysics Data System (ADS)

    Stewart, Shona; McClelland, Lindy; Maier, John

    2005-04-01

    Contamination of drinking water with pathogenic microorganisms such as Cryptosporidium has become an increasing concern in recent years. Cryptosporidium oocysts are particularly problematic, as infections caused by this organism can be life threatening in immunocompromised patients. Current methods for monitoring and analyzing water are often laborious and require experts to conduct. In addition, many of the techniques require very specific reagents to be employed. These factors add considerable cost and time to the analytical process. Raman spectroscopy provides specific molecular information on samples, and offers advantages of speed, sensitivity and low cost over current methods of water monitoring. Raman spectroscopy is an optical method that has demonstrated the capability to identify and differentiate microorganisms at the species and strain levels. In addition, this technique has exhibited sensitivities down to the single organism detection limit. We have employed Raman spectroscopy and Raman Chemical Imaging, in conjunction with chemometric techniques, to detect small numbers of oocysts in the presence of interferents derived from real-world water samples. Our investigations have also indicated that Raman Chemical Imaging may provide chemical and physiological information about an oocyst sample which complements information provided by the traditional methods. This work provides evidence that Raman imaging is a useful technique for consideration in the water quality industry.

  10. Novel Method For Low-Rate Ddos Attack Detection

    NASA Astrophysics Data System (ADS)

    Chistokhodova, A. A.; Sidorov, I. D.

    2018-05-01

    The relevance of the work is associated with an increasing number of advanced types of DDoS attacks, in particular, low-rate HTTP-flood. Last year, the power and complexity of such attacks increased significantly. The article is devoted to the analysis of DDoS attacks detecting methods and their modifications with the purpose of increasing the accuracy of DDoS attack detection. The article details low-rate attacks features in comparison with conventional DDoS attacks. During the analysis, significant shortcomings of the available method for detecting low-rate DDoS attacks were found. Thus, the result of the study is an informal description of a new method for detecting low-rate denial-of-service attacks. The architecture of the stand for approbation of the method is developed. At the current stage of the study, it is possible to improve the efficiency of an already existing method by using a classifier with memory, as well as additional information.

  11. Detection of fatigue cracks by nondestructive testing methods

    NASA Technical Reports Server (NTRS)

    Anderson, R. T.; Delacy, T. J.; Stewart, R. C.

    1973-01-01

    The effectiveness was assessed of various NDT methods to detect small tight cracks by randomly introducing fatigue cracks into aluminum sheets. The study included optimizing NDT methods calibrating NDT equipment with fatigue cracked standards, and evaluating a number of cracked specimens by the optimized NDT methods. The evaluations were conducted by highly trained personnel, provided with detailed procedures, in order to minimize the effects of human variability. These personnel performed the NDT on the test specimens without knowledge of the flaw locations and reported on the flaws detected. The performance of these tests was measured by comparing the flaws detected against the flaws present. The principal NDT methods utilized were radiographic, ultrasonic, penetrant, and eddy current. Holographic interferometry, acoustic emission monitoring, and replication methods were also applied on a reduced number of specimens. Generally, the best performance was shown by eddy current, ultrasonic, penetrant and holographic tests. Etching provided no measurable improvement, while proof loading improved flaw detectability. Data are shown that quantify the performances of the NDT methods applied.

  12. Dent detection method by high gradation photometric stereo

    NASA Astrophysics Data System (ADS)

    Hasebe, Akihisa; Kato, Kunihito; Tanahashi, Hideki; Kubota, Naoki

    2017-03-01

    This paper describes an automatic detection method for small dents on a metal plate. We adopted the photometric stereo as a three-dimensional measurement method, which has advantages in terms of low cost and short measurement time. In addition, a high precision measurement system was realized by using an 18bit camera. Furthermore, the small dent on the surface of the metal plate is detected by the inner product of the measured normal vectors using photometric stereo. Finally, the effectiveness of our method was confirmed by detection experiments.

  13. Cancer Detection and Diagnosis Methods - Annual Plan

    Cancer.gov

    Early cancer detection is a proven life-saving strategy. Learn about the research opportunities NCI supports, including liquid biopsies and other less-invasive methods, for detecting early cancers and precancerous growths.

  14. Comparison of different detection methods for persistent multiple hypothesis tracking in wide area motion imagery

    NASA Astrophysics Data System (ADS)

    Hartung, Christine; Spraul, Raphael; Schuchert, Tobias

    2017-10-01

    Wide area motion imagery (WAMI) acquired by an airborne multicamera sensor enables continuous monitoring of large urban areas. Each image can cover regions of several square kilometers and contain thousands of vehicles. Reliable vehicle tracking in this imagery is an important prerequisite for surveillance tasks, but remains challenging due to low frame rate and small object size. Most WAMI tracking approaches rely on moving object detections generated by frame differencing or background subtraction. These detection methods fail when objects slow down or stop. Recent approaches for persistent tracking compensate for missing motion detections by combining a detection-based tracker with a second tracker based on appearance or local context. In order to avoid the additional complexity introduced by combining two trackers, we employ an alternative single tracker framework that is based on multiple hypothesis tracking and recovers missing motion detections with a classifierbased detector. We integrate an appearance-based similarity measure, merge handling, vehicle-collision tests, and clutter handling to adapt the approach to the specific context of WAMI tracking. We apply the tracking framework on a region of interest of the publicly available WPAFB 2009 dataset for quantitative evaluation; a comparison to other persistent WAMI trackers demonstrates state of the art performance of the proposed approach. Furthermore, we analyze in detail the impact of different object detection methods and detector settings on the quality of the output tracking results. For this purpose, we choose four different motion-based detection methods that vary in detection performance and computation time to generate the input detections. As detector parameters can be adjusted to achieve different precision and recall performance, we combine each detection method with different detector settings that yield (1) high precision and low recall, (2) high recall and low precision, and (3) best f

  15. Need for new caries detection methods

    NASA Astrophysics Data System (ADS)

    Young, Douglas A.; Featherstone, John D. B.

    1999-05-01

    Dental caries (tooth decay) continues to be a major problems for adults as well as children, even though great advances have been made in preventive methods in the last 20 years. New methods for the management of caries will work best if lesions can be detected at an early stage and chemical rather than physical intervention can take place, thereby preserving the natural tooth structure and helping the saliva to heal, or remineralize, the areas of early decay. Clinical detection of caries in the US relies on visual examination, tactile with hand held explorer, and conventional radiographs, all of which are inadequate for the occlusal (biting) surfaces of the teeth where most of the decay now occurs. The dentist often has to explore by drilling with a dental bur to confirm early decay in these areas. New method that can determine the extent and degree of subsurface lesions in these surfaces non-destructively are essential for further advances in the clinical management of dental caries. Optical methods, which exploit the differences between sound and carious enamel and dentin, show great promise for the accurate detection of these lesions. Two or three- dimensional images, which include a measure of severity will be needed.

  16. Bioluminescent bioreporter integrated circuit detection methods

    DOEpatents

    Simpson, Michael L.; Paulus, Michael J.; Sayler, Gary S.; Applegate, Bruce M.; Ripp, Steven A.

    2005-06-14

    Disclosed are monolithic bioelectronic devices comprising a bioreporter and an OASIC. These bioluminescent bioreporter integrated circuit are useful in detecting substances such as pollutants, explosives, and heavy-metals residing in inhospitable areas such as groundwater, industrial process vessels, and battlefields. Also disclosed are methods and apparatus for detection of particular analytes, including ammonia and estrogen compounds.

  17. Apparatus and methods for detecting chemical permeation

    DOEpatents

    Vo-Dinh, Tuan

    1994-01-01

    Apparatus and methods for detecting the permeation of hazardous or toxic chemicals through protective clothing are disclosed. The hazardous or toxic chemicals of interest do not possess the spectral characteristic of luminescence. The apparatus and methods utilize a spectrochemical modification technique to detect the luminescence quenching of an indicator compound which upon permeation of the chemical through the protective clothing, the indicator is exposed to the chemical, thus indicating chemical permeation.

  18. Culture-dependent enumeration methods failed to simultaneously detect disinfectant-injured and genetically modified Escherichia coli in drinking water.

    PubMed

    Li, Jing; Liu, Lu; Yang, Dong; Liu, Wei-Li; Shen, Zhi-Qiang; Qu, Hong-Mei; Qiu, Zhi-Gang; Hou, Ai-Ming; Wang, Da-Ning; Ding, Chen-Shi; Li, Jun-Wen; Guo, Jian-Hua; Jin, Min

    2017-05-24

    Underestimation of Escherichia coli in drinking water, an indicator microorganism of sanitary risk, may result in potential risks of waterborne diseases. However, the detection of disinfectant-injured or genetically modified (GM) E. coli has been largely overlooked so far. To evaluate the accuracy of culture-dependent enumeration with regard to disinfectant-injured and GM E. coli, chlorine- or ozone-injured wild-type (WT) and GM E. coli were prepared and characterized. Then, water samples contaminated with these E. coli strains were assayed by four widely used methods, including lactose tryptose broth-based multiple-tube fermentation (MTF), m-endo-based membrane filtration method (MFM), an enzyme substrate test (EST) known as Colilert, and Petrifilm-based testing slip method (TSM). It was found that MTF was the most effective method to detect disinfectant-injured WT E. coli (with 76.9% trials detecting all these bacteria), while this method could not effectively detect GM E. coli (with uninjured bacteria undetectable and a maximal detection rate of 21.5% for the injured). The EST was the only method which enabled considerable enumeration of uninjured GM E. coli, with a detection rate of over 93%. However, the detection rate declined to lower than 45.4% once the GM E. coli was injured by disinfectants. The MFM was invalid for both disinfectant-injured and GM E. coli. This is the first study to report the failure of these commonly used enumeration methods to simultaneously detect disinfectant-injured and GM E. coli. Thus, it highlights the urgent requirement for the development of a more accurate and versatile enumeration method which allows the detection of disinfectant-injured and GM E. coli on the assessment of microbial quality of drinking water.

  19. System and Method for Multi-Wavelength Optical Signal Detection

    NASA Technical Reports Server (NTRS)

    McGlone, Thomas D. (Inventor)

    2017-01-01

    The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.

  20. Odour detection methods: olfactometry and chemical sensors.

    PubMed

    Brattoli, Magda; de Gennaro, Gianluigi; de Pinto, Valentina; Loiotile, Annamaria Demarinis; Lovascio, Sara; Penza, Michele

    2011-01-01

    The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc.) and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality); this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants) as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective "analytical instrument" for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses) are then described, focusing on their better performances for environmental analysis. Odour emission monitoring carried out through

  1. G-CNV: A GPU-Based Tool for Preparing Data to Detect CNVs with Read-Depth Methods.

    PubMed

    Manconi, Andrea; Manca, Emanuele; Moscatelli, Marco; Gnocchi, Matteo; Orro, Alessandro; Armano, Giuliano; Milanesi, Luciano

    2015-01-01

    Copy number variations (CNVs) are the most prevalent types of structural variations (SVs) in the human genome and are involved in a wide range of common human diseases. Different computational methods have been devised to detect this type of SVs and to study how they are implicated in human diseases. Recently, computational methods based on high-throughput sequencing (HTS) are increasingly used. The majority of these methods focus on mapping short-read sequences generated from a donor against a reference genome to detect signatures distinctive of CNVs. In particular, read-depth based methods detect CNVs by analyzing genomic regions with significantly different read-depth from the other ones. The pipeline analysis of these methods consists of four main stages: (i) data preparation, (ii) data normalization, (iii) CNV regions identification, and (iv) copy number estimation. However, available tools do not support most of the operations required at the first two stages of this pipeline. Typically, they start the analysis by building the read-depth signal from pre-processed alignments. Therefore, third-party tools must be used to perform most of the preliminary operations required to build the read-depth signal. These data-intensive operations can be efficiently parallelized on graphics processing units (GPUs). In this article, we present G-CNV, a GPU-based tool devised to perform the common operations required at the first two stages of the analysis pipeline. G-CNV is able to filter low-quality read sequences, to mask low-quality nucleotides, to remove adapter sequences, to remove duplicated read sequences, to map the short-reads, to resolve multiple mapping ambiguities, to build the read-depth signal, and to normalize it. G-CNV can be efficiently used as a third-party tool able to prepare data for the subsequent read-depth signal generation and analysis. Moreover, it can also be integrated in CNV detection tools to generate read-depth signals.

  2. 25 Years of Self-organized Criticality: Numerical Detection Methods

    NASA Astrophysics Data System (ADS)

    McAteer, R. T. James; Aschwanden, Markus J.; Dimitropoulou, Michaila; Georgoulis, Manolis K.; Pruessner, Gunnar; Morales, Laura; Ireland, Jack; Abramenko, Valentyna

    2016-01-01

    The detection and characterization of self-organized criticality (SOC), in both real and simulated data, has undergone many significant revisions over the past 25 years. The explosive advances in the many numerical methods available for detecting, discriminating, and ultimately testing, SOC have played a critical role in developing our understanding of how systems experience and exhibit SOC. In this article, methods of detecting SOC are reviewed; from correlations to complexity to critical quantities. A description of the basic autocorrelation method leads into a detailed analysis of application-oriented methods developed in the last 25 years. In the second half of this manuscript space-based, time-based and spatial-temporal methods are reviewed and the prevalence of power laws in nature is described, with an emphasis on event detection and characterization. The search for numerical methods to clearly and unambiguously detect SOC in data often leads us outside the comfort zone of our own disciplines—the answers to these questions are often obtained by studying the advances made in other fields of study. In addition, numerical detection methods often provide the optimum link between simulations and experiments in scientific research. We seek to explore this boundary where the rubber meets the road, to review this expanding field of research of numerical detection of SOC systems over the past 25 years, and to iterate forwards so as to provide some foresight and guidance into developing breakthroughs in this subject over the next quarter of a century.

  3. Method for predicting peptide detection in mass spectrometry

    DOEpatents

    Kangas, Lars [West Richland, WA; Smith, Richard D [Richland, WA; Petritis, Konstantinos [Richland, WA

    2010-07-13

    A method of predicting whether a peptide present in a biological sample will be detected by analysis with a mass spectrometer. The method uses at least one mass spectrometer to perform repeated analysis of a sample containing peptides from proteins with known amino acids. The method then generates a data set of peptides identified as contained within the sample by the repeated analysis. The method then calculates the probability that a specific peptide in the data set was detected in the repeated analysis. The method then creates a plurality of vectors, where each vector has a plurality of dimensions, and each dimension represents a property of one or more of the amino acids present in each peptide and adjacent peptides in the data set. Using these vectors, the method then generates an algorithm from the plurality of vectors and the calculated probabilities that specific peptides in the data set were detected in the repeated analysis. The algorithm is thus capable of calculating the probability that a hypothetical peptide represented as a vector will be detected by a mass spectrometry based proteomic platform, given that the peptide is present in a sample introduced into a mass spectrometer.

  4. Criteria For Evaluation of Proposed Protozoan Detection Methods

    EPA Science Inventory

    Currently, the only EPA approved method for detection and quantitation of protozoan cysts and oöcysts in source and drinking water, is the “ICR Protozoan Method for Detecting Giardia Cysts and Cryptosporidium Oöcysts in Water by a Fluorescent Antibody Procedure (ICR Microbial La...

  5. A Method of Detections' Fusion for GNSS Anti-Spoofing.

    PubMed

    Tao, Huiqi; Li, Hong; Lu, Mingquan

    2016-12-19

    The spoofing attack is one of the security threats of systems depending on the Global Navigation Satellite System (GNSS). There have been many GNSS spoofing detection methods, and each of them focuses on a characteristic of the GNSS signal or a measurement that the receiver has obtained. The method based on a single detector is insufficient against spoofing attacks in some scenarios. How to fuse multiple detections together is a problem that concerns the performance of GNSS anti-spoofing. Scholars have put forward a model to fuse different detection results based on the Dempster-Shafer theory (DST) of evidence combination. However, there are some problems in the application. The main challenge is the valuation of the belief function, which is a key issue in DST. This paper proposes a practical method of detections' fusion based on an approach to assign the belief function for spoofing detections. The frame of discernment is simplified, and the hard decision of hypothesis testing is replaced by the soft decision; then, the belief functions for some detections can be evaluated. The method is discussed in detail, and a performance evaluation is provided, as well. Detections' fusion reduces false alarms of detection and makes the result more reliable. Experimental results based on public test datasets demonstrate the performance of the proposed method.

  6. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be

  7. Spectral analysis method for detecting an element

    DOEpatents

    Blackwood, Larry G [Idaho Falls, ID; Edwards, Andrew J [Idaho Falls, ID; Jewell, James K [Idaho Falls, ID; Reber, Edward L [Idaho Falls, ID; Seabury, Edward H [Idaho Falls, ID

    2008-02-12

    A method for detecting an element is described and which includes the steps of providing a gamma-ray spectrum which has a region of interest which corresponds with a small amount of an element to be detected; providing nonparametric assumptions about a shape of the gamma-ray spectrum in the region of interest, and which would indicate the presence of the element to be detected; and applying a statistical test to the shape of the gamma-ray spectrum based upon the nonparametric assumptions to detect the small amount of the element to be detected.

  8. A Bayesian method for detecting stellar flares

    NASA Astrophysics Data System (ADS)

    Pitkin, M.; Williams, D.; Fletcher, L.; Grant, S. D. T.

    2014-12-01

    We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of `quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N.

  9. Nested methylation-specific polymerase chain reaction cancer detection method

    DOEpatents

    Belinsky, Steven A [Albuquerque, NM; Palmisano, William A [Edgewood, NM

    2007-05-08

    A molecular marker-based method for monitoring and detecting cancer in humans. Aberrant methylation of gene promoters is a marker for cancer risk in humans. A two-stage, or "nested" polymerase chain reaction method is disclosed for detecting methylated DNA sequences at sufficiently high levels of sensitivity to permit cancer screening in biological fluid samples, such as sputum, obtained non-invasively. The method is for detecting the aberrant methylation of the p16 gene, O 6-methylguanine-DNA methyltransferase gene, Death-associated protein kinase gene, RAS-associated family 1 gene, or other gene promoters. The method offers a potentially powerful approach to population-based screening for the detection of lung and other cancers.

  10. Ultra-high sensitivity radiation detection apparatus and method

    DOEpatents

    Gross, Kenneth C.; Valentine, John D.; Markum, Francis; Zawadzki, Mary; Dickerman, Charles

    1999-01-01

    A method and apparatus are provided to concentrate and detect very low levels of radioactive noble gases from the atmosphere. More specifically the invention provides a method and apparatus to concentrate xenon, krypton and radon in an organic fluid and to detect these gases by the radioactive emissions.

  11. Mobile/android application for QRS detection using zero cross method

    NASA Astrophysics Data System (ADS)

    Rizqyawan, M. I.; Simbolon, A. I.; Suhendra, M. A.; Amri, M. F.; Kusumandari, D. E.

    2018-03-01

    In automatic ECG signal processing, one of the main topics of research is QRS complex detection. Detecting correct QRS complex or R peak is important since it is used to measure several other ECG metrics. One of the robust methods for QRS detection is Zero Cross method. This method uses an addition of high-frequency signal and zero crossing count to detect QRS complex which has a low-frequency oscillation. This paper presents an application of QRS detection using Zero Cross algorithm in the Android-based system. The performance of the algorithm in the mobile environment is measured. The result shows that this method is suitable for real-time QRS detection in a mobile application.

  12. A Signal Detection Theory Approach to Evaluating Oculometer Data Quality

    NASA Technical Reports Server (NTRS)

    Latorella, Kara; Lynn, William, III; Barry, John S.; Kelly, Lon; Shih, Ming-Yun

    2013-01-01

    Currently, data quality is described in terms of spatial and temporal accuracy and precision [Holmqvist et al. in press]. While this approach provides precise errors in pixels, or visual angle, often experiments are more concerned with whether subjects'points of gaze can be said to be reliable with respect to experimentally-relevant areas of interest. This paper proposes a method to characterize oculometer data quality using Signal Detection Theory (SDT) [Marcum 1947]. SDT classification results in four cases: Hit (correct report of a signal), Miss (failure to report a ), False Alarm (a signal falsely reported), Correct Reject (absence of a signal correctly reported). A technique is proposed where subjects' are directed to look at points in and outside of an AOI, and the resulting Points of Gaze (POG) are classified as Hits (points known to be internal to an AOI are classified as such), Misses (AOI points are not indicated as such), False Alarms (points external to AOIs are indicated as in the AOI), or Correct Rejects (points external to the AOI are indicated as such). SDT metrics describe performance in terms of discriminability, sensitivity, and specificity. This paper presentation will provide the procedure for conducting this assessment and an example of data collected for AOIs in a simulated flightdeck environment.

  13. Field Tests of the Magnetotelluric Method to Detect Gas Hydrates, Mallik, Mackenzie Delta, Canada

    NASA Astrophysics Data System (ADS)

    Craven, J. A.; Roberts, B.; Bellefleur, G.; Spratt, J.; Wright, F.; Dallimore, S. R.

    2008-12-01

    The magnetotelluric method is not generally utilized at extreme latitudes due primarily to difficulties in making the good electrical contact with the ground required to measure the electric field. As such, the magnetotelluric technique has not been previously investigated to direct detect gas hydrates in on-shore permafrost environments. We present the results of preliminary field tests at Mallik, Northwest Territories, Canada, that demonstrate good quality magnetotelluric data can be obtained in this environment using specialized electrodes and buffer amplifiers similar to those utilized by Wannamaker et al (2004). This result suggests that subsurface images from larger magnetotelluric surveys will be useful to complement other techniques to detect, quantify and characterize gas hydrates.

  14. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz [Livermore, CA; Langlois, Richard G [Livermore, CA; Venkateswaran, Kodumudi S [Round Rock, TX

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  15. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz [Livermore, CA; Langlois, Richard G [Livermore, CA; Venkateswaran, Kodumudi S [Livermore, CA

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  16. Multiple targets detection method in detection of UWB through-wall radar

    NASA Astrophysics Data System (ADS)

    Yang, Xiuwei; Yang, Chuanfa; Zhao, Xingwen; Tian, Xianzhong

    2017-11-01

    In this paper, the problems and difficulties encountered in the detection of multiple moving targets by UWB radar are analyzed. The experimental environment and the penetrating radar system are established. An adaptive threshold method based on local area is proposed to effectively filter out clutter interference The objective of the moving target is analyzed, and the false target is further filtered out by extracting the target feature. Based on the correlation between the targets, the target matching algorithm is proposed to improve the detection accuracy. Finally, the effectiveness of the above method is verified by practical experiment.

  17. Effectiveness Comparison of TxDOT Quality Control/Quality Assurance and Method Specifications

    DOT National Transportation Integrated Search

    1998-12-01

    Original Report date: October 1997. This is the first and final report for research project 0-1721, "Effectiveness Comparison of TxDOT Quality Control/Quality Assurance and Method Specifications." This study was established and sponsored by TxDOT to ...

  18. System and method for detecting cells or components thereof

    DOEpatents

    Porter, Marc D [Ames, IA; Lipert, Robert J [Ames, IA; Doyle, Robert T [Ames, IA; Grubisha, Desiree S [Corona, CA; Rahman, Salma [Ames, IA

    2009-01-06

    A system and method for detecting a detectably labeled cell or component thereof in a sample comprising one or more cells or components thereof, at least one cell or component thereof of which is detectably labeled with at least two detectable labels. In one embodiment, the method comprises: (i) introducing the sample into one or more flow cells of a flow cytometer, (ii) irradiating the sample with one or more light sources that are absorbed by the at least two detectable labels, the absorption of which is to be detected, and (iii) detecting simultaneously the absorption of light by the at least two detectable labels on the detectably labeled cell or component thereof with an array of photomultiplier tubes, which are operably linked to two or more filters that selectively transmit detectable emissions from the at least two detectable labels.

  19. [Quality assurance in geriatric rehabilitation--approaches and methods].

    PubMed

    Deckenbach, B; Borchelt, M; Steinhagen-Thiessen, E

    1997-08-01

    It did not take the provisions of the 5th Book of the Social Code for quality assurance issues to gain significance in the field of geriatric rehabilitation as well. While in the surgical specialties, experience in particular with external quality assurance have already been gathered over several years now, suitable concepts and methods for the new Geriatric Rehabilitation specialty are still in the initial stages of development. Proven methods from the industrial and service sectors, such as auditing, monitoring and quality circles, can in principle be drawn on for devising geriatric rehabilitation quality assurance schemes; these in particular need to take into account the multiple factors influencing the course and outcome of rehabilitation entailed by multimorbidity and multi-drug use; the eminent role of the social environment; therapeutic interventions by a multidisciplinary team; as well as the multi-dimensional nature of rehabilitation outcomes. Moreover, the specific conditions of geriatric rehabilitation require development not only of quality standards unique to this domain but also of quality assurance procedures specific to geriatrics. Along with a number of other methods, standardized geriatric assessment will play a crucial role in this respect.

  20. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  1. Identification of suitable fundus images using automated quality assessment methods.

    PubMed

    Şevik, Uğur; Köse, Cemal; Berber, Tolga; Erdöl, Hidayet

    2014-04-01

    Retinal image quality assessment (IQA) is a crucial process for automated retinal image analysis systems to obtain an accurate and successful diagnosis of retinal diseases. Consequently, the first step in a good retinal image analysis system is measuring the quality of the input image. We present an approach for finding medically suitable retinal images for retinal diagnosis. We used a three-class grading system that consists of good, bad, and outlier classes. We created a retinal image quality dataset with a total of 216 consecutive images called the Diabetic Retinopathy Image Database. We identified the suitable images within the good images for automatic retinal image analysis systems using a novel method. Subsequently, we evaluated our retinal image suitability approach using the Digital Retinal Images for Vessel Extraction and Standard Diabetic Retinopathy Database Calibration level 1 public datasets. The results were measured through the F1 metric, which is a harmonic mean of precision and recall metrics. The highest F1 scores of the IQA tests were 99.60%, 96.50%, and 85.00% for good, bad, and outlier classes, respectively. Additionally, the accuracy of our suitable image detection approach was 98.08%. Our approach can be integrated into any automatic retinal analysis system with sufficient performance scores.

  2. Detection and characterisation of anthropogenic pieces by magnetic method

    NASA Astrophysics Data System (ADS)

    Nodot, Emilie; Munschy, Marc; Benevent, Pierre

    2013-04-01

    Human activities have let many anthropogenic objects buried under our feet. Some of these like explosive devices left after the World Wars turn out to be a threat to safety or environment. Others must be perfectly localised in case of construction work, for example gas pipe. Geophysics and more specifically magnetic cartography (many of these items are magnetic) can obviously help to locate them. We already use this method on daily basis to detect UXO (unexploded ordnance) but less than 10% of the unearthed objects are actually bombs or shells. Detection and mostly characterisation methods must be improved in order to reduce this proportion. On the field there are a few things we can do to increase data qualities. Characterisation may be improved by multiple scale prospections. We search a large area with our usual and rather fast method then we achieve high definition cartographies of small interesting areas (upon the object to characterise). In the case of measurements in an urban environment for example, data are distorted. The traffic (train, tramway, cars…) produces temporal variations of the magnetic field. This effect can be lessened, sometimes even removed by the use of a fixed scalar magnetic sensor. Data treatment is another key as regards the characterisation. Tools such as analytic signal or derivative are frequently used at the first degree. We will see that in a synthetic case the second and third degree bring even more information. A new issue appeared recently about pipes. Can we localise very precisely (less than 10 cm uncertainty) a gas pipe? Horizontally we can but due to our inversion method we still have troubles with the depth accuracy. Our final concern is about the amplitude of some anomalies. Potential methods equations are based on the fact that the anomaly norm must be minor to magnetic field norm. Sometimes this is not the case but vector magnetometry is a lead to solve this problem.

  3. Tunnel Detection Using Seismic Methods

    NASA Astrophysics Data System (ADS)

    Miller, R.; Park, C. B.; Xia, J.; Ivanov, J.; Steeples, D. W.; Ryden, N.; Ballard, R. F.; Llopis, J. L.; Anderson, T. S.; Moran, M. L.; Ketcham, S. A.

    2006-05-01

    Surface seismic methods have shown great promise for use in detecting clandestine tunnels in areas where unauthorized movement beneath secure boundaries have been or are a matter of concern for authorities. Unauthorized infiltration beneath national borders and into or out of secure facilities is possible at many sites by tunneling. Developments in acquisition, processing, and analysis techniques using multi-channel seismic imaging have opened the door to a vast number of near-surface applications including anomaly detection and delineation, specifically tunnels. Body waves have great potential based on modeling and very preliminary empirical studies trying to capitalize on diffracted energy. A primary limitation of all seismic energy is the natural attenuation of high-frequency energy by earth materials and the difficulty in transmitting a high- amplitude source pulse with a broad spectrum above 500 Hz into the earth. Surface waves have shown great potential since the development of multi-channel analysis methods (e.g., MASW). Both shear-wave velocity and backscatter energy from surface waves have been shown through modeling and empirical studies to have great promise in detecting the presence of anomalies, such as tunnels. Success in developing and evaluating various seismic approaches for detecting tunnels relies on investigations at known tunnel locations, in a variety of geologic settings, employing a wide range of seismic methods, and targeting a range of uniquely different tunnel geometries, characteristics, and host lithologies. Body-wave research at the Moffat tunnels in Winter Park, Colorado, provided well-defined diffraction-looking events that correlated with the subsurface location of the tunnel complex. Natural voids related to karst have been studied in Kansas, Oklahoma, Alabama, and Florida using shear-wave velocity imaging techniques based on the MASW approach. Manmade tunnels, culverts, and crawl spaces have been the target of multi-modal analysis

  4. Study of comparison between Ultra-high Frequency (UHF) method and ultrasonic method on PD detection for GIS

    NASA Astrophysics Data System (ADS)

    Li, Yanran; Chen, Duo; Li, Li; Zhang, Jiwei; Li, Guang; Liu, Hongxia

    2017-11-01

    GIS (gas insulated switchgear), is an important equipment in power system. Partial discharge plays an important role in detecting the insulation performance of GIS. UHF method and ultrasonic method frequently used in partial discharge (PD) detection for GIS. However, few studies have been conducted on comparison of this two methods. From the view point of safety, it is necessary to investigate UHF method and ultrasonic method for partial discharge in GIS. This paper presents study aimed at clarifying the effect of UHF method and ultrasonic method for partial discharge caused by free metal particles in GIS. Partial discharge tests were performed in laboratory simulated environment. Obtained results show the ability of anti-interference of signal detection and the accuracy of fault localization for UHF method and ultrasonic method. A new method based on UHF method and ultrasonic method of PD detection for GIS is proposed in order to greatly enhance the ability of anti-interference of signal detection and the accuracy of detection localization.

  5. Spectral methods to detect surface mines

    NASA Astrophysics Data System (ADS)

    Winter, Edwin M.; Schatten Silvious, Miranda

    2008-04-01

    Over the past five years, advances have been made in the spectral detection of surface mines under minefield detection programs at the U. S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). The problem of detecting surface land mines ranges from the relatively simple, the detection of large anti-vehicle mines on bare soil, to the very difficult, the detection of anti-personnel mines in thick vegetation. While spatial and spectral approaches can be applied to the detection of surface mines, spatial-only detection requires many pixels-on-target such that the mine is actually imaged and shape-based features can be exploited. This method is unreliable in vegetated areas because only part of the mine may be exposed, while spectral detection is possible without the mine being resolved. At NVESD, hyperspectral and multi-spectral sensors throughout the reflection and thermal spectral regimes have been applied to the mine detection problem. Data has been collected on mines in forest and desert regions and algorithms have been developed both to detect the mines as anomalies and to detect the mines based on their spectral signature. In addition to the detection of individual mines, algorithms have been developed to exploit the similarities of mines in a minefield to improve their detection probability. In this paper, the types of spectral data collected over the past five years will be summarized along with the advances in algorithm development.

  6. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    PubMed

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Bayesian methods for outliers detection in GNSS time series

    NASA Astrophysics Data System (ADS)

    Qianqian, Zhang; Qingming, Gui

    2013-07-01

    This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.

  8. Quality Control Guidelines for SAM Chemical Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the chemistry methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  9. Quality Control Guidelines for SAM Radiochemical Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the radiochemistry methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  10. Quality Control Guidelines for SAM Biotoxin Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the pathogen methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  11. Quality Control Guidelines for SAM Pathogen Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the biotoxin methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  12. Optimization of a Viability PCR Method for the Detection of Listeria monocytogenes in Food Samples.

    PubMed

    Agustí, Gemma; Fittipaldi, Mariana; Codony, Francesc

    2018-06-01

    Rapid detection of Listeria and other microbial pathogens in food is an essential part of quality control and it is critical for ensuring the safety of consumers. Culture-based methods for detecting foodborne pathogens are time-consuming, laborious and cannot detect viable but non-culturable microorganism, whereas viability PCR methodology provides quick results; it is able to detect viable but non-culturable cells, and allows for easier handling of large amount of samples. Although the most critical point to use viability PCR technique is achieving the complete exclusion of dead cell amplification signals, many improvements are being introduced to overcome this. In the present work, the yield of dead cell DNA neutralization was enhanced by incorporating two new sample treatment strategies: tube change combined with a double light treatment. This procedure was successfully tested using artificially contaminated food samples, showing improved neutralization of dead cell DNA.

  13. Molecular methods for pathogen detection and quantification

    USDA-ARS?s Scientific Manuscript database

    Ongoing interest in convenient, inexpensive, fast, sensitive and accurate techniques for detecting and/or quantifying the presence of soybean pathogens has resulted in increased usage of molecular tools. The method of extracting a molecular target (usually DNA or RNA) for detection depends wholly up...

  14. External Quality Assessment for the Detection of Measles Virus by Reverse Transcription-PCR Using Armored RNA

    PubMed Central

    Jia, Tingting; Zhang, Lei; Wang, Guojing; Zhang, Rui; Zhang, Kuo; Lin, Guigao; Xie, Jiehong; Wang, Lunan; Li, Jinming

    2015-01-01

    In recent years, nucleic acid tests for detection of measles virus RNA have been widely applied in laboratories belonging to the measles surveillance system of China. An external quality assessment program was established by the National Center for Clinical Laboratories to evaluate the performance of nucleic acid tests for measles virus. The external quality assessment panel, which consisted of 10 specimens, was prepared using armored RNAs, complex of noninfectious MS2 bacteriophage coat proteins encapsulated RNA of measles virus, as measles virus surrogate controls. Conserved sequences amplified from a circulating measles virus strain or from a vaccine strain were encapsulated into these armored RNAs. Forty-one participating laboratories from 15 provinces, municipalities, or autonomous regions that currently conduct molecular detection of measles virus enrolled in the external quality assessment program, including 40 measles surveillance system laboratories and one diagnostic reagent manufacturer. Forty laboratories used commercial reverse transcription-quantitative PCR kits, with only one laboratory applying a conventional PCR method developed in-house. The results indicated that most of the participants (38/41, 92.7%) were able to accurately detect the panel with 100% sensitivity and 100% specificity. Although a wide range of commercially available kits for nucleic acid extraction and reverse transcription polymerase chain reaction were used by the participants, only two false-negative results and one false-positive result were generated; these were generated by three separate laboratories. Both false-negative results were obtained with tests performed on specimens with the lowest concentration (1.2 × 104 genomic equivalents/mL). In addition, all 18 participants from Beijing achieved 100% sensitivity and 100% specificity. Overall, we conclude that the majority of the laboratories evaluated have reliable diagnostic capacities for the detection of measles virus

  15. External Quality Assessment for the Detection of Measles Virus by Reverse Transcription-PCR Using Armored RNA.

    PubMed

    Zhang, Dong; Sun, Yu; Jia, Tingting; Zhang, Lei; Wang, Guojing; Zhang, Rui; Zhang, Kuo; Lin, Guigao; Xie, Jiehong; Wang, Lunan; Li, Jinming

    2015-01-01

    In recent years, nucleic acid tests for detection of measles virus RNA have been widely applied in laboratories belonging to the measles surveillance system of China. An external quality assessment program was established by the National Center for Clinical Laboratories to evaluate the performance of nucleic acid tests for measles virus. The external quality assessment panel, which consisted of 10 specimens, was prepared using armored RNAs, complex of noninfectious MS2 bacteriophage coat proteins encapsulated RNA of measles virus, as measles virus surrogate controls. Conserved sequences amplified from a circulating measles virus strain or from a vaccine strain were encapsulated into these armored RNAs. Forty-one participating laboratories from 15 provinces, municipalities, or autonomous regions that currently conduct molecular detection of measles virus enrolled in the external quality assessment program, including 40 measles surveillance system laboratories and one diagnostic reagent manufacturer. Forty laboratories used commercial reverse transcription-quantitative PCR kits, with only one laboratory applying a conventional PCR method developed in-house. The results indicated that most of the participants (38/41, 92.7%) were able to accurately detect the panel with 100% sensitivity and 100% specificity. Although a wide range of commercially available kits for nucleic acid extraction and reverse transcription polymerase chain reaction were used by the participants, only two false-negative results and one false-positive result were generated; these were generated by three separate laboratories. Both false-negative results were obtained with tests performed on specimens with the lowest concentration (1.2 × 104 genomic equivalents/mL). In addition, all 18 participants from Beijing achieved 100% sensitivity and 100% specificity. Overall, we conclude that the majority of the laboratories evaluated have reliable diagnostic capacities for the detection of measles virus.

  16. Dim target detection method based on salient graph fusion

    NASA Astrophysics Data System (ADS)

    Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun

    2018-02-01

    Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.

  17. [Optimized application of nested PCR method for detection of malaria].

    PubMed

    Yao-Guang, Z; Li, J; Zhen-Yu, W; Li, C

    2017-04-28

    Objective To optimize the application of the nested PCR method for the detection of malaria according to the working practice, so as to improve the efficiency of malaria detection. Methods Premixing solution of PCR, internal primers for further amplification and new designed primers that aimed at two Plasmodium ovale subspecies were employed to optimize the reaction system, reaction condition and specific primers of P . ovale on basis of routine nested PCR. Then the specificity and the sensitivity of the optimized method were analyzed. The positive blood samples and examination samples of malaria were detected by the routine nested PCR and the optimized method simultaneously, and the detection results were compared and analyzed. Results The optimized method showed good specificity, and its sensitivity could reach the pg to fg level. The two methods were used to detect the same positive malarial blood samples simultaneously, the results indicated that the PCR products of the two methods had no significant difference, but the non-specific amplification reduced obviously and the detection rates of P . ovale subspecies improved, as well as the total specificity also increased through the use of the optimized method. The actual detection results of 111 cases of malarial blood samples showed that the sensitivity and specificity of the routine nested PCR were 94.57% and 86.96%, respectively, and those of the optimized method were both 93.48%, and there was no statistically significant difference between the two methods in the sensitivity ( P > 0.05), but there was a statistically significant difference between the two methods in the specificity ( P < 0.05). Conclusion The optimized PCR can improve the specificity without reducing the sensitivity on the basis of the routine nested PCR, it also can save the cost and increase the efficiency of malaria detection as less experiment links.

  18. DNA Extraction Method Affects the Detection of a Fungal Pathogen in Formalin-Fixed Specimens Using qPCR.

    PubMed

    Adams, Andrea J; LaBonte, John P; Ball, Morgan L; Richards-Hrdlicka, Kathryn L; Toothman, Mary H; Briggs, Cheryl J

    2015-01-01

    Museum collections provide indispensable repositories for obtaining information about the historical presence of disease in wildlife populations. The pathogenic amphibian chytrid fungus Batrachochytrium dendrobatidis (Bd) has played a significant role in global amphibian declines, and examining preserved specimens for Bd can improve our understanding of its emergence and spread. Quantitative PCR (qPCR) enables Bd detection with minimal disturbance to amphibian skin and is significantly more sensitive to detecting Bd than histology; therefore, developing effective qPCR methodologies for detecting Bd DNA in formalin-fixed specimens can provide an efficient and effective approach to examining historical Bd emergence and prevalence. Techniques for detecting Bd in museum specimens have not been evaluated for their effectiveness in control specimens that mimic the conditions of animals most likely to be encountered in museums, including those with low pathogen loads. We used American bullfrogs (Lithobates catesbeianus) of known infection status to evaluate the success of qPCR to detect Bd in formalin-fixed specimens after three years of ethanol storage. Our objectives were to compare the most commonly used DNA extraction method for Bd (PrepMan, PM) to Macherey-Nagel DNA FFPE (MN), test optimizations for Bd detection with PM, and provide recommendations for maximizing Bd detection. We found that successful detection is relatively high (80-90%) when Bd loads before formalin fixation are high, regardless of the extraction method used; however, at lower infection levels, detection probabilities were significantly reduced. The MN DNA extraction method increased Bd detection by as much as 50% at moderate infection levels. Our results indicate that, for animals characterized by lower pathogen loads (i.e., those most commonly encountered in museum collections), current methods may underestimate the proportion of Bd-infected amphibians. Those extracting DNA from archived museum

  19. Evaluating online data of water quality changes in a pilot drinking water distribution system with multivariate data exploration methods.

    PubMed

    Mustonen, Satu M; Tissari, Soile; Huikko, Laura; Kolehmainen, Mikko; Lehtola, Markku J; Hirvonen, Arja

    2008-05-01

    The distribution of drinking water generates soft deposits and biofilms in the pipelines of distribution systems. Disturbances in water distribution can detach these deposits and biofilms and thus deteriorate the water quality. We studied the effects of simulated pressure shocks on the water quality with online analysers. The study was conducted with copper and composite plastic pipelines in a pilot distribution system. The online data gathered during the study was evaluated with Self-Organising Map (SOM) and Sammon's mapping, which are useful methods in exploring large amounts of multivariate data. The objective was to test the usefulness of these methods in pinpointing the abnormal water quality changes in the online data. The pressure shocks increased temporarily the number of particles, turbidity and electrical conductivity. SOM and Sammon's mapping were able to separate these situations from the normal data and thus make those visible. Therefore these methods make it possible to detect abrupt changes in water quality and thus to react rapidly to any disturbances in the system. These methods are useful in developing alert systems and predictive applications connected to online monitoring.

  20. New Optical Methods for Liveness Detection on Fingers

    PubMed Central

    Dolezel, Michal; Vana, Jan; Brezinova, Eva; Yim, Jaegeol; Shim, Kyubark

    2013-01-01

    This paper is devoted to new optical methods, which are supposed to be used for liveness detection on fingers. First we describe the basics about fake finger use in fingerprint recognition process and the possibilities of liveness detection. Then we continue with introducing three new liveness detection methods, which we developed and tested in the scope of our research activities—the first one is based on measurement of the pulse, the second one on variations of optical characteristics caused by pressure change, and the last one is based on reaction of skin to illumination with different wavelengths. The last part deals with the influence of skin diseases on fingerprint recognition, especially on liveness detection. PMID:24151584

  1. A morphological method for ammonia detection in liver

    PubMed Central

    Gutiérrez-de-Juan, Virginia; López de Davalillo, Sergio; Fernández-Ramos, David; Barbier-Torres, Lucía; Zubiete-Franco, Imanol; Fernández-Tussy, Pablo; Simon, Jorge; Lopitz-Otsoa, Fernando; de las Heras, Javier; Iruzubieta, Paula; Arias-Loste, María Teresa; Villa, Erica; Crespo, Javier; Andrade, Raúl; Lucena, M. Isabel; Varela-Rey, Marta; Lu, Shelly C.; Mato, José M.; Delgado, Teresa Cardoso

    2017-01-01

    Hyperammonemia is a metabolic condition characterized by elevated levels of ammonia and a common event in acute liver injury/failure and chronic liver disease. Even though hepatic ammonia levels are potential predictive factors of patient outcome, easy and inexpensive methods aiming at the detection of liver ammonia accumulation in the clinical setting remain unavailable. Thus, herein we have developed a morphological method, based on the utilization of Nessler´s reagent, to accurately and precisely detect the accumulation of ammonia in biological tissue. We have validated our method against a commercially available kit in mouse tissue samples and, by using this modified method, we have confirmed the hepatic accumulation of ammonia in clinical and animal models of acute and chronic advanced liver injury as well as in the progression of fatty liver disease. Overall, we propose a morphological method for ammonia detection in liver that correlates well with the degree of liver disease severity and therefore can be potentially used to predict patient outcome. PMID:28319158

  2. A morphological method for ammonia detection in liver.

    PubMed

    Gutiérrez-de-Juan, Virginia; López de Davalillo, Sergio; Fernández-Ramos, David; Barbier-Torres, Lucía; Zubiete-Franco, Imanol; Fernández-Tussy, Pablo; Simon, Jorge; Lopitz-Otsoa, Fernando; de Las Heras, Javier; Iruzubieta, Paula; Arias-Loste, María Teresa; Villa, Erica; Crespo, Javier; Andrade, Raúl; Lucena, M Isabel; Varela-Rey, Marta; Lu, Shelly C; Mato, José M; Delgado, Teresa Cardoso; Martínez-Chantar, María-Luz

    2017-01-01

    Hyperammonemia is a metabolic condition characterized by elevated levels of ammonia and a common event in acute liver injury/failure and chronic liver disease. Even though hepatic ammonia levels are potential predictive factors of patient outcome, easy and inexpensive methods aiming at the detection of liver ammonia accumulation in the clinical setting remain unavailable. Thus, herein we have developed a morphological method, based on the utilization of Nessler´s reagent, to accurately and precisely detect the accumulation of ammonia in biological tissue. We have validated our method against a commercially available kit in mouse tissue samples and, by using this modified method, we have confirmed the hepatic accumulation of ammonia in clinical and animal models of acute and chronic advanced liver injury as well as in the progression of fatty liver disease. Overall, we propose a morphological method for ammonia detection in liver that correlates well with the degree of liver disease severity and therefore can be potentially used to predict patient outcome.

  3. Apparatus and methods for detecting chemical permeation

    DOEpatents

    Vo-Dinh, T.

    1994-12-27

    Apparatus and methods for detecting the permeation of hazardous or toxic chemicals through protective clothing are disclosed. The hazardous or toxic chemicals of interest do not possess the spectral characteristic of luminescence. The apparatus and methods utilize a spectrochemical modification technique to detect the luminescence quenching of an indicator compound which upon permeation of the chemical through the protective clothing, the indicator is exposed to the chemical, thus indicating chemical permeation. The invention also relates to the fabrication of protective clothing materials. 13 figures.

  4. Novel methods for detecting buried explosive devices

    NASA Astrophysics Data System (ADS)

    Kercel, Stephen W.; Burlage, Robert S.; Patek, David R.; Smith, Cyrus M.; Hibbs, Andrew D.; Rayner, Timothy J.

    1997-07-01

    Oak Ridge National Laboratory and Quantum Magnetics, Inc. are exploring novel landmine detection technologies. Technologies considered here include bioreporter bacteria, swept acoustic resonance, nuclear quadrupole resonance (NQR), and semiotic data fusion. Bioreporter bacteria look promising for third-world humanitarian applications; they are inexpensive, and deployment does not require high-tech methods. Swept acoustic resonance may be a useful adjunct to magnetometers in humanitarian demining. For military demining, NQR is a promising method for detecting explosive substances; of 50,000 substances that have been tested, one has an NQR signature that can be mistaken for RDX or TNT. For both military and commercial demining, sensor fusion entails two daunting tasks, identifying fusible features in both present-day and emerging technologies, and devising a fusion algorithm that runs in real-time on cheap hardware. Preliminary research in these areas is encouraging. A bioreporter bacterium for TNT detection is under development. Investigation has just started in swept acoustic resonance as an approach to a cheap mine detector for humanitarian use. Real-time wavelet processing appears to be a key to extending NQR bomb detection into mine detection, including TNT-based mines. Recent discoveries in semiotics may be the breakthrough that will lead to a robust fused detection scheme.

  5. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  6. Accurate Determination of the Q Quality Factor in Magnetoelastic Resonant Platforms for Advanced Biological Detection

    PubMed Central

    Lopes, Ana Catarina; Sagasti, Ariane; Lasheras, Andoni; Muto, Virginia; Gutiérrez, Jon; Kouzoudis, Dimitris; Barandiarán, José Manuel

    2018-01-01

    The main parameters of magnetoelastic resonators in the detection of chemical (i.e., salts, gases, etc.) or biological (i.e., bacteria, phages, etc.) agents are the sensitivity S (or external agent change magnitude per Hz change in the resonance frequency) and the quality factor Q of the resonance. We present an extensive study on the experimental determination of the Q factor in such magnetoelastic resonant platforms, using three different strategies: (a) analyzing the real and imaginary components of the susceptibility at resonance; (b) numerical fitting of the modulus of the susceptibility; (c) using an exact mathematical expression for the real part of the susceptibility. Q values obtained by the three methods are analyzed and discussed, aiming to establish the most adequate one to accurately determine the quality factor of the magnetoelastic resonance. PMID:29547578

  7. Accurate Determination of the Q Quality Factor in Magnetoelastic Resonant Platforms for Advanced Biological Detection.

    PubMed

    Lopes, Ana Catarina; Sagasti, Ariane; Lasheras, Andoni; Muto, Virginia; Gutiérrez, Jon; Kouzoudis, Dimitris; Barandiarán, José Manuel

    2018-03-16

    The main parameters of magnetoelastic resonators in the detection of chemical (i.e., salts, gases, etc.) or biological (i.e., bacteria, phages, etc.) agents are the sensitivity S (or external agent change magnitude per Hz change in the resonance frequency) and the quality factor Q of the resonance. We present an extensive study on the experimental determination of the Q factor in such magnetoelastic resonant platforms, using three different strategies: (a) analyzing the real and imaginary components of the susceptibility at resonance; (b) numerical fitting of the modulus of the susceptibility; (c) using an exact mathematical expression for the real part of the susceptibility. Q values obtained by the three methods are analyzed and discussed, aiming to establish the most adequate one to accurately determine the quality factor of the magnetoelastic resonance.

  8. Comparison of detection methods for cell surface globotriaosylceramide.

    PubMed

    Kim, Minji; Binnington, Beth; Sakac, Darinka; Fernandes, Kimberly R; Shi, Sheryl P; Lingwood, Clifford A; Branch, Donald R

    2011-08-31

    The cell surface-expressed glycosphingolipid (GSL), globotriaosylceramide (Gb(3)), is becoming increasingly important and is widely studied in the areas of verotoxin (VT)-mediated cytotoxicity, human immunodeficiency virus (HIV) infection, immunology and cancer. However, despite its diverse roles and implications, an optimized detection method for cell surface Gb(3) has not been determined. GSLs are differentially organized in the plasma membrane which can affect their availability for protein binding. To examine various detection methods for cell surface Gb(3), we compared four reagents for use in flow cytometry analysis. A natural ligand (VT1B) and three different monoclonal antibodies (mAbs) were optimized and tested on various human cell lines for Gb(3) detection. A differential detection pattern of cell surface Gb(3) expression, which was influenced by the choice of reagent, was observed. Two mAb were found to be suboptimal. However, two other methods were found to be useful as defined by their high percentage of positivity and mean fluorescence intensity (MFI) values. Rat IgM anti-Gb(3) mAb (clone 38-13) using phycoerythrin-conjugated secondary antibody was found to be the most specific detection method while the use of VT1B conjugated to Alexa488 fluorochrome was found to be the most sensitive; showing a rare crossreactivity only when Gb(4) expression was highly elevated. The findings of this study demonstrate the variability in detection of Gb(3) depending on the reagent and cell target used and emphasize the importance of selecting an optimal methodology in studies for the detection of cell surface expression of Gb(3). Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Methods of detection and identificationoc carbon- and nitrogen-containing materials

    DOEpatents

    Karev, Alexander Ivanovich; Raevsky, Valery Georgievich; Dzhalivyan, Leonid Zavenovich; Brothers, Louis Joseph; Wilhide, Larry K

    2013-11-12

    Methods for detecting and identifying carbon- and/or nitrogen-containing materials are disclosed. The methods may comprise detection of photo-nuclear reaction products of nitrogen and carbon to detect and identify the carbon- and/or nitrogen-containing materials.

  10. Evaluation of Anomaly Detection Method Based on Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Fontugne, Romain; Himura, Yosuke; Fukuda, Kensuke

    The number of threats on the Internet is rapidly increasing, and anomaly detection has become of increasing importance. High-speed backbone traffic is particularly degraded, but their analysis is a complicated task due to the amount of data, the lack of payload data, the asymmetric routing and the use of sampling techniques. Most anomaly detection schemes focus on the statistical properties of network traffic and highlight anomalous traffic through their singularities. In this paper, we concentrate on unusual traffic distributions, which are easily identifiable in temporal-spatial space (e.g., time/address or port). We present an anomaly detection method that uses a pattern recognition technique to identify anomalies in pictures representing traffic. The main advantage of this method is its ability to detect attacks involving mice flows. We evaluate the parameter set and the effectiveness of this approach by analyzing six years of Internet traffic collected from a trans-Pacific link. We show several examples of detected anomalies and compare our results with those of two other methods. The comparison indicates that the only anomalies detected by the pattern-recognition-based method are mainly malicious traffic with a few packets.

  11. A Method of Face Detection with Bayesian Probability

    NASA Astrophysics Data System (ADS)

    Sarker, Goutam

    2010-10-01

    The objective of face detection is to identify all images which contain a face, irrespective of its orientation, illumination conditions etc. This is a hard problem, because the faces are highly variable in size, shape lighting conditions etc. Many methods have been designed and developed to detect faces in a single image. The present paper is based on one `Appearance Based Method' which relies on learning the facial and non facial features from image examples. This in its turn is based on statistical analysis of examples and counter examples of facial images and employs Bayesian Conditional Classification Rule to detect the probability of belongingness of a face (or non-face) within an image frame. The detection rate of the present system is very high and thereby the number of false positive and false negative detection is substantially low.

  12. Method of Fault Detection and Rerouting

    NASA Technical Reports Server (NTRS)

    Gibson, Tracy L. (Inventor); Medelius, Pedro J. (Inventor); Lewis, Mark E. (Inventor)

    2013-01-01

    A system and method for detecting damage in an electrical wire, including delivering at least one test electrical signal to an outer electrically conductive material in a continuous or non-continuous layer covering an electrically insulative material layer that covers an electrically conductive wire core. Detecting the test electrical signals in the outer conductive material layer to obtain data that is processed to identify damage in the outer electrically conductive material layer.

  13. Compositions and methods for detecting single nucleotide polymorphisms

    DOEpatents

    Yeh, Hsin-Chih; Werner, James; Martinez, Jennifer S.

    2016-11-22

    Described herein are nucleic acid based probes and methods for discriminating and detecting single nucleotide variants in nucleic acid molecules (e.g., DNA). The methods include use of a pair of probes can be used to detect and identify polymorphisms, for example single nucleotide polymorphism in DNA. The pair of probes emit a different fluorescent wavelength of light depending on the association and alignment of the probes when hybridized to a target nucleic acid molecule. Each pair of probes is capable of discriminating at least two different nucleic acid molecules that differ by at least a single nucleotide difference. The methods can probes can be used, for example, for detection of DNA polymorphisms that are indicative of a particular disease or condition.

  14. Stable, sensitive, fluorescence-based method for detecting cAMP.

    PubMed

    Hesley, Jayne; Daijo, Janet; Ferguson, Anne T

    2002-09-01

    cAMP is a universal secondary messenger that connects changes in the extracellular environment, as detected by cell surface receptors, to transcriptional changes in the nucleus. Since cAMP-mediated signal transduction plays a role in critical cell functions and human diseases, monitoring its activity can aid in understanding these responses and the process of drug discovery. This report examines the performance of a fluorescence-based competitive immunoassay in 384-well microplate format. Using purified cAMP as a competitor the estimated detection limit was determined to be 0.1 nM and Z'-factor was greater than 0.83, which indicates that the assay is of high quality and one of the most sensitive assays currently on the market. Of note, the results obtained were similar whether the reaction was allowed to proceed for 10 min or up to 60 min. Next, HEK 293 cells were treated with the promiscuous adenylate cyclase activator, forskolin, and the beta-adrenoceptor agonist, isoproterenol. The resultant average EC50 values were 11 microM and 123 nM, respectively, which correspond to those found in the literature. Together, these results demonstrate that this assay is afast, accurate, non-radioactive method that is ideal for high-throughput screening.

  15. Softcopy quality ruler method: implementation and validation

    NASA Astrophysics Data System (ADS)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  16. Evaluation of two outlier-detection-based methods for detecting tissue-selective genes from microarray data.

    PubMed

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-05-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent's non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent's method is not suitable for ROKU.

  17. Droplet digital PCR-based EGFR mutation detection with an internal quality control index to determine the quality of DNA.

    PubMed

    Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee

    2018-01-11

    In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.

  18. System and method for anomaly detection

    DOEpatents

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  19. Comparison of ultraviolet detection and charged aerosol detection methods for liquid-chromatographic determination of protoescigenin.

    PubMed

    Filip, Katarzyna; Grynkiewicz, Grzegorz; Gruza, Mariusz; Jatczak, Kamil; Zagrodzki, Bogdan

    2014-01-01

    Escin, a complex mixture of pentacyclic triterpene saponins obtained from horse chestnut seeds extract (HCSE; Aesculus hippocastanum L.), constitutes a traditional herbal active substance of preparations (drugs) used for a treatment of chronic venous insufficiency and capillary blood vessel leakage. A new approach to exploitation of pharmacological potential of this saponin complex has been recently proposed, in which the β-escin mixture is perceived as a source of a hitherto unavailable raw material, pentacyclic triterpene aglycone-protoescigenin. Although many liquid chromatography methods are described in the literature for saponins determination, analysis of protoescigenin is barely mentioned. In this work, a new ultra-high performance liquid chromatography (UHPLC) method developed for protoescigenin quantification has been described. CAD (charged aerosol detection), as a relatively new detection method based on aerosol charging, has been applied in this method as an alternative to ultraviolet (UV) detection. The influence of individual parameters on CAD response and sensitivity was studied. The detection was performed using CAD and UV (200 nm) simultaneously and the results were compared with reference to linearity, accuracy, precision and limit of detection.

  20. An improved PCA method with application to boiler leak detection.

    PubMed

    Sun, Xi; Marquez, Horacio J; Chen, Tongwen; Riaz, Muhammad

    2005-07-01

    Principal component analysis (PCA) is a popular fault detection technique. It has been widely used in process industries, especially in the chemical industry. In industrial applications, achieving a sensitive system capable of detecting incipient faults, which maintains the false alarm rate to a minimum, is a crucial issue. Although a lot of research has been focused on these issues for PCA-based fault detection and diagnosis methods, sensitivity of the fault detection scheme versus false alarm rate continues to be an important issue. In this paper, an improved PCA method is proposed to address this problem. In this method, a new data preprocessing scheme and a new fault detection scheme designed for Hotelling's T2 as well as the squared prediction error are developed. A dynamic PCA model is also developed for boiler leak detection. This new method is applied to boiler water/steam leak detection with real data from Syncrude Canada's utility plant in Fort McMurray, Canada. Our results demonstrate that the proposed method can effectively reduce false alarm rate, provide effective and correct leak alarms, and give early warning to operators.

  1. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    USGS Publications Warehouse

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  2. Radiologists' confidence in detecting abnormalities on chest images and their subjective judgments of image quality

    NASA Astrophysics Data System (ADS)

    King, Jill L.; Gur, David; Rockette, Howard E.; Curtin, Hugh D.; Obuchowski, Nancy A.; Thaete, F. Leland; Britton, Cynthia A.; Metz, Charles E.

    1991-07-01

    The relationship between subjective judgments of image quality for the performance of specific detection tasks and radiologists' confidence level in arriving at correct diagnoses was investigated in two studies in which 12 readers, using a total of three different display environments, interpreted a series of 300 PA chest images. The modalities used were conventional films, laser-printed films, and high-resolution CRT display of digitized images. For the detection of interstitial disease, nodules, and pneumothoraces, there was no statistically significant correlation (Spearman rho) between subjective ratings of quality and radiologists' confidence in detecting these abnormalities. However, in each study, for all modalities and all readers but one, a small but statistically significant correlation was found between the radiologists' ability to correctly and confidently rule out interstitial disease and their subjective ratings of image quality.

  3. Transistor-based particle detection systems and methods

    DOEpatents

    Jain, Ankit; Nair, Pradeep R.; Alam, Muhammad Ashraful

    2015-06-09

    Transistor-based particle detection systems and methods may be configured to detect charged and non-charged particles. Such systems may include a supporting structure contacting a gate of a transistor and separating the gate from a dielectric of the transistor, and the transistor may have a near pull-in bias and a sub-threshold region bias to facilitate particle detection. The transistor may be configured to change current flow through the transistor in response to a change in stiffness of the gate caused by securing of a particle to the gate, and the transistor-based particle detection system may configured to detect the non-charged particle at least from the change in current flow.

  4. The comparison of detection methods of asymptomatic malaria in hypoendemic areas

    NASA Astrophysics Data System (ADS)

    Siahaan, L.; Panggabean, M.; Panggabean, Y. C.

    2018-03-01

    Malaria is still a problem that disrupts public health in North Sumatera. Late diagnosis will increase the chances of increased morbidity and mortality due to malaria. The early detection of asymptomatic malaria is one of the best efforts to reduce the transmission of the disease. Early detection is certainly must be done on suspect patients who have no malaria complaints. Passive Case Detection (PCD) methods seem hard to find asymptomatic malaria. This study was conducted to compare ACD (Active Case Detection) and PCD methods in asymptomatic malaria detection in the hypoendemic areas of malaria. ACD method is done by going to the sample based on secondary data. Meanwhile, PCD is done on samples that come to health services. Samples were taken randomly and diagnosis was confirmed by microscopic examination with 3% Giemsa staining, as gold standard of malaria diagnostics. There was a significant difference between ACD and PCD detection methods (p = 0.034), where ACD method was seen superior in detecting malaria patients in all categories, such as: clinical malaria (65.2%), asymptomatic malaria (65.1%) and submicroscopic malaria (58.5%). ACD detection methods are superior in detecting malaria sufferers, especially asymptomatic malaria sufferers.

  5. Salient object detection method based on multiple semantic features

    NASA Astrophysics Data System (ADS)

    Wang, Chunyang; Yu, Chunyan; Song, Meiping; Wang, Yulei

    2018-04-01

    The existing salient object detection model can only detect the approximate location of salient object, or highlight the background, to resolve the above problem, a salient object detection method was proposed based on image semantic features. First of all, three novel salient features were presented in this paper, including object edge density feature (EF), object semantic feature based on the convex hull (CF) and object lightness contrast feature (LF). Secondly, the multiple salient features were trained with random detection windows. Thirdly, Naive Bayesian model was used for combine these features for salient detection. The results on public datasets showed that our method performed well, the location of salient object can be fixed and the salient object can be accurately detected and marked by the specific window.

  6. A simple method of DNA isolation from jute (Corchorus olitorius) seed suitable for PCR-based detection of the pathogen Macrophomina phaseolina (Tassi) Goid.

    PubMed

    Biswas, C; Dey, P; Satpathy, S; Sarkar, S K; Bera, A; Mahapatra, B S

    2013-02-01

    A simple method was developed for isolating DNA from jute seed, which contains high amounts of mucilage and secondary metabolites, and a PCR protocol was standardized for detecting the seedborne pathogen Macrophomina phaseolina. The cetyl trimethyl ammonium bromide method was modified with increased salt concentration and a simple sodium acetate treatment to extract genomic as well as fungal DNA directly from infected jute seed. The Miniprep was evaluated along with five other methods of DNA isolation in terms of yield and quality of DNA and number of PCR positive samples. The Miniprep consistently recovered high amounts of DNA with good spectral qualities at A260/A280. The DNA isolated from jute seed was found suitable for PCR amplification. Macrophomina phaseolina could be detected by PCR from artificially inoculated as well as naturally infected jute seeds. The limit of PCR-based detection of M. phaseolina in jute seed was determined to be 0·62 × 10(-7) CFU g(-1) seed. © 2012 The Society for Applied Microbiology.

  7. Iodine absorption cells quality evaluation methods

    NASA Astrophysics Data System (ADS)

    Hrabina, Jan; Zucco, Massimo; Holá, Miroslava; Šarbort, Martin; Acef, Ouali; Du-Burck, Frédéric; Lazar, Josef; Číp, Ondřej

    2016-12-01

    The absorption cells represent an unique tool for the laser frequency stabilization. They serve as irreplaceable optical frequency references in realization of high-stable laser standards and laser sources for different brands of optical measurements, including the most precise frequency and dimensional measurement systems. One of the most often used absorption media covering visible and near IR spectral range is molecular iodine. It offers rich atlas of very strong and narrow spectral transitions which allow realization of laser systems with ultimate frequency stabilities in or below 10-14 order level. One of the most often disccussed disadvantage of the iodine cells is iodine's corrosivity and sensitivity to presence of foreign substances. The impurities react with absorption media and cause spectral shifts of absorption spectra, spectral broadening of the transitions and decrease achievable signal-to-noise ratio of the detected spectra. All of these unwanted effects directly influence frequency stability of the realized laser standard and due to this fact, the quality of iodine cells must be precisely controlled. We present a comparison of traditionally used method of laser induced fluorescence (LIF) with novel technique based on hyperfine transitions linewidths measurement. The results summarize advantages and drawbacks of these techniques and give a recommendation for their practical usage.

  8. A flow-cytometry-based method for detecting simultaneously five allergens in a complex food matrix.

    PubMed

    Otto, Gaetan; Lamote, Amandine; Deckers, Elise; Dumont, Valery; Delahaut, Philippe; Scippo, Marie-Louise; Pleck, Jessica; Hillairet, Caroline; Gillard, Nathalie

    2016-12-01

    To avoid carry-over contamination with allergens, food manufacturers implement quality control strategies relying primarily on detection of allergenic proteins by ELISA. Although sensitive and specific, this method allowed detection of only one allergen per analysis and effective control policies were thus based on multiplying the number of tests done in order to cover the whole range of allergens. We present in this work an immunoassay for the simultaneous detection of milk, egg, peanut, mustard and crustaceans in cookies samples. The method was based on a combination of flow cytometry with competitive ELISA where microbeads were used as sorbent surface. The test was able to detect the presence of the five allergens with median inhibitory concentrations (IC50) ranging from 2.5 to 15 mg/kg according to the allergen to be detected. The lowest concentrations of contaminants inducing a significant difference of signal between non-contaminated controls and test samples were 2 mg/kg of peanut, 5 mg/kg of crustaceans, 5 mg/kg of milk, 5 mg/kg of mustard and 10 mg/kg of egg. Assay sensitivity was influenced by the concentration of primary antibodies added to the sample extract for the competition and by the concentration of allergenic proteins bound to the surface of the microbeads.

  9. The Continuous Monitoring of Desert Dust using an Infrared-based Dust Detection and Retrieval Method

    NASA Technical Reports Server (NTRS)

    Duda, David P.; Minnis, Patrick; Trepte, Qing; Sun-Mack, Sunny

    2006-01-01

    Airborne dust and sand are significant aerosol sources that can impact the atmospheric and surface radiation budgets. Because airborne dust affects visibility and air quality, it is desirable to monitor the location and concentrations of this aerosol for transportation and public health. Although aerosol retrievals have been derived for many years using visible and near-infrared reflectance measurements from satellites, the detection and quantification of dust from these channels is problematic over bright surfaces, or when dust concentrations are large. In addition, aerosol retrievals from polar orbiting satellites lack the ability to monitor the progression and sources of dust storms. As a complement to current aerosol dust retrieval algorithms, multi-spectral thermal infrared (8-12 micron) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Meteosat-8 Spinning Enhanced Visible and Infrared Imager (SEVIRI) are used in the development of a prototype dust detection method and dust property retrieval that can monitor the progress of Saharan dust fields continuously, both night and day. The dust detection method is incorporated into the processing of CERES (Clouds and the Earth s Radiant Energy System) aerosol retrievals to produce dust property retrievals. Both MODIS (from Terra and Aqua) and SEVERI data are used to develop the method.

  10. Odour Detection Methods: Olfactometry and Chemical Sensors

    PubMed Central

    Brattoli, Magda; de Gennaro, Gianluigi; de Pinto, Valentina; Loiotile, Annamaria Demarinis; Lovascio, Sara; Penza, Michele

    2011-01-01

    The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc.) and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality); this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants) as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective “analytical instrument” for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses) are then described, focusing on their better performances for environmental analysis. Odour emission monitoring carried out

  11. SU-F-18C-01: Minimum Detectability Analysis for Comprehensive Sized Based Optimization of Image Quality and Radiation Dose Across CT Protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smitherman, C; Chen, B; Samei, E

    2014-06-15

    Purpose: This work involved a comprehensive modeling of task-based performance of CT across a wide range of protocols. The approach was used for optimization and consistency of dose and image quality within a large multi-vendor clinical facility. Methods: 150 adult protocols from the Duke University Medical Center were grouped into sub-protocols with similar acquisition characteristics. A size based image quality phantom (Duke Mercury Phantom) was imaged using these sub-protocols for a range of clinically relevant doses on two CT manufacturer platforms (Siemens, GE). The images were analyzed to extract task-based image quality metrics such as the Task Transfer Function (TTF),more » Noise Power Spectrum, and Az based on designer nodule task functions. The data were analyzed in terms of the detectability of a lesion size/contrast as a function of dose, patient size, and protocol. A graphical user interface (GUI) was developed to predict image quality and dose to achieve a minimum level of detectability. Results: Image quality trends with variations in dose, patient size, and lesion contrast/size were evaluated and calculated data behaved as predicted. The GUI proved effective to predict the Az values representing radiologist confidence for a targeted lesion, patient size, and dose. As an example, an abdomen pelvis exam for the GE scanner, with a task size/contrast of 5-mm/50-HU, and an Az of 0.9 requires a dose of 4.0, 8.9, and 16.9 mGy for patient diameters of 25, 30, and 35 cm, respectively. For a constant patient diameter of 30 cm, the minimum detected lesion size at those dose levels would be 8.4, 5, and 3.9 mm, respectively. Conclusion: The designed CT protocol optimization platform can be used to evaluate minimum detectability across dose levels and patient diameters. The method can be used to improve individual protocols as well as to improve protocol consistency across CT scanners.« less

  12. Lifting wavelet method of target detection

    NASA Astrophysics Data System (ADS)

    Han, Jun; Zhang, Chi; Jiang, Xu; Wang, Fang; Zhang, Jin

    2009-11-01

    Image target recognition plays a very important role in the areas of scientific exploration, aeronautics and space-to-ground observation, photography and topographic mapping. Complex environment of the image noise, fuzzy, all kinds of interference has always been to affect the stability of recognition algorithm. In this paper, the existence of target detection in real-time, accuracy problems, as well as anti-interference ability, using lifting wavelet image target detection methods. First of all, the use of histogram equalization, the goal difference method to obtain the region, on the basis of adaptive threshold and mathematical morphology operations to deal with the elimination of the background error. Secondly, the use of multi-channel wavelet filter wavelet transform of the original image de-noising and enhancement, to overcome the general algorithm of the noise caused by the sensitive issue of reducing the rate of miscarriage of justice will be the multi-resolution characteristics of wavelet and promotion of the framework can be designed directly in the benefits of space-time region used in target detection, feature extraction of targets. The experimental results show that the design of lifting wavelet has solved the movement of the target due to the complexity of the context of the difficulties caused by testing, which can effectively suppress noise, and improve the efficiency and speed of detection.

  13. A new method for water quality assessment: by harmony degree equation.

    PubMed

    Zuo, Qiting; Han, Chunhui; Liu, Jing; Ma, Junxia

    2018-02-22

    Water quality assessment is an important basic work in the development, utilization, management, and protection of water resources, and also a prerequisite for water safety. In this paper, the harmony degree equation (HDE) was introduced into the research of water quality assessment, and a new method for water quality assessment was proposed according to the HDE: by harmony degree equation (WQA-HDE). First of all, the calculation steps and ideas of this method were described in detail, and then, this method with some other important methods of water quality assessment (single factor assessment method, mean-type comprehensive index assessment method, and multi-level gray correlation assessment method) were used to assess the water quality of the Shaying River (the largest tributary of the Huaihe in China). For this purpose, 2 years (2013-2014) dataset of nine water quality variables covering seven monitoring sites, and approximately 189 observations were used to compare and analyze the characteristics and advantages of the new method. The results showed that the calculation steps of WQA-HDE are similar to the comprehensive assessment method, and WQA-HDE is more operational comparing with the results of other water quality assessment methods. In addition, this new method shows good flexibility by setting the judgment criteria value HD 0 of water quality; when HD 0  = 0.8, the results are closer to reality, and more realistic and reliable. Particularly, when HD 0  = 1, the results of WQA-HDE are consistent with the single factor assessment method, both methods are subject to the most stringent "one vote veto" judgment condition. So, WQA-HDE is a composite method that combines the single factor assessment and comprehensive assessment. This research not only broadens the research field of theoretical method system of harmony theory but also promotes the unity of water quality assessment method and can be used for reference in other comprehensive assessment.

  14. Development of image processing method to detect noise in geostationary imagery

    NASA Astrophysics Data System (ADS)

    Khlopenkov, Konstantin V.; Doelling, David R.

    2016-10-01

    The Clouds and the Earth's Radiant Energy System (CERES) has incorporated imagery from 16 individual geostationary (GEO) satellites across five contiguous domains since March 2000. In order to derive broadband fluxes uniform across satellite platforms it is important to ensure a good quality of the input raw count data. GEO data obtained by older GOES imagers (such as MTSAT-1, Meteosat-5, Meteosat-7, GMS-5, and GOES-9) are known to frequently contain various types of noise caused by transmission errors, sync errors, stray light contamination, and others. This work presents an image processing methodology designed to detect most kinds of noise and corrupt data in all bands of raw imagery from modern and historic GEO satellites. The algorithm is based on a set of different approaches to detect abnormal image patterns, including inter-line and inter-pixel differences within a scanline, correlation between scanlines, analysis of spatial variance, and also a 2D Fourier analysis of the image spatial frequencies. In spite of computational complexity, the described method is highly optimized for performance to facilitate volume processing of multi-year data and runs in fully automated mode. Reliability of this noise detection technique has been assessed by human supervision for each GEO dataset obtained during selected time periods in 2005 and 2006. This assessment has demonstrated the overall detection accuracy of over 99.5% and the false alarm rate of under 0.3%. The described noise detection routine is currently used in volume processing of historical GEO imagery for subsequent production of global gridded data products and for cross-platform calibration.

  15. A new method for ultrasound detection of interfacial position in gas-liquid two-phase flow.

    PubMed

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Neves, Flávio; Morales, Rigoberto E M

    2014-05-22

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe.

  16. Variable threshold method for ECG R-peak detection.

    PubMed

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  17. [A new method for safety monitoring of natural dietary supplements--quality profile].

    PubMed

    Wang, Juan; Wang, Li-Ping; Yang, Da-Jin; Chen, Bo

    2008-07-01

    A new method for safety monitoring of natural dietary supplements--quality profile was proposed. It would convert passive monitoring of synthetic drug to active, and guarantee the security of natural dietary supplements. Preliminary research on quality profile was completed by high performance liquid chromatography (HPLC) and mass spectrometry (MS). HPLC was employed to analyze chemical constituent profiles of natural dietary supplements. The separation was completed on C18 column with acetonitrile and water (0.05% H3PO4) as mobile phase, the detection wavelength was 223 nm. Based on HPLC, stability of quality profile had been studied, and abnormal compounds in quality profile had been analyzed after addition of phenolphthalein, sibutramine, rosiglitazone, glibenclamide and gliclazide. And by MS, detector worked with ESI +, capillary voltage: 3.5 kV, cone voltage: 30 V, extractor voltage: 4 V, RF lens voltage: 0.5 V, source temperature: 105 degrees C, desolvation temperature: 300 degrees C, desolvation gas flow rate: 260 L/h, cone gas flow rate: 50 L/h, full scan mass spectra: m/z 100-600. Abnormal compound in quality profile had been analyzed after addition of N-mono-desmethyl sibutramine. Quality profile based on HPLC had good stability (Similarity > 0.877). Addition of phenolphthalein, sibutramine, rosiglitazone, glibenclamide and gliclazide in natural dietary supplements could be reflected by HPLC, and addition of N-mono-desmethyl sibutramine in natural dietary supplements could be reflected by MS. Quality profile might monitor adulteration of natural dietary supplements, and prevent addition of synthetic drug after "approval".

  18. A New Intrusion Detection Method Based on Antibody Concentration

    NASA Astrophysics Data System (ADS)

    Zeng, Jie; Li, Tao; Li, Guiyang; Li, Haibo

    Antibody is one kind of protein that fights against the harmful antigen in human immune system. In modern medical examination, the health status of a human body can be diagnosed by detecting the intrusion intensity of a specific antigen and the concentration indicator of corresponding antibody from human body’s serum. In this paper, inspired by the principle of antigen-antibody reactions, we present a New Intrusion Detection Method Based on Antibody Concentration (NIDMBAC) to reduce false alarm rate without affecting detection rate. In our proposed method, the basic definitions of self, nonself, antigen and detector in the intrusion detection domain are given. Then, according to the antigen intrusion intensity, the change of antibody number is recorded from the process of clone proliferation for detectors based on the antigen classified recognition. Finally, building upon the above works, a probabilistic calculation method for the intrusion alarm production, which is based on the correlation between the antigen intrusion intensity and the antibody concen-tration, is proposed. Our theoretical analysis and experimental results show that our proposed method has a better performance than traditional methods.

  19. Novel face-detection method under various environments

    NASA Astrophysics Data System (ADS)

    Jing, Min-Quan; Chen, Ling-Hwei

    2009-06-01

    We propose a method to detect a face with different poses under various environments. On the basis of skin color information, skin regions are first extracted from an input image. Next, the shoulder part is cut out by using shape information and the head part is then identified as a face candidate. For a face candidate, a set of geometric features is applied to determine if it is a profile face. If not, then a set of eyelike rectangles extracted from the face candidate and the lighting distribution are used to determine if the face candidate is a nonprofile face. Experimental results show that the proposed method is robust under a wide range of lighting conditions, different poses, and races. The detection rate for the HHI face database is 93.68%. For the Champion face database, the detection rate is 95.15%.

  20. External Quality Assessment for Rubella Virus RNA Detection Using Armored RNA in China.

    PubMed

    Zhang, D; Lin, G; Yi, L; Hao, M; Fan, G; Yang, X; Peng, R; Ding, J; Zhang, K; Zhang, R; Li, J

    2017-02-01

    Although tremendous efforts have been made to reduce rubella incidence, there are still 300 new cases of congenital rubella syndrome daily; thus, rubella infections remain one of the leading causes of preventable congenital birth defects. An effective surveillance system, which could be achieved and maintained by using an external quality assessment program, is critical for prevention and control of this disease. Armored RNAs, which are noninfectious and RNase-resistant, were used for encapsulation of the E1 gene of rubella virus and for preparation of a 10-specimen panel for external quality assessment. Thirty-two laboratories across mainland China that used nucleic acid tests for rubella virus RNA detection were included in the external quality assessment program organized by the National Center for Clinical Laboratories of China. Different kinds of commercial kits were used by the laboratories for nucleic acid extraction and TaqMan real-time reverse-transcription PCR for rubella virus RNA detection; 99.2% sensitivity and 100% specificity were achieved in this external quality assessment program. Most of the participating laboratories obtained accurate results for rubella nucleic acid tests, thereby achieving the quality required for regional rubella and congenital rubella syndrome elimination.

  1. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio.

    PubMed

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population's sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns.

  2. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio

    PubMed Central

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population’s sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns. PMID:27441554

  3. Molecular methods for the detection of mutations.

    PubMed

    Monteiro, C; Marcelino, L A; Conde, A R; Saraiva, C; Giphart-Gassler, M; De Nooij-van Dalen, A G; Van Buuren-van Seggelen, V; Van der Keur, M; May, C A; Cole, J; Lehmann, A R; Steinsgrimsdottir, H; Beare, D; Capulas, E; Armour, J A

    2000-01-01

    We report the results of a collaborative study aimed at developing reliable, direct assays for mutation in human cells. The project used common lymphoblastoid cell lines, both with and without mutagen treatment, as a shared resource to validate the development of new molecular methods for the detection of low-level mutations in the presence of a large excess of normal alleles. As the "gold standard, " hprt mutation frequencies were also measured on the same samples. The methods under development included i) the restriction site mutation (RSM) assay, in which mutations lead to the destruction of a restriction site; ii) minisatellite length-change mutation, in which mutations lead to alleles containing new numbers of tandem repeat units; iii) loss of heterozygosity for HLA epitopes, in which antibodies can be used to direct selection for mutant cells; iv) multiple fluorescence-based long linker arm nucleotides assay (mf-LLA) technology, for the detection of substitutional mutations; v) detection of alterations in the TP53 locus using a (CA) array as the target for the screening; and vi) PCR analysis of lymphocytes for the presence of the BCL2 t(14:18) translocation. The relative merits of these molecular methods are discussed, and a comparison made with more "traditional" methods.

  4. A novel method for detection of apoptosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zagariya, Alexander M., E-mail: zagariya@uic.edu

    2012-04-15

    There are two different Angiotensin II (ANG II) peptides in nature: Human type (ANG II) and Bovine type (ANG II*). These eight amino acid peptides differ only at position 5 where Valine is replaced by Isoleucine in the Bovine type. They are present in all species studied so far. These amino acids are different by only one atom of carbon. This difference is so small, that it will allow any of ANG II, Bovine or Human antibodies to interact with all species and create a universal method for apoptosis detection. ANG II concentrations are found at substantially higher levels inmore » apoptotic, compared to non-apoptotic, tissues. ANG II accumulation can lead to DNA damage, mutations, carcinogenesis and cell death. We demonstrate that Bovine antiserum can be used for universal detection of apoptosis. In 2010, the worldwide market for apoptosis detection reached the $20 billion mark and significantly increases each year. Most commercially available methods are related to Annexin V and TUNNEL. Our new method based on ANG II is more widely known to physicians and scientists compared to previously used methods. Our approach offers a novel alternative for assessing apoptosis activity with enhanced sensitivity, at a lower cost and ease of use.« less

  5. Evaluation of Two Outlier-Detection-Based Methods for Detecting Tissue-Selective Genes from Microarray Data

    PubMed Central

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-01-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent’s non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent’s method is not suitable for ROKU. PMID:19936074

  6. A novel method to detect shadows on multispectral images

    NASA Astrophysics Data System (ADS)

    Daǧlayan Sevim, Hazan; Yardımcı ćetin, Yasemin; Özışık Başkurt, Didem

    2016-10-01

    Shadowing occurs when the direct light coming from a light source is obstructed by high human made structures, mountains or clouds. Since shadow regions are illuminated only by scattered light, true spectral properties of the objects are not observed in such regions. Therefore, many object classification and change detection problems utilize shadow detection as a preprocessing step. Besides, shadows are useful for obtaining 3D information of the objects such as estimating the height of buildings. With pervasiveness of remote sensing images, shadow detection is ever more important. This study aims to develop a shadow detection method on multispectral images based on the transformation of C1C2C3 space and contribution of NIR bands. The proposed method is tested on Worldview-2 images covering Ankara, Turkey at different times. The new index is used on these 8-band multispectral images with two NIR bands. The method is compared with methods in the literature.

  7. Efficient method of image edge detection based on FSVM

    NASA Astrophysics Data System (ADS)

    Cai, Aiping; Xiong, Xiaomei

    2013-07-01

    For efficient object cover edge detection in digital images, this paper studied traditional methods and algorithm based on SVM. It analyzed Canny edge detection algorithm existed some pseudo-edge and poor anti-noise capability. In order to provide a reliable edge extraction method, propose a new detection algorithm based on FSVM. Which contains several steps: first, trains classify sample and gives the different membership function to different samples. Then, a new training sample is formed by increase the punishment some wrong sub-sample, and use the new FSVM classification model for train and test them. Finally the edges are extracted of the object image by using the model. Experimental result shows that good edge detection image will be obtained and adding noise experiments results show that this method has good anti-noise.

  8. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  9. Method for detecting toxic gases

    DOEpatents

    Stetter, Joseph R.; Zaromb, Solomon; Findlay, Jr., Melvin W.

    1991-01-01

    A method capable of detecting low concentrations of a pollutant or other component in air or other gas, utilizing a combination of a heating filament having a catalytic surface of a noble metal for exposure to the gas and producing a derivative chemical product from the component, and an electrochemical sensor responsive to the derivative chemical product for providing a signal indicative of the product. At concentrations in the order of about 1-100 ppm of tetrachloroethylene, neither the heating filament nor the electrochemical sensor is individually capable of sensing the pollutant. In the combination, the heating filament converts the benzyl chloride to one or more derivative chemical products which may be detected by the electrochemical sensor.

  10. A high-throughput method for the detection of homoeologous gene deletions in hexaploid wheat

    PubMed Central

    2010-01-01

    Background Mutational inactivation of plant genes is an essential tool in gene function studies. Plants with inactivated or deleted genes may also be exploited for crop improvement if such mutations/deletions produce a desirable agronomical and/or quality phenotype. However, the use of mutational gene inactivation/deletion has been impeded in polyploid plant species by genetic redundancy, as polyploids contain multiple copies of the same genes (homoeologous genes) encoded by each of the ancestral genomes. Similar to many other crop plants, bread wheat (Triticum aestivum L.) is polyploid; specifically allohexaploid possessing three progenitor genomes designated as 'A', 'B', and 'D'. Recently modified TILLING protocols have been developed specifically for mutation detection in wheat. Whilst extremely powerful in detecting single nucleotide changes and small deletions, these methods are not suitable for detecting whole gene deletions. Therefore, high-throughput methods for screening of candidate homoeologous gene deletions are needed for application to wheat populations generated by the use of certain mutagenic agents (e.g. heavy ion irradiation) that frequently generate whole-gene deletions. Results To facilitate the screening for specific homoeologous gene deletions in hexaploid wheat, we have developed a TaqMan qPCR-based method that allows high-throughput detection of deletions in homoeologous copies of any gene of interest, provided that sufficient polymorphism (as little as a single nucleotide difference) amongst homoeologues exists for specific probe design. We used this method to identify deletions of individual TaPFT1 homoeologues, a wheat orthologue of the disease susceptibility and flowering regulatory gene PFT1 in Arabidopsis. This method was applied to wheat nullisomic-tetrasomic lines as well as other chromosomal deletion lines to locate the TaPFT1 gene to the long arm of chromosome 5. By screening of individual DNA samples from 4500 M2 mutant wheat

  11. Intercomparison of methods for image quality characterization. II. Noise power spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobbins, James T. III; Samei, Ehsan; Ranger, Nicole T.

    Second in a two-part series comparing measurement techniques for the assessment of basic image quality metrics in digital radiography, in this paper we focus on the measurement of the image noise power spectrum (NPS). Three methods were considered: (1) a method published by Dobbins et al. [Med. Phys. 22, 1581-1593 (1995)] (2) a method published by Samei et al. [Med. Phys. 30, 608-622 (2003)], and (3) a new method sanctioned by the International Electrotechnical Commission (IEC 62220-1, 2003), developed as part of an international standard for the measurement of detective quantum efficiency. In addition to an overall comparison of themore » estimated NPS between the three techniques, the following factors were also evaluated for their effect on the measured NPS: horizontal versus vertical directional dependence, the use of beam-limiting apertures, beam spectrum, and computational methods of NPS analysis, including the region-of-interest (ROI) size and the method of ROI normalization. Of these factors, none was found to demonstrate a substantial impact on the amplitude of the NPS estimates ({<=}3.1% relative difference in NPS averaged over frequency, for each factor considered separately). Overall, the three methods agreed to within 1.6%{+-}0.8% when averaged over frequencies >0.15 mm{sup -1}.« less

  12. Quality in Colonoscopy: Beyond the Adenoma Detection Rate Fever

    PubMed Central

    Taveira, Filipe; Areia, Miguel; Elvas, Luís; Alves, Susana; Brito, Daniel; Saraiva, Sandra; Cadime, Ana Teresa

    2017-01-01

    Background Colonoscopy quality is a hot topic in gastroenterological communities, with several actual guidelines focusing on this aspect. Although the adenoma detection rate (ADR) is the single most important indicator, several other metrics are described and need reporting. Electronic medical reports are essential for the audit of quality indicators; nevertheless, they have proved not to be faultless. Aim The aim of this study was to analyse and audit quality indicators (apart from ADR) using only our internal electronic endoscopy records as a starting point for improvement. Methods An analysis of electronically recorded information of 8,851 total colonoscopies from a single tertiary centre from 2010 to 2015 was performed. Results The mean patient age was 63.4 ± 8.5 years; 45.5% of them were female, and in 14.6% sedation was used. Photographic documentation was done in 98.4% with 10.7 photographs on average, and 37.4% reports had <8 pictures per exam. Bowel preparation was rated as adequate in 67%, fair in 27% and inadequate in 4.9% of cases. The adjusted caecal intubation rate (CIR) was 92%, while negative predictors were inadequate preparation (OR 119, 95% CI 84–170), no sedation (OR 2.39, 95% CI 1.81–3.15), female gender (OR 1.61, 95% CI 1.38–1.88) and age ≥65 years (OR 1.56, 95% CI 1.34–1.82). In 28% of patients, a snare polypectomy was performed, correlating with adequate preparation (OR 5.75, 95% CI 3.90–8.48), male gender (OR 1.82, 95% CI 1.64–2.01) and age ≥65 years (OR 1.25, 95% CI 1.13–1.37; p < 0.01) as positive predictors. An annual evolution was observed with improvements in photographic documentation (10.7 vs. 12.9; p < 0.001), CIR (91 vs. 94%; p = 0.002) and “adequate” bowel preparation (p = 0.004). Conclusions: There is much more to report than the ADR to ensure quality in colonoscopy practice. Better registry systematization and integrated software should be goals to achieve in the short term. PMID:29255755

  13. Salmonella detection in poultry samples. Comparison of two commercial real-time PCR systems with culture methods for the detection of Salmonella spp. in environmental and fecal samples of poultry.

    PubMed

    Sommer, D; Enderlein, D; Antakli, A; Schönenbrücher, H; Slaghuis, J; Redmann, T; Lierz, M

    2012-01-01

    The efficiency of two commercial PCR methods based on real-time technology, the foodproof® Salmonella detection system and the BAX® PCR Assay Salmonella system was compared to standardized culture methods (EN ISO 6579:2002 - Annex D) for the detection of Salmonella spp. in poultry samples. Four sample matrices (feed, dust, boot swabs, feces) obtained directly from poultry flocks, as well as artificially spiked samples of the same matrices, were used. All samples were tested for Salmonella spp. using culture methods first as the gold standard. In addition samples spiked with Salmonella Enteridis were tested to evaluate the sensitivity of both PCR methods. Furthermore all methods were evaluated in an annual ring-trial of the National Salmonella Reference Laboratory of Germany. Salmonella detection in the matrices feed, dust and boot swabs were comparable in both PCR systems whereas the results from feces differed markedly. The quality, especially the freshness, of the fecal samples had an influence on the sensitivity of the real-time PCR and the results of the culture methods. In fresh fecal samples an initial spiking level of 100cfu/25g Salmonella Enteritidis was detected. Two-days-dried fecal samples allowed the detection of 14cfu/25g. Both real- time PCR protocols appear to be suitable for the detection of Salmonella spp. in all four matrices. The foodproof® system detected eight samples more to be positive compared to the BAX® system, but had a potential false positive result in one case. In 7-days-dried samples none of the methods was able to detect Salmonella likely through letal cell damage. In general the advantage of PCR analyses over the culture method is the reduction of working time from 4-5 days to only 2 days. However, especially for the analysis of fecal samples official validation should be conducted according to the requirement of EN ISO6579:2002 - Annex D.

  14. Efficient method for events detection in phonocardiographic signals

    NASA Astrophysics Data System (ADS)

    Martinez-Alajarin, Juan; Ruiz-Merino, Ramon

    2005-06-01

    The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.

  15. Method for detection of antibodies for metallic elements

    DOEpatents

    Barrick, C.W.; Clarke, S.M.; Nordin, C.W.

    1993-11-30

    An apparatus and method for detecting antibodies specific to non-protein antigens. The apparatus is an immunological plate containing a plurality of plastic projections coated with a non-protein material. Assays utilizing the plate are capable of stabilizing the non-protein antigens with detection levels for antibodies specific to the antigens on a nanogram level. A screening assay with the apparatus allows for early detection of exposure to non-protein materials. Specifically metallic elements are detected. 10 figures.

  16. An R-peak detection method that uses an SVD filter and a search back system.

    PubMed

    Jung, Woo-Hyuk; Lee, Sang-Goog

    2012-12-01

    In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Research on infrared ship detection method in sea-sky background

    NASA Astrophysics Data System (ADS)

    Tang, Da; Sun, Gang; Wang, Ding-he; Niu, Zhao-dong; Chen, Zeng-ping

    2013-09-01

    An approach to infrared ship detection based on sea-sky-line(SSL) detection, ROI extraction and feature recognition is proposed in this paper. Firstly, considering that far ships are expected to be adjacent to the SSL, SSL is detected to find potential target areas. Radon transform is performed on gradient image to choose candidate SSLs, and detection result is given by fuzzy synthetic evaluation values. Secondly, in view of recognizable condition that there should be enough differences between target and background in infrared image, two gradient masks have been created and improved as practical guidelines in eliminating false alarm. Thirdly, extract ROI near the SSL by using multi-grade segmentation and fusion method after image sharpening, and unsuitable candidates are screened out according to the gradient masks and ROI shape. Finally, we segment the rest of ROIs by two-stage modified OTSU, and calculate target confidence as a standard measuring the facticity of target. Compared with other ship detection methods, proposed method is suitable for bipolar targets, which offers a good practicability and accuracy, and achieves a satisfying detection speed. Detection experiments with 200 thousand frames show that the proposed method is widely applicable, powerful in resistance to interferences and noises with a detection rate of above 95%, which satisfies the engineering needs commendably.

  18. [Investigation of quantitative detection of water quality using spectral fluorescence signature].

    PubMed

    He, Jun-hua; Cheng, Yong-jin; Han, Yan-ling; Zhang, Hao; Yang, Tao

    2008-08-01

    A method of spectral analysis, which can simultaneously detect dissolved organic matter (DOM) and chlorophyll a (Chl-a) in natural water, was developed in the present paper with the intention of monitoring water quality fast and quantitatively. Firstly, the total luminescence spectra (TLS) of water sample from East Lake in Wuhan city were measured by the use of laser (532 nm) induced fluorescence (LIF). There were obvious peaks of relative intensity at the wavelength value of 580, 651 and 687 nm in the TLS of the sample, which correspond respectively to spectra of DOM, and the Raman scattering of water and Chl-a in the water. Then the spectral fluorescence signature (SFS) technique was adopted to analyze and distinguish spectral characteristics of DOM and Chl-a in natural water. The calibration curves and function expressions, which indicate the relation between the normalized fluorescence intensities of DOM and Chl-a in water and their concentrations, were obtained respectively under the condition of low concentration(< 40 mg x L(-1))by using normalization of Raman scattering spectrum of water. The curves have a high linearity. When the concentration of the solution with humic acid is large (> 40 mg x L(-1)), the Raman scattering signal is totally absorbed by the molecules of humic acid being on the ground state, so the normalization technique can not be adopted. However the function expression between the concentration of the solution with humic acid and its relative fluorescence peak intensity can be acquired directly with the aid of experiment of fluorescence spectrum. It is concluded that although the expression is non-linearity as a whole, there is a excellent linear relation between the fluorescence intensity and concentration of DOM when the concentration is less than 200 mg x L(-1). The method of measurement based on spectral fluorescence signature technique and the calibration curves gained will have prospects of broad application. It can recognize fast

  19. [Review of driver fatigue/drowsiness detection methods].

    PubMed

    Wang, Lei; Wu, Xiaojuan; Yu, Mengsun

    2007-02-01

    Driver fatigue/drowsiness is one of the important causes of serious traffic accidents and results in so many people deaths or injuries, but also substantial directly and indirectly economic expenses. Therefore, many countries make great effort on how to detect drowsiness during driving. In this paper, we introduce the recent developments of driver fatigue/drowsiness detection technology of world wide and try to classify the existing methods into several kinds according to different features measured, and analyzed. Finally, the challenges faced to fatigue/drowsiness detection technology and the development trend are presented.

  20. [Comparison of several methods for detecting anti-erythrocyte alloantibodies].

    PubMed

    Bencomo, A A

    1990-08-01

    The efficacy of different methods for anti-red cell antibodies detection was assessed, variations being found in accordance with the specificity of the alloantibodies. The usefulness of enzyme tests in anti-Rh antibody detection was demonstrated, as well as that of low ionic strength saline solutions in detecting anti-Kell, anti-Duffy and anti-Kidd antibodies. Serum precipitation with 15% polyethyleneglycol 8000 previously to indirect antiglobulin test was found the most sensitive method, providing the best results in all the antibodies studied.

  1. Localized surface plasmon resonance mercury detection system and methods

    DOEpatents

    James, Jay; Lucas, Donald; Crosby, Jeffrey Scott; Koshland, Catherine P.

    2016-03-22

    A mercury detection system that includes a flow cell having a mercury sensor, a light source and a light detector is provided. The mercury sensor includes a transparent substrate and a submonolayer of mercury absorbing nanoparticles, e.g., gold nanoparticles, on a surface of the substrate. Methods of determining whether mercury is present in a sample using the mercury sensors are also provided. The subject mercury detection systems and methods find use in a variety of different applications, including mercury detecting applications.

  2. Methods, compounds and systems for detecting a microorganism in a sample

    DOEpatents

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.

    2016-09-06

    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  3. Quality control methods for linear accelerator radiation and mechanical axes alignment.

    PubMed

    Létourneau, Daniel; Keller, Harald; Becker, Nathan; Amin, Md Nurul; Norrlinger, Bernhard; Jaffray, David A

    2018-06-01

    The delivery accuracy of highly conformal dose distributions generated using intensity modulation and collimator, gantry, and couch degrees of freedom is directly affected by the quality of the alignment between the radiation beam and the mechanical axes of a linear accelerator. For this purpose, quality control (QC) guidelines recommend a tolerance of ±1 mm for the coincidence of the radiation and mechanical isocenters. Traditional QC methods for assessment of radiation and mechanical axes alignment (based on pointer alignment) are time consuming and complex tasks that provide limited accuracy. In this work, an automated test suite based on an analytical model of the linear accelerator motions was developed to streamline the QC of radiation and mechanical axes alignment. The proposed method used the automated analysis of megavoltage images of two simple task-specific phantoms acquired at different linear accelerator settings to determine the coincidence of the radiation and mechanical isocenters. The sensitivity and accuracy of the test suite were validated by introducing actual misalignments on a linear accelerator between the radiation axis and the mechanical axes using both beam steering and mechanical adjustments of the gantry and couch. The validation demonstrated that the new QC method can detect sub-millimeter misalignment between the radiation axis and the three mechanical axes of rotation. A displacement of the radiation source of 0.2 mm using beam steering parameters was easily detectable with the proposed collimator rotation axis test. Mechanical misalignments of the gantry and couch rotation axes of the same magnitude (0.2 mm) were also detectable using the new gantry and couch rotation axis tests. For the couch rotation axis, the phantom and test design allow detection of both translational and tilt misalignments with the radiation beam axis. For the collimator rotation axis, the test can isolate the misalignment between the beam radiation axis

  4. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  5. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  6. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  7. Comparative Study in Laboratory Rats to Validate Sperm Quality Methods and Endpoints

    NASA Technical Reports Server (NTRS)

    Price, W. A.; Briggs, G. B.; Alexander, W. K.; Still, K. R.; Grasman, K. A.

    2000-01-01

    Abstract The Naval Health Research Center, Detachment (Toxicology) performs toxicity studies in laboratory animals to characterize the risk of exposure to chemicals of Navy interest. Research was conducted at the Toxicology Detachment at WPAFB, OH in collaboration with Wright State University, Department of Biological Sciences for the validation of new bioassay methods for evaluating reproductive toxicity. The Hamilton Thorne sperm analyzer was used to evaluate sperm damage produced by exposure to a known testicular toxic agent, methoxyacetic acid and by inhalation exposure to JP-8 and JP-5 in laboratory rats. Sperm quality parameters were evaluated (sperm concentration, motility, and morphology) to provide evidence of sperm damage. The Hamilton Thorne sperm analyzer utilizes a DNA specific fluorescent stain (similar to flow cytometry) and digitized optical computer analysis to detect sperm cell damage. The computer assisted sperm analysis (CASA) is a more rapid, robust, predictive and sensitive method for characterizing reproductive toxicity. The results presented in this poster report validation information showing exposure to methoxyacetic acid causes reproductive toxicity and inhalation exposure to JP-8 and JP-5 had no significant effects. The CASA method detects early changes that result in reproductive deficits and these data will be used in a continuing program to characterize the toxicity of chemicals, and combinations of chemicals, of military interest to formulate permissible exposure limits.

  8. Fault detection of gearbox using time-frequency method

    NASA Astrophysics Data System (ADS)

    Widodo, A.; Satrijo, Dj.; Prahasto, T.; Haryanto, I.

    2017-04-01

    This research deals with fault detection and diagnosis of gearbox by using vibration signature. In this work, fault detection and diagnosis are approached by employing time-frequency method, and then the results are compared with cepstrum analysis. Experimental work has been conducted for data acquisition of vibration signal thru self-designed gearbox test rig. This test-rig is able to demonstrate normal and faulty gearbox i.e., wears and tooth breakage. Three accelerometers were used for vibration signal acquisition from gearbox, and optical tachometer was used for shaft rotation speed measurement. The results show that frequency domain analysis using fast-fourier transform was less sensitive to wears and tooth breakage condition. However, the method of short-time fourier transform was able to monitor the faults in gearbox. Wavelet Transform (WT) method also showed good performance in gearbox fault detection using vibration signal after employing time synchronous averaging (TSA).

  9. Development, validation and comparison of NIR and Raman methods for the identification and assay of poor-quality oral quinine drops.

    PubMed

    Mbinze, J K; Sacré, P-Y; Yemoa, A; Mavar Tayey Mbay, J; Habyalimana, V; Kalenda, N; Hubert, Ph; Marini, R D; Ziemons, E

    2015-01-01

    Poor quality antimalarial drugs are one of the public's major health problems in Africa. The depth of this problem may be explained in part by the lack of effective enforcement and the lack of efficient local drug analysis laboratories. To tackle part of this issue, two spectroscopic methods with the ability to detect and to quantify quinine dihydrochloride in children's oral drops formulations were developed and validated. Raman and near infrared (NIR) spectroscopy were selected for the drug analysis due to their low cost, non-destructive and rapid characteristics. Both of the methods developed were successfully validated using the total error approach in the range of 50-150% of the target concentration (20%W/V) within the 10% acceptance limits. Samples collected on the Congolese pharmaceutical market were analyzed by both techniques to detect potentially substandard drugs. After a comparison of the analytical performance of both methods, it has been decided to implement the method based on NIR spectroscopy to perform the routine analysis of quinine oral drop samples in the Quality Control Laboratory of Drugs at the University of Kinshasa (DRC). Copyright © 2015 Elsevier B.V. All rights reserved.

  10. An operant-based detection method for inferring tinnitus in mice.

    PubMed

    Zuo, Hongyan; Lei, Debin; Sivaramakrishnan, Shobhana; Howie, Benjamin; Mulvany, Jessica; Bao, Jianxin

    2017-11-01

    Subjective tinnitus is a hearing disorder in which a person perceives sound when no external sound is present. It can be acute or chronic. Because our current understanding of its pathology is incomplete, no effective cures have yet been established. Mouse models are useful for studying the pathophysiology of tinnitus as well as for developing therapeutic treatments. We have developed a new method for determining acute and chronic tinnitus in mice, called sound-based avoidance detection (SBAD). The SBAD method utilizes one paradigm to detect tinnitus and another paradigm to monitor possible confounding factors, such as motor impairment, loss of motivation, and deficits in learning and memory. The SBAD method has succeeded in monitoring both acute and chronic tinnitus in mice. Its detection ability is further validated by functional studies demonstrating an abnormal increase in neuronal activity in the inferior colliculus of mice that had previously been identified as having tinnitus by the SBAD method. The SBAD method provides a new means by which investigators can detect tinnitus in a single mouse accurately and with more control over potential confounding factors than existing methods. This work establishes a new behavioral method for detecting tinnitus in mice. The detection outcome is consistent with functional validation. One key advantage of mouse models is they provide researchers the opportunity to utilize an extensive array of genetic tools. This new method could lead to a deeper understanding of the molecular pathways underlying tinnitus pathology. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. First External Quality Assessment of Molecular and Serological Detection of Rift Valley Fever in the Western Mediterranean Region

    PubMed Central

    Monaco, Federica; Cosseddu, Gian Mario; Doumbia, Baba; Madani, Hafsa; El Mellouli, Fatiha; Jiménez-Clavero, Miguel Angel; Sghaier, Soufien; Marianneau, Philippe; Cetre-Sossah, Catherine; Polci, Andrea; Lacote, Sandra; Lakhdar, Magtouf; Fernandez-Pinero, Jovita; Sari Nassim, Chabane; Pinoni, Chiara; Capobianco Dondona, Andrea; Gallardo, Carmina; Bouzid, Taoufiq; Conte, Annamaria; Bortone, Grazia; Savini, Giovanni; Petrini, Antonio; Puech, Lilian

    2015-01-01

    Rift Valley fever (RVF) is a mosquito-borne viral zoonosis which affects humans and a wide range of domestic and wild ruminants. The large spread of RVF in Africa and its potential to emerge beyond its geographic range requires the development of surveillance strategies to promptly detect the disease outbreaks in order to implement efficient control measures, which could prevent the widespread of the virus to humans. The Animal Health Mediterranean Network (REMESA) linking some Northern African countries as Algeria, Egypt, Libya, Mauritania, Morocco, Tunisia with Southern European ones as France, Italy, Portugal and Spain aims at improving the animal health in the Western Mediterranean Region since 2009. In this context, a first assessment of the diagnostic capacities of the laboratories involved in the RVF surveillance was performed. The first proficiency testing (external quality assessment—EQA) for the detection of the viral genome and antibodies of RVF virus (RVFV) was carried out from October 2013 to February 2014. Ten laboratories participated from 6 different countries (4 from North Africa and 2 from Europe). Six laboratories participated in the ring trial for both viral RNA and antibodies detection methods, while four laboratories participated exclusively in the antibodies detection ring trial. For the EQA targeting the viral RNA detection methods 5 out of 6 laboratories reported 100% of correct results. One laboratory misidentified 2 positive samples as negative and 3 positive samples as doubtful indicating a need for corrective actions. For the EQA targeting IgG and IgM antibodies methods 9 out of the 10 laboratories reported 100% of correct results, whilst one laboratory reported all correct results except one false-positive. These two ring trials provide evidence that most of the participating laboratories are capable to detect RVF antibodies and viral RNA thus recognizing RVF infection in affected ruminants with the diagnostic methods currently

  12. First External Quality Assessment of Molecular and Serological Detection of Rift Valley Fever in the Western Mediterranean Region.

    PubMed

    Monaco, Federica; Cosseddu, Gian Mario; Doumbia, Baba; Madani, Hafsa; El Mellouli, Fatiha; Jiménez-Clavero, Miguel Angel; Sghaier, Soufien; Marianneau, Philippe; Cetre-Sossah, Catherine; Polci, Andrea; Lacote, Sandra; Lakhdar, Magtouf; Fernandez-Pinero, Jovita; Sari Nassim, Chabane; Pinoni, Chiara; Capobianco Dondona, Andrea; Gallardo, Carmina; Bouzid, Taoufiq; Conte, Annamaria; Bortone, Grazia; Savini, Giovanni; Petrini, Antonio; Puech, Lilian

    2015-01-01

    Rift Valley fever (RVF) is a mosquito-borne viral zoonosis which affects humans and a wide range of domestic and wild ruminants. The large spread of RVF in Africa and its potential to emerge beyond its geographic range requires the development of surveillance strategies to promptly detect the disease outbreaks in order to implement efficient control measures, which could prevent the widespread of the virus to humans. The Animal Health Mediterranean Network (REMESA) linking some Northern African countries as Algeria, Egypt, Libya, Mauritania, Morocco, Tunisia with Southern European ones as France, Italy, Portugal and Spain aims at improving the animal health in the Western Mediterranean Region since 2009. In this context, a first assessment of the diagnostic capacities of the laboratories involved in the RVF surveillance was performed. The first proficiency testing (external quality assessment--EQA) for the detection of the viral genome and antibodies of RVF virus (RVFV) was carried out from October 2013 to February 2014. Ten laboratories participated from 6 different countries (4 from North Africa and 2 from Europe). Six laboratories participated in the ring trial for both viral RNA and antibodies detection methods, while four laboratories participated exclusively in the antibodies detection ring trial. For the EQA targeting the viral RNA detection methods 5 out of 6 laboratories reported 100% of correct results. One laboratory misidentified 2 positive samples as negative and 3 positive samples as doubtful indicating a need for corrective actions. For the EQA targeting IgG and IgM antibodies methods 9 out of the 10 laboratories reported 100% of correct results, whilst one laboratory reported all correct results except one false-positive. These two ring trials provide evidence that most of the participating laboratories are capable to detect RVF antibodies and viral RNA thus recognizing RVF infection in affected ruminants with the diagnostic methods currently

  13. Method for outlier detection: a tool to assess the consistency between laboratory data and ultraviolet-visible absorbance spectra in wastewater samples.

    PubMed

    Zamora, D; Torres, A

    2014-01-01

    Reliable estimations of the evolution of water quality parameters by using in situ technologies make it possible to follow the operation of a wastewater treatment plant (WWTP), as well as improving the understanding and control of the operation, especially in the detection of disturbances. However, ultraviolet (UV)-Vis sensors have to be calibrated by means of a local fingerprint laboratory reference concentration-value data-set. The detection of outliers in these data-sets is therefore important. This paper presents a method for detecting outliers in UV-Vis absorbances coupled to water quality reference laboratory concentrations for samples used for calibration purposes. Application to samples from the influent of the San Fernando WWTP (Medellín, Colombia) is shown. After the removal of outliers, improvements in the predictability of the influent concentrations using absorbance spectra were found.

  14. Reliably detectable flaw size for NDE methods that use calibration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  15. Reliably Detectable Flaw Size for NDE Methods that Use Calibration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh1823 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  16. Essential Limitations of the Standard THz TDS Method for Substance Detection and Identification and a Way of Overcoming Them.

    PubMed

    Trofimov, Vyacheslav A; Varentsova, Svetlana A

    2016-04-08

    Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria.

  17. General Quality Control (QC) Guidelines for SAM Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  18. River water quality analysis via headspace detection of volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Tang, Johnny Jock Lee; Nishi, Phyllis Jacqueline; Chong, Gabriel Eng Wee; Wong, Martin Gideon; Chua, Hong Siang; Persaud, Krishna; Ng, Sing Muk

    2017-03-01

    Human civilization has intensified the interaction between the community and the environment. This increases the threat on the environm ent for being over exploited and contaminated with m anmade products and synthetic chemicals. Of all, clean water is one of the resources that can be easily contaminated since it is a universal solvent and of high mobility. This work reports the development and optimization of a water quality monitoring system based on metal oxide sensors. The system is intended to a ssist the detection of volatile organic compounds (VOCs) present in water sources online and onsite. The sampling mechanism was based on contactless mode, where headspace partial pressure of the VOCs formed above the water body in a close chamber was drawn for detection at the sensor platform. Pure toluene was used as standard to represent the broad spectrum of VOCs, and the sensor dynamic range was achieved from 1-1000 ppb. Several sensing parameters such as sampling time, headspace volume, and sensor recovery were s tudied and optimized. Besides direct detection of VOC contaminants in the water, the work has also been extended to detect VOCs produced by microbial communities and to c orrelate the size of the communities with the reading of V OCs recorded. This can serve to give b etter indication of water quality, not only on the conce ntration of VOCs c ontamination from chemicals, but also the content of microbes, which some can have severe effect on human health.

  19. Development of novel wireless sensor for food quality detection

    NASA Astrophysics Data System (ADS)

    Son Nguyen, Dat; Ngan Le, Nguyen; Phat Lam, Tan; Fribourg-Blanc, Eric; Chien Dang, Mau; Tedjini, Smail

    2015-12-01

    In this paper we present a wireless sensor for the monitoring of food quality. We integrate sensing capability into ultrahigh frequency (UHF) radio-frequency identification (RFID) tags through the relationship between the physical read-range and permittivity of the object we label with the RFID tags. Using the known variations of food permittivity as a function of time, we can detect the contamination time at which a food product becomes unacceptable for consumption based on the measurement of read-range with the as-designed sensing tags. This low-cost UHF RFID passive sensor was designed and experimentally tested on beef, pork, and cheese with the same storage conditions as in supermarkets. The agreement between the experimental and simulation results show the potential of this technique for practical application in food-quality tracking.

  20. Magnetic Stirrer Method for the Detection of Trichinella Larvae in Muscle Samples.

    PubMed

    Mayer-Scholl, Anne; Pozio, Edoardo; Gayda, Jennifer; Thaben, Nora; Bahn, Peter; Nöckler, Karsten

    2017-03-03

    Trichinellosis is a debilitating disease in humans and is caused by the consumption of raw or undercooked meat of animals infected with the nematode larvae of the genus Trichinella. The most important sources of human infections worldwide are game meat and pork or pork products. In many countries, the prevention of human trichinellosis is based on the identification of infected animals by means of the artificial digestion of muscle samples from susceptible animal carcasses. There are several methods based on the digestion of meat but the magnetic stirrer method is considered the gold standard. This method allows the detection of Trichinella larvae by microscopy after the enzymatic digestion of muscle samples and subsequent filtration and sedimentation steps. Although this method does not require special and expensive equipment, internal controls cannot be used. Therefore, stringent quality management should be applied throughout the test. The aim of the present work is to provide detailed handling instructions and critical control points of the method to analysts, based on the experience of the European Union Reference Laboratory for Parasites and the National Reference Laboratory of Germany for Trichinella.

  1. Dynamic baseline detection method for power data network service

    NASA Astrophysics Data System (ADS)

    Chen, Wei

    2017-08-01

    This paper proposes a dynamic baseline Traffic detection Method which is based on the historical traffic data for the Power data network. The method uses Cisco's NetFlow acquisition tool to collect the original historical traffic data from network element at fixed intervals. This method uses three dimensions information including the communication port, time, traffic (number of bytes or number of packets) t. By filtering, removing the deviation value, calculating the dynamic baseline value, comparing the actual value with the baseline value, the method can detect whether the current network traffic is abnormal.

  2. The Quality of Methods Reporting in Parasitology Experiments

    PubMed Central

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000–2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32–90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <−0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental

  3. The quality of methods reporting in parasitology experiments.

    PubMed

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000-2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32-90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <-0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection

  4. Effects of Linking Methods on Detection of DIF.

    ERIC Educational Resources Information Center

    Kim, Seock-Ho; Cohen, Allan S.

    1992-01-01

    Effects of the following methods for linking metrics on detection of differential item functioning (DIF) were compared: (1) test characteristic curve method (TCC); (2) weighted mean and sigma method; and (3) minimum chi-square method. With large samples, results were essentially the same. With small samples, TCC was most accurate. (SLD)

  5. Management of low-grade cervical abnormalities detected at screening: which method do women prefer?

    PubMed

    Whynes, D K; Woolley, C; Philips, Z

    2008-12-01

    To establish whether women with low-grade abnormalities detected during screening for cervical cancer prefer to be managed by cytological surveillance or by immediate colposcopy. TOMBOLA (Trial of Management of Borderline and Other Low-grade Abnormal smears) is a randomized controlled trial comparing alternative management strategies following the screen-detection of low-grade cytological abnormalities. At exit, a sample of TOMBOLA women completed a questionnaire eliciting opinions on their management, contingent valuations (CV) of the management methods and preferences. Within-trial quality of life (EQ-5D) data collected for a sample of TOMBOLA women throughout their follow-up enabled the comparison of self-reported health at various time points, by management method. Once management had been initiated, self-reported health in the colposcopy arm rose relative to that in the surveillance arm, although the effect was short-term only. For the majority of women, the satisfaction ratings and the CV indicated approval of the management method to which they had been randomized. Of the minority manifesting a preference for the method which they had not experienced, relatively more would have preferred colposcopy than would have preferred surveillance. The findings must be interpreted in the light of sample bias with respect to preferences, whereby enthusiasm for colposcopy was probably over-represented amongst trial participants. The study suggests that neither of the management methods is preferred unequivocally; rather, individual women have individual preferences, although many would be indifferent between methods.

  6. Method of detecting genetic translocations identified with chromosomal abnormalities

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel; Tkachuk, Douglas

    2001-01-01

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML) and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  7. Method of detecting genetic deletions identified with chromosomal abnormalities

    DOEpatents

    Gray, Joe W; Pinkel, Daniel; Tkachuk, Douglas

    2013-11-26

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyzes. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acids probes are typically of a complexity greater tha 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particlularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML) and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar ut genetically different diseases, and for many prognostic and diagnostic applications.

  8. MASQOT: a method for cDNA microarray spot quality control

    PubMed Central

    Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan

    2005-01-01

    Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442

  9. A Distributed Signature Detection Method for Detecting Intrusions in Sensor Systems

    PubMed Central

    Kim, Ilkyu; Oh, Doohwan; Yoon, Myung Kuk; Yi, Kyueun; Ro, Won Woo

    2013-01-01

    Sensor nodes in wireless sensor networks are easily exposed to open and unprotected regions. A security solution is strongly recommended to prevent networks against malicious attacks. Although many intrusion detection systems have been developed, most systems are difficult to implement for the sensor nodes owing to limited computation resources. To address this problem, we develop a novel distributed network intrusion detection system based on the Wu–Manber algorithm. In the proposed system, the algorithm is divided into two steps; the first step is dedicated to a sensor node, and the second step is assigned to a base station. In addition, the first step is modified to achieve efficient performance under limited computation resources. We conduct evaluations with random string sets and actual intrusion signatures to show the performance improvement of the proposed method. The proposed method achieves a speedup factor of 25.96 and reduces 43.94% of packet transmissions to the base station compared with the previously proposed method. The system achieves efficient utilization of the sensor nodes and provides a structural basis of cooperative systems among the sensors. PMID:23529146

  10. A distributed signature detection method for detecting intrusions in sensor systems.

    PubMed

    Kim, Ilkyu; Oh, Doohwan; Yoon, Myung Kuk; Yi, Kyueun; Ro, Won Woo

    2013-03-25

    Sensor nodes in wireless sensor networks are easily exposed to open and unprotected regions. A security solution is strongly recommended to prevent networks against malicious attacks. Although many intrusion detection systems have been developed, most systems are difficult to implement for the sensor nodes owing to limited computation resources. To address this problem, we develop a novel distributed network intrusion detection system based on the Wu-Manber algorithm. In the proposed system, the algorithm is divided into two steps; the first step is dedicated to a sensor node, and the second step is assigned to a base station. In addition, the first step is modified to achieve efficient performance under limited computation resources. We conduct evaluations with random string sets and actual intrusion signatures to show the performance improvement of the proposed method. The proposed method achieves a speedup factor of 25.96 and reduces 43.94% of packet transmissions to the base station compared with the previously proposed method. The system achieves efficient utilization of the sensor nodes and provides a structural basis of cooperative systems among the sensors.

  11. A New Method for Ultrasound Detection of Interfacial Position in Gas-Liquid Two-Phase Flow

    PubMed Central

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Jr., Flávio Neves; Morales, Rigoberto E. M.

    2014-01-01

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe. PMID:24858961

  12. Method for detecting an image of an object

    DOEpatents

    Chapman, Leroy Dean; Thomlinson, William C.; Zhong, Zhong

    1999-11-16

    A method for detecting an absorption, refraction and scatter image of an object by independently analyzing, detecting, digitizing, and combining images acquired on a high and a low angle side of a rocking curve of a crystal analyzer. An x-ray beam which is generated by any suitable conventional apparatus can be irradiated upon either a Bragg type crystal analyzer or a Laue type crystal analyzer. Images of the absorption, refraction and scattering effects are detected, such as on an image plate, and then digitized. The digitized images are simultaneously solved, preferably on a pixel-by-pixel basis, to derive a combined visual image which has dramatically improved contrast and spatial resolution over an image acquired through conventional radiology methods.

  13. Method of Detecting Coliform Bacteria from Reflected Light

    NASA Technical Reports Server (NTRS)

    Vincent, Robert K. (Inventor)

    2014-01-01

    The present invention relates to a method of detecting coliform bacteria in water from reflected light, and also includes devices for the measurement, calculation and transmission of data relating to that method.

  14. Narrative methods in quality improvement research

    PubMed Central

    Greenhalgh, T; Russell, J; Swinglehurst, D

    2005-01-01

    

 This paper reviews and critiques the different approaches to the use of narrative in quality improvement research. The defining characteristics of narrative are chronology (unfolding over time); emplotment (the literary juxtaposing of actions and events in an implicitly causal sequence); trouble (that is, harm or the risk of harm); and embeddedness (the personal story nests within a particular social, historical and organisational context). Stories are about purposeful action unfolding in the face of trouble and, as such, have much to offer quality improvement researchers. But the quality improvement report (a story about efforts to implement change), which is common, must be distinguished carefully from narrative based quality improvement research (focused systematic enquiry that uses narrative methods to generate new knowledge), which is currently none. We distinguish four approaches to the use of narrative in quality improvement research—narrative interview; naturalistic story gathering; organisational case study; and collective sense-making—and offer a rationale, describe how data can be collected and analysed, and discuss the strengths and limitations of each using examples from the quality improvement literature. Narrative research raises epistemological questions about the nature of narrative truth (characterised by sense-making and emotional impact rather than scientific objectivity), which has implications for how rigour should be defined (and how it might be achieved) in this type of research. We offer some provisional guidance for distinguishing high quality narrative research in a quality improvement setting from other forms of narrative account such as report, anecdote, and journalism. PMID:16326792

  15. Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems

    DTIC Science & Technology

    2009-03-01

    using the composite event detection method [Kerman, Jiang, Blumberg , and Buttrey, 2009]. Although the techniques and utility of the...aforementioned method have been clearly demonstrated, there is still much work and research to be conducted within the realm of event detection. This...detection methods . The paragraphs that follow summarize the discoveries of and lessons learned by multiple researchers and authors over many

  16. Moving target detection method based on improved Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Ma, J. Y.; Jie, F. R.; Hu, Y. J.

    2017-07-01

    Gaussian Mixture Model is often employed to build background model in background difference methods for moving target detection. This paper puts forward an adaptive moving target detection algorithm based on improved Gaussian Mixture Model. According to the graylevel convergence for each pixel, adaptively choose the number of Gaussian distribution to learn and update background model. Morphological reconstruction method is adopted to eliminate the shadow.. Experiment proved that the proposed method not only has good robustness and detection effect, but also has good adaptability. Even for the special cases when the grayscale changes greatly and so on, the proposed method can also make outstanding performance.

  17. Novel Methods for Detecting Buried Explosive Devices

    DTIC Science & Technology

    2007-04-10

    NQR ), and semiotic data fusion. Bioreporter bacteria look promising for third-world humanitarian applications; they are inexpensive, and...demining, NQR is a promising method for detecting explosive substances; of 50,000 substances that have been tested, none has an NQR signature that can be...approach to a cheap mine detector for humanitarian use. Real-time wavelet processing appears to be a key to extending NQR bomb detection into mine

  18. Optimal DNA Isolation Method for Detection of Nontuberculous Mycobacteria by Polymerase Chain Reaction

    PubMed Central

    Mohammadi, Samira; Esfahani, Bahram Nasr; Moghim, Sharareh; Mirhendi, Hossein; Zaniani, Fatemeh Riyahi; Safaei, Hajieh Ghasemian; Fazeli, Hossein; Salehi, Mahshid

    2017-01-01

    Background: Nontuberculous mycobacteria (NTM) are a group of opportunistic pathogens and these are widely dispersed in water and soil resources. Identification of mycobacteria isolates by conventional methods including biochemical tests, growth rates, colony pigmentation, and presence of acid-fast bacilli is widely used, but these methods are time-consuming, labor-intensive, and may sometimes remain inconclusive. Materials and Methods: The DNA was extracted from NTM cultures using CTAB, Chelex, Chelex + Nonidet P-40, FTA® Elute card, and boiling The quantity and quality of the DNA extracted via these methods were determined using UV-photometer at 260 and 280 nm, and polymerase chain reaction (PCR) amplification of the heat-shock protein 65 gene with serially diluted DNA samples. Results: The CTAB method showed more positive results at 1:10–1:100,000 at which the DNA amount was substantial. With the Chelex method of DNA extraction, PCR amplification was detected at 1:10 and 1:1000 dilutions. Conclusions: According to the electrophoresis results, the CTAB and Chelex DNA extraction methods were more successful in comparison with the others as regard producing suitable concentrations of DNA with the minimum use of PCR inhibitor. PMID:29279831

  19. Human Health Water Quality Criteria and Methods for Toxics

    EPA Pesticide Factsheets

    Documents pertaining to Human Health Water Quality Criteria and Methods for Toxins. Includes 2015 Update for Water Quality Criteria, 2002 National Recommended Human Health Criteria, and 2000 EPA Methodology.

  20. Sunglass detection method for automation of video surveillance system

    NASA Astrophysics Data System (ADS)

    Sikandar, Tasriva; Samsudin, Wan Nur Azhani W.; Hawari Ghazali, Kamarul; Mohd, Izzeldin I.; Fazle Rabbi, Mohammad

    2018-04-01

    Wearing sunglass to hide face from surveillance camera is a common activity in criminal incidences. Therefore, sunglass detection from surveillance video has become a demanding issue in automation of security systems. In this paper we propose an image processing method to detect sunglass from surveillance images. Specifically, a unique feature using facial height and width has been employed to identify the covered region of the face. The presence of covered area by sunglass is evaluated using facial height-width ratio. Threshold value of covered area percentage is used to classify the glass wearing face. Two different types of glasses have been considered i.e. eye glass and sunglass. The results of this study demonstrate that the proposed method is able to detect sunglasses in two different illumination conditions such as, room illumination as well as in the presence of sunlight. In addition, due to the multi-level checking in facial region, this method has 100% accuracy of detecting sunglass. However, in an exceptional case where fabric surrounding the face has similar color as skin, the correct detection rate was found 93.33% for eye glass.

  1. Chemical detection system and related methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caffrey, Augustine J.; Chichester, David L.; Egger, Ann E.

    2017-06-27

    A chemical detection system includes a frame, an emitter coupled to the frame, and a detector coupled to the frame proximate the emitter. The system also includes a shielding system coupled to the frame and positioned at least partially between the emitter and the detector, wherein the frame positions a sensing surface of the detector in a direction substantially parallel to a plane extending along a front portion of the frame. A method of analyzing composition of a suspect object includes directing neutrons at the object, detecting gamma rays emitted from the object, and communicating spectrometer information regarding the gammamore » rays. The method also includes presenting a GUI to a user with a dynamic status of an ongoing neutron spectroscopy process. The dynamic status includes a present confidence for a plurality of compounds being present in the suspect object responsive to changes in the spectrometer information during the ongoing process.« less

  2. Sensing Methods for Detecting Analog Television Signals

    NASA Astrophysics Data System (ADS)

    Rahman, Mohammad Azizur; Song, Chunyi; Harada, Hiroshi

    This paper introduces a unified method of spectrum sensing for all existing analog television (TV) signals including NTSC, PAL and SECAM. We propose a correlation based method (CBM) with a single reference signal for sensing any analog TV signals. In addition we also propose an improved energy detection method. The CBM approach has been implemented in a hardware prototype specially designed for participating in Singapore TV white space (WS) test trial conducted by Infocomm Development Authority (IDA) of the Singapore government. Analytical and simulation results of the CBM method will be presented in the paper, as well as hardware testing results for sensing various analog TV signals. Both AWGN and fading channels will be considered. It is shown that the theoretical results closely match with those from simulations. Sensing performance of the hardware prototype will also be presented in fading environment by using a fading simulator. We present performance of the proposed techniques in terms of probability of false alarm, probability of detection, sensing time etc. We also present a comparative study of the various techniques.

  3. Automated detection of hospital outbreaks: A systematic review of methods

    PubMed Central

    Buckeridge, David L.; Lepelletier, Didier

    2017-01-01

    Objectives Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. Methods We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Results Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Conclusion Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results. PMID:28441422

  4. A bootstrap method for estimating uncertainty of water quality trends

    USGS Publications Warehouse

    Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura

    2015-01-01

    Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.

  5. 77 FR 22282 - Draft Guidelines on Biologics Quality Monitoring: Testing for the Detection of Mycoplasma...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-13

    ...] Draft Guidelines on Biologics Quality Monitoring: Testing for the Detection of Mycoplasma Contamination... Detection of Mycoplasma Contamination.'' This draft guideline identifies stages of manufacture where... contamination. Because the guidelines apply to final product and master seed/cell testing in veterinary vaccines...

  6. A New Moving Object Detection Method Based on Frame-difference and Background Subtraction

    NASA Astrophysics Data System (ADS)

    Guo, Jiajia; Wang, Junping; Bai, Ruixue; Zhang, Yao; Li, Yong

    2017-09-01

    Although many methods of moving object detection have been proposed, moving object extraction is still the core in video surveillance. However, with the complex scene in real world, false detection, missed detection and deficiencies resulting from cavities inside the body still exist. In order to solve the problem of incomplete detection for moving objects, a new moving object detection method combined an improved frame-difference and Gaussian mixture background subtraction is proposed in this paper. To make the moving object detection more complete and accurate, the image repair and morphological processing techniques which are spatial compensations are applied in the proposed method. Experimental results show that our method can effectively eliminate ghosts and noise and fill the cavities of the moving object. Compared to other four moving object detection methods which are GMM, VIBE, frame-difference and a literature's method, the proposed method improve the efficiency and accuracy of the detection.

  7. Radiometric Method for the Detection of Coliform Organisms in Water

    PubMed Central

    Bachrach, Uriel; Bachrach, Zelilah

    1974-01-01

    A new radiometric method for the detection of coliform bacteria in water has been described. The method is based on the release of 14CO2 from [14C]lactose by bacteria suspended in growth medium and incubated at 37 C. The evolved 14CO2 is trapped by hyamine hydroxide and counted in a liquid scintillation spectrometer. The method permits the detection of 1 to 10 organisms within 6 h of incubation. Coliform bacteria suspended in water for several days recover from starvation and may be quantitated by the proposed method. Bacteria from water samples may also be concentrated by filtration through membrane filters and detected by the radiometric assay. PMID:4605007

  8. Halal and kosher slaughter methods and meat quality: a review.

    PubMed

    Farouk, M M; Al-Mazeedi, H M; Sabow, A B; Bekhit, A E D; Adeyemi, K D; Sazili, A Q; Ghani, A

    2014-11-01

    There are many slaughter procedures that religions and cultures use around the world. The two that are commercially relevant are the halal and kosher methods practiced by Muslims and Jews respectively. The global trade in red meat and poultry produced using these two methods is substantial, thus the importance of the quality of the meat produced using the methods. Halal and kosher slaughter per se should not affect meat quality more than their industrial equivalents, however, some of their associated pre- and post-slaughter processes do. For instance, the slow decline in blood pressure following a halal pre-slaughter head-only stun and neck cut causes blood splash (ecchymosis) in a range of muscles and organs of slaughtered livestock. Other quality concerns include bruising, hemorrhages, skin discoloration and broken bones particularly in poultry. In addition to these conventional quality issues, the "spiritual quality" of the meat can also be affected when the halal and kosher religious requirements are not fully met during the slaughter process. The nature, causes, importance and mitigations of these and other quality issues related to halal and kosher slaughtering and meat production using these methods are the subjects of this review. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. SiPM electro-optical detection system noise suppression method

    NASA Astrophysics Data System (ADS)

    Bi, Xiangli; Yang, Suhui; Hu, Tao; Song, Yiheng

    2014-11-01

    In this paper, the single photon detection principle of Silicon Photomultipliers (SiPM) device is introduced. The main noise factors that infect the sensitivity of the electro-optical detection system are analyzed, including background light noise, detector dark noise, preamplifier noise and signal light noise etc. The Optical, electrical and thermodynamic methods are used to suppress the SiPM electro-optical detection system noise, which improved the response sensitivity of the detector. Using SiPM optoelectronic detector with a even high sensitivity, together with small field large aperture optical system, high cutoff narrow bandwidth filters, low-noise operational amplifier circuit, the modular design of functional circuit, semiconductor refrigeration technology, greatly improved the sensitivity of optical detection system, reduced system noise and achieved long-range detection of weak laser radiation signal. Theoretical analysis and experimental results show that the proposed methods are reasonable and efficient.

  10. Automated Detection of Salt Marsh Platforms : a Topographic Method

    NASA Astrophysics Data System (ADS)

    Goodwin, G.; Mudd, S. M.; Clubb, F. J.

    2017-12-01

    Monitoring the topographic evolution of coastal marshes is a crucial step toward improving the management of these valuable landscapes under the pressure of relative sea level rise and anthropogenic modification. However, determining their geometrically complex boundaries currently relies on spectral vegetation detection methods or requires labour-intensive field surveys and digitisation.We propose a novel method to reproducibly isolate saltmarsh scarps and platforms from a DEM. Field observations and numerical models show that saltmarshes mature into sub-horizontal platforms delineated by sub-vertical scarps: based on this premise, we identify scarps as lines of local maxima on a slope*relief raster, then fill landmasses from the scarps upward, thus isolating mature marsh platforms. Non-dimensional search parameters allow batch-processing of data without recalibration. We test our method using lidar-derived DEMs of six saltmarshes in England with varying tidal ranges and geometries, for which topographic platforms were manually isolated from tidal flats. Agreement between manual and automatic segregation exceeds 90% for resolutions of 1m, with all but one sites maintaining this performance for resolutions up to 3.5m. For resolutions of 1m, automatically detected platforms are comparable in surface area and elevation distribution to digitised platforms. We also find that our method allows the accurate detection of local bloc failures 3 times larger than the DEM resolution.Detailed inspection reveals that although tidal creeks were digitised as part of the marsh platform, automatic detection classifies them as part of the tidal flat, causing an increase in false negatives and overall platform perimeter. This suggests our method would benefit from a combination with existing creek detection algorithms. Fallen blocs and pioneer zones are inconsistently identified, particularly in macro-tidal marshes, leading to differences between digitisation and the automated method

  11. Method development of damage detection in asymmetric buildings

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Thambiratnam, David P.; Chan, Tommy H. T.; Nguyen, Andy

    2018-01-01

    Aesthetics and functionality requirements have caused most buildings to be asymmetric in recent times. Such buildings exhibit complex vibration characteristics under dynamic loads as there is coupling between the lateral and torsional components of vibration, and are referred to as torsionally coupled buildings. These buildings require three dimensional modelling and analysis. In spite of much recent research and some successful applications of vibration based damage detection methods to civil structures in recent years, the applications to asymmetric buildings has been a challenging task for structural engineers. There has been relatively little research on detecting and locating damage specific to torsionally coupled asymmetric buildings. This paper aims to compare the difference in vibration behaviour between symmetric and asymmetric buildings and then use the vibration characteristics for predicting damage in them. The need for developing a special method to detect damage in asymmetric buildings thus becomes evident. Towards this end, this paper modifies the traditional modal strain energy based damage index by decomposing the mode shapes into their lateral and vertical components and to form component specific damage indices. The improved approach is then developed by combining the modified strain energy based damage indices with the modal flexibility method which was modified to suit three dimensional structures to form a new damage indicator. The procedure is illustrated through numerical studies conducted on three dimensional five-story symmetric and asymmetric frame structures with the same layout, after validating the modelling techniques through experimental testing of a laboratory scale asymmetric building model. Vibration parameters obtained from finite element analysis of the intact and damaged building models are then applied into the proposed algorithms for detecting and locating the single and multiple damages in these buildings. The results

  12. Essential Limitations of the Standard THz TDS Method for Substance Detection and Identification and a Way of Overcoming Them

    PubMed Central

    Trofimov, Vyacheslav A.; Varentsova, Svetlana A.

    2016-01-01

    Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria. PMID:27070617

  13. Comparison of four different methods for detection of biofilm formation by uropathogens.

    PubMed

    Panda, Pragyan Swagatika; Chaudhary, Uma; Dube, Surya K

    2016-01-01

    Urinary tract infection (UTI) is one of the most common infectious diseases encountered in clinical practice. Emerging resistance of the uropathogens to the antimicrobial agents due to biofilm formation is a matter of concern while treating symptomatic UTI. However, studies comparing different methods for detection of biofilm by uropathogens are scarce. To compare four different methods for detection of biofilm formation by uropathogens. Prospective observational study conducted in a tertiary care hospital. Totally 300 isolates from urinary samples were analyzed for biofilm formation by four methods, that is, tissue culture plate (TCP) method, tube method (TM), Congo Red Agar (CRA) method and modified CRA (MCRA) method. Chi-square test was applied when two or more set of variables were compared. P < 0.05 considered as statistically significant. Considering TCP to be a gold standard method for our study we calculated other statistical parameters. The rate of biofilm detection was 45.6%, 39.3% and 11% each by TCP, TM, CRA and MCRA methods, respectively. The difference between TCP and only CRA/MCRA was significant, but not that between TCP and TM. There was no difference in the rate of biofilm detection between CRA and MCRA in other isolates, but MCRA is superior to CRA for detection of the staphylococcal biofilm formation. TCP method is the ideal method for detection of bacterial biofilm formation by uropathogens. MCRA method is superior only to CRA for detection of staphylococcal biofilm formation.

  14. [Quality of the Early Cervical Cancer Detection Program in the State of Nuevo León].

    PubMed

    Salinas-Martínez, A M; Villarreal-Ríos, E; Garza-Elizondo, M E; Fraire-Gloria, J M; López-Franco, J J; Barboza-Quintana, O

    1997-01-01

    To determine the quality of the Early Cervical Cancer Detection Program in the state of Nuevo León. A random selection of 4791 cytologic reports were analyzed, emitted by the State Ministry of Health, the University Hospital and the Mexican Institute for Social Security early cervical cancer detection modules. Pap tests of women with hysterectomy, current pregnancy, menopause or positive result were excluded. Quality was measured with previously defined standards. Analysis included, besides univariate statistics, tests of significance for proportions and means. The quality of the program was fairly satisfactory at the level of the State. The quality of the sampling procedure was low; 39.9% of the tests contained endocervical cells. Quality of coverage was low; 15.6% were women 25+years with first time Pap test. Quality of opportunity was high; 8.5 +/- 7 weekdays between the date of the pap smear and the interpretation date. Strategies are needed to increase the impact of the state program, such as improving the sampling procedure and the coverage quality levels.

  15. Traumatic Brain Injury Detection Using Electrophysiological Methods

    PubMed Central

    Rapp, Paul E.; Keyser, David O.; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B.; Zambon, Robert A.; Hairston, W. David; Hughes, John D.; Krystal, Andrew; Nichols, Andrew S.

    2015-01-01

    Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in q

  16. Traumatic brain injury detection using electrophysiological methods.

    PubMed

    Rapp, Paul E; Keyser, David O; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B; Zambon, Robert A; Hairston, W David; Hughes, John D; Krystal, Andrew; Nichols, Andrew S

    2015-01-01

    Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in q

  17. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  18. Optimal DNA Isolation Method for Detection of Nontuberculous Mycobacteria by Polymerase Chain Reaction.

    PubMed

    Mohammadi, Samira; Esfahani, Bahram Nasr; Moghim, Sharareh; Mirhendi, Hossein; Zaniani, Fatemeh Riyahi; Safaei, Hajieh Ghasemian; Fazeli, Hossein; Salehi, Mahshid

    2017-01-01

    Nontuberculous mycobacteria (NTM) are a group of opportunistic pathogens and these are widely dispersed in water and soil resources. Identification of mycobacteria isolates by conventional methods including biochemical tests, growth rates, colony pigmentation, and presence of acid-fast bacilli is widely used, but these methods are time-consuming, labor-intensive, and may sometimes remain inconclusive. The DNA was extracted from NTM cultures using CTAB, Chelex, Chelex + Nonidet P-40, FTA ® Elute card, and boiling The quantity and quality of the DNA extracted via these methods were determined using UV-photometer at 260 and 280 nm, and polymerase chain reaction (PCR) amplification of the heat-shock protein 65 gene with serially diluted DNA samples. The CTAB method showed more positive results at 1:10-1:100,000 at which the DNA amount was substantial. With the Chelex method of DNA extraction, PCR amplification was detected at 1:10 and 1:1000 dilutions. According to the electrophoresis results, the CTAB and Chelex DNA extraction methods were more successful in comparison with the others as regard producing suitable concentrations of DNA with the minimum use of PCR inhibitor.

  19. Systems and methods for data quality control and cleansing

    DOEpatents

    Wenzel, Michael; Boettcher, Andrew; Drees, Kirk; Kummer, James

    2016-05-31

    A method for detecting and cleansing suspect building automation system data is shown and described. The method includes using processing electronics to automatically determine which of a plurality of error detectors and which of a plurality of data cleansers to use with building automation system data. The method further includes using processing electronics to automatically detect errors in the data and cleanse the data using a subset of the error detectors and a subset of the cleansers.

  20. Samsung Salmonella Detection Kit. AOAC Performance Tested Method(SM) 021203.

    PubMed

    Li, Jun; Cheung, Win Den; Opdyke, Jason; Harvey, John; Chong, Songchun; Moon, Cheol Gon

    2012-01-01

    Salmonella, one of the most common causes of foodborne illness, is a significant public health concern worldwide. There is a need in the food industry for methods that are simple, rapid, and sensitive for the detection of foodborne pathogens. In this study, the Samsung Salmonella Detection Kit, a real-time PCR assay for the detection of Salmonella, was evaluated according to the current AOAC guidelines. The validation consisted of lot-to-lot consistency, stability, robustness, and inclusivity/exclusivity studies, as well as a method comparison of 10 different food matrixes. In the validation, the Samsung Salmonella Detection Kit was used in conjunction with the Applied Biosystems StepOnePlus PCR system and the Samsung Food Testing Software for the detection of Salmonella species. The performance of the assays was compared to the U.S. Department of Agriculture/Food Safety and Inspection Service-Microbiology Laboratory Guidebook (USDA/FSIS-MLG) 4.05: Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg, and Catfish and the and U.S. Food and Drug Administration/Bacteriological Analytical Manual (FDA/BAM) Chapter 5 Salmonella reference methods. The validation was conducted using an unpaired study design for detection of Salmonella spp. in raw ground beef, raw pork, raw ground pork, raw chicken wings, raw salmon, alfalfa sprouts, pasteurized orange juice, peanut butter, pasteurized whole milk, and shell eggs. The Samsung Salmonella Detection Kit demonstrated lot-to-lot consistency among three independent lots as well as ruggedness with minor modifications to changes in enrichment incubation time, enrichment incubation temperature, and DNA sample volume for PCR reaction. Stability was observed for 13 months at -20 degrees C and 3 months at 5 degrees C. For the inclusivity/exclusivity study, the Samsung Salmonella Detection Kit correctly identified 147 Salmonella species isolates out of 147 isolates tested from each of three different enrichment

  1. Recent developments in optical detection methods for microchip separations.

    PubMed

    Götz, Sebastian; Karst, Uwe

    2007-01-01

    This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.

  2. Emergency First Responders' Experience with Colorimetric Detection Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandra L. Fox; Keith A. Daum; Carla J. Miller

    2007-10-01

    Nationwide, first responders from state and federal support teams respond to hazardous materials incidents, industrial chemical spills, and potential weapons of mass destruction (WMD) attacks. Although first responders have sophisticated chemical, biological, radiological, and explosive detectors available for assessment of the incident scene, simple colorimetric detectors have a role in response actions. The large number of colorimetric chemical detection methods available on the market can make the selection of the proper methods difficult. Although each detector has unique aspects to provide qualitative or quantitative data about the unknown chemicals present, not all detectors provide consistent, accurate, and reliable results. Includedmore » here, in a consumer-report-style format, we provide “boots on the ground” information directly from first responders about how well colorimetric chemical detection methods meet their needs in the field and how they procure these methods.« less

  3. Impact of fair bowel preparation quality on adenoma and serrated polyp detection: data from the New Hampshire colonoscopy registry by using a standardized preparation-quality rating.

    PubMed

    Anderson, Joseph C; Butterly, Lynn F; Robinson, Christina M; Goodrich, Martha; Weiss, Julia E

    2014-09-01

    The effect of colon preparation quality on adenoma detection rates (ADRs) is unclear, partly because of lack of uniform colon preparation ratings in prior studies. The New Hampshire Colonoscopy Registry collects detailed data from colonoscopies statewide, by using a uniform preparation quality scale after the endoscopist has cleaned the mucosa. To compare the overall and proximal ADR and serrated polyp detection rates (SDR) in colonoscopies with differing levels of colon preparation quality. Cross-sectional. New Hampshire statewide registry. Patients undergoing colonoscopy. We examined colon preparation quality for 13,022 colonoscopies, graded by using specific descriptions provided to endoscopists. ADR and SDR are the number of colonoscopies with at least 1 adenoma or serrated polyp (excluding those in the rectum and/or sigmoid colon) detected divided by the total number of colonoscopies, for the preparation categories: optimal (excellent and/or good), fair, and poor. Overall/proximal ADR/SDR. The overall detection rates in examinations with fair colon preparation quality (SDR 8.9%; 95% confidence interval [CI], 7.4-10.7, ADR 27.1%; 95% CI, 24.6-30.0) were similar to rates observed in colonoscopies with optimal preparation quality (SDR 8.8%; 95% CI, 8.3-9.4, ADR 26.3%; 95% CI, 25.6-27.2). This finding also was observed for rates in the proximal colon. A logistic regression model (including withdrawal time) found that proximal ADR was statistically lower in the poor preparation category (odds ratio 0.45; 95% CI, 0.24-0.84; P < .01) than in adequately prepared colons. Homogeneous population. In our sample, there was no significant difference in overall or proximal ADR or SDR between colonoscopies with fair versus optimal colon preparation quality. Poor colon preparation quality may reduce the proximal ADR. Published by Mosby, Inc.

  4. [An automatic peak detection method for LIBS spectrum based on continuous wavelet transform].

    PubMed

    Chen, Peng-Fei; Tian, Di; Qiao, Shu-Jun; Yang, Guang

    2014-07-01

    Spectrum peak detection in the laser-induced breakdown spectroscopy (LIBS) is an essential step, but the presence of background and noise seriously disturb the accuracy of peak position. The present paper proposed a method applied to automatic peak detection for LIBS spectrum in order to enhance the ability of overlapping peaks searching and adaptivity. We introduced the ridge peak detection method based on continuous wavelet transform to LIBS, and discussed the choice of the mother wavelet and optimized the scale factor and the shift factor. This method also improved the ridge peak detection method with a correcting ridge method. The experimental results show that compared with other peak detection methods (the direct comparison method, derivative method and ridge peak search method), our method had a significant advantage on the ability to distinguish overlapping peaks and the precision of peak detection, and could be be applied to data processing in LIBS.

  5. Defining Instructional Quality by Employing the Total Quality Management (TQM) Method: A Research Project.

    ERIC Educational Resources Information Center

    Croker, Robert E.; And Others

    The feasibility of using W. E. Deming's total quality management (TQM) method to define instructional quality was examined by surveying three groups of students attending Idaho State University's College of Education and School of Applied Technology: 31 students seeking cosmetology certification; 75 undergraduates pursuing degrees in corporate…

  6. Explosives detection system and method

    DOEpatents

    Reber, Edward L.; Jewell, James K.; Rohde, Kenneth W.; Seabury, Edward H.; Blackwood, Larry G.; Edwards, Andrew J.; Derr, Kurt W.

    2007-12-11

    A method of detecting explosives in a vehicle includes providing a first rack on one side of the vehicle, the rack including a neutron generator and a plurality of gamma ray detectors; providing a second rack on another side of the vehicle, the second rack including a neutron generator and a plurality of gamma ray detectors; providing a control system, remote from the first and second racks, coupled to the neutron generators and gamma ray detectors; using the control system, causing the neutron generators to generate neutrons; and performing gamma ray spectroscopy on spectra read by the gamma ray detectors to look for a signature indicative of presence of an explosive. Various apparatus and other methods are also provided.

  7. Apnea Detection Method for Cheyne-Stokes Respiration Analysis on Newborn

    NASA Astrophysics Data System (ADS)

    Niimi, Taiga; Itoh, Yushi; Natori, Michiya; Aoki, Yoshimitsu

    2013-04-01

    Cheyne-Stokes respiration is especially prevalent in preterm newborns, but its severity may not be recognized. It is characterized by apnea and cyclical weakening and strengthening of the breathing. We developed a method for detecting apnea and this abnormal respiration and for estimating its malignancy. Apnea was detected based on a "difference" feature (calculated from wavelet coefficients) and a modified maximum displacement feature (related to the respiratory waveform shape). The waveform is calculated from vertical motion of the thoracic and abdominal region during respiration using a vision sensor. Our proposed detection method effectively detects apnea (sensitivity 88.4%, specificity 99.7%).

  8. Detection of ESBL among ampc producing enterobacteriaceae using inhibitor-based method

    PubMed Central

    Bakthavatchalu, Sasirekha; Shakthivel, Uma; Mishra, Tannu

    2013-01-01

    Introduction The occurrence of multiple β-lactamases among bacteria only limits the therapeutic options but also poses a challenge. A study using boronic acid (BA), an AmpC enzyme inhibitor, was designed to detect the combined expression of AmpC β-lactamases and extended-spectrum β-lactamases (ESBLs) in bacterial isolates further different phenotypic methods are compared to detect ESBL and AmpC. Methods A total of 259 clinical isolates of Enterobacteriaceae were isolated and screened for ESBL production by (i) CLSI double-disk diffusion method (ii) cefepime- clavulanic acid method (iii) boronic disk potentiation method. AmpC production was detected using cefoxitin alone and in combination with boronic acid and confirmation was done by three dimensional disk methods. Isolates were also subjected to detailed antibiotic susceptibility test. Results Among 259 isolates, 20.46% were coproducers of ESBL and AmpC, 26.45% were ESBL and 5.40% were AmpC. All of the 53 AmpC and ESBL coproducers were accurately detected by boronic acid disk potentiation method. Conclusion The BA disk test using Clinical and Laboratory Standards Institute methodology is simple and very efficient method that accurately detects the isolates that harbor both AmpCs and ESBLs. PMID:23504148

  9. Methods of use for sensor based fluid detection devices

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor)

    2001-01-01

    Methods of use and devices for detecting analyte in fluid. A system for detecting an analyte in a fluid is described comprising a substrate having a sensor comprising a first organic material and a second organic material where the sensor has a response to permeation by an analyte. A detector is operatively associated with the sensor. Further, a fluid delivery appliance is operatively associated with the sensor. The sensor device has information storage and processing equipment, which is operably connected with the device. This device compares a response from the detector with a stored ideal response to detect the presence of analyte. An integrated system for detecting an analyte in a fluid is also described where the sensing device, detector, information storage and processing device, and fluid delivery device are incorporated in a substrate. Methods for use for the above system are also described where the first organic material and a second organic material are sensed and the analyte is detected with a detector operatively associated with the sensor. The method provides for a device, which delivers fluid to the sensor and measures the response of the sensor with the detector. Further, the response is compared to a stored ideal response for the analyte to determine the presence of the analyte. In different embodiments, the fluid measured may be a gaseous fluid, a liquid, or a fluid extracted from a solid. Methods of fluid delivery for each embodiment are accordingly provided.

  10. Systems and methods of detecting force and stress using tetrapod nanocrystal

    DOEpatents

    Choi, Charina L.; Koski, Kristie J.; Sivasankar, Sanjeevi; Alivisatos, A. Paul

    2013-08-20

    Systems and methods of detecting force on the nanoscale including methods for detecting force using a tetrapod nanocrystal by exposing the tetrapod nanocrystal to light, which produces a luminescent response by the tetrapod nanocrystal. The method continues with detecting a difference in the luminescent response by the tetrapod nanocrystal relative to a base luminescent response that indicates a force between a first and second medium or stresses or strains experienced within a material. Such systems and methods find use with biological systems to measure forces in biological events or interactions.

  11. Slot angle detecting method for fiber fixed chip

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaquan; Wang, Jiliang; Zhou, Chaochao

    2018-04-01

    The slot angle of fiber fixed chip has a significant impact on performance of photoelectric devices. In order to solve the actual engineering problem, this paper put forward a detecting method based on imaging processing. Because the images have very low contrast that is hardly segmented, so this paper proposes imaging segment methods based on edge character. Then get fixed chip edge line slope k2 and calculate the fiber fixed slot line slope k1, which can be used calculating the slot angle. Lastly, test the repeatability and accuracy of system, which show that this method has very fast operation speed and good robustness. Clearly, it is also satisfied to the actual demand of fiber fixed chip slot angle detection.

  12. A novel visual saliency detection method for infrared video sequences

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Zhang, Yuzhen; Ning, Chen

    2017-12-01

    Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.

  13. A method for detecting fungal contaminants in wall cavities.

    PubMed

    Spurgeon, Joe C

    2003-01-01

    This article describes a practical method for detecting the presence of both fungal spores and culturable fungi in wall cavities. Culturable fungi were collected in 25 mm cassettes containing 0.8 microm mixed cellulose ester filters using aggressive sampling conditions. Both culturable fungi and fungal spores were collected in modified slotted-disk cassettes. The sample volume was 4 L. The filters were examined microscopically and dilution plated onto multiple culture media. Collecting airborne samples in filter cassettes was an effective method for assessing wall cavities for fungal contaminants, especially because this method allowed the sample to be analyzed by both microscopy and culture media. Assessment criteria were developed that allowed the sample results to be used to classify wall cavities as either uncontaminated or contaminated. As a criterion, wall cavities with concentrations of culturable fungi below the limit of detection (LOD) were classified as uncontaminated, whereas those cavities with detectable concentrations of culturable fungi were classified as contaminated. A total of 150 wall cavities was sampled as part of a field project. The concentrations of culturable fungi were below the LOD in 34% of the samples, whereas Aspergillus and/or Penicillium were the only fungal genera detected in 69% of the samples in which culturable fungi were detected. Spore counting resulted in the detection of Stachybotrys-like spores in 25% of the samples that were analyzed, whereas Stachybotrys chartarum colonies were only detected on 2% of malt extract agar plates and on 6% of corn meal agar plates.

  14. Apparatus and method for detecting leaks in piping

    DOEpatents

    Trapp, Donald J.

    1994-01-01

    A method and device for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe.

  15. Video quality assessment method motivated by human visual perception

    NASA Astrophysics Data System (ADS)

    He, Meiling; Jiang, Gangyi; Yu, Mei; Song, Yang; Peng, Zongju; Shao, Feng

    2016-11-01

    Research on video quality assessment (VQA) plays a crucial role in improving the efficiency of video coding and the performance of video processing. It is well acknowledged that the motion energy model generates motion energy responses in a middle temporal area by simulating the receptive field of neurons in V1 for the motion perception of the human visual system. Motivated by the biological evidence for the visual motion perception, a VQA method is proposed in this paper, which comprises the motion perception quality index and the spatial index. To be more specific, the motion energy model is applied to evaluate the temporal distortion severity of each frequency component generated from the difference of Gaussian filter bank, which produces the motion perception quality index, and the gradient similarity measure is used to evaluate the spatial distortion of the video sequence to get the spatial quality index. The experimental results of the LIVE, CSIQ, and IVP video databases demonstrate that the random forests regression technique trained by the generated quality indices is highly correspondent to human visual perception and has many significant improvements than comparable well-performing methods. The proposed method has higher consistency with subjective perception and higher generalization capability.

  16. Analysis and detection of functional outliers in water quality parameters from different automated monitoring stations in the Nalón river basin (Northern Spain).

    PubMed

    Piñeiro Di Blasi, J I; Martínez Torres, J; García Nieto, P J; Alonso Fernández, J R; Díaz Muñiz, C; Taboada, J

    2015-01-01

    The purposes and intent of the authorities in establishing water quality standards are to provide enhancement of water quality and prevention of pollution to protect the public health or welfare in accordance with the public interest for drinking water supplies, conservation of fish, wildlife and other beneficial aquatic life, and agricultural, industrial, recreational, and other reasonable and necessary uses as well as to maintain and improve the biological integrity of the waters. In this way, water quality controls involve a large number of variables and observations, often subject to some outliers. An outlier is an observation that is numerically distant from the rest of the data or that appears to deviate markedly from other members of the sample in which it occurs. An interesting analysis is to find those observations that produce measurements that are different from the pattern established in the sample. Therefore, identification of atypical observations is an important concern in water quality monitoring and a difficult task because of the multivariate nature of water quality data. Our study provides a new method for detecting outliers in water quality monitoring parameters, using turbidity, conductivity and ammonium ion as indicator variables. Until now, methods were based on considering the different parameters as a vector whose components were their concentration values. This innovative approach lies in considering water quality monitoring over time as continuous curves instead of discrete points, that is to say, the dataset of the problem are considered as a time-dependent function and not as a set of discrete values in different time instants. This new methodology, which is based on the concept of functional depth, was applied to the detection of outliers in water quality monitoring samples in the Nalón river basin with success. Results of this study were discussed here in terms of origin, causes, etc. Finally, the conclusions as well as advantages of

  17. Development of a low-cost detection method for miRNA microarray.

    PubMed

    Li, Wei; Zhao, Botao; Jin, Youxin; Ruan, Kangcheng

    2010-04-01

    MicroRNA (miRNA) microarray is a powerful tool to explore the expression profiling of miRNA. The current detection method used in miRNA microarray is mainly fluorescence based, which usually requires costly detection system such as laser confocal scanner of tens of thousands of dollars. Recently, we developed a low-cost yet sensitive detection method for miRNA microarray based on enzyme-linked assay. In this approach, the biotinylated miRNAs were captured by the corresponding oligonucleotide probes immobilized on microarray slide; and then the biotinylated miRNAs would capture streptavidin-conjugated alkaline phosphatase. A purple-black precipitation on each biotinylated miRNA spot was produced by the enzyme catalytic reaction. It could be easily detected by a charge-coupled device digital camera mounted on a microscope, which lowers the detection cost more than 100 fold compared with that of fluorescence method. Our data showed that signal intensity of the spot correlates well with the biotinylated miRNA concentration and the detection limit for miRNAs is at least 0.4 fmol and the detection dynamic range spans about 2.5 orders of magnitude, which is comparable to that of fluorescence method.

  18. A review on detection methods used for foodborne pathogens

    PubMed Central

    Priyanka, B.; Patil, Rajashekhar K.; Dwarakanath, Sulatha

    2016-01-01

    Foodborne pathogens have been a cause of a large number of diseases worldwide and more so in developing countries. This has a major economic impact. It is important to contain them, and to do so, early detection is very crucial. Detection and diagnostics relied on culture-based methods to begin with and have developed in the recent past parallel to the developments towards immunological methods such as enzyme-linked immunosorbent assays (ELISA) and molecular biology-based methods such as polymerase chain reaction (PCR). The aim has always been to find a rapid, sensitive, specific and cost-effective method. Ranging from culturing of microbes to the futuristic biosensor technology, the methods have had this common goal. This review summarizes the recent trends and brings together methods that have been developed over the years. PMID:28139531

  19. Early detection of Aspergillus carbonarius and A. niger on table grapes: a tool for quality improvement.

    PubMed

    Ayoub, F; Reverberi, M; Ricelli, A; D'Onghia, A M; Yaseen, T

    2010-09-01

    Aspergillus carbonarius and A. niger aggregate are the main fungal contaminants of table grapes. Besides their ability to cause black rot, they can produce ochratoxin A (OTA), a mycotoxin that has attracted increasing attention worldwide. The objective of this work was to set up a simple and rapid molecular method for the early detection of both fungi in table grapes before fungal development becomes evident. Polymerase chain reaction (PCR)-based assays were developed by designing species-specific primers based on the polyketide synthases (PKS(S)) sequences of A. carbonarius and A. niger that have recently been demonstrated to be involved in OTA biosynthesis. Three table grape varieties (Red globe, Crimson seedless, and Italia) were inoculated with A. carbonarius and A. niger aggregate strains producing OTA. The extracted DNA from control (non-inoculated) and inoculated grapes was amplified by PCR using ACPKS2F-ACPKS2R for A. carbonarius and ANPKS5-ANPKS6 for A. niger aggregate. Both primers allowed a clear detection, even in symptomless samples. PCR-based methods are considered to be a good alternative to traditional diagnostic means for the early detection of fungi in complex matrix for their high specificity and sensitivity. The results obtained could be useful for the definition of a 'quality label' for tested grapes to improve the safety measures taken to guarantee the production of fresh table grapes.

  20. Systems and methods for detection of blowout precursors in combustors

    DOEpatents

    Lieuwen, Tim C.; Nair, Suraj

    2006-08-15

    The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.

  1. Rapid Methods for the Detection of General Fecal Indicators

    EPA Science Inventory

    Specified that EPA should develop: appropriate and effective indicators for improving detection in a timely manner of pathogens in coastal waters appropriate, accurate, expeditious and cost-effective methods for the timely detection of pathogens in coastal waters

  2. Oil defect detection of electrowetting display

    NASA Astrophysics Data System (ADS)

    Chiang, Hou-Chi; Tsai, Yu-Hsiang; Yan, Yung-Jhe; Huang, Ting-Wei; Mang, Ou-Yang

    2015-08-01

    In recent years, transparent display is an emerging topic in display technologies. Apply in many fields just like mobile device, shopping or advertising window, and etc. Electrowetting Display (EWD) is one kind of potential transparent display technology advantages of high transmittance, fast response time, high contrast and rich color with pigment based oil system. In mass production process of Electrowetting Display, oil defects should be found by Automated Optical Inspection (AOI) detection system. It is useful in determination of panel defects for quality control. According to the research of our group, we proposed a mechanism of AOI detection system detecting the different kinds of oil defects. This mechanism can detect different kinds of oil defect caused by oil overflow or material deteriorated after oil coating or driving. We had experiment our mechanism with a 6-inch Electrowetting Display panel from ITRI, using an Epson V750 scanner with 1200 dpi resolution. Two AOI algorithms were developed, which were high speed method and high precision method. In high precision method, oil jumping or non-recovered can be detected successfully. This mechanism of AOI detection system can be used to evaluate the oil uniformity in EWD panel process. In the future, our AOI detection system can be used in quality control of panel manufacturing for mass production.

  3. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Treesearch

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  4. ICP-MS: Analytical Method for Identification and Detection of Elemental Impurities.

    PubMed

    Mittal, Mohini; Kumar, Kapil; Anghore, Durgadas; Rawal, Ravindra K

    2017-01-01

    Aim of this article is to review and discuss the currently used quantitative analytical method ICP-MS, which is used for quality control of pharmaceutical products. ICP-MS technique has several applications such as determination of single elements, multi element analysis in synthetic drugs, heavy metals in environmental water, trace element content of selected fertilizers and dairy manures. ICP-MS is also used for determination of toxic and essential elements in different varieties of food samples and metal pollutant present in the environment. The pharmaceuticals may generate impurities at various stages of development, transportation and storage which make them risky to be administered. Thus, it is essential that these impurities must be detected and quantified. ICP-MS plays an important function in the recognition and revealing of elemental impurities. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Current methods for detecting ethylene in plants

    PubMed Central

    Cristescu, Simona M.; Mandon, Julien; Arslanov, Denis; De Pessemier, Jérôme; Hermans, Christian; Harren, Frans J. M.

    2013-01-01

    Background In view of ethylene's critical developmental and physiological roles the gaseous hormone remains an active research topic for plant biologists. Progress has been made to understand the ethylene biosynthesis pathway and the mechanisms of perception and action. Still numerous questions need to be answered and findings to be validated. Monitoring gas production will very often complete the picture of any ethylene research topic. Therefore the search for suitable ethylene measuring methods for various plant samples either in the field, greenhouses, laboratories or storage facilities is strongly motivated. Scope This review presents an update of the current methods for ethylene monitoring in plants. It focuses on the three most-used methods – gas chromatography detection, electrochemical sensing and optical detection – and compares them in terms of sensitivity, selectivity, time response and price. Guidelines are provided for proper selection and application of the described sensor methodologies and some specific applications are illustrated of laser-based detector for monitoring ethylene given off by Arabidopsis thaliana upon various nutritional treatments. Conclusions Each method has its advantages and limitations. The choice for the suitable ethylene sensor needs careful consideration and is driven by the requirements for a specific application. PMID:23243188

  6. Imaging quality analysis of computer-generated holograms using the point-based method and slice-based method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.

    2017-06-01

    Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.

  7. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  8. Infrared thermal integrity testing quality assurance test method to detect drilled shaft defects.

    DOT National Transportation Integrated Search

    2011-06-01

    Thermal integrity profiling uses the measured temperature generated in curing concrete to assess the quality of cast in place concrete foundations (i.e. drilled shafts or ACIP piles) which can include effective shaft size (diameter and length), anoma...

  9. [Selecting methods of controls concentration for internal quality control and continuity of control chart between different reagent lots for HBsAg qualitative detection].

    PubMed

    Li, Jin-ming; Zheng, Huai-jing; Wang, Lu-nan; Deng, Wei

    2003-04-01

    To establish a model for one choosing controls with a suitable concentration for internal quality control (IQC) with qualitative ELISA detection, and a consecutive plotting method on Levey-Jennings control chart when reagent kit lot is changed. First, a series of control serum with 0.2, 0.5, 1.0, 2.0 and 5.0ng/ml HBsAg respectively were assessed for within-run and between-run precision according to NCCLs EP5 document. Then, a linear regression equation (y=bx + a) with best correlation coefficient (r > 0.99) was established based on S/CO values of the series of control serum. Finally, one could choose controls with S/CO value calculated from the equation (y = bx + a) minus the product of the S/CO value multiplying three-fold between-run CV to be still more than 1.0 for IQC use. For consecutive plotting on Levey-Jennings control chart when ELISA kit lot was changed, the new lot kits were used to detect the same series of HBsAg control serum as above. Then, a new linear regression equation (y2 = b2x2 + a2) with best correlation coefficient was obtained. The old one (y1 =b1x1 + a1) could be obtained based on the mean values from above precision assessment. The S/CO value of a control serum detected by new lot kit could be changed to that detected by old kit lot based on the factor of y2/y1. Therefore, the plotting on primary Levey-Jennings control chart could be continued. The within-run coefficient of variation CV of the ELISA method for control serum with 0.2, 0.5, 1.0, 2.0 and 5.0ng/ml HBsAg were 11.08%, 9.49%, 9.83%, 9.18% and 7.25%, respectively, and between-run CV were 13.25%, 14.03%, 15.11%, 13.29% and 9.92%. The linear regression equation with best correlation coefficient from a test at random was y = 3.509x + 0.180. The suitable concentration of control serum for IQC could be 0.5ng/ml or 1.0ng/ml. The linear regression equation from the old lot and other two new lots of the ELISA kits were y1 = 3.550(x1) + 0.226, y2 = 3.238(x2) +0.388, and y3 =3.428(x3) + 0

  10. Detection and quantification limits of the EPA Enterococcus qPCR method

    EPA Science Inventory

    The U.S. EPA will be recommending a quantitative polymerase chain reaction (qPCR) method targeting Enterococcus spp. as an option for monitoring recreational beach water quality in 2013 and has published preliminary proposed water quality criteria guidelines for the method. An im...

  11. Comparing Two Methods for Reducing Variability in Voice Quality Measurements

    ERIC Educational Resources Information Center

    Kreiman, Jody; Gerratt, Bruce R.

    2011-01-01

    Purpose: Interrater disagreements in ratings of quality plague the study of voice. This study compared 2 methods for handling this variability. Method: Listeners provided multiple breathiness ratings for 2 sets of pathological voices, one including 20 male and 20 female voices unselected for quality and one including 20 breathy female voices.…

  12. Near-infrared microscopic methods for the detection and quantification of processed by-products of animal origin

    NASA Astrophysics Data System (ADS)

    Abbas, O.; Fernández Pierna, J. A.; Dardenne, P.; Baeten, V.

    2010-04-01

    Since the BSE crisis, researches concern mainly the detection, identification, and quantification of meat and bone meal with an important focus on the development of new analytical methods. Microscopic based spectroscopy methods (NIR microscopy - NIRM or/and NIR hyperspectral imaging) have been proposed as complementary methods to the official method; the optical microscopy. NIR spectroscopy offers the advantage of being rapid, accurate and independent of human analyst skills. The combination of an NIR detector and a microscope or a camera allows the collection of high quality spectra for small feed particles having a size larger than 50 μm. Several studies undertaken have demonstrated the clear potential of NIR microscopic methods for the detection of animal particles in both raw and sediment fractions. Samples are sieved and only the gross fraction (superior than 250 μm) is investigated. Proposed methodologies have been developed to assure, with an acceptable level of confidence (95%), the detection of at least one animal particle when a feed sample is adulterated at a level of 0.1%. NIRM and NIR hyperspectral imaging are running under accreditation ISO 17025 since 2005 at CRA-W. A quantitative NIRM approach has been developed in order to fulfill the new requirements of the European commission policies. The capacities of NIRM method have been improved; only the raw fraction is analyzed, both the gross and the fine fractions of the samples are considered, and the acquisition parameters are optimized (the aperture, the gap, and the composition of the animal feed). A mapping method for a faster collection of spectra is also developed. The aim of this work is to show the new advances in the analytical methods developed in the frame of the feed ban applied in Europe.

  13. Methods for detection of GMOs in food and feed.

    PubMed

    Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca

    2008-10-01

    This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.

  14. Method and apparatus for vapor detection

    NASA Technical Reports Server (NTRS)

    Lerner, Melvin (Inventor); Hood, Lyal V. (Inventor); Rommel, Marjorie A. (Inventor); Pettitt, Bruce C. (Inventor); Erikson, Charles M. (Inventor)

    1980-01-01

    The method disclosed herein may be practiced by passing the vapors to be sampled along a path with halogen vapor, preferably chlorine vapor, heating the mixed vapors to halogenate those of the sampled vapors subject to halogenation, removing unreacted halogen vapor, and then sensing the vapors for organic halogenated compounds. The apparatus disclosed herein comprises means for flowing the vapors, both sample and halogen vapors, into a common path, means for heating the mixed vapors to effect the halogenation reaction, means for removing unreacted halogen vapor, and a sensing device for sensing halogenated compounds. By such a method and means, the vapors of low molecular weight hydrocarbons, ketones and alcohols, when present, such as methane, ethane, acetone, ethanol, and the like are converted, at least in part, to halogenated compounds, then the excess halogen removed or trapped, and the resultant vapors of the halogenated compounds sensed or detected. The system is highly sensitive. For example, acetone in a concentration of 30 parts per billion (volume) is readily detected.

  15. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    PubMed Central

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the

  16. Automated detection of hospital outbreaks: A systematic review of methods.

    PubMed

    Leclère, Brice; Buckeridge, David L; Boëlle, Pierre-Yves; Astagneau, Pascal; Lepelletier, Didier

    2017-01-01

    Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results.

  17. Thin Cloud Detection Method by Linear Combination Model of Cloud Image

    NASA Astrophysics Data System (ADS)

    Liu, L.; Li, J.; Wang, Y.; Xiao, Y.; Zhang, W.; Zhang, S.

    2018-04-01

    The existing cloud detection methods in photogrammetry often extract the image features from remote sensing images directly, and then use them to classify images into cloud or other things. But when the cloud is thin and small, these methods will be inaccurate. In this paper, a linear combination model of cloud images is proposed, by using this model, the underlying surface information of remote sensing images can be removed. So the cloud detection result can become more accurate. Firstly, the automatic cloud detection program in this paper uses the linear combination model to split the cloud information and surface information in the transparent cloud images, then uses different image features to recognize the cloud parts. In consideration of the computational efficiency, AdaBoost Classifier was introduced to combine the different features to establish a cloud classifier. AdaBoost Classifier can select the most effective features from many normal features, so the calculation time is largely reduced. Finally, we selected a cloud detection method based on tree structure and a multiple feature detection method using SVM classifier to compare with the proposed method, the experimental data shows that the proposed cloud detection program in this paper has high accuracy and fast calculation speed.

  18. Apparatus and method for detecting leaks in piping

    DOEpatents

    Trapp, D.J.

    1994-12-27

    A method and device are disclosed for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe. 2 figures.

  19. Microbial detection method based on sensing molecular hydrogen

    NASA Technical Reports Server (NTRS)

    Wilkins, J. R.; Stoner, G. E.; Boykin, E. H.

    1974-01-01

    A simple method for detecting bacteria, based on the time of hydrogen evolution, was developed and tested against various members of the Enterobacteriaceae group. The test system consisted of (1) two electrodes, platinum and a reference electrode, (2) a buffer amplifier, and (3) a strip-chart recorder. Hydrogen evolution was measured by an increase in voltage in the negative (cathodic) direction. A linear relationship was established between inoculum size and the time hydrogen was detected (lag period). Lag times ranged from 1 h for 1 million cells/ml to 7 h for 1 cell/ml. For each 10-fold decrease in inoculum, length of the lag period increased 60 to 70 min. Based on the linear relationship between inoculum and lag period, these results indicate the potential application of the hydrogen-sensing method for rapidly detecting coliforms and other gas-producing microorganisms in a variety of clinical, food, and other samples.

  20. Correlation fluorescence method of amine detection

    NASA Astrophysics Data System (ADS)

    Myslitsky, Valentin F.; Tkachuk, Svetlana S.; Rudeichuk, Volodimir M.; Strinadko, Miroslav T.; Slyotov, Mikhail M.; Strinadko, Marina M.

    1997-12-01

    The amines fluorescence spectra stimulated by UV laser radiation are investigated in this paper. The fluorescence is stimulated by the coherent laser beam with the wavelength 0.337 micrometers . At the sufficient energy of laser stimulation the narrow peaks of the fluorescence spectra are detected besides the wide maximum. The relationship between the fluorescence intensity and the concentration of amines solutions are investigated. The fluorescence intensity temporal dependence on wavelength 0.363 micrometers of the norepinephrine solution preliminarily radiated by UV laser with wavelength 0.337 micrometers was found. The computer stimulated and experimental investigations of adrenaline and norepinephrine mixtures fluorescence spectra were done. The correlation fluorescent method of amines detection is proposed.

  1. 40 CFR 280.43 - Methods of release detection for tanks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... two consecutive stick readings at both the beginning and ending of the period; (3) The equipment used... UNDERGROUND STORAGE TANKS (UST) Release Detection § 280.43 Methods of release detection for tanks. Each method... of the tank is made to the nearest one-eighth of an inch at least once a month. Note: Practices...

  2. 40 CFR 280.43 - Methods of release detection for tanks.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... two consecutive stick readings at both the beginning and ending of the period; (3) The equipment used... UNDERGROUND STORAGE TANKS (UST) Release Detection § 280.43 Methods of release detection for tanks. Each method... of the tank is made to the nearest one-eighth of an inch at least once a month. Note: Practices...

  3. Effect of freezing method and frozen storage duration on instrumental quality of lamb throughout display.

    PubMed

    Muela, E; Sañudo, C; Campo, M M; Medel, I; Beltrán, J A

    2010-04-01

    This study evaluated the effect of freezing method (FM) (air blast freezer, freezing tunnel, or nitrogen chamber) and frozen storage duration (FSD) (1, 3, or 6 months) on the instrumental measurements of quality of thawed lamb, aged for a total of 72 h, throughout a 10-d display period, compared to the quality of fresh meat. pH, colour, lipid oxidation, thawing, and cooking losses in Longissimus thoracis and lumborum muscle, were determined following standard methods. FM affected yellowness, FSD redness and thawing losses, and both affected oxidation (increased as freezing rate decreased and/or as storage duration increased). When compared with fresh meat, the main differences appeared on oxidation (where a significant interaction between treatment (3FM x 3FSD + fresh meat) with display duration was detected), and on total losses (thaw + cook losses). Oxidation was lower in fresh meat, but values were not significantly different from those stored frozen for 1 month. Fresh meat had smaller total losses than did thawed meat, but losses were not significantly different from meat frozen in the freezing tunnel and stored frozen for 1 month. Display duration had a greater effect on instrumental quality parameters than did FM or FSD. pH, b*, and oxidation increased, and L* and a* decreased with an increase in the number of days on display. In conclusion, neither freezing method nor frozen storage up to 6 months influenced extensively the properties of lamb when instrumental measurements of quality were measured in meat that had been displayed for 1d after thawing. The small deterioration shown in this study should not give consumers concerns about frozen meat. 2009 Elsevier Ltd. All rights reserved.

  4. Radial line method for rear-view mirror distortion detection

    NASA Astrophysics Data System (ADS)

    Rahmah, Fitri; Kusumawardhani, Apriani; Setijono, Heru; Hatta, Agus M.; Irwansyah, .

    2015-01-01

    An image of the object can be distorted due to a defect in a mirror. A rear-view mirror is an important component for the vehicle safety. One of standard parameters of the rear-view mirror is a distortion factor. This paper presents a radial line method for distortion detection of the rear-view mirror. The rear-view mirror was tested for the distortion detection by using a system consisting of a webcam sensor and an image-processing unit. In the image-processing unit, the captured image from the webcam were pre-processed by using smoothing and sharpening techniques and then a radial line method was used to define the distortion factor. It was demonstrated successfully that the radial line method could be used to define the distortion factor. This detection system is useful to be implemented such as in Indonesian's automotive component industry while the manual inspection still be used.

  5. A deep learning approach for fetal QRS complex detection.

    PubMed

    Zhong, Wei; Liao, Lijuan; Guo, Xuemei; Wang, Guoli

    2018-04-20

    Non-invasive foetal electrocardiography (NI-FECG) has the potential to provide more additional clinical information for detecting and diagnosing fetal diseases. We propose and demonstrate a deep learning approach for fetal QRS complex detection from raw NI-FECG signals by using a convolutional neural network (CNN) model. The main objective is to investigate whether reliable fetal QRS complex detection performance can still be obtained from features of single-channel NI-FECG signals, without canceling maternal ECG (MECG) signals. A deep learning method is proposed for recognizing fetal QRS complexes. Firstly, we collect data from set-a of the PhysioNet/computing in Cardiology Challenge database. The sample entropy method is used for signal quality assessment. Part of the bad quality signals is excluded in the further analysis. Secondly, in the proposed method, the features of raw NI-FECG signals are normalized before they are fed to a CNN classifier to perform fetal QRS complex detection. We use precision, recall, F-measure and accuracy as the evaluation metrics to assess the performance of fetal QRS complex detection. The proposed deep learning method can achieve relatively high precision (75.33%), recall (80.54%), and F-measure scores (77.85%) compared with three other well-known pattern classification methods, namely KNN, naive Bayes and SVM. the proposed deep learning method can attain reliable fetal QRS complex detection performance from the raw NI-FECG signals without canceling MECG signals. In addition, the influence of different activation functions and signal quality assessment on classification performance are evaluated, and results show that Relu outperforms the Sigmoid and Tanh on this particular task, and better classification performance is obtained with the signal quality assessment step in this study.

  6. Median Filtering Methods for Non-volcanic Tremor Detection

    NASA Astrophysics Data System (ADS)

    Damiao, L. G.; Nadeau, R. M.; Dreger, D. S.; Luna, B.; Zhang, H.

    2016-12-01

    Various properties of median filtering over time and space are used to address challenges posed by the Non-volcanic tremor detection problem. As part of a "Big-Data" effort to characterize the spatial and temporal distribution of ambient tremor throughout the Northern San Andreas Fault system, continuous seismic data from multiple seismic networks with contrasting operational characteristics and distributed over a variety of regions are being used. Automated median filtering methods that are flexible enough to work consistently with these data are required. Tremor is characterized by a low-amplitude, long-duration signal-train whose shape is coherent at multiple stations distributed over a large area. There are no consistent phase arrivals or mechanisms in a given tremor's signal and even the durations and shapes among different tremors vary considerably. A myriad of masquerading noise, anthropogenic and natural-event signals must also be discriminated in order to obtain accurate tremor detections. We present here results of the median methods applied to data from four regions of the San Andreas Fault system in northern California (Geysers Geothermal Field, Napa, Bitterwater and Parkfield) to illustrate the ability of the methods to detect tremor under diverse conditions.

  7. Methods of detecting and counting raptors: A review

    USGS Publications Warehouse

    Fuller, M.R.; Mosher, J.A.; Ralph, C. John; Scott, J. Michael

    1981-01-01

    Most raptors are wide-ranging, secretive, and occur at relatively low densities. These factors, in conjunction with the nocturnal activity of owls, cause the counting of raptors by most standard census and survey efforts to be very time consuming and expensive. This paper reviews the most common methods of detecting and counting raptors. It is hoped that it will be of use to the ever-increasing number of biologists, land-use planners, and managers that must determine the occurrence, density, or population dynamics of raptors. Road counts of fixed station or continuous transect design are often used to sample large areas. Detection of spontaneous or elicited vocalizations, especially those of owls, provides a means of detecting and estimating raptor numbers. Searches for nests are accomplished from foot surveys, observations from automobiles and boats, or from aircraft when nest structures are conspicuous (e.g., Osprey). Knowledge of nest habitat, historic records, and inquiries of local residents are useful for locating nests. Often several of these techniques are combined to help find nest sites. Aerial searches have also been used to locate or count large raptors (e.g., eagles), or those that may be conspicuous in open habitats (e.g., tundra). Counts of birds entering or leaving nest colonies or colonial roosts have been attempted on a limited basis. Results from Christmas Bird Counts have provided an index of the abundance of some species. Trapping and banding generally has proven to be an inefficient method of detecting raptors or estimating their populations. Concentrations of migrants at strategically located points around the world afford the best opportunity to count many rap tors in a relatively short period of time, but the influence of many unquantified variables has inhibited extensive interpretation of these counts. Few data exist to demonstrate the effectiveness of these methods. We believe more research on sampling techniques, rather than complete

  8. Methods, systems and devices for detecting and locating ferromagnetic objects

    DOEpatents

    Roybal, Lyle Gene [Idaho Falls, ID; Kotter, Dale Kent [Shelley, ID; Rohrbaugh, David Thomas [Idaho Falls, ID; Spencer, David Frazer [Idaho Falls, ID

    2010-01-26

    Methods for detecting and locating ferromagnetic objects in a security screening system. One method includes a step of acquiring magnetic data that includes magnetic field gradients detected during a period of time. Another step includes representing the magnetic data as a function of the period of time. Another step includes converting the magnetic data to being represented as a function of frequency. Another method includes a step of sensing a magnetic field for a period of time. Another step includes detecting a gradient within the magnetic field during the period of time. Another step includes identifying a peak value of the gradient detected during the period of time. Another step includes identifying a portion of time within the period of time that represents when the peak value occurs. Another step includes configuring the portion of time over the period of time to represent a ratio.

  9. Position detectors, methods of detecting position, and methods of providing positional detectors

    DOEpatents

    Weinberg, David M.; Harding, L. Dean; Larsen, Eric D.

    2002-01-01

    Position detectors, welding system position detectors, methods of detecting various positions, and methods of providing position detectors are described. In one embodiment, a welding system positional detector includes a base that is configured to engage and be moved along a curved surface of a welding work piece. At least one position detection apparatus is provided and is connected with the base and configured to measure angular position of the detector relative to a reference vector. In another embodiment, a welding system positional detector includes a weld head and at least one inclinometer mounted on the weld head. The one inclinometer is configured to develop positional data relative to a reference vector and the position of the weld head on a non-planar weldable work piece.

  10. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  11. Wavelength band selection method for multispectral target detection.

    PubMed

    Karlholm, Jörgen; Renhorn, Ingmar

    2002-11-10

    A framework is proposed for the selection of wavelength bands for multispectral sensors by use of hyperspectral reference data. Using the results from the detection theory we derive a cost function that is minimized by a set of spectral bands optimal in terms of detection performance for discrimination between a class of small rare targets and clutter with known spectral distribution. The method may be used, e.g., in the design of multispectral infrared search and track and electro-optical missile warning sensors, where a low false-alarm rate and a high-detection probability for detection of small targets against a clutter background are of critical importance, but the required high frame rate prevents the use of hyperspectral sensors.

  12. Evaluating the Good Ontology Design Guideline (GoodOD) with the Ontology Quality Requirements and Evaluation Method and Metrics (OQuaRE)

    PubMed Central

    Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás

    2014-01-01

    Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262

  13. Detection method of visible and invisible nipples on digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Chae, Seung-Hoon; Jeong, Ji-Wook; Lee, Sooyeul; Chae, Eun Young; Kim, Hak Hee; Choi, Young-Wook

    2015-03-01

    Digital Breast Tomosynthesis(DBT) with 3D breast image can improve detection sensitivity of breast cancer more than 2D mammogram on dense breast. The nipple location information is needed to analyze DBT. The nipple location is invaluable information in registration and as a reference point for classifying mass or micro-calcification clusters. Since there are visible nipple and invisible nipple in 2D mammogram or DBT, the nipple detection of breast must be possible to detect visible and invisible nipple of breast. The detection method of visible nipple using shape information of nipple is simple and highly efficient. However, it is difficult to detect invisible nipple because it doesn't have prominent shape. Mammary glands in breast connect nipple, anatomically. The nipple location is detected through analyzing location of mammary glands in breast. In this paper, therefore, we propose a method to detect the nipple on a breast, which has a visible or invisible nipple using changes of breast area and mammary glands, respectively. The result shows that our proposed method has average error of 2.54+/-1.47mm.

  14. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    NASA Astrophysics Data System (ADS)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  15. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  16. Monocular Vision-Based Underwater Object Detection

    PubMed Central

    Zhang, Zhen; Dai, Fengzhao; Bu, Yang; Wang, Huibin

    2017-01-01

    In this paper, we propose an underwater object detection method using monocular vision sensors. In addition to commonly used visual features such as color and intensity, we investigate the potential of underwater object detection using light transmission information. The global contrast of various features is used to initially identify the region of interest (ROI), which is then filtered by the image segmentation method, producing the final underwater object detection results. We test the performance of our method with diverse underwater datasets. Samples of the datasets are acquired by a monocular camera with different qualities (such as resolution and focal length) and setups (viewing distance, viewing angle, and optical environment). It is demonstrated that our ROI detection method is necessary and can largely remove the background noise and significantly increase the accuracy of our underwater object detection method. PMID:28771194

  17. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    PubMed

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Roka Listeria detection method using transcription mediated amplification to detect Listeria species in select foods and surfaces. Performance Tested Method(SM) 011201.

    PubMed

    Hua, Yang; Kaplan, Shannon; Reshatoff, Michael; Hu, Ernie; Zukowski, Alexis; Schweis, Franz; Gin, Cristal; Maroni, Brett; Becker, Michael; Wisniewski, Michele

    2012-01-01

    The Roka Listeria Detection Assay was compared to the reference culture methods for nine select foods and three select surfaces. The Roka method used Half-Fraser Broth for enrichment at 35 +/- 2 degrees C for 24-28 h. Comparison of Roka's method to reference methods requires an unpaired approach. Each method had a total of 545 samples inoculated with a Listeria strain. Each food and surface was inoculated with a different strain of Listeria at two different levels per method. For the dairy products (Brie cheese, whole milk, and ice cream), our method was compared to AOAC Official Method(SM) 993.12. For the ready-to-eat meats (deli chicken, cured ham, chicken salad, and hot dogs) and environmental surfaces (sealed concrete, stainless steel, and plastic), these samples were compared to the U.S. Department of Agriculture/Food Safety and Inspection Service-Microbiology Laboratory Guidebook (USDA/FSIS-MLG) method MLG 8.07. Cold-smoked salmon and romaine lettuce were compared to the U.S. Food and Drug Administration/Bacteriological Analytical Manual, Chapter 10 (FDA/BAM) method. Roka's method had 358 positives out of 545 total inoculated samples compared to 332 positive for the reference methods. Overall the probability of detection analysis of the results showed better or equivalent performance compared to the reference methods.

  19. New sensitive high-performance liquid chromatography-tandem mass spectrometry method for the detection of horse and pork in halal beef.

    PubMed

    von Bargen, Christoph; Dojahn, Jörg; Waidelich, Dietmar; Humpf, Hans-Ulrich; Brockmeyer, Jens

    2013-12-11

    The accidental or fraudulent blending of meat from different species is a highly relevant aspect for food product quality control, especially for consumers with ethical concerns against species, such as horse or pork. In this study, we present a sensitive mass spectrometrical approach for the detection of trace contaminations of horse meat and pork and demonstrate the specificity of the identified biomarker peptides against chicken, lamb, and beef. Biomarker peptides were identified by a shotgun proteomic approach using tryptic digests of protein extracts and were verified by the analysis of 21 different meat samples from the 5 species included in this study. For the most sensitive peptides, a multiple reaction monitoring (MRM) method was developed that allows for the detection of 0.55% horse or pork in a beef matrix. To enhance sensitivity, we applied MRM(3) experiments and were able to detect down to 0.13% pork contamination in beef. To the best of our knowledge, we present here the first rapid and sensitive mass spectrometrical method for the detection of horse and pork by use of MRM and MRM(3).

  20. A surface acoustic wave response detection method for passive wireless torque sensor

    NASA Astrophysics Data System (ADS)

    Fan, Yanping; Kong, Ping; Qi, Hongli; Liu, Hongye; Ji, Xiaojun

    2018-01-01

    This paper presents an effective surface acoustic wave (SAW) response detection method for the passive wireless SAW torque sensor to improve the measurement accuracy. An analysis was conducted on the relationship between the response energy-entropy and the bandwidth of SAW resonator (SAWR). A self-correlation method was modified to suppress the blurred white noise and highlight the attenuation characteristic of wireless SAW response. The SAW response was detected according to both the variation and the duration of energy-entropy ascension of an acquired RF signal. Numerical simulation results showed that the SAW response can be detected even when the signal-to-noise ratio (SNR) is 6dB. The proposed SAW response detection method was evaluated with several experiments at different conditions. The SAW response can be well distinguished from the sinusoidal signal and the noise. The performance of the SAW torque measurement system incorporating the detection method was tested. The obtained repeatability error was 0.23% and the linearity was 0.9934, indicating the validity of the detection method.

  1. Evaluation of two methods for direct detection of Fusarium spp. in water.

    PubMed

    Graça, Mariana G; van der Heijden, Inneke M; Perdigão, Lauro; Taira, Cleison; Costa, Silvia F; Levin, Anna S

    2016-04-01

    Fusarium is a waterborne fungus that causes severe infections especially in patients with prolonged neutropenia. Traditionally, the detection of Fusarium in water is done by culturing which is difficult and time consuming. A faster method is necessary to prevent exposure of susceptible patients to contaminated water. The objective of this study was to develop a molecular technique for direct detection of Fusarium in water. A direct DNA extraction method from water was developed and coupled to a genus-specific PCR, to detect 3 species of Fusarium (verticillioides, oxysporum and solani). The detection limits were 10 cells/L and 1 cell/L for the molecular and culture methods, respectively. To our knowledge, this is the first method developed to detect Fusarium directly from water. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Apparatus and methods for real-time detection of explosives devices

    DOEpatents

    Blackburn, Brandon W [Idaho Falls, ID; Hunt, Alan W [Pocatello, ID; Chichester, David L [Idaho Falls, ID

    2014-01-07

    The present disclosure relates, according to some embodiments, to apparatus, devices, systems, and/or methods for real-time detection of a concealed or camouflaged explosive device (e.g., EFPs and IEDs) from a safe stand-off distance. Apparatus, system and/or methods of the disclosure may also be operable to identify and/or spatially locate and/or detect an explosive device. An apparatus or system may comprise an x-ray generator that generates high-energy x-rays and/or electrons operable to contact and activate a metal comprised in an explosive device from a stand-off distance; and a detector operable to detect activation of the metal. Identifying an explosive device may comprise detecting characteristic radiation signatures emitted by metals specific to an EFP, an IED or a landmine. Apparatus and systems of the disclosure may be mounted on vehicles and methods of the disclosure may be performed while moving in the vehicle and from a safe stand-off distance.

  3. Detection Method of TOXOPLASMA GONDII Tachyzoites

    NASA Astrophysics Data System (ADS)

    Eassa, Souzan; Bose, Chhanda; Alusta, Pierre; Tarasenko, Olga

    2011-06-01

    Tachyzoites are considered to be the most important stage of Toxoplasma gondii which causes toxoplasmosis. T. gondii is, an obligate intracellular parasite which infects a wide range of cells. The present study was designed to develop a method for an early detection of T. gondii tachyzoites. The method comprised of a binding assay which was analyzed using principal component and cluster analysis. Our data showed that glycoconjugates GC1, GC2, GC3 and GC10 exhibit a significantly higher binding affinity for T. gondii tachyzoites as compared to controls (T. gondii only, PAA only, GC 1, 2, 3, and 10 only).

  4. Micro-vibration detection with heterodyne holography based on time-averaged method

    NASA Astrophysics Data System (ADS)

    Qin, XiaoDong; Pan, Feng; Chen, ZongHui; Hou, XueQin; Xiao, Wen

    2017-02-01

    We propose a micro-vibration detection method by introducing heterodyne interferometry to time-averaged holography. This method compensates for the deficiency of time-average holography in quantitative measurements and widens its range of application effectively. Acousto-optic modulators are used to modulate the frequencies of the reference beam and the object beam. Accurate detection of the maximum amplitude of each point in the vibration plane is performed by altering the frequency difference of both beams. The range of amplitude detection of plane vibration is extended. In the stable vibration mode, the distribution of the maximum amplitude of each point is measured and the fitted curves are plotted. Hence the plane vibration mode of the object is demonstrated intuitively and detected quantitatively. We analyzed the method in theory and built an experimental system with a sine signal as the excitation source and a typical piezoelectric ceramic plate as the target. The experimental results indicate that, within a certain error range, the detected vibration mode agrees with the intrinsic vibration characteristics of the object, thus proving the validity of this method.

  5. Comparison of formant detection methods used in speech processing applications

    NASA Astrophysics Data System (ADS)

    Belean, Bogdan

    2013-11-01

    The paper describes time frequency representations of speech signal together with the formant significance in speech processing applications. Speech formants can be used in emotion recognition, sex discrimination or diagnosing different neurological diseases. Taking into account the various applications of formant detection in speech signal, two methods for detecting formants are presented. First, the poles resulted after a complex analysis of LPC coefficients are used for formants detection. The second approach uses the Kalman filter for formant prediction along the speech signal. Results are presented for both approaches on real life speech spectrograms. A comparison regarding the features of the proposed methods is also performed, in order to establish which method is more suitable in case of different speech processing applications.

  6. Method and automated apparatus for detecting coliform organisms

    NASA Technical Reports Server (NTRS)

    Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)

    1980-01-01

    Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.

  7. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    PubMed

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Method and apparatus for detection of chemical vapors

    DOEpatents

    Mahurin, Shannon Mark [Knoxville, TN; Dai, Sheng [Knoxville, TN; Caja, Josip [Knoxville, TN

    2007-05-15

    The present invention is a gas detector and method for using the gas detector for detecting and identifying volatile organic and/or volatile inorganic substances present in unknown vapors in an environment. The gas detector comprises a sensing means and a detecting means for detecting electrical capacitance variance of the sensing means and for further identifying the volatile organic and volatile inorganic substances. The sensing means comprises at least one sensing unit and a sensing material allocated therein the sensing unit. The sensing material is an ionic liquid which is exposed to the environment and is capable of dissolving a quantity of said volatile substance upon exposure thereto. The sensing means constitutes an electrochemical capacitor and the detecting means is in electrical communication with the sensing means.

  9. Why conventional detection methods fail in identifying the existence of contamination events.

    PubMed

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A cascade method for TFT-LCD defect detection

    NASA Astrophysics Data System (ADS)

    Yi, Songsong; Wu, Xiaojun; Yu, Zhiyang; Mo, Zhuoya

    2017-07-01

    In this paper, we propose a novel cascade detection algorithm which focuses on point and line defects on TFT-LCD. At the first step of the algorithm, we use the gray level difference of su-bimage to segment the abnormal area. The second step is based on phase only transform (POT) which corresponds to the Discrete Fourier Transform (DFT), normalized by the magnitude. It can remove regularities like texture and noise. After that, we improve the method of setting regions of interest (ROI) with the method of edge segmentation and polar transformation. The algorithm has outstanding performance in both computation speed and accuracy. It can solve most of the defect detections including dark point, light point, dark line, etc.

  11. Adverse drug events with hyperkalaemia during inpatient stays: evaluation of an automated method for retrospective detection in hospital databases

    PubMed Central

    2014-01-01

    Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules’ analytical quality was evaluated for ADEs. Results In terms of recall, 89.5% of ADEs with hyperkalaemia “with or without an abnormal symptom” were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. Conclusions The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases. PMID:25212108

  12. Analysis of characteristics of Si in blast furnace pig iron and calibration methods in the detection by laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Mei, Yaguang; Cheng, Yuxin; Cheng, Shusen; Hao, Zhongqi; Guo, Lianbo; Li, Xiangyou; Zeng, Xiaoyan

    2017-10-01

    During the iron-making process in blast furnace, the Si content in liquid pig iron was usually used to evaluate the quality of liquid iron and thermal state of blast furnace. None effective method was found for rapid detecting the Si concentration of liquid iron. Laser-induced breakdown spectroscopy (LIBS) is a kind of atomic emission spectrometry technology based on laser ablation. Its obvious advantage is realizing rapid, in-situ, online analysis of element concentration in open air without sample pretreatment. The characteristics of Si in liquid iron were analyzed from the aspect of thermodynamic theory and metallurgical technology. The relationship between Si and C, Mn, S, P or other alloy elements were revealed based on thermodynamic calculation. Subsequently, LIBS was applied on rapid detection of Si of pig iron in this work. During LIBS detection process, several groups of standard pig iron samples were employed in this work to calibrate the Si content in pig iron. The calibration methods including linear, quadratic and cubic internal standard calibration, multivariate linear calibration and partial least squares (PLS) were compared with each other. It revealed that the PLS improved by normalization was the best calibration method for Si detection by LIBS.

  13. An improved EMD method for modal identification and a combined static-dynamic method for damage detection

    NASA Astrophysics Data System (ADS)

    Yang, Jinping; Li, Peizhen; Yang, Youfa; Xu, Dian

    2018-04-01

    Empirical mode decomposition (EMD) is a highly adaptable signal processing method. However, the EMD approach has certain drawbacks, including distortions from end effects and mode mixing. In the present study, these two problems are addressed using an end extension method based on the support vector regression machine (SVRM) and a modal decomposition method based on the characteristics of the Hilbert transform. The algorithm includes two steps: using the SVRM, the time series data are extended at both endpoints to reduce the end effects, and then, a modified EMD method using the characteristics of the Hilbert transform is performed on the resulting signal to reduce mode mixing. A new combined static-dynamic method for identifying structural damage is presented. This method combines the static and dynamic information in an equilibrium equation that can be solved using the Moore-Penrose generalized matrix inverse. The combination method uses the differences in displacements of the structure with and without damage and variations in the modal force vector. Tests on a four-story, steel-frame structure were conducted to obtain static and dynamic responses of the structure. The modal parameters are identified using data from the dynamic tests and improved EMD method. The new method is shown to be more accurate and effective than the traditional EMD method. Through tests with a shear-type test frame, the higher performance of the proposed static-dynamic damage detection approach, which can detect both single and multiple damage locations and the degree of the damage, is demonstrated. For structures with multiple damage, the combined approach is more effective than either the static or dynamic method. The proposed EMD method and static-dynamic damage detection method offer improved modal identification and damage detection, respectively, in structures.

  14. Evaluating the Good Ontology Design Guideline (GoodOD) with the ontology quality requirements and evaluation method and metrics (OQuaRE).

    PubMed

    Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás

    2014-01-01

    To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies.

  15. Ozone Air Quality over North America: Part II-An Analysis of Trend Detection and Attribution Techniques.

    PubMed

    Porter, P Steven; Rao, S Trivikrama; Zurbenko, Igor G; Dunker, Alan M; Wolff, George T

    2001-02-01

    Assessment of regulatory programs aimed at improving ambient O 3 air quality is of considerable interest to the scientific community and to policymakers. Trend detection, the identification of statistically significant long-term changes, and attribution, linking change to specific clima-tological and anthropogenic forcings, are instrumental to this assessment. Detection and attribution are difficult because changes in pollutant concentrations of interest to policymakers may be much smaller than natural variations due to weather and climate. In addition, there are considerable differences in reported trends seemingly based on similar statistical methods and databases. Differences arise from the variety of techniques used to reduce nontrend variation in time series, including mitigating the effects of meteorology and the variety of metrics used to track changes. In this paper, we review the trend assessment techniques being used in the air pollution field and discuss their strengths and limitations in discerning and attributing changes in O 3 to emission control policies.

  16. Railway clearance intrusion detection method with binocular stereo vision

    NASA Astrophysics Data System (ADS)

    Zhou, Xingfang; Guo, Baoqing; Wei, Wei

    2018-03-01

    In the stage of railway construction and operation, objects intruding railway clearance greatly threaten the safety of railway operation. Real-time intrusion detection is of great importance. For the shortcomings of depth insensitive and shadow interference of single image method, an intrusion detection method with binocular stereo vision is proposed to reconstruct the 3D scene for locating the objects and judging clearance intrusion. The binocular cameras are calibrated with Zhang Zhengyou's method. In order to improve the 3D reconstruction speed, a suspicious region is firstly determined by background difference method of a single camera's image sequences. The image rectification, stereo matching and 3D reconstruction process are only executed when there is a suspicious region. A transformation matrix from Camera Coordinate System(CCS) to Track Coordinate System(TCS) is computed with gauge constant and used to transfer the 3D point clouds into the TCS, then the 3D point clouds are used to calculate the object position and intrusion in TCS. The experiments in railway scene show that the position precision is better than 10mm. It is an effective way for clearance intrusion detection and can satisfy the requirement of railway application.

  17. Quality metrics for sensor images

    NASA Technical Reports Server (NTRS)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery

  18. Comparison of Soil Quality Index Using Three Methods

    PubMed Central

    Mukherjee, Atanu; Lal, Rattan

    2014-01-01

    Assessment of management-induced changes in soil quality is important to sustaining high crop yield. A large diversity of cultivated soils necessitate identification development of an appropriate soil quality index (SQI) based on relative soil properties and crop yield. Whereas numerous attempts have been made to estimate SQI for major soils across the World, there is no standard method established and thus, a strong need exists for developing a user-friendly and credible SQI through comparison of various available methods. Therefore, the objective of this article is to compare three widely used methods to estimate SQI using the data collected from 72 soil samples from three on-farm study sites in Ohio. Additionally, challenge lies in establishing a correlation between crop yield versus SQI calculated either depth wise or in combination of soil layers as standard methodology is not yet available and was not given much attention to date. Predominant soils of the study included one organic (Mc), and two mineral (CrB, Ko) soils. Three methods used to estimate SQI were: (i) simple additive SQI (SQI-1), (ii) weighted additive SQI (SQI-2), and (iii) statistically modeled SQI (SQI-3) based on principal component analysis (PCA). The SQI varied between treatments and soil types and ranged between 0–0.9 (1 being the maximum SQI). In general, SQIs did not significantly differ at depths under any method suggesting that soil quality did not significantly differ for different depths at the studied sites. Additionally, data indicate that SQI-3 was most strongly correlated with crop yield, the correlation coefficient ranged between 0.74–0.78. All three SQIs were significantly correlated (r = 0.92–0.97) to each other and with crop yield (r = 0.65–0.79). Separate analyses by crop variety revealed that correlation was low indicating that some key aspects of soil quality related to crop response are important requirements for estimating SQI. PMID:25148036

  19. Quality control and quality assurance in genotypic data for genome-wide association studies

    PubMed Central

    Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.

    2011-01-01

    Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045

  20. [Analgesic quality in a postoperative pain service: continuous assessment with the cumulative sum (cusum) method].

    PubMed

    Baptista Macaroff, W M; Castroman Espasandín, P

    2007-01-01

    The aim of this study was to assess the cumulative sum (cusum) method for evaluating the performance of our hospital's acute postoperative pain service. The period of analysis was 7 months. Analgesic failure was defined as a score of 3 points or more on a simple numerical scale. Acceptable failure (p0) was set at 20% of patients upon admission to the postanesthetic recovery unit and at 7% 24 hours after surgery. Unacceptable failure was set at double the p0 rate at each time (40% and 14%, respectively). The unit's patient records were used to generate a cusum graph for each evaluation. Nine hundred four records were included. The rate of failure was 31.6% upon admission to the unit and 12.1% at the 24-hour postoperative assessment. The curve rose rapidly to the value set for p0 at both evaluation times (n = 14 and n = 17, respectively), later leveled off, and began to fall after 721 and 521 cases, respectively. Our study shows the efficacy of the cusum method for monitoring a proposed quality standard. The graph also showed periods of suboptimal performance that would not have been evident from analyzing the data en block. Thus the cusum method would facilitate rapid detection of periods in which quality declines.

  1. Methods and systems for remote detection of gases

    DOEpatents

    Johnson, Timothy J.

    2007-11-27

    Novel systems and methods for remotely detecting at least one constituent of a gas via infrared detection are provided. A system includes at least one extended source of broadband infrared radiation and a spectrally sensitive receiver positioned remotely from the source. The source and the receiver are oriented such that a surface of the source is in the field of view of the receiver. The source includes a heating component thermally coupled to the surface, and the heating component is configured to heat the surface to a temperature above ambient temperature. The receiver is operable to collect spectral infrared absorption data representative of a gas present between the source and the receiver. The invention advantageously overcomes significant difficulties associated with active infrared detection techniques known in the art, and provides an infrared detection technique with a much greater sensitivity than passive infrared detection techniques known in the art.

  2. Methods and systems for remote detection of gases

    DOEpatents

    Johnson, Timothy J

    2012-09-18

    Novel systems and methods for remotely detecting at least one constituent of a gas via infrared detection are provided. A system includes at least one extended source of broadband infrared radiation and a spectrally sensitive receiver positioned remotely from the source. The source and the receiver are oriented such that a surface of the source is in the field of view of the receiver. The source includes a heating component thermally coupled to the surface, and the heating component is configured to heat the surface to a temperature above ambient temperature. The receiver is operable to collect spectral infrared absorption data representative of a gas present between the source and the receiver. The invention advantageously overcomes significant difficulties associated with active infrared detection techniques known in the art, and provides an infrared detection technique with a much greater sensitivity than passive infrared detection techniques known in the art.

  3. 7 CFR 58.133 - Methods for quality and wholesomeness determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and Grading Service 1 Quality Specifications for Raw Milk § 58.133 Methods for quality and wholesomeness determination. (a) Appearance and odor. The appearance of acceptable raw milk shall be normal and...

  4. 7 CFR 58.133 - Methods for quality and wholesomeness determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Grading Service 1 Quality Specifications for Raw Milk § 58.133 Methods for quality and wholesomeness determination. (a) Appearance and odor. The appearance of acceptable raw milk shall be normal and...

  5. Rapid methods for the detection of foodborne bacterial pathogens: principles, applications, advantages and limitations

    PubMed Central

    Law, Jodi Woan-Fei; Ab Mutalib, Nurul-Syakima; Chan, Kok-Gan; Lee, Learn-Han

    2015-01-01

    The incidence of foodborne diseases has increased over the years and resulted in major public health problem globally. Foodborne pathogens can be found in various foods and it is important to detect foodborne pathogens to provide safe food supply and to prevent foodborne diseases. The conventional methods used to detect foodborne pathogen are time consuming and laborious. Hence, a variety of methods have been developed for rapid detection of foodborne pathogens as it is required in many food analyses. Rapid detection methods can be categorized into nucleic acid-based, biosensor-based and immunological-based methods. This review emphasizes on the principles and application of recent rapid methods for the detection of foodborne bacterial pathogens. Detection methods included are simple polymerase chain reaction (PCR), multiplex PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), loop-mediated isothermal amplification (LAMP) and oligonucleotide DNA microarray which classified as nucleic acid-based methods; optical, electrochemical and mass-based biosensors which classified as biosensor-based methods; enzyme-linked immunosorbent assay (ELISA) and lateral flow immunoassay which classified as immunological-based methods. In general, rapid detection methods are generally time-efficient, sensitive, specific and labor-saving. The developments of rapid detection methods are vital in prevention and treatment of foodborne diseases. PMID:25628612

  6. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance

    USGS Publications Warehouse

    Clare, John; McKinney, Shawn T.; DePue, John E.; Loftin, Cynthia S.

    2017-01-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture–recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten

  7. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance.

    PubMed

    Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S

    2017-10-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture-recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten

  8. Exploring the clinical potential of an automatic colonic polyp detection method based on the creation of energy maps.

    PubMed

    Fernández-Esparrach, Glòria; Bernal, Jorge; López-Cerón, Maria; Córdova, Henry; Sánchez-Montes, Cristina; Rodríguez de Miguel, Cristina; Sánchez, Francisco Javier

    2016-09-01

    Polyp miss-rate is a drawback of colonoscopy that increases significantly for small polyps. We explored the efficacy of an automatic computer-vision method for polyp detection. Our method relies on a model that defines polyp boundaries as valleys of image intensity. Valley information is integrated into energy maps that represent the likelihood of the presence of a polyp. In 24 videos containing polyps from routine colonoscopies, all polyps were detected in at least one frame. The mean of the maximum values on the energy map was higher for frames with polyps than without (P < 0.001). Performance improved in high quality frames (AUC = 0.79 [95 %CI 0.70 - 0.87] vs. 0.75 [95 %CI 0.66 - 0.83]). With 3.75 set as the maximum threshold value, sensitivity and specificity for the detection of polyps were 70.4 % (95 %CI 60.3 % - 80.8 %) and 72.4 % (95 %CI 61.6 % - 84.6 %), respectively. Energy maps performed well for colonic polyp detection, indicating their potential applicability in clinical practice. © Georg Thieme Verlag KG Stuttgart · New York.

  9. An Energy-Efficient Cluster-Based Vehicle Detection on Road Network Using Intention Numeration Method

    PubMed Central

    Devasenapathy, Deepa; Kannan, Kathiravan

    2015-01-01

    The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate. PMID:25793221

  10. An energy-efficient cluster-based vehicle detection on road network using intention numeration method.

    PubMed

    Devasenapathy, Deepa; Kannan, Kathiravan

    2015-01-01

    The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate.

  11. A multi points ultrasonic detection method for material flow of belt conveyor

    NASA Astrophysics Data System (ADS)

    Zhang, Li; He, Rongjun

    2018-03-01

    For big detection error of single point ultrasonic ranging technology used in material flow detection of belt conveyor when coal distributes unevenly or is large, a material flow detection method of belt conveyor is designed based on multi points ultrasonic counter ranging technology. The method can calculate approximate sectional area of material by locating multi points on surfaces of material and belt, in order to get material flow according to running speed of belt conveyor. The test results show that the method has smaller detection error than single point ultrasonic ranging technology under the condition of big coal with uneven distribution.

  12. Towards a Quality Assessment Method for Learning Preference Profiles in Negotiation

    NASA Astrophysics Data System (ADS)

    Hindriks, Koen V.; Tykhonov, Dmytro

    In automated negotiation, information gained about an opponent's preference profile by means of learning techniques may significantly improve an agent's negotiation performance. It therefore is useful to gain a better understanding of how various negotiation factors influence the quality of learning. The quality of learning techniques in negotiation are typically assessed indirectly by means of comparing the utility levels of agreed outcomes and other more global negotiation parameters. An evaluation of learning based on such general criteria, however, does not provide any insight into the influence of various aspects of negotiation on the quality of the learned model itself. The quality may depend on such aspects as the domain of negotiation, the structure of the preference profiles, the negotiation strategies used by the parties, and others. To gain a better understanding of the performance of proposed learning techniques in the context of negotiation and to be able to assess the potential to improve the performance of such techniques a more systematic assessment method is needed. In this paper we propose such a systematic method to analyse the quality of the information gained about opponent preferences by learning in single-instance negotiations. The method includes measures to assess the quality of a learned preference profile and proposes an experimental setup to analyse the influence of various negotiation aspects on the quality of learning. We apply the method to a Bayesian learning approach for learning an opponent's preference profile and discuss our findings.

  13. Detection and enumeration of Salmonella enteritidis in homemade ice cream associated with an outbreak: comparison of conventional and real-time PCR methods.

    PubMed

    Seo, K H; Valentin-Bon, I E; Brackett, R E

    2006-03-01

    Salmonellosis caused by Salmonella Enteritidis (SE) is a significant cause of foodborne illnesses in the United States. Consumption of undercooked eggs and egg-containing products has been the primary risk factor for the disease. The importance of the bacterial enumeration technique has been enormously stressed because of the quantitative risk analysis of SE in shell eggs. Traditional enumeration methods mainly depend on slow and tedious most-probable-number (MPN) methods. Therefore, specific, sensitive, and rapid methods for SE quantitation are needed to collect sufficient data for risk assessment and food safety policy development. We previously developed a real-time quantitative PCR assay for the direct detection and enumeration of SE and, in this study, applied it to naturally contaminated ice cream samples with and without enrichment. The detection limit of the real-time PCR assay was determined with artificially inoculated ice cream. When applied to the direct detection and quantification of SE in ice cream, the real-time PCR assay was as sensitive as the conventional plate count method in frequency of detection. However, populations of SE derived from real-time quantitative PCR were approximately 1 log higher than provided by MPN and CFU values obtained by conventional culture methods. The detection and enumeration of SE in naturally contaminated ice cream can be completed in 3 h by this real-time PCR method, whereas the cultural enrichment method requires 5 to 7 days. A commercial immunoassay for the specific detection of SE was also included in the study. The real-time PCR assay proved to be a valuable tool that may be useful to the food industry in monitoring its processes to improve product quality and safety.

  14. Application of a Subspace-Based Fault Detection Method to Industrial Structures

    NASA Astrophysics Data System (ADS)

    Mevel, L.; Hermans, L.; van der Auweraer, H.

    1999-11-01

    Early detection and localization of damage allow increased expectations of reliability, safety and reduction of the maintenance cost. This paper deals with the industrial validation of a technique to monitor the health of a structure in operating conditions (e.g. rotating machinery, civil constructions subject to ambient excitations, etc.) and to detect slight deviations in a modal model derived from in-operation measured data. In this paper, a statistical local approach based on covariance-driven stochastic subspace identification is proposed. The capabilities and limitations of the method with respect to health monitoring and damage detection are discussed and it is explained how the method can be practically used in industrial environments. After the successful validation of the proposed method on a few laboratory structures, its application to a sports car is discussed. The example illustrates that the method allows the early detection of a vibration-induced fatigue problem of a sports car.

  15. Modern methods for the quality management of high-rate melt solidification

    NASA Astrophysics Data System (ADS)

    Vasiliev, V. A.; Odinokov, S. A.; Serov, M. M.

    2016-12-01

    The quality management of high-rate melt solidification needs combined solution obtained by methods and approaches adapted to a certain situation. Technological audit is recommended to estimate the possibilities of the process. Statistical methods are proposed with the choice of key parameters. Numerical methods, which can be used to perform simulation under multifactor technological conditions, and an increase in the quality of decisions are of particular importance.

  16. A review of data quality assessment methods for public health information systems.

    PubMed

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-05-14

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users' concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process.

  17. A Review of Data Quality Assessment Methods for Public Health Information Systems

    PubMed Central

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-01-01

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process. PMID:24830450

  18. Minimum detectable gas concentration performance evaluation method for gas leak infrared imaging detection systems.

    PubMed

    Zhang, Xu; Jin, Weiqi; Li, Jiakun; Wang, Xia; Li, Shuo

    2017-04-01

    Thermal imaging technology is an effective means of detecting hazardous gas leaks. Much attention has been paid to evaluation of the performance of gas leak infrared imaging detection systems due to several potential applications. The minimum resolvable temperature difference (MRTD) and the minimum detectable temperature difference (MDTD) are commonly used as the main indicators of thermal imaging system performance. This paper establishes a minimum detectable gas concentration (MDGC) performance evaluation model based on the definition and derivation of MDTD. We proposed the direct calculation and equivalent calculation method of MDGC based on the MDTD measurement system. We build an experimental MDGC measurement system, which indicates the MDGC model can describe the detection performance of a thermal imaging system to typical gases. The direct calculation, equivalent calculation, and direct measurement results are consistent. The MDGC and the minimum resolvable gas concentration (MRGC) model can effectively describe the performance of "detection" and "spatial detail resolution" of thermal imaging systems to gas leak, respectively, and constitute the main performance indicators of gas leak detection systems.

  19. Methods for Detection of Mitochondrial and Cellular Reactive Oxygen Species

    PubMed Central

    Harrison, David G.

    2014-01-01

    Abstract Significance: Mitochondrial and cellular reactive oxygen species (ROS) play important roles in both physiological and pathological processes. Different ROS, such as superoxide (O2•−), hydrogen peroxide, and peroxynitrite (ONOO•−), stimulate distinct cell-signaling pathways and lead to diverse outcomes depending on their amount and subcellular localization. A variety of methods have been developed for ROS detection; however, many of these methods are not specific, do not allow subcellular localization, and can produce artifacts. In this review, we will critically analyze ROS detection and present advantages and the shortcomings of several available methods. Recent Advances: In the past decade, a number of new fluorescent probes, electron-spin resonance approaches, and immunoassays have been developed. These new state-of-the-art methods provide improved selectivity and subcellular resolution for ROS detection. Critical Issues: Although new methods for HPLC superoxide detection, application of fluorescent boronate-containing probes, use of cell-targeted hydroxylamine spin probes, and immunospin trapping have been available for several years, there has been lack of translation of these into biomedical research, limiting their widespread use. Future Directions: Additional studies to translate these new technologies from the test tube to physiological applications are needed and could lead to a wider application of these approaches to study mitochondrial and cellular ROS. Antioxid. Redox Signal. 20, 372–382. PMID:22978713

  20. Practices and exploration on competition of molecular biological detection technology among students in food quality and safety major.

    PubMed

    Chang, Yaning; Peng, Yuke; Li, Pengfei; Zhuang, Yingping

    2017-07-08

    With the increasing importance in the application of the molecular biological detection technology in the field of food safety, strengthening education in molecular biology experimental techniques is more necessary for the culture of the students in food quality and safety major. However, molecular biology experiments are not always in curricula of Food quality and safety Majors. This paper introduced a project "competition of molecular biological detection technology for food safety among undergraduate sophomore students in food quality and safety major", students participating in this project needed to learn the fundamental molecular biology experimental techniques such as the principles of molecular biology experiments and genome extraction, PCR and agarose gel electrophoresis analysis, and then design the experiments in groups to identify the meat species in pork and beef products using molecular biological methods. The students should complete the experimental report after basic experiments, write essays and make a presentation after the end of the designed experiments. This project aims to provide another way for food quality and safety majors to improve their knowledge of molecular biology, especially experimental technology, and enhances them to understand the scientific research activities as well as give them a chance to learn how to write a professional thesis. In addition, in line with the principle of an open laboratory, the project is also open to students in other majors in East China University of Science and Technology, in order to enhance students in other majors to understand the fields of molecular biology and food safety. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(4):343-350, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  1. Method of Detecting Coliform Bacteria and Escherichia Coli Bacteria from Reflected Light

    NASA Technical Reports Server (NTRS)

    Vincent, Robert (Inventor)

    2013-01-01

    The present invention relates to a method of detecting coliform bacteria in water from reflected light and a method of detecting Eschericha Coli bacteria in water from reflected light, and also includes devices for the measurement, calculation and transmission of data relating to that method.

  2. ULTRASONIC FLAW DETECTION METHOD AND MEANS

    DOEpatents

    Worlton, D.C.

    1961-08-15

    A method of detecting subsurface flaws in an object using ultrasonic waves is described. An ultnasonic wave of predetermined velocity and frequency is transmitted to engage the surface of the object at a predetermined angle of inci dence thereto. The incident angle of the wave to the surface is determined with respect to phase velocity, incident wave velocity, incident wave frequency, and the estimated depth of the flaw so that Lamb waves of a particular type and mode are induced only in the portion of the object between the flaw and the surface. These Lamb waves are then detected as they leave the object at an angle of exit equal to the angle of incidence. No waves wlll be generated in the object and hence received if no flaw exists beneath the surface. (AEC)

  3. Method and apparatus for acoustic plate mode liquid-solid phase transition detection

    DOEpatents

    Blair, Dianna S.; Freye, Gregory C.; Hughes, Robert C.; Martin, Stephen J.; Ricco, Antonio J.

    1993-01-01

    A method and apparatus for sensing a liquid-solid phase transition event is provided which comprises an acoustic plate mode detecting element placed in contact with a liquid or solid material which generates a high-frequency acoustic wave that is attenuated to an extent based on the physical state of the material is contact with the detecting element. The attenuation caused by the material in contact with the acoustic plate mode detecting element is used to determine the physical state of the material being detected. The method and device are particularly suited for detecting conditions such as the icing and deicing of wings of an aircraft. In another aspect of the present invention, a method is provided wherein the adhesion of a solid material to the detecting element can be measured using the apparatus of the invention.

  4. Method of enhancing radiation response of radiation detection materials

    DOEpatents

    Miller, Steven D.

    1997-01-01

    The present invention is a method of increasing radiation response of a radiation detection material for a given radiation signal by first pressurizing the radiation detection material. Pressurization may be accomplished by any means including mechanical and/or hydraulic. In this application, the term "pressure" includes fluid pressure and/or mechanical stress.

  5. Remote sensing image ship target detection method based on visual attention model

    NASA Astrophysics Data System (ADS)

    Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong

    2017-11-01

    The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.

  6. Pornographic information of Internet views detection method based on the connected areas

    NASA Astrophysics Data System (ADS)

    Wang, Huibai; Fan, Ajie

    2017-01-01

    Nowadays online porn video broadcasting and downloading is very popular. In view of the widespread phenomenon of Internet pornography, this paper proposed a new method of pornographic video detection based on connected areas. Firstly, decode the video into a serious of static images and detect skin color on the extracted key frames. If the area of skin color reaches a certain threshold, use the AdaBoost algorithm to detect the human face. Judge the connectivity of the human face and the large area of skin color to determine whether detect the sensitive area finally. The experimental results show that the method can effectively remove the non-pornographic videos contain human who wear less. This method can improve the efficiency and reduce the workload of detection.

  7. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  8. "RCL-Pooling Assay": A Simplified Method for the Detection of Replication-Competent Lentiviruses in Vector Batches Using Sequential Pooling.

    PubMed

    Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne

    2016-02-01

    Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.

  9. Current and Prospective Methods for Plant Disease Detection

    PubMed Central

    Fang, Yi; Ramasamy, Ramaraja P.

    2015-01-01

    Food losses due to crop infections from pathogens such as bacteria, viruses and fungi are persistent issues in agriculture for centuries across the globe. In order to minimize the disease induced damage in crops during growth, harvest and postharvest processing, as well as to maximize productivity and ensure agricultural sustainability, advanced disease detection and prevention in crops are imperative. This paper reviews the direct and indirect disease identification methods currently used in agriculture. Laboratory-based techniques such as polymerase chain reaction (PCR), immunofluorescence (IF), fluorescence in-situ hybridization (FISH), enzyme-linked immunosorbent assay (ELISA), flow cytometry (FCM) and gas chromatography-mass spectrometry (GC-MS) are some of the direct detection methods. Indirect methods include thermography, fluorescence imaging and hyperspectral techniques. Finally, the review also provides a comprehensive overview of biosensors based on highly selective bio-recognition elements such as enzyme, antibody, DNA/RNA and bacteriophage as a new tool for the early identification of crop diseases. PMID:26287253

  10. Infrared video based gas leak detection method using modified FAST features

    NASA Astrophysics Data System (ADS)

    Wang, Min; Hong, Hanyu; Huang, Likun

    2018-03-01

    In order to detect the invisible leaking gas that is usually dangerous and easily leads to fire or explosion in time, many new technologies have arisen in the recent years, among which the infrared video based gas leak detection is widely recognized as a viable tool. However, all the moving regions of a video frame can be detected as leaking gas regions by the existing infrared video based gas leak detection methods, without discriminating the property of each detected region, e.g., a walking person in a video frame may be also detected as gas by the current gas leak detection methods.To solve this problem, we propose a novel infrared video based gas leak detection method in this paper, which is able to effectively suppress strong motion disturbances.Firstly, the Gaussian mixture model(GMM) is used to establish the background model.Then due to the observation that the shapes of gas regions are different from most rigid moving objects, we modify the Features From Accelerated Segment Test (FAST) algorithm and use the modified FAST (mFAST) features to describe each connected component. In view of the fact that the statistical property of the mFAST features extracted from gas regions is different from that of other motion regions, we propose the Pixel-Per-Points (PPP) condition to further select candidate connected components.Experimental results show that the algorithm is able to effectively suppress most strong motion disturbances and achieve real-time leaking gas detection.

  11. Method for the detection of Salmonella enterica serovar Enteritidis

    DOEpatents

    Agron, Peter G.; Andersen, Gary L.; Walker, Richard L.

    2008-10-28

    Described herein is the identification of a novel Salmonella enterica serovar Enteritidis locus that serves as a marker for DNA-based identification of this bacterium. In addition, three primer pairs derived from this locus that may be used in a nucleotide detection method to detect the presence of the bacterium are also disclosed herein.

  12. Marine Biotoxins: Occurrence, Toxicity, and Detection Methods

    NASA Astrophysics Data System (ADS)

    Asakawa, M.

    2017-04-01

    This review summarizes the role of marine organisms as vectors of marine biotoxins, and discusses the need for surveillance to protect public health and ensure the quality of seafood. I Paralytic shellfish poison (PSP) and PSP-bearing organisms-PSP is produced by toxic dinoflagellates species belonging to the genera Alexandrium, Gymnodinium, and Pyrodinium. Traditionally, PSP monitoring programs have only considered filter-feeding molluscs that concentrate these toxic algae, however, increasing attention is now being paid to higher-order predators that carry PSP, such as carnivorous gastropods and crustaceans. II. Tetrodotoxin (TTX) and TTX-bearing organisms - TTX is the most common natural marine toxin that causes food poisonings in Japan, and poses a serious public health risk. TTX was long believed to be present only in pufferfish. However, TTX was detected in the eggs of California newt Taricha torosa in 1964, and since then it has been detected in a wide variety of species belonging to several different phyla. In this study, the main toxic components in the highly toxic ribbon worm Cephalothrix simula and the greater blue-ringed octopus Hapalochlaena lunulata from Japan were purified and analysed.

  13. Immunochemical Detection Methods for Gluten in Food Products: Where Do We Go from Here?

    PubMed

    Slot, I D Bruins; van der Fels-Klerx, H J; Bremer, M G E G; Hamer, R J

    2016-11-17

    Accurate and reliable quantification methods for gluten in food are necessary to ensure proper product labeling and thus safeguard the gluten sensitive consumer against exposure. Immunochemical detection is the method of choice, as it is sensitive, rapid and relatively easy to use. Although a wide range of detection kits are commercially available, there are still many difficulties in gluten detection that have not yet been overcome. This review gives an overview of the currently commercially available immunochemical detection methods, and discusses the problems that still exist in gluten detection in food. The largest problems are encountered in the extraction of gluten from food matrices, the choice of epitopes targeted by the detection method, and the use of a standardized reference material. By comparing the available techniques with the unmet needs in gluten detection, the possible benefit of a new multiplex immunoassay is investigated. This detection method would allow for the detection and quantification of multiple harmful gluten peptides at once and would, therefore, be a logical advancement in gluten detection in food.

  14. Water Quality Evaluation of the Yellow River Basin Based on Gray Clustering Method

    NASA Astrophysics Data System (ADS)

    Fu, X. Q.; Zou, Z. H.

    2018-03-01

    Evaluating the water quality of 12 monitoring sections in the Yellow River Basin comprehensively by grey clustering method based on the water quality monitoring data from the Ministry of environmental protection of China in May 2016 and the environmental quality standard of surface water. The results can reflect the water quality of the Yellow River Basin objectively. Furthermore, the evaluation results are basically the same when compared with the fuzzy comprehensive evaluation method. The results also show that the overall water quality of the Yellow River Basin is good and coincident with the actual situation of the Yellow River basin. Overall, gray clustering method for water quality evaluation is reasonable and feasible and it is also convenient to calculate.

  15. Fluorescence-based methods for the detection of pressure-induced spore germination and inactivation

    NASA Astrophysics Data System (ADS)

    Baier, Daniel; Reineke, Kai; Doehner, Isabel; Mathys, Alexander; Knorr, Dietrich

    2011-03-01

    The application of high pressure (HP) provides an opportunity for the non-thermal preservation of high-quality foods, whereas highly resistant bacterial endospores play an important role. It is known that the germination of spores can be initiated by the application of HP. Moreover, the resistance properties of spores are highly dependent on their physiological states, which are passed through during the germination. To distinguish between different physiological states and to detect the amount of germinated spores after HP treatments, two fluorescence-based methods were applied. A flow cytometric method using a double staining with SYTO 16 as an indicator for germination and propidium iodide as an indicator for membrane damage was used to detect different physiological states of the spores. During the first step of germination, the spore-specific dipicolinic acid (DPA) is released [P. Setlow, Spore germination, Curr. Opin. Microbiol. 6 (2003), pp. 550-556]. DPA reacts with added terbium to form a distinctive fluorescent complex. After measuring the fluorescence intensity at 270 nm excitation wavelength in a fluorescence spectrophotometer, the amount of germinated spores can be determined. Spores of Bacillus subtilis were treated at pressures from 150 to 600 MPa and temperatures from 37 °C to 60 °C in 0.05 M ACES buffer solution (pH 7) for dwell times of up to 2 h. During the HP treatments, inactivation up to 2log 10 cycles and thermal sensitive populations up to 4log 10 cycles could be detected by plate counts. With an increasing number of thermal sensitive spores, an increased proportion of spores in germinated states was detected by flow cytometry. Also the released amount of DPA increased during the dwell times. Moreover, a clear pressure-temperature-time-dependency was shown by screening different conditions. The fluorescence-based measurement of the released DPA can provide the opportunity of an online monitoring of the germination of spores under HP inside

  16. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  17. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  18. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    USGS Publications Warehouse

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    volume grab samples were collected for direct-plating analyses for bacterial indicators and coliphage. After filtration, the viruses were eluted from the filter and further concentrated. The final concentrated sample volume (FCSV) was used for enteric virus analysis by use of two methods-cell culture and a molecular method, polymerase chain reaction (PCR). Quantitative PCR (qPCR) for DNA viruses and quantitative reverse-transcriptase PCR (qRT-PCR) for RNA viruses were used in this study. To support data interpretations, the assay limit of detection (ALOD) was set for each virus assay and used to determine sample reporting limits (SRLs). For qPCR and qRT-PCR the ALOD was an estimated value because it was not established according to established method detection limit procedures. The SRLs were different for each sample because effective sample volumes (the volume of the original sample that was actually used in each analysis) were different for each sample. Effective sample volumes were much less than the original sample volumes because of reductions from processing steps and (or) from when dilutions were made to minimize the effects from PCR-inhibiting substances. Codes were used to further qualify the virus data and indicate the level of uncertainty associated with each measurement. Quality-control samples were used to support data interpretations. Field and laboratory blanks for bacteria, coliphage, and enteric viruses were all below detection, indicating that it was unlikely that samples were contaminated from equipment or processing procedures. The absolute value log differences (AVLDs) between concurrent replicate pairs were calculated to identify the variability associated with each measurement. For bacterial indicators and coliphage, the AVLD results indicated that concentrations <10 colony-forming units or plaque-forming units per 100 mL can differ between replicates by as much as 1 log, whereas higher concentrations can differ by as much as 0.3 log. The AVLD

  19. Tweets about hospital quality: a mixed methods study

    PubMed Central

    Greaves, Felix; Laverty, Antony A; Cano, Daniel Ramirez; Moilanen, Karo; Pulman, Stephen; Darzi, Ara; Millett, Christopher

    2014-01-01

    Background Twitter is increasingly being used by patients to comment on their experience of healthcare. This may provide information for understanding the quality of healthcare providers and improving services. Objective To examine whether tweets sent to hospitals in the English National Health Service contain information about quality of care. To compare sentiment on Twitter about hospitals with established survey measures of patient experience and standardised mortality rates. Design A mixed methods study including a quantitative analysis of all 198 499 tweets sent to English hospitals over a year and a qualitative directed content analysis of 1000 random tweets. Twitter sentiment and conventional quality metrics were compared using Spearman's rank correlation coefficient. Key results 11% of tweets to hospitals contained information about care quality, with the most frequent topic being patient experience (8%). Comments on effectiveness or safety of care were present, but less common (3%). 77% of tweets about care quality were positive in tone. Other topics mentioned in tweets included messages of support to patients, fundraising activity, self-promotion and dissemination of health information. No associations were observed between Twitter sentiment and conventional quality metrics. Conclusions Only a small proportion of tweets directed at hospitals discuss quality of care and there was no clear relationship between Twitter sentiment and other measures of quality, potentially limiting Twitter as a medium for quality monitoring. However, tweets did contain information useful to target quality improvement activity. Recent enthusiasm by policy makers to use social media as a quality monitoring and improvement tool needs to be carefully considered and subjected to formal evaluation. PMID:24748372

  20. Improvement of automatic hemorrhage detection methods using brightness correction on fundus images

    NASA Astrophysics Data System (ADS)

    Hatanaka, Yuji; Nakagawa, Toshiaki; Hayashi, Yoshinori; Kakogawa, Masakatsu; Sawada, Akira; Kawase, Kazuhide; Hara, Takeshi; Fujita, Hiroshi

    2008-03-01

    We have been developing several automated methods for detecting abnormalities in fundus images. The purpose of this study is to improve our automated hemorrhage detection method to help diagnose diabetic retinopathy. We propose a new method for preprocessing and false positive elimination in the present study. The brightness of the fundus image was changed by the nonlinear curve with brightness values of the hue saturation value (HSV) space. In order to emphasize brown regions, gamma correction was performed on each red, green, and blue-bit image. Subsequently, the histograms of each red, blue, and blue-bit image were extended. After that, the hemorrhage candidates were detected. The brown regions indicated hemorrhages and blood vessels and their candidates were detected using density analysis. We removed the large candidates such as blood vessels. Finally, false positives were removed by using a 45-feature analysis. To evaluate the new method for the detection of hemorrhages, we examined 125 fundus images, including 35 images with hemorrhages and 90 normal images. The sensitivity and specificity for the detection of abnormal cases was were 80% and 88%, respectively. These results indicate that the new method may effectively improve the performance of our computer-aided diagnosis system for hemorrhages.

  1. Initial Results in Using a Self-Coherence Method for Detecting Sustained Oscillations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Dagle, Jeffery E.

    2015-01-01

    This paper develops a self-coherence method for detecting sustained oscillations using phasor measurement unit (PMU) data. Sustained oscillations decrease system performance and introduce potential reliability issues. Timely detection of the oscillations at an early stage provides the opportunity for taking remedial reaction. Using high-speed time-synchronized PMU data, this paper details a self-coherence method for detecting sustained oscillation, even when the oscillation amplitude is lower than ambient noise. Simulation and field measurement data are used to evaluate the proposed method’s performance. It is shown that the proposed method can detect sustained oscillations and estimate oscillation frequencies with a low signal-to-noise ratio.more » Comparison with a power spectral density method also shows that the proposed self-coherence method performs better. Index Terms—coherence, power spectral density, phasor measurement unit (PMU), oscillations, power system dynamics« less

  2. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    PubMed

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  3. Application of laser to nondestructive detection of fruit quality

    NASA Astrophysics Data System (ADS)

    Li, Jing; Xue, Long; Liu, Muhua; Li, Zhanlong; Yang, Yong

    2008-12-01

    In this study, a hyperspectral imaging system using a laser source was developed and two experiments were carried out. The first experiment was detection of pesticide residue on navel orange surface. We calculated the mean intensity of regions of interest to plot the curves between 629nm to 638nm. The analysis of the mean intensity curves showed that the mean intensity can be described by a characteristic Gaussian curve equation. The coefficients a in characteristic equations of 0%, 0.1% and 0.5% fenvalerate residue images were more than 2400, 1570-2400 and less than 1570, respectively. So we suggest using equation coefficient a to detect pesticide residue on navel orange surface. The second experiment was predicting firmness, sugar content and vitamin C content of kiwi fruit. The optimal wavelength range of the kiwi fruit firmness, sugar content, vitamin C content line regressing prediction model were 680-711nm, 674-708nm, 669-701nm. The correlation coefficients (R) of prediction models for firmness, sugar content and vitamin C content were 0.898, 0.932 and 0.918. The mean errors of validation results were 0.35×105Pa, 0.32%Brix and 7mg/100g. The experimental results indicate that a hyperspectral imaging system based on a laser source can detect fruit quality effectively.

  4. Method and system for detecting an explosive

    DOEpatents

    Reber, Edward L.; Rohde, Kenneth W.; Blackwood, Larry G.

    2010-12-07

    A method and system for detecting at least one explosive in a vehicle using a neutron generator and a plurality of NaI detectors. Spectra read from the detectors is calibrated by performing Gaussian peak fitting to define peak regions, locating a Na peak and an annihilation peak doublet, assigning a predetermined energy level to one peak in the doublet, and predicting a hydrogen peak location based on a location of at least one peak of the doublet. The spectra are gain shifted to a common calibration, summed for respective groups of NaI detectors, and nitrogen detection analysis performed on the summed spectra for each group.

  5. Development and validation of a multiplex real-time PCR method to simultaneously detect 47 targets for the identification of genetically modified organisms.

    PubMed

    Cottenet, Geoffrey; Blancpain, Carine; Sonnard, Véronique; Chuah, Poh Fong

    2013-08-01

    Considering the increase of the total cultivated land area dedicated to genetically modified organisms (GMO), the consumers' perception toward GMO and the need to comply with various local GMO legislations, efficient and accurate analytical methods are needed for their detection and identification. Considered as the gold standard for GMO analysis, the real-time polymerase chain reaction (RTi-PCR) technology was optimised to produce a high-throughput GMO screening method. Based on simultaneous 24 multiplex RTi-PCR running on a ready-to-use 384-well plate, this new procedure allows the detection and identification of 47 targets on seven samples in duplicate. To comply with GMO analytical quality requirements, a negative and a positive control were analysed in parallel. In addition, an internal positive control was also included in each reaction well for the detection of potential PCR inhibition. Tested on non-GM materials, on different GM events and on proficiency test samples, the method offered high specificity and sensitivity with an absolute limit of detection between 1 and 16 copies depending on the target. Easy to use, fast and cost efficient, this multiplex approach fits the purpose of GMO testing laboratories.

  6. Dynamic Network Change Detection

    DTIC Science & Technology

    2008-12-01

    Change Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...Fisher and Mackenzie, 1922). These methods are used in quality engineering to detect small changes in a process (Montgomery, 1991; Ryan , 2000). Larger...Social Network Modeling and Analysis: Workshop Summary and Papers, Ronald Breiger, Kathleen Carley, and Philippa Pattison, (Eds

  7. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data.

    PubMed

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-05-15

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection.

  8. Sea Ice Detection Based on an Improved Similarity Measurement Method Using Hyperspectral Data

    PubMed Central

    Han, Yanling; Li, Jue; Zhang, Yun; Hong, Zhonghua; Wang, Jing

    2017-01-01

    Hyperspectral remote sensing technology can acquire nearly continuous spectrum information and rich sea ice image information, thus providing an important means of sea ice detection. However, the correlation and redundancy among hyperspectral bands reduce the accuracy of traditional sea ice detection methods. Based on the spectral characteristics of sea ice, this study presents an improved similarity measurement method based on linear prediction (ISMLP) to detect sea ice. First, the first original band with a large amount of information is determined based on mutual information theory. Subsequently, a second original band with the least similarity is chosen by the spectral correlation measuring method. Finally, subsequent bands are selected through the linear prediction method, and a support vector machine classifier model is applied to classify sea ice. In experiments performed on images of Baffin Bay and Bohai Bay, comparative analyses were conducted to compare the proposed method and traditional sea ice detection methods. Our proposed ISMLP method achieved the highest classification accuracies (91.18% and 94.22%) in both experiments. From these results the ISMLP method exhibits better performance overall than other methods and can be effectively applied to hyperspectral sea ice detection. PMID:28505135

  9. ELISA Methods for the Detection of Ebolavirus Infection.

    PubMed

    Cross, Robert W; Ksiazek, Thomas G

    2017-01-01

    Ebola viruses are high-priority pathogens first discovered in rural Africa associated with sporadic outbreaks of severe hemorrhagic disease in humans and nonhuman primates. Little is known about the disease ecology or the prevalence of past exposure of human populations to any of the five species of the genus Ebolavirus. The use of immunologic means of detection for either virus antigens or the host's immune response to antigen associated with prior infections offers a powerful approach at understanding the epidemiology and epizootiology of these agents. Here we describe methods for preparing antigen detection sandwich enzyme-linked immunosorbent assays (ELISAs) as well as IgG and IgM ELISAs for the detection of ebolavirus antigens or antibodies in biological samples.

  10. A scale-invariant change detection method for land use/cover change research

    NASA Astrophysics Data System (ADS)

    Xing, Jin; Sieber, Renee; Caelli, Terrence

    2018-07-01

    Land Use/Cover Change (LUCC) detection relies increasingly on comparing remote sensing images with different spatial and spectral scales. Based on scale-invariant image analysis algorithms in computer vision, we propose a scale-invariant LUCC detection method to identify changes from scale heterogeneous images. This method is composed of an entropy-based spatial decomposition, two scale-invariant feature extraction methods, Maximally Stable Extremal Region (MSER) and Scale-Invariant Feature Transformation (SIFT) algorithms, a spatial regression voting method to integrate MSER and SIFT results, a Markov Random Field-based smoothing method, and a support vector machine classification method to assign LUCC labels. We test the scale invariance of our new method with a LUCC case study in Montreal, Canada, 2005-2012. We found that the scale-invariant LUCC detection method provides similar accuracy compared with the resampling-based approach but this method avoids the LUCC distortion incurred by resampling.

  11. Quality control for normal liquid-based cytology: Rescreening, high-risk HPV targeted reviewing and/or high-risk HPV detection?

    PubMed Central

    Depuydt, Christophe E; Arbyn, Marc; Benoy, Ina H; Vandepitte, Johan; Vereecken, Annie J; Bogers, Johannes J

    2009-01-01

    The objective of this prospective study was to compare the number of CIN2+cases detected in negative cytology by different quality control (QC) methods. Full rescreening, high-risk (HR) human papillomavirus (HPV)-targeted reviewing and HR HPV detection were compared. Randomly selected negative cytology detected by BD FocalPoint™ (NFR), by guided screening of the prescreened which needed further review (GS) and by manual screening (MS) was used. A 3-year follow-up period was available. Full rescreening of cytology only detected 23.5% of CIN2+ cases, whereas the cytological rescreening of oncogenic positive slides (high-risk HPV-targeted reviewing) detected 7 of 17 CIN2+ cases (41.2%). Quantitative real-time PCR for 15 oncogenic HPV types detected all CIN2+ cases. Relative sensitivity to detect histological CIN2+ was 0.24 for full rescreening, 0.41 for HR-targeted reviewing and 1.00 for HR HPV detection. In more than half of the reviewed negative cytological preparations associated with histological CIN2+cases no morphologically abnormal cells were detected despite a positive HPV test. The visual cut-off for the detection of abnormal cytology was established at 6.5 HR HPV copies/cell. High-risk HPV detection has a higher yield for detection of CIN2+ cases as compared to manual screening followed by 5% full review, or compared to targeted reviewing of smears positive for oncogenic HPV types, and show diagnostic properties that support its use as a QC procedure in cytologic laboratories. PMID:18544049

  12. Exploring three faint source detections methods for aperture synthesis radio images

    NASA Astrophysics Data System (ADS)

    Peracaula, M.; Torrent, A.; Masias, M.; Lladó, X.; Freixenet, J.; Martí, J.; Sánchez-Sutil, J. R.; Muñoz-Arjonilla, A. J.; Paredes, J. M.

    2015-04-01

    Wide-field radio interferometric images often contain a large population of faint compact sources. Due to their low intensity/noise ratio, these objects can be easily missed by automated detection methods, which have been classically based on thresholding techniques after local noise estimation. The aim of this paper is to present and analyse the performance of several alternative or complementary techniques to thresholding. We compare three different algorithms to increase the detection rate of faint objects. The first technique consists of combining wavelet decomposition with local thresholding. The second technique is based on the structural behaviour of the neighbourhood of each pixel. Finally, the third algorithm uses local features extracted from a bank of filters and a boosting classifier to perform the detections. The methods' performances are evaluated using simulations and radio mosaics from the Giant Metrewave Radio Telescope and the Australia Telescope Compact Array. We show that the new methods perform better than well-known state of the art methods such as SEXTRACTOR, SAD and DUCHAMP at detecting faint sources of radio interferometric images.

  13. An affordable and easy-to-use diagnostic method for keratoconus detection using a smartphone

    NASA Astrophysics Data System (ADS)

    Askarian, Behnam; Tabei, Fatemehsadat; Askarian, Amin; Chong, Jo Woon

    2018-02-01

    Recently, smartphones are used for disease diagnosis and healthcare. In this paper, we propose a novel affordable diagnostic method of detecting keratoconus using a smartphone. Keratoconus is usually detected in clinics with ophthalmic devices, which are large, expensive and not portable, and need to be operated by trained technicians. However, our proposed smartphone-based eye disease detection method is small, affordable, portable, and it can be operated by patients in a convenient way. The results show that the proposed keratoconus detection method detects severe, advanced, and moderate keratoconus with accuracies of 93%, 86%, 67%, respectively. Due to its convenience with these accuracies, the proposed keratoconus detection method is expected to be applied in detecting keratoconus at an earlier stage in an affordable way.

  14. Land use change detection with LANDSAT-2 data for monitoring and predicting regional water quality degradation. [Arkansas

    NASA Technical Reports Server (NTRS)

    Macdonald, H.; Steele, K. (Principal Investigator); Waite, W.; Rice, R.; Shinn, M.; Dillard, T.; Petersen, C.

    1977-01-01

    The author has identified the following significant results. Comparison between LANDSAT 1 and 2 imagery of Arkansas provided evidence of significant land use changes during the 1972-75 time period. Analysis of Arkansas historical water quality information has shown conclusively that whereas point source pollution generally can be detected by use of water quality data collected by state and federal agencies, sampling methodologies for nonpoint source contamination attributable to surface runoff are totally inadequate. The expensive undertaking of monitoring all nonpoint sources for numerous watersheds can be lessened by implementing LANDSAT change detection analyses.

  15. An infrared spectroscopy method to detect ammonia in gastric juice.

    PubMed

    Giovannozzi, Andrea M; Pennecchi, Francesca; Muller, Paul; Balma Tivola, Paolo; Roncari, Silvia; Rossi, Andrea M

    2015-11-01

    Ammonia in gastric juice is considered a potential biomarker for Helicobacter pylori infection and as a factor contributing to gastric mucosal injury. High ammonia concentrations are also found in patients with chronic renal failure, peptic ulcer disease, and chronic gastritis. Rapid and specific methods for ammonia detection are urgently required by the medical community. Here we present a method to detect ammonia directly in gastric juice based on Fourier transform infrared spectroscopy. The ammonia dissolved in biological liquid samples as ammonium ion was released in air as a gas by the shifting of the pH equilibrium of the ammonium/ammonia reaction and was detected in line by a Fourier transform infrared spectroscopy system equipped with a gas cell for the quantification. The method developed provided high sensitivity and selectivity in ammonia detection both in pure standard solutions and in a simulated gastric juice matrix over the range of diagnostic concentrations tested. Preliminary analyses were also performed on real gastric juice samples from patients with gastric mucosal injury and with symptoms of H. pylori infection, and the results were in agreement with the clinicopathology information. The whole analysis, performed in less than 10 min, can be directly applied on the sample without extraction procedures and it ensures high specificity of detection because of the ammonia fingerprint absorption bands in the infrared spectrum. This method could be easily used with endoscopy instrumentation to provide information in real time and would enable the endoscopist to improve and integrate gastroscopic examinations.

  16. Method and system for detecting explosives

    DOEpatents

    Reber, Edward L [Idaho Falls, ID; Jewell, James K [Idaho Falls, ID; Rohde, Kenneth W [Idaho Falls, ID; Seabury, Edward H [Idaho Falls, ID; Blackwood, Larry G [Idaho Falls, ID; Edwards, Andrew J [Idaho Falls, ID; Derr, Kurt W [Idaho Falls, ID

    2009-03-10

    A method of detecting explosives in a vehicle includes providing a first rack on one side of the vehicle, the rack including a neutron generator and a plurality of gamma ray detectors; providing a second rack on another side of the vehicle, the second rack including a neutron generator and a plurality of gamma ray detectors; providing a control system, remote from the first and second racks, coupled to the neutron generators and gamma ray detectors; using the control system, causing the neutron generators to generate neutrons; and performing gamma ray spectroscopy on spectra read by the gamma ray detectors to look for a signature indicative of presence of an explosive. Various apparatus and other methods are also provided.

  17. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    PubMed

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  18. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  19. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    NASA Astrophysics Data System (ADS)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  20. Identifying Minefields and Verifying Clearance: Adapting Statistical Methods for UXO Target Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.; O'Brien, Robert F.; Wilson, John E.

    2003-09-01

    It may not be feasible to completely survey large tracts of land suspected of containing minefields. It is desirable to develop a characterization protocol that will confidently identify minefields within these large land tracts if they exist. Naturally, surveying areas of greatest concern and most likely locations would be necessary but will not provide the needed confidence that an unknown minefield had not eluded detection. Once minefields are detected, methods are needed to bound the area that will require detailed mine detection surveys. The US Department of Defense Strategic Environmental Research and Development Program (SERDP) is sponsoring the development ofmore » statistical survey methods and tools for detecting potential UXO targets. These methods may be directly applicable to demining efforts. Statistical methods are employed to determine the optimal geophysical survey transect spacing to have confidence of detecting target areas of a critical size, shape, and anomaly density. Other methods under development determine the proportion of a land area that must be surveyed to confidently conclude that there are no UXO present. Adaptive sampling schemes are also being developed as an approach for bounding the target areas. These methods and tools will be presented and the status of relevant research in this area will be discussed.« less

  1. Detectability of Wellbore CO2 Leakage using the Magnetotelluric Method

    NASA Astrophysics Data System (ADS)

    Yang, X.; Buscheck, T. A.; Mansoor, K.; Carroll, S.

    2016-12-01

    We assessed the effectiveness of the magnetotelluric (MT) method in detecting CO2 and brine leakage through a wellbore, which penetrates a CO2 storage reservoir, into overlying aquifers, 0 to 1720 m in depth, in support of the USDOE National Risk Assessment Partnership (NRAP) monitoring program. Synthetic datasets based on the Kimberlina site in the southern San Joaquin Basin, California were created using CO2 storage reservoir models, wellbore leakage models, and groundwater/geochemical models of the overlying aquifers. The species concentrations simulated with the groundwater/geochemical models were converted into bulk electrical conductivity (EC) distributions as the MT model input. Brine and CO2 leakage into the overlying aquifers increases ion concentrations, and thus results in an EC increase, which may be detected by the MT method. Our objective was to estimate and maximize the probability of leakage detection using the MT method. The MT method is an electromagnetic geophysical technique that images the subsurface EC distribution by measuring natural electric and magnetic fields in the frequency range from 0.01 Hz to 1 kHz with sensors on the ground surface. The ModEM software was used to predict electromagnetic responses from brine and CO2 leakage and to invert synthetic MT data for recovery of subsurface conductivity distribution. We are in the process of building 1000 simulations for ranges of permeability, leakage flux, and hydraulic gradient to study leakage detectability and to develop an optimization method to answer when, where and how an MT monitoring system should be deployed to maximize the probability of leakage detection. This work was sponsored by the USDOE Fossil Energy, National Energy Technology Laboratory, managed by Traci Rodosta and Andrea McNemar. This work was performed under the auspices of the USDOE by LLNL under contract DE-AC52-07NA27344. LLNL IM release number is LLNL-ABS-699276.

  2. Comparison of four DNA extraction methods for the detection of Mycobacterium leprae from Ziehl-Neelsen-stained microscopic slides.

    PubMed

    Ruiz-Fuentes, Jenny Laura; Díaz, Alexis; Entenza, Anayma Elena; Frión, Yahima; Suárez, Odelaisy; Torres, Pedro; de Armas, Yaxsier; Acosta, Lucrecia

    2015-12-01

    The diagnosis of leprosy has been a challenge due to the low sensibility of the conventional methods and the impossibility of culturing the causative organism. In this study, four methods for Mycobacterium leprae nucleic-acid extraction from Ziehl-Neelsen-stained slides (ZNS slides) were compared: Phenol/chloroform, Chelex 100 resin, and two commercial kits (Wizard Genomic DNA Purification Kit and QIAamp DNA Mini Kit). DNA was extracted from four groups of slides: a high-codification-slide group (bacteriological index [BI]⩾4), a low-codification-slide group (BI=1), a negative-slide group (BI=0), and a negative-control-slide group (BI=0). Quality DNA was evidenced by the amplification of specific repetitive element present in M. leprae genomic DNA (RLEP) using a nested polymerase chain reaction. This is the first report comparing four different extraction methods for obtaining M. leprae DNA from ZNS slides in Cuban patients, and applied in molecular diagnosis. Good-quality DNA and positive amplification were detected in the high-codification-slide group with the four methods, while from the low-codification-slide group only the QIAGEN and phenol-chloroform methods obtained amplification of M. leprae. In the negative-slide group, only the QIAGEN method was able to obtain DNA with sufficient quality for positive amplification of the RLEP region. No amplification was observed in the negative-control-slide group by any method. Patients with ZNS negative slides can still transmit the infection, and molecular methods can help identify and treat them, interrupting the chain of transmission and preventing the onset of disabilities. The ZNS slides can be sent easily to reference laboratories for later molecular analysis that can be useful not only to improve the diagnosis, but also for the application of other molecular techniques. Copyright © 2015 Asian-African Society for Mycobacteriology. Published by Elsevier Ltd. All rights reserved.

  3. Sensitive and specific miRNA detection method using SplintR Ligase

    PubMed Central

    Jin, Jingmin; Vaud, Sophie; Zhelkovsky, Alexander M.; Posfai, Janos; McReynolds, Larry A.

    2016-01-01

    We describe a simple, specific and sensitive microRNA (miRNA) detection method that utilizes Chlorella virus DNA ligase (SplintR® Ligase). This two-step method involves ligation of adjacent DNA oligonucleotides hybridized to a miRNA followed by real-time quantitative PCR (qPCR). SplintR Ligase is 100X faster than either T4 DNA Ligase or T4 RNA Ligase 2 for RNA splinted DNA ligation. Only a 4–6 bp overlap between a DNA probe and miRNA was required for efficient ligation by SplintR Ligase. This property allows more flexibility in designing miRNA-specific ligation probes than methods that use reverse transcriptase for cDNA synthesis of miRNA. The qPCR SplintR ligation assay is sensitive; it can detect a few thousand molecules of miR-122. For miR-122 detection the SplintR qPCR assay, using a FAM labeled double quenched DNA probe, was at least 40× more sensitive than the TaqMan assay. The SplintR method, when coupled with NextGen sequencing, allowed multiplex detection of miRNAs from brain, kidney, testis and liver. The SplintR qPCR assay is specific; individual let-7 miRNAs that differ by one nucleotide are detected. The rapid kinetics and ability to ligate DNA probes hybridized to RNA with short complementary sequences makes SplintR Ligase a useful enzyme for miRNA detection. PMID:27154271

  4. Fatigue Crack Detection via Load-Differential Guided Wave Methods (Preprint)

    DTIC Science & Technology

    2011-11-01

    AFRL-RX-WP-TP-2011-4362 FATIGUE CRACK DETECTION VIA LOAD- DIFFERENTIAL GUIDED WAVE METHODS (PREPRINT) Jennifer E. Michaels, Sang Jun Lee...November 2011 Technical Paper 1 November 2011 – 1 November 2011 4. TITLE AND SUBTITLE FATIGUE CRACK DETECTION VIA LOAD-DIFFERENTIAL GUIDED WAVE...document contains color. 14. ABSTRACT Detection of fatigue cracks originating from fastener holes is an important application for structural health

  5. Improvement of seawater salt quality by hydro-extraction and re-crystallization methods

    NASA Astrophysics Data System (ADS)

    Sumada, K.; Dewati, R.; Suprihatin

    2018-01-01

    Indonesia is one of the salt producing countries that use sea water as a source of raw materials, the quality of salt produced is influenced by the quality of sea water. The resulting average salt quality contains 85-90% NaCl. The Indonesian National Standard (SNI) for human salt’s consumption sodium chloride content is 94.7 % (dry base) and for industrial salt 98,5 %. In this study developed the re-crystallization without chemical and hydro-extraction method. The objective of this research to choose the best methods based on efficiency. The results showed that re-crystallization method can produce salt with NaCl content 99,21%, while hydro-extraction method content 99,34 % NaCl. The salt produced through both methods can be used as a consumption and industrial salt, Hydro-extraction method is more efficient than re-crystallization method because re-crystallization method requires heat energy.

  6. Optoelectronic method for detection of cervical intraepithelial neoplasia and cervical cancer

    NASA Astrophysics Data System (ADS)

    Pruski, D.; Przybylski, M.; Kędzia, W.; Kędzia, H.; Jagielska-Pruska, J.; Spaczyński, M.

    2011-12-01

    The optoelectronic method is one of the most promising concepts of biophysical program of the diagnostics of CIN and cervical cancer. Objectives of the work are evaluation of sensitivity and specificity of the optoelectronic method in the detection of CIN and cervical cancer. The paper shows correlation between the pNOR number and sensitivity/specificity of the optoelectronic method. The study included 293 patients with abnormal cervical cytology result and the following examinations: examination with the use of the optoelectronic method — Truscreen, colposcopic examination, and histopathologic biopsy. Specificity of the optoelectronic method for LGSIL was estimated at 65.70%, for HGSIL and squamous cell carcinoma of cervix amounted to 90.38%. Specificity of the optoelectronic method used to confirm lack of cervical pathology was estimated at 78.89%. The field under the ROC curve for the optoelectronic method was estimated at 0.88 (95% CI, 0.84-0.92) which shows high diagnostic value of the test in the detection of HGSIL and squamous cell carcinoma. The optoelectronic method is characterised by high usefulness in the detection of CIN, present in the squamous epithelium and squamous cell carcinoma of cervix.

  7. A method of immediate detection of objects with a near-zero apparent motion in series of CCD-frames

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Khlamov, S. V.; Vavilova, I. B.; Briukhovetskyi, A. B.; Pohorelov, A. V.; Mkrtichian, D. E.; Kudak, V. I.; Pakuliak, L. K.; Dikov, E. N.; Melnik, R. G.; Vlasenko, V. P.; Reichart, D. E.

    2018-01-01

    The paper deals with a computational method for detection of the solar system minor bodies (SSOs), whose inter-frame shifts in series of CCD-frames during the observation are commensurate with the errors in measuring their positions. These objects have velocities of apparent motion between CCD-frames not exceeding three rms errors (3σ) of measurements of their positions. About 15% of objects have a near-zero apparent motion in CCD-frames, including the objects beyond the Jupiter's orbit as well as the asteroids heading straight to the Earth. The proposed method for detection of the object's near-zero apparent motion in series of CCD-frames is based on the Fisher f-criterion instead of using the traditional decision rules that are based on the maximum likelihood criterion. We analyzed the quality indicators of detection of the object's near-zero apparent motion applying statistical and in situ modeling techniques in terms of the conditional probability of the true detection of objects with a near-zero apparent motion. The efficiency of method being implemented as a plugin for the Collection Light Technology (CoLiTec) software for automated asteroids and comets detection has been demonstrated. Among the objects discovered with this plugin, there was the sungrazing comet C/2012 S1 (ISON). Within 26 min of the observation, the comet's image has been moved by three pixels in a series of four CCD-frames (the velocity of its apparent motion at the moment of discovery was equal to 0.8 pixels per CCD-frame; the image size on the frame was about five pixels). Next verification in observations of asteroids with a near-zero apparent motion conducted with small telescopes has confirmed an efficiency of the method even in bad conditions (strong backlight from the full Moon). So, we recommend applying the proposed method for series of observations with four or more frames.

  8. Nanotechnology: a promising method for oral cancer detection and diagnosis.

    PubMed

    Chen, Xiao-Jie; Zhang, Xue-Qiong; Liu, Qi; Zhang, Jing; Zhou, Gang

    2018-06-11

    Oral cancer is a common and aggressive cancer with high morbidity, mortality, and recurrence rate globally. Early detection is of utmost importance for cancer prevention and disease management. Currently, tissue biopsy remains the gold standard for oral cancer diagnosis, but it is invasive, which may cause patient discomfort. The application of traditional noninvasive methods-such as vital staining, exfoliative cytology, and molecular imaging-is limited by insufficient sensitivity and specificity. Thus, there is an urgent need for exploring noninvasive, highly sensitive, and specific diagnostic techniques. Nano detection systems are known as new emerging noninvasive strategies that bring the detection sensitivity of biomarkers to nano-scale. Moreover, compared to current imaging contrast agents, nanoparticles are more biocompatible, easier to synthesize, and able to target specific surface molecules. Nanoparticles generate localized surface plasmon resonances at near-infrared wavelengths, providing higher image contrast and resolution. Therefore, using nano-based techniques can help clinicians to detect and better monitor diseases during different phases of oral malignancy. Here, we review the progress of nanotechnology-based methods in oral cancer detection and diagnosis.

  9. A Hyperspherical Adaptive Sparse-Grid Method for High-Dimensional Discontinuity Detection

    DOE PAGES

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max D.; ...

    2015-06-24

    This study proposes and analyzes a hyperspherical adaptive hierarchical sparse-grid method for detecting jump discontinuities of functions in high-dimensional spaces. The method is motivated by the theoretical and computational inefficiencies of well-known adaptive sparse-grid methods for discontinuity detection. Our novel approach constructs a function representation of the discontinuity hypersurface of an N-dimensional discontinuous quantity of interest, by virtue of a hyperspherical transformation. Then, a sparse-grid approximation of the transformed function is built in the hyperspherical coordinate system, whose value at each point is estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of the hypersurface, the newmore » technique can identify jump discontinuities with significantly reduced computational cost, compared to existing methods. In addition, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. Rigorous complexity analyses of the new method are provided as are several numerical examples that illustrate the effectiveness of the approach.« less

  10. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  11. Detection of forced oscillations in power systems with multichannel methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.

    2015-09-30

    The increasing availability of high fidelity, geographically dispersed measurements in power systems improves the ability of researchers and engineers to study dynamic behaviors in the grid. One such behavior that is garnering increased attention is the presence of forced oscillations. Power system engineers are interested in forced oscillations because they are often symptomatic of the malfunction or misoperation of equipment. Though the resulting oscillation is not always large in amplitude, the root cause may be serious. In this report, multi-channel forced oscillation detection methods are developed. These methods leverage previously developed detection approaches based on the periodogram and spectral-coherence. Makingmore » use of geographically distributed channels of data is shown to improved detection performance and shorten the delay before an oscillation can be detected in the online environment. Results from simulated and measured power system data are presented.« less

  12. QQ-SNV: single nucleotide variant detection at low frequency by comparing the quality quantiles.

    PubMed

    Van der Borght, Koen; Thys, Kim; Wetzels, Yves; Clement, Lieven; Verbist, Bie; Reumers, Joke; van Vlijmen, Herman; Aerssens, Jeroen

    2015-11-10

    Next generation sequencing enables studying heterogeneous populations of viral infections. When the sequencing is done at high coverage depth ("deep sequencing"), low frequency variants can be detected. Here we present QQ-SNV (http://sourceforge.net/projects/qqsnv), a logistic regression classifier model developed for the Illumina sequencing platforms that uses the quantiles of the quality scores, to distinguish true single nucleotide variants from sequencing errors based on the estimated SNV probability. To train the model, we created a dataset of an in silico mixture of five HIV-1 plasmids. Testing of our method in comparison to the existing methods LoFreq, ShoRAH, and V-Phaser 2 was performed on two HIV and four HCV plasmid mixture datasets and one influenza H1N1 clinical dataset. For default application of QQ-SNV, variants were called using a SNV probability cutoff of 0.5 (QQ-SNV(D)). To improve the sensitivity we used a SNV probability cutoff of 0.0001 (QQ-SNV(HS)). To also increase specificity, SNVs called were overruled when their frequency was below the 80(th) percentile calculated on the distribution of error frequencies (QQ-SNV(HS-P80)). When comparing QQ-SNV versus the other methods on the plasmid mixture test sets, QQ-SNV(D) performed similarly to the existing approaches. QQ-SNV(HS) was more sensitive on all test sets but with more false positives. QQ-SNV(HS-P80) was found to be the most accurate method over all test sets by balancing sensitivity and specificity. When applied to a paired-end HCV sequencing study, with lowest spiked-in true frequency of 0.5%, QQ-SNV(HS-P80) revealed a sensitivity of 100% (vs. 40-60% for the existing methods) and a specificity of 100% (vs. 98.0-99.7% for the existing methods). In addition, QQ-SNV required the least overall computation time to process the test sets. Finally, when testing on a clinical sample, four putative true variants with frequency below 0.5% were consistently detected by QQ-SNV(HS-P80) from different

  13. A multiscale method for a robust detection of the default mode network

    NASA Astrophysics Data System (ADS)

    Baquero, Katherine; Gómez, Francisco; Cifuentes, Christian; Guldenmund, Pieter; Demertzi, Athena; Vanhaudenhuyse, Audrey; Gosseries, Olivia; Tshibanda, Jean-Flory; Noirhomme, Quentin; Laureys, Steven; Soddu, Andrea; Romero, Eduardo

    2013-11-01

    The Default Mode Network (DMN) is a resting state network widely used for the analysis and diagnosis of mental disorders. It is normally detected in fMRI data, but for its detection in data corrupted by motion artefacts or low neuronal activity, the use of a robust analysis method is mandatory. In fMRI it has been shown that the signal-to-noise ratio (SNR) and the detection sensitivity of neuronal regions is increased with di erent smoothing kernels sizes. Here we propose to use a multiscale decomposition based of a linear scale-space representation for the detection of the DMN. Three main points are proposed in this methodology: rst, the use of fMRI data at di erent smoothing scale-spaces, second, detection of independent neuronal components of the DMN at each scale by using standard preprocessing methods and ICA decomposition at scale-level, and nally, a weighted contribution of each scale by the Goodness of Fit measurement. This method was applied to a group of control subjects and was compared with a standard preprocesing baseline. The detection of the DMN was improved at single subject level and at group level. Based on these results, we suggest to use this methodology to enhance the detection of the DMN in data perturbed with artefacts or applied to subjects with low neuronal activity. Furthermore, the multiscale method could be extended for the detection of other resting state neuronal networks.

  14. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research

    PubMed Central

    Weng, Chunhua

    2013-01-01

    Objective To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. Materials and methods A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Results Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Discussion Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. Conclusion There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment. PMID:22733976

  15. High-Accuracy Ultrasound Contrast Agent Detection Method for Diagnostic Ultrasound Imaging Systems.

    PubMed

    Ito, Koichi; Noro, Kazumasa; Yanagisawa, Yukari; Sakamoto, Maya; Mori, Shiro; Shiga, Kiyoto; Kodama, Tetsuya; Aoki, Takafumi

    2015-12-01

    An accurate method for detecting contrast agents using diagnostic ultrasound imaging systems is proposed. Contrast agents, such as microbubbles, passing through a blood vessel during ultrasound imaging are detected as blinking signals in the temporal axis, because their intensity value is constantly in motion. Ultrasound contrast agents are detected by evaluating the intensity variation of a pixel in the temporal axis. Conventional methods are based on simple subtraction of ultrasound images to detect ultrasound contrast agents. Even if the subject moves only slightly, a conventional detection method will introduce significant error. In contrast, the proposed technique employs spatiotemporal analysis of the pixel intensity variation over several frames. Experiments visualizing blood vessels in the mouse tail illustrated that the proposed method performs efficiently compared with conventional approaches. We also report that the new technique is useful for observing temporal changes in microvessel density in subiliac lymph nodes containing tumors. The results are compared with those of contrast-enhanced computed tomography. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  16. Land Use Land Cover Changes in Detection of Water Quality: A Study Based on Remote Sensing and Multivariate Statistics.

    PubMed

    Hua, Ang Kean

    2017-01-01

    Malacca River water quality is affected due to rapid urbanization development. The present study applied LULC changes towards water quality detection in Malacca River. The method uses LULC, PCA, CCA, HCA, NHCA, and ANOVA. PCA confirmed DS, EC, salinity, turbidity, TSS, DO, BOD, COD, As, Hg, Zn, Fe, E. coli , and total coliform. CCA confirmed 14 variables into two variates; first variate involves residential and industrial activities; and second variate involves agriculture, sewage treatment plant, and animal husbandry. HCA and NHCA emphasize that cluster 1 occurs in urban area with Hg, Fe, total coliform, and DO pollution; cluster 3 occurs in suburban area with salinity, EC, and DS; and cluster 2 occurs in rural area with salinity and EC. ANOVA between LULC and water quality data indicates that built-up area significantly polluted the water quality through E. coli , total coliform, EC, BOD, COD, TSS, Hg, Zn, and Fe, while agriculture activities cause EC, TSS, salinity, E. coli , total coliform, arsenic, and iron pollution; and open space causes contamination of turbidity, salinity, EC, and TSS. Research finding provided useful information in identifying pollution sources and understanding LULC with river water quality as references to policy maker for proper management of Land Use area.

  17. A Chemoenzymatic Histology Method for O-GlcNAc Detection.

    PubMed

    Aguilar, Aime Lopez; Hou, Xiaomeng; Wen, Liuqing; Wang, Peng G; Wu, Peng

    2017-12-14

    Modification of nuclear and cytoplasmic proteins by the addition or removal of O-GlcNAc dynamically impacts multiple biological processes. Here, we present the development of a chemoenzymatic histology method for the detection of O-GlcNAc in tissue specimens. We applied this method to screen murine organs, uncovering specific O-GlcNAc distribution patterns in different tissue structures. We then utilized our histology method for O-GlcNAc detection in human brain specimens from healthy donors and donors with Alzheimer's disease and found higher levels of O-GlcNAc in specimens from healthy donors. We also performed an analysis using a multiple cancer tissue array, uncovering different O-GlcNAc levels between healthy and cancerous tissues, as well as different O-GlcNAc cellular distributions within certain tissue specimens. This chemoenzymatic histology method therefore holds great potential for revealing the biology of O-GlcNAc in physiopathological processes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A whole-mount in situ hybridization method for microRNA detection in Caenorhabditis elegans

    PubMed Central

    Andachi, Yoshiki; Kohara, Yuji

    2016-01-01

    Whole-mount in situ hybridization (WISH) is an outstanding method to decipher the spatiotemporal expression patterns of microRNAs (miRNAs) and provides important clues for elucidating their functions. The first WISH method for miRNA detection was developed in zebrafish. Although this method was quickly adapted for other vertebrates and fruit flies, WISH analysis has not been successfully used to detect miRNAs in Caenorhabditis elegans. Here, we show a novel WISH method for miRNA detection in C. elegans. Using this method, mir-1 miRNA was detected in the body-wall muscle where the expression and roles of mir-1 miRNA have been previously elucidated. Application of the method to let-7 family miRNAs, let-7, mir-48, mir-84, and mir-241, revealed their distinct but partially overlapping expression patterns, indicating that miRNAs sharing a short common sequence were distinguishably detected. In pash-1 mutants that were depleted of mature miRNAs, signals of mir-48 miRNA were greatly reduced, suggesting that mature miRNAs were detected by the method. These results demonstrate the validity of WISH to detect mature miRNAs in C. elegans. PMID:27154969

  19. The estimation method on diffusion spot energy concentration of the detection system

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Song, Zongxi; Liu, Feng; Dan, Lijun; Sun, Zhonghan; Du, Yunfei

    2016-09-01

    We propose a method to estimate the diffusion spot energy of the detection system. We do outdoor observation experiments in Xinglong Observatory, by using a detection system which diffusion spot energy concentration is estimated (the correlation coefficient is approximate 0.9926).The aperture of system is 300mm and limiting magnitude of system is 14.15Mv. Observation experiments show that the highest detecting magnitude of estimated system is 13.96Mv, and the average detecting magnitude of estimated system is about 13.5Mv. The results indicate that this method can be used to evaluate the energy diffusion spot concentration level of detection system efficiently.

  20. Surface property detection apparatus and method

    DOEpatents

    Martens, J.S.; Ginley, D.S.; Hietala, V.M.; Sorensen, N.R.

    1995-08-08

    Apparatus and method for detecting, determining, and imaging surface resistance corrosion, thin film growth, and oxide formation on the surface of conductors or other electrical surface modification. The invention comprises a modified confocal resonator structure with the sample remote from the radiating mirror. Surface resistance is determined by analyzing and imaging reflected microwaves; imaging reveals anomalies due to surface impurities, non-stoichiometry, and the like, in the surface of the superconductor, conductor, dielectric, or semiconductor. 4 figs.