Sample records for analysis detection processing

  1. Tornado detection data reduction and analysis

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1977-01-01

    Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.

  2. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  3. Combination of process and vibration data for improved condition monitoring of industrial systems working under variable operating conditions

    NASA Astrophysics Data System (ADS)

    Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.

    2016-01-01

    The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.

  4. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  6. Retinal imaging analysis based on vessel detection.

    PubMed

    Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila

    2017-07-01

    With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.

  7. WeaselBoard :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mulder, John C.; Schwartz, Moses Daniel; Berg, Michael J.

    2013-10-01

    Critical infrastructures, such as electrical power plants and oil refineries, rely on programmable logic controllers (PLCs) to control essential processes. State of the art security cannot detect attacks on PLCs at the hardware or firmware level. This renders critical infrastructure control systems vulnerable to costly and dangerous attacks. WeaselBoard is a PLC backplane analysis system that connects directly to the PLC backplane to capture backplane communications between modules. WeaselBoard forwards inter-module traffic to an external analysis system that detects changes to process control settings, sensor values, module configuration information, firmware updates, and process control program (logic) updates. WeaselBoard provides zero-daymore » exploit detection for PLCs by detecting changes in the PLC and the process. This approach to PLC monitoring is protected under U.S. Patent Application 13/947,887.« less

  8. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  9. SSME propellant path leak detection real-time

    NASA Technical Reports Server (NTRS)

    Crawford, R. A.; Smith, L. M.

    1994-01-01

    Included are four documents that outline the technical aspects of the research performed on NASA Grant NAG8-140: 'A System for Sequential Step Detection with Application to Video Image Processing'; 'Leak Detection from the SSME Using Sequential Image Processing'; 'Digital Image Processor Specifications for Real-Time SSME Leak Detection'; and 'A Color Change Detection System for Video Signals with Applications to Spectral Analysis of Rocket Engine Plumes'.

  10. Information theoretic analysis of edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  11. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  12. A capillary electrophoresis chip for the analysis of print and film photographic developing agents in commercial processing solutions using indirect fluorescence detection.

    PubMed

    Sirichai, S; de Mello, A J

    2001-01-01

    The separation and detection of both print and film developing agents (CD-3 and CD-4) in photographic processing solutions using chip-based capillary electrophoresis is presented. For simultaneous detection of both analytes under identical experimental conditions a buffer pH of 11.9 is used to partially ionise the analytes. Detection is made possible by indirect fluorescence, where the ions of the analytes displace the anionic fluorescing buffer ion to create negative peaks. Under optimal conditions, both analytes can be analyzed within 30 s. The limits of detection for CD-3 and CD-4 are 0.17 mM and 0.39 mM, respectively. The applicability of the method for the analysis of seasoned photographic processing developer solutions is also examined.

  13. Fast and objective detection and analysis of structures in downhole images

    NASA Astrophysics Data System (ADS)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  14. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    NASA Astrophysics Data System (ADS)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  15. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less

  16. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    PubMed

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ.

  17. Indoor air quality inspection and analysis system based on gas sensor array

    NASA Astrophysics Data System (ADS)

    Gao, Xiang; Wang, Mingjiang; Fan, Binwen

    2017-08-01

    A detection and analysis system capable of measuring the concentration of four major gases in indoor air is designed. It uses four gas sensors constitute a gas sensor array, to achieve four indoor gas concentration detection, while the detection of data for further processing to reduce the cross-sensitivity between the gas sensor to improve the accuracy of detection.

  18. A Dynamic Causal Modeling Analysis of the Effective Connectivities Underlying Top-Down Letter Processing

    ERIC Educational Resources Information Center

    Liu, Jiangang; Li, Jun; Rieth, Cory A.; Huber, David E.; Tian, Jie; Lee, Kang

    2011-01-01

    The present study employed dynamic causal modeling to investigate the effective functional connectivity between regions of the neural network involved in top-down letter processing. We used an illusory letter detection paradigm in which participants detected letters while viewing pure noise images. When participants detected letters, the response…

  19. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  20. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    NASA Astrophysics Data System (ADS)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  1. Performance of an image analysis processing system for hen tracking in an environmental preference chamber.

    PubMed

    Kashiha, Mohammad Amin; Green, Angela R; Sales, Tatiana Glogerley; Bahr, Claudia; Berckmans, Daniel; Gates, Richard S

    2014-10-01

    Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed. ©2014 Poultry Science Association Inc.

  2. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    PubMed Central

    Bayır, Şafak

    2016-01-01

    With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272

  3. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  4. MPI Runtime Error Detection with MUST: Advances in Deadlock Detection

    DOE PAGES

    Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...

    2013-01-01

    The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less

  5. Detection of genetically modified organisms in foreign-made processed foods containing corn and potato.

    PubMed

    Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo

    2005-06-01

    Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.

  6. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...

  7. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...

  8. Detectability of Granger causality for subsampled continuous-time neurophysiological processes.

    PubMed

    Barnett, Lionel; Seth, Anil K

    2017-01-01

    Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity from neurophysiological recordings. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  10. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE PAGES

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    2018-02-13

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  11. A Review of Flow Analysis Methods for Determination of Radionuclides in Nuclear Wastes and Nuclear Reactor Coolants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.

    Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less

  12. A review of flow analysis methods for determination of radionuclides in nuclear wastes and nuclear reactor coolants.

    PubMed

    Trojanowicz, Marek; Kołacińska, Kamila; Grate, Jay W

    2018-06-01

    The safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. The benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β-radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    NASA Astrophysics Data System (ADS)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  14. Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.

    ERIC Educational Resources Information Center

    De Grave, W. S.; And Others

    1996-01-01

    To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…

  15. Information theoretic analysis of linear shift-invariant edge-detection operators

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2012-06-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.

  16. Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution

    NASA Astrophysics Data System (ADS)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin

    2018-06-01

    Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6  ±  36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.

  17. Neurophysiological analysis of echolocation in bats

    NASA Technical Reports Server (NTRS)

    Suga, N.

    1972-01-01

    An analysis of echolocation and signal processing in brown bats is presented. Data cover echo detection, echo ranging, echolocalization, and echo analysis. Efforts were also made to identify the part of the brain that carries out the most essential processing function for echolocation. Results indicate the inferior colliculus and the auditory nuclei function together to process this information.

  18. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    PubMed

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  19. Quality assessment of raw and processed Arctium lappa L. through multicomponent quantification, chromatographic fingerprint, and related chemometric analysis.

    PubMed

    Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang

    2015-05-01

    In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.

    2015-04-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less

  1. Fault detection of Tennessee Eastman process based on topological features and SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen

    2018-03-01

    Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.

  2. Integrated Multi-process Microfluidic Systems for Automating Analysis

    PubMed Central

    Yang, Weichun; Woolley, Adam T.

    2010-01-01

    Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343

  3. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  4. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  5. Integrated analysis of error detection and recovery

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y. H.

    1985-01-01

    An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.

  6. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  7. Computer-assisted image processing to detect spores from the fungus Pandora neoaphidis.

    PubMed

    Korsnes, Reinert; Westrum, Karin; Fløistad, Erling; Klingen, Ingeborg

    2016-01-01

    This contribution demonstrates an example of experimental automatic image analysis to detect spores prepared on microscope slides derived from trapping. The application is to monitor aerial spore counts of the entomopathogenic fungus Pandora neoaphidis which may serve as a biological control agent for aphids. Automatic detection of such spores can therefore play a role in plant protection. The present approach for such detection is a modification of traditional manual microscopy of prepared slides, where autonomous image recording precedes computerised image analysis. The purpose of the present image analysis is to support human visual inspection of imagery data - not to replace it. The workflow has three components:•Preparation of slides for microscopy.•Image recording.•Computerised image processing where the initial part is, as usual, segmentation depending on the actual data product. Then comes identification of blobs, calculation of principal axes of blobs, symmetry operations and projection on a three parameter egg shape space.

  8. Novel image processing approach to detect malaria

    NASA Astrophysics Data System (ADS)

    Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

    2015-09-01

    In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.

  9. [Image processing applying in analysis of motion features of cultured cardiac myocyte in rat].

    PubMed

    Teng, Qizhi; He, Xiaohai; Luo, Daisheng; Wang, Zhengrong; Zhou, Beiyi; Yuan, Zhirun; Tao, Dachang

    2007-02-01

    Study of mechanism of medicine actions, by quantitative analysis of cultured cardiac myocyte, is one of the cutting edge researches in myocyte dynamics and molecular biology. The characteristics of cardiac myocyte auto-beating without external stimulation make the research sense. Research of the morphology and cardiac myocyte motion using image analysis can reveal the fundamental mechanism of medical actions, increase the accuracy of medicine filtering, and design the optimal formula of medicine for best medical treatments. A system of hardware and software has been built with complete sets of functions including living cardiac myocyte image acquisition, image processing, motion image analysis, and image recognition. In this paper, theories and approaches are introduced for analysis of living cardiac myocyte motion images and implementing quantitative analysis of cardiac myocyte features. A motion estimation algorithm is used for motion vector detection of particular points and amplitude and frequency detection of a cardiac myocyte. Beatings of cardiac myocytes are sometimes very small. In such case, it is difficult to detect the motion vectors from the particular points in a time sequence of images. For this reason, an image correlation theory is employed to detect the beating frequencies. Active contour algorithm in terms of energy function is proposed to approximate the boundary and detect the changes of edge of myocyte.

  10. Evolution and Advances in Satellite Analysis of Volcanoes

    NASA Astrophysics Data System (ADS)

    Dean, K. G.; Dehn, J.; Webley, P.; Bailey, J.

    2008-12-01

    Over the past 20 years satellite data used for monitoring and analysis of volcanic eruptions has evolved in terms of timeliness, access, distribution, resolution and understanding of volcanic processes. Initially satellite data was used for retrospective analysis but has evolved to proactive monitoring systems. Timely acquisition of data and the capability to distribute large data files paralleled advances in computer technology and was a critical component for near real-time monitoring. The sharing of these data and resulting discussions has improved our understanding of eruption processes and, even more importantly, their impact on society. To illustrate this evolution, critical scientific discoveries will be highlighted, including detection of airborne ash and sulfur dioxide, cloud-height estimates, prediction of ash cloud movement, and detection of thermal anomalies as precursor-signals to eruptions. AVO has been a leader in implementing many of these advances into an operational setting such as, automated eruption detection, database analysis systems, and remotely accessible web-based analysis systems. Finally, limitations resulting from trade-offs between resolution and how they impact some weakness in detection techniques and hazard assessments will be presented.

  11. Amplitude image processing by diffractive optics.

    PubMed

    Cagigal, Manuel P; Valle, Pedro J; Canales, V F

    2016-02-22

    In contrast to the standard digital image processing, which operates over the detected image intensity, we propose to perform amplitude image processing. Amplitude processing, like low pass or high pass filtering, is carried out using diffractive optics elements (DOE) since it allows to operate over the field complex amplitude before it has been detected. We show the procedure for designing the DOE that corresponds to each operation. Furthermore, we accomplish an analysis of amplitude image processing performances. In particular, a DOE Laplacian filter is applied to simulated astronomical images for detecting two stars one Airy ring apart. We also check by numerical simulations that the use of a Laplacian amplitude filter produces less noisy images than the standard digital image processing.

  12. Steelmaking process control using remote ultraviolet atomic emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Arnold, Samuel

    Steelmaking in North America is a multi-billion dollar industry that has faced tremendous economic and environmental pressure over the past few decades. Fierce competition has driven steel manufacturers to improve process efficiency through the development of real-time sensors to reduce operating costs. In particular, much attention has been focused on end point detection through furnace off gas analysis. Typically, off-gas analysis is done with extractive sampling and gas analyzers such as Non-dispersive Infrared Sensors (NDIR). Passive emission spectroscopy offers a more attractive approach to end point detection as the equipment can be setup remotely. Using high resolution UV spectroscopy and applying sophisticated emission line detection software, a correlation was observed between metal emissions and the process end point during field trials. This correlation indicates a relationship between the metal emissions and the status of a steelmaking melt which can be used to improve overall process efficiency.

  13. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    NASA Technical Reports Server (NTRS)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  14. Visual analysis of trash bin processing on garbage trucks in low resolution video

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Loibner, Gernot

    2015-03-01

    We present a system for trash can detection and counting from a camera which is mounted on a garbage collection truck. A working prototype has been successfully implemented and tested with several hours of real-world video. The detection pipeline consists of HOG detectors for two trash can sizes, and meanshift tracking and low level image processing for the analysis of the garbage disposal process. Considering the harsh environment and unfavorable imaging conditions, the process works already good enough so that very useful measurements from video data can be extracted. The false positive/false negative rate of the full processing pipeline is about 5-6% at fully automatic operation. Video data of a full day (about 8 hrs) can be processed in about 30 minutes on a standard PC.

  15. Statistical sensor fusion analysis of near-IR polarimetric and thermal imagery for the detection of minelike targets

    NASA Astrophysics Data System (ADS)

    Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.

    1999-02-01

    We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.

  16. Single-tube analysis of DNA methylation with silica superparamagnetic beads.

    PubMed

    Bailey, Vasudev J; Zhang, Yi; Keeley, Brian P; Yin, Chao; Pelosky, Kristen L; Brock, Malcolm; Baylin, Stephen B; Herman, James G; Wang, Tza-Huei

    2010-06-01

    DNA promoter methylation is a signature for the silencing of tumor suppressor genes. Most widely used methods to detect DNA methylation involve 3 separate, independent processes: DNA extraction, bisulfite conversion, and methylation detection via a PCR method, such as methylation-specific PCR (MSP). This method includes many disconnected steps with associated losses of material, potentially reducing the analytical sensitivity required for analysis of challenging clinical samples. Methylation on beads (MOB) is a new technique that integrates DNA extraction, bisulfite conversion, and PCR in a single tube via the use of silica superparamagnetic beads (SSBs) as a common DNA carrier for facilitating cell debris removal and buffer exchange throughout the entire process. In addition, PCR buffer is used to directly elute bisulfite-treated DNA from SSBs for subsequent target amplifications. The diagnostic sensitivity of MOB was evaluated by methylation analysis of the CDKN2A [cyclin-dependent kinase inhibitor 2A (melanoma, p16, inhibits CDK4); also known as p16(INK4a)] promoter in serum DNA of lung cancer patients and compared with that of conventional methods. Methylation analysis consisting of DNA extraction followed by bisulfite conversion and MSP was successfully carried out within 9 h in a single tube. The median pre-PCR DNA yield was 6.61-fold higher with the MOB technique than with conventional techniques. Furthermore, MOB increased the diagnostic sensitivity in our analysis of the CDKN2A promoter in patient serum by successfully detecting methylation in 74% of cancer patients, vs the 45% detection rate obtained with conventional techniques. The MOB technique successfully combined 3 processes into a single tube, thereby allowing ease in handling and an increased detection throughput. The increased pre-PCR yield in MOB allowed efficient, diagnostically sensitive methylation detection.

  17. Reliably detectable flaw size for NDE methods that use calibration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  18. Reliably Detectable Flaw Size for NDE Methods that Use Calibration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh1823 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  19. An Analysis of the Magneto-Optic Imaging System

    NASA Technical Reports Server (NTRS)

    Nath, Shridhar

    1996-01-01

    The Magneto-Optic Imaging system is being used for the detection of defects in airframes and other aircraft structures. The system has been successfully applied to detecting surface cracks, but has difficulty in the detection of sub-surface defects such as corrosion. The intent of the grant was to understand the physics of the MOI better, in order to use it effectively for detecting corrosion and for classifying surface defects. Finite element analysis, image classification, and image processing are addressed.

  20. Directional analysis and filtering for dust storm detection in NOAA-AVHRR imagery

    NASA Astrophysics Data System (ADS)

    Janugani, S.; Jayaram, V.; Cabrera, S. D.; Rosiles, J. G.; Gill, T. E.; Rivera Rivera, N.

    2009-05-01

    In this paper, we propose spatio-spectral processing techniques for the detection of dust storms and automatically finding its transport direction in 5-band NOAA-AVHRR imagery. Previous methods that use simple band math analysis have produced promising results but have drawbacks in producing consistent results when low signal to noise ratio (SNR) images are used. Moreover, in seeking to automate the dust storm detection, the presence of clouds in the vicinity of the dust storm creates a challenge in being able to distinguish these two types of image texture. This paper not only addresses the detection of the dust storm in the imagery, it also attempts to find the transport direction and the location of the sources of the dust storm. We propose a spatio-spectral processing approach with two components: visualization and automation. Both approaches are based on digital image processing techniques including directional analysis and filtering. The visualization technique is intended to enhance the image in order to locate the dust sources. The automation technique is proposed to detect the transport direction of the dust storm. These techniques can be used in a system to provide timely warnings of dust storms or hazard assessments for transportation, aviation, environmental safety, and public health.

  1. Pneumothorax detection in chest radiographs using local and global texture signatures

    NASA Astrophysics Data System (ADS)

    Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit

    2015-03-01

    A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.

  2. Glial brain tumor detection by using symmetry analysis

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Binaghi, Elisabetta; Balbi, Sergio; De Benedictis, Alessandro; Monti, Emanuele; Minotto, Renzo

    2012-02-01

    In this work a fully automatic algorithm to detect brain tumors by using symmetry analysis is proposed. In recent years a great effort of the research in field of medical imaging was focused on brain tumors segmentation. The quantitative analysis of MRI brain tumor allows to obtain useful key indicators of disease progression. The complex problem of segmenting tumor in MRI can be successfully addressed by considering modular and multi-step approaches mimicking the human visual inspection process. The tumor detection is often an essential preliminary phase to solvethe segmentation problem successfully. In visual analysis of the MRI, the first step of the experts cognitive process, is the detection of an anomaly respect the normal tissue, whatever its nature. An healthy brain has a strong sagittal symmetry, that is weakened by the presence of tumor. The comparison between the healthy and ill hemisphere, considering that tumors are generally not symmetrically placed in both hemispheres, was used to detect the anomaly. A clustering method based on energy minimization through Graph-Cut is applied on the volume computed as a difference between the left hemisphere and the right hemisphere mirrored across the symmetry plane. Differential analysis involves the loss the knowledge of the tumor side. Through an histogram analysis the ill hemisphere is recognized. Many experiments are performed to assess the performance of the detection strategy on MRI volumes in presence of tumors varied in terms of shapes positions and intensity levels. The experiments showed good results also in complex situations.

  3. A practical approach to tramway track condition monitoring: vertical track defects detection and identification using time-frequency processing technique

    NASA Astrophysics Data System (ADS)

    Bocz, Péter; Vinkó, Ákos; Posgay, Zoltán

    2018-03-01

    This paper presents an automatic method for detecting vertical track irregularities on tramway operation using acceleration measurements on trams. For monitoring of tramway tracks, an unconventional measurement setup is developed, which records the data of 3-axes wireless accelerometers mounted on wheel discs. Accelerations are processed to obtain the vertical track irregularities to determine whether the track needs to be repaired. The automatic detection algorithm is based on time-frequency distribution analysis and determines the defect locations. Admissible limits (thresholds) are given for detecting moderate and severe defects using statistical analysis. The method was validated on frequented tram lines in Budapest and accurately detected severe defects with a hit rate of 100%, with no false alarms. The methodology is also sensitive to moderate and small rail surface defects at the low operational speed.

  4. Fault detection and diagnosis in an industrial fed-batch cell culture process.

    PubMed

    Gunther, Jon C; Conner, Jeremy S; Seborg, Dale E

    2007-01-01

    A flexible process monitoring method was applied to industrial pilot plant cell culture data for the purpose of fault detection and diagnosis. Data from 23 batches, 20 normal operating conditions (NOC) and three abnormal, were available. A principal component analysis (PCA) model was constructed from 19 NOC batches, and the remaining NOC batch was used for model validation. Subsequently, the model was used to successfully detect (both offline and online) abnormal process conditions and to diagnose the root causes. This research demonstrates that data from a relatively small number of batches (approximately 20) can still be used to monitor for a wide range of process faults.

  5. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    NASA Astrophysics Data System (ADS)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  6. Edge detection and localization with edge pattern analysis and inflection characterization

    NASA Astrophysics Data System (ADS)

    Jiang, Bo

    2012-05-01

    In general edges are considered to be abrupt changes or discontinuities in two dimensional image signal intensity distributions. The accuracy of front-end edge detection methods in image processing impacts the eventual success of higher level pattern analysis downstream. To generalize edge detectors designed from a simple ideal step function model to real distortions in natural images, research on one dimensional edge pattern analysis to improve the accuracy of edge detection and localization proposes an edge detection algorithm, which is composed by three basic edge patterns, such as ramp, impulse, and step. After mathematical analysis, general rules for edge representation based upon the classification of edge types into three categories-ramp, impulse, and step (RIS) are developed to reduce detection and localization errors, especially reducing "double edge" effect that is one important drawback to the derivative method. But, when applying one dimensional edge pattern in two dimensional image processing, a new issue is naturally raised that the edge detector should correct marking inflections or junctions of edges. Research on human visual perception of objects and information theory pointed out that a pattern lexicon of "inflection micro-patterns" has larger information than a straight line. Also, research on scene perception gave an idea that contours have larger information are more important factor to determine the success of scene categorization. Therefore, inflections or junctions are extremely useful features, whose accurate description and reconstruction are significant in solving correspondence problems in computer vision. Therefore, aside from adoption of edge pattern analysis, inflection or junction characterization is also utilized to extend traditional derivative edge detection algorithm. Experiments were conducted to test my propositions about edge detection and localization accuracy improvements. The results support the idea that these edge detection method improvements are effective in enhancing the accuracy of edge detection and localization.

  7. Progressive data transmission for anatomical landmark detection in a cloud.

    PubMed

    Sofka, M; Ralovich, K; Zhang, J; Zhou, S K; Comaniciu, D

    2012-01-01

    In the concept of cloud-computing-based systems, various authorized users have secure access to patient records from a number of care delivery organizations from any location. This creates a growing need for remote visualization, advanced image processing, state-of-the-art image analysis, and computer aided diagnosis. This paper proposes a system of algorithms for automatic detection of anatomical landmarks in 3D volumes in the cloud computing environment. The system addresses the inherent problem of limited bandwidth between a (thin) client, data center, and data analysis server. The problem of limited bandwidth is solved by a hierarchical sequential detection algorithm that obtains data by progressively transmitting only image regions required for processing. The client sends a request to detect a set of landmarks for region visualization or further analysis. The algorithm running on the data analysis server obtains a coarse level image from the data center and generates landmark location candidates. The candidates are then used to obtain image neighborhood regions at a finer resolution level for further detection. This way, the landmark locations are hierarchically and sequentially detected and refined. Only image regions surrounding landmark location candidates need to be trans- mitted during detection. Furthermore, the image regions are lossy compressed with JPEG 2000. Together, these properties amount to at least 30 times bandwidth reduction while achieving similar accuracy when compared to an algorithm using the original data. The hierarchical sequential algorithm with progressive data transmission considerably reduces bandwidth requirements in cloud-based detection systems.

  8. Signal Processing and Interpretation Using Multilevel Signal Abstractions.

    DTIC Science & Technology

    1986-06-01

    mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves

  9. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  10. Location precision analysis of stereo thermal anti-sniper detection system

    NASA Astrophysics Data System (ADS)

    He, Yuqing; Lu, Ya; Zhang, Xiaoyan; Jin, Weiqi

    2012-06-01

    Anti-sniper detection devices are the urgent requirement in modern warfare. The precision of the anti-sniper detection system is especially important. This paper discusses the location precision analysis of the anti-sniper detection system based on the dual-thermal imaging system. It mainly discusses the following two aspects which produce the error: the digital quantitative effects of the camera; effect of estimating the coordinate of bullet trajectory according to the infrared images in the process of image matching. The formula of the error analysis is deduced according to the method of stereovision model and digital quantitative effects of the camera. From this, we can get the relationship of the detecting accuracy corresponding to the system's parameters. The analysis in this paper provides the theory basis for the error compensation algorithms which are put forward to improve the accuracy of 3D reconstruction of the bullet trajectory in the anti-sniper detection devices.

  11. Real-time detection of hazardous materials in air

    NASA Astrophysics Data System (ADS)

    Schechter, Israel; Schroeder, Hartmut; Kompa, Karl L.

    1994-03-01

    A new detection system has been developed for real-time analysis of organic compounds in ambient air. It is based on multiphoton ionization by an unfocused laser beam in a single parallel-plate device. Thus, the ionization volume can be relatively large. The amount of laser created ions is determined quantitatively from the induced total voltage drop between the biased plates (Q equals (Delta) V(DOT)C). Mass information is obtained from computer analysis of the time-dependent signal. When a KrF laser (5 ev) is used, most of the organic compounds can be ionized in a two-photon process, but none of the standard components of atmospheric air are ionized by this process. Therefore, this instrument may be developed as a `sniffer' for organic materials. The method has been applied for benzene analysis in air. The detection limit is about 10 ppb. With a simple preconcentration technique the detection limit can be decreased to the sub-ppb range. Simple binary mixtures are also resolved.

  12. Visual verification and analysis of cluster detection for molecular dynamics.

    PubMed

    Grottel, Sebastian; Reina, Guido; Vrabec, Jadran; Ertl, Thomas

    2007-01-01

    A current research topic in molecular thermodynamics is the condensation of vapor to liquid and the investigation of this process at the molecular level. Condensation is found in many physical phenomena, e.g. the formation of atmospheric clouds or the processes inside steam turbines, where a detailed knowledge of the dynamics of condensation processes will help to optimize energy efficiency and avoid problems with droplets of macroscopic size. The key properties of these processes are the nucleation rate and the critical cluster size. For the calculation of these properties it is essential to make use of a meaningful definition of molecular clusters, which currently is a not completely resolved issue. In this paper a framework capable of interactively visualizing molecular datasets of such nucleation simulations is presented, with an emphasis on the detected molecular clusters. To check the quality of the results of the cluster detection, our framework introduces the concept of flow groups to highlight potential cluster evolution over time which is not detected by the employed algorithm. To confirm the findings of the visual analysis, we coupled the rendering view with a schematic view of the clusters' evolution. This allows to rapidly assess the quality of the molecular cluster detection algorithm and to identify locations in the simulation data in space as well as in time where the cluster detection fails. Thus, thermodynamics researchers can eliminate weaknesses in their cluster detection algorithms. Several examples for the effective and efficient usage of our tool are presented.

  13. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  14. Change detection in satellite images

    NASA Astrophysics Data System (ADS)

    Thonnessen, U.; Hofele, G.; Middelmann, W.

    2005-05-01

    Change detection plays an important role in different military areas as strategic reconnaissance, verification of armament and disarmament control and damage assessment. It is the process of identifying differences in the state of an object or phenomenon by observing it at different times. The availability of spaceborne reconnaissance systems with high spatial resolution, multi spectral capabilities, and short revisit times offer new perspectives for change detection. Before performing any kind of change detection it is necessary to separate changes of interest from changes caused by differences in data acquisition parameters. In these cases it is necessary to perform a pre-processing to correct the data or to normalize it. Image registration and, corresponding to this task, the ortho-rectification of the image data is a further prerequisite for change detection. If feasible, a 1-to-1 geometric correspondence should be aspired for. Change detection on an iconic level with a succeeding interpretation of the changes by the observer is often proposed; nevertheless an automatic knowledge-based analysis delivering the interpretation of the changes on a semantic level should be the aim of the future. We present first results of change detection on a structural level concerning urban areas. After pre-processing, the images are segmented in areas of interest and structural analysis is applied to these regions to extract descriptions of urban infrastructure like buildings, roads and tanks of refineries. These descriptions are matched to detect changes and similarities.

  15. Design and Performance of the Astro-E/XRS Signal Processing System

    NASA Technical Reports Server (NTRS)

    Boyce, Kevin R.; Audley, M. D.; Baker, R. G.; Dumonthier, J. J.; Fujimoto, R.; Gendreau, K. C.; Ishisaki, Y.; Kelley, R. L.; Stahle, C. K.; Szymkowiak, A. E.

    1999-01-01

    We describe the signal processing system of the Astro-E XRS instrument. The Calorimeter Analog Processor (CAP) provides bias and power for the detectors and amplifies the detector signals by a factor of 20,000. The Calorimeter Digital Processor (CDP) performs the digital processing of the calorimeter signals, detecting X-ray pulses and analyzing them by optimal filtering. We describe the operation of pulse detection, Pulse height analysis. and risetime determination. We also discuss performance, including the three event grades (hi-res mid-res, and low-res). anticoincidence detection, counting rate dependence, and noise rejection.

  16. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    PubMed

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  17. The Data Analysis in Gravitational Wave Detection

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-ge; Lebigot, Eric; Du, Zhi-hui; Cao, Jun-wei; Wang, Yun-yong; Zhang, Fan; Cai, Yong-zhi; Li, Mu-zi; Zhu, Zong-hong; Qian, Jin; Yin, Cong; Wang, Jian-bo; Zhao, Wen; Zhang, Yang; Blair, David; Ju, Li; Zhao, Chun-nong; Wen, Lin-qing

    2017-01-01

    Gravitational wave (GW) astronomy based on the GW detection is a rising interdisciplinary field, and a new window for humanity to observe the universe, followed after the traditional astronomy with the electromagnetic waves as the detection means, it has a quite important significance for studying the origin and evolution of the universe, and for extending the astronomical research field. The appearance of laser interferometer GW detector has opened a new era of GW detection, and the data processing and analysis of GWs have already been developed quickly around the world, to provide a sharp weapon for the GW astronomy. This paper introduces systematically the tool software that commonly used for the data analysis of GWs, and discusses in detail the basic methods used in the data analysis of GWs, such as the time-frequency analysis, composite analysis, pulsar timing analysis, matched filter, template, χ2 test, and Monte-Carlo simulation, etc.

  18. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  19. HPLC-PDA Combined with Chemometrics for Quantitation of Active Components and Quality Assessment of Raw and Processed Fruits of Xanthium strumarium L.

    PubMed

    Jiang, Hai; Yang, Liu; Xing, Xudong; Yan, Meiling; Guo, Xinyue; Yang, Bingyou; Wang, Qiuhong; Kuang, Haixue

    2018-01-25

    As a valuable herbal medicine, the fruits of Xanthium strumarium L. (Xanthii Fructus) have been widely used in raw and processed forms to achieve different therapeutic effects in practice. In this study, a comprehensive strategy was proposed for evaluating the active components in 30 batches of raw and processed Xanthii Fructus (RXF and PXF) samples, based on high-performance liquid chromatography coupled with photodiode array detection (HPLC-PDA). Twelve common peaks were detected and eight compounds of caffeoylquinic acids were simultaneously quantified in RXF and PXF. All the analytes were detected with satisfactory linearity (R² > 0.9991) over wide concentration ranges. Simultaneously, the chemically latent information was revealed by hierarchical cluster analysis (HCA) and principal component analysis (PCA). The results suggest that there were significant differences between RXF and PXF from different regions in terms of the content of eight caffeoylquinic acids. Potential chemical markers for XF were found during processing by chemometrics.

  20. Effective connectivities of cortical regions for top-down face processing: A Dynamic Causal Modeling study

    PubMed Central

    Li, Jun; Liu, Jiangang; Liang, Jimin; Zhang, Hongchuan; Zhao, Jizheng; Rieth, Cory A.; Huber, David E.; Li, Wu; Shi, Guangming; Ai, Lin; Tian, Jie; Lee, Kang

    2013-01-01

    To study top-down face processing, the present study used an experimental paradigm in which participants detected non-existent faces in pure noise images. Conventional BOLD signal analysis identified three regions involved in this illusory face detection. These regions included the left orbitofrontal cortex (OFC) in addition to the right fusiform face area (FFA) and right occipital face area (OFA), both of which were previously known to be involved in both top-down and bottom-up processing of faces. We used Dynamic Causal Modeling (DCM) and Bayesian model selection to further analyze the data, revealing both intrinsic and modulatory effective connectivities among these three cortical regions. Specifically, our results support the claim that the orbitofrontal cortex plays a crucial role in the top-down processing of faces by regulating the activities of the occipital face area, and the occipital face area in turn detects the illusory face features in the visual stimuli and then provides this information to the fusiform face area for further analysis. PMID:20423709

  1. Stability Analysis of Radial Turning Process for Superalloys

    NASA Astrophysics Data System (ADS)

    Jiménez, Alberto; Boto, Fernando; Irigoien, Itziar; Sierra, Basilio; Suarez, Alfredo

    2017-09-01

    Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC) environments, are set in four different states depending on materials grain size and Hardness (LGA, LGS, SGA and SGS). Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  2. Higher-Order Optical Modes and Nanostructures for Detection and Imaging Applications

    NASA Astrophysics Data System (ADS)

    Schultz, Zachary D.; Levin, Ira W.

    2010-08-01

    Raman spectroscopy offers a label-free, chemically specific, method of detecting molecules; however, the low cross-section attendant to this scattering process has hampered trace detection. The realization that scattering is enhanced at a metallic surface has enabled new techniques for spectroscopic and imaging analysis.

  3. In-TFT-array-process micro defect inspection using nonlinear principal component analysis.

    PubMed

    Liu, Yi-Hung; Wang, Chi-Kai; Ting, Yung; Lin, Wei-Zhi; Kang, Zhi-Hao; Chen, Ching-Shun; Hwang, Jih-Shang

    2009-11-20

    Defect inspection plays a critical role in thin film transistor liquid crystal display (TFT-LCD) manufacture, and has received much attention in the field of automatic optical inspection (AOI). Previously, most focus was put on the problems of macro-scale Mura-defect detection in cell process, but it has recently been found that the defects which substantially influence the yield rate of LCD panels are actually those in the TFT array process, which is the first process in TFT-LCD manufacturing. Defect inspection in TFT array process is therefore considered a difficult task. This paper presents a novel inspection scheme based on kernel principal component analysis (KPCA) algorithm, which is a nonlinear version of the well-known PCA algorithm. The inspection scheme can not only detect the defects from the images captured from the surface of LCD panels, but also recognize the types of the detected defects automatically. Results, based on real images provided by a LCD manufacturer in Taiwan, indicate that the KPCA-based defect inspection scheme is able to achieve a defect detection rate of over 99% and a high defect classification rate of over 96% when the imbalanced support vector machine (ISVM) with 2-norm soft margin is employed as the classifier. More importantly, the inspection time is less than 1 s per input image.

  4. Using terrestrial light detection and ranging (lidar) technology for land-surface analysis in the Southwest

    USGS Publications Warehouse

    Soulard, Christopher E.; Bogle, Rian

    2011-01-01

    Emerging technologies provide scientists with methods to measure Earth processes in new ways. One of these technologies--ultra-high-resolution, ground-based light detection and ranging (lidar)--is being used by USGS Western Geographic Science Center scientists to characterize the role of wind and fire processes in shaping desert landscapes of the Southwest United States.

  5. Advances of lab-on-a-chip in isolation, detection and post-processing of circulating tumour cells.

    PubMed

    Yu, Ling; Ng, Shu Rui; Xu, Yang; Dong, Hua; Wang, Ying Jun; Li, Chang Ming

    2013-08-21

    Circulating tumour cells (CTCs) are shed by primary tumours and are found in the peripheral blood of patients with metastatic cancers. Recent studies have shown that the number of CTCs corresponds with disease severity and prognosis. Therefore, detection and further functional analysis of CTCs are important for biomedical science, early diagnosis of cancer metastasis and tracking treatment efficacy in cancer patients, especially in point-of-care applications. Over the last few years, there has been an increasing shift towards not only capturing and detecting these rare cells, but also ensuring their viability for post-processing, such as cell culture and genetic analysis. High throughput lab-on-a-chip (LOC) has been fuelled up to process and analyse heterogeneous real patient samples while gaining profound insights for cancer biology. In this review, we highlight how miniaturisation strategies together with nanotechnologies have been used to advance LOC for capturing, separating, enriching and detecting different CTCs efficiently, while meeting the challenges of cell viability, high throughput multiplex or single-cell detection and post-processing. We begin this survey with an introduction to CTC biology, followed by description of the use of various materials, microstructures and nanostructures for design of LOC to achieve miniaturisation, as well as how various CTC capture or separation strategies can enhance cell capture and enrichment efficiencies, purity and viability. The significant progress of various nanotechnologies-based detection techniques to achieve high sensitivities and low detection limits for viable CTCs and/or to enable CTC post-processing are presented and the fundamental insights are also discussed. Finally, the challenges and perspectives of the technologies are enumerated.

  6. StreakDet data processing and analysis pipeline for space debris optical observations

    NASA Astrophysics Data System (ADS)

    Virtanen, Jenni; Flohrer, Tim; Muinonen, Karri; Granvik, Mikael; Torppa, Johanna; Poikonen, Jonne; Lehti, Jussi; Santti, Tero; Komulainen, Tuomo; Naranen, Jyri

    We describe a novel data processing and analysis pipeline for optical observations of space debris. The monitoring of space object populations requires reliable acquisition of observational data, to support the development and validation of space debris environment models, the build-up and maintenance of a catalogue of orbital elements. In addition, data is needed for the assessment of conjunction events and for the support of contingency situations or launches. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a “track before detect” problem, resulting in streaks, i.e., object trails of arbitrary lengths, in the images. The scope of the ESA-funded StreakDet (Streak detection and astrometric reduction) project is to investigate solutions for detecting and reducing streaks from optical images, particularly in the low signal-to-noise ratio (SNR) domain, where algorithms are not readily available yet. For long streaks, the challenge is to extract precise position information and related registered epochs with sufficient precision. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, there is a need to discuss and compare these approaches for space debris analysis, in order to develop and evaluate prototype implementations. In the StreakDet project, we develop algorithms applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The proposed processing pipeline starts from the segmentation of the acquired image (i.e., the extraction of all sources), followed by the astrometric and photometric characterization of the candidate streaks, and ends with orbital validation of the detected streaks. A central concept of the pipeline is streak classification which guides the actual characterization process by aiming to identify the interesting sources and to filter out the uninteresting ones, as well as by allowing the tailoring of algorithms for specific streak classes (e.g. point-like vs. long, disintegrated streaks). To validate the single-image detections, the processing is finalized by orbital analysis, resulting in preliminary orbital classification (Earth-bound vs. non-Earth-bound orbit) for the detected streaks.

  7. Psychophysical Models for Signal Detection with Time Varying Uncertainty. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gai, E.

    1975-01-01

    Psychophysical models for the behavior of the human operator in detection tasks which include change in detectability, correlation between observations and deferred decisions are developed. Classical Signal Detection Theory (SDT) is discussed and its emphasis on the sensory processes is contrasted to decision strategies. The analysis of decision strategies utilizes detection tasks with time varying signal strength. The classical theory is modified to include such tasks and several optimal decision strategies are explored. Two methods of classifying strategies are suggested. The first method is similar to the analysis of ROC curves, while the second is based on the relation between the criterion level (CL) and the detectability. Experiments to verify the analysis of tasks with changes of signal strength are designed. The results show that subjects are aware of changes in detectability and tend to use strategies that involve changes in the CL's.

  8. Comparing single- and dual-process models of memory development.

    PubMed

    Hayes, Brett K; Dunn, John C; Joubert, Amy; Taylor, Robert

    2017-11-01

    This experiment examined single-process and dual-process accounts of the development of visual recognition memory. The participants, 6-7-year-olds, 9-10-year-olds and adults, were presented with a list of pictures which they encoded under shallow or deep conditions. They then made recognition and confidence judgments about a list containing old and new items. We replicated the main trends reported by Ghetti and Angelini () in that recognition hit rates increased from 6 to 9 years of age, with larger age changes following deep than shallow encoding. Formal versions of the dual-process high threshold signal detection model and several single-process models (equal variance signal detection, unequal variance signal detection, mixture signal detection) were fit to the developmental data. The unequal variance and mixture signal detection models gave a better account of the data than either of the other models. A state-trace analysis found evidence for only one underlying memory process across the age range tested. These results suggest that single-process memory models based on memory strength are a viable alternative to dual-process models for explaining memory development. © 2016 John Wiley & Sons Ltd.

  9. Efficient Enrichment and Analysis of Vicinal-Diol-Containing Flavonoid Molecules Using Boronic-Acid-Functionalized Particles and Matrix-Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry.

    PubMed

    Kim, Eunjin; Kang, Hyunook; Choi, Insung; Song, Jihyeon; Mok, Hyejung; Jung, Woong; Yeo, Woon-Seok

    2018-05-09

    Detection and quantitation of flavonoids are relatively difficult compared to those of other small-molecule analytes because flavonoids undergo rapid metabolic processes, resulting in their elimination from the body. Here, we report an efficient enrichment method for facilitating the analysis of vicinal-diol-containing flavonoid molecules using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. In our strategy, boronic-acid-functionalized polyacrylamide particles were used, where boronic acids bound to vicinal diols to form boronate monoesters at basic pH. This complex remained intact during the enrichment processes, and the vicinal-diol-containing flavonoids were easily separated by centrifugation and subsequent acidic treatments. The selectivity and limit of detection of our strategy were confirmed by mass spectrometry analysis, and the validity was assessed by performing the detection and quantitation of quercetin in mouse organs.

  10. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  11. Hidden Process Models

    DTIC Science & Technology

    2009-12-18

    cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging

  12. Detecting 2LSB steganography using extended pairs of values analysis

    NASA Astrophysics Data System (ADS)

    Khalind, Omed; Aziz, Benjamin

    2014-05-01

    In this paper, we propose an extended pairs of values analysis to detect and estimate the amount of secret messages embedded with 2LSB replacement in digital images based on chi-square attack and regularity rate in pixel values. The detection process is separated from the estimation of the hidden message length, as it is the main requirement of any steganalysis method. Hence, the detection process acts as a discrete classifier, which classifies a given set of images into stego and clean classes. The method can accurately detect 2LSB replacement even when the message length is about 10% of the total capacity, it also reaches its best performance with an accuracy of higher than 0.96 and a true positive rate of more than 0.997 when the amount of data are 20% to 100% of the total capacity. However, the method puts no assumptions neither on the image nor the secret message, as it tested with two sets of 3000 images, compressed and uncompressed, embedded with a random message for each case. This method of detection could also be used as an automated tool to analyse a bulk of images for hidden contents, which could be used by digital forensics analysts in their investigation process.

  13. The use of multispectral sensing techniques to detect ponderosa pines trees under stress from insects or diseases

    NASA Technical Reports Server (NTRS)

    Heller, R. C.; Weber, F. P.; Zealear, K. A.

    1970-01-01

    The detection of stress induced by bark beetles in conifers is reviewed in two sections: (1) the analysis of very small scale aerial photographs taken by NASA's RB-57F aircraft on August 10, 1969, and (2) the analysis of multispectral imagery obtained by the optical-mechanical line scanner. Underexposure of all films taken from the RB-57 aircraft and inadequate flight coverage prevented drawing definitive conclusions regarding optimum scales and film combinations to detect the discolored infestations. Preprocessing of the scanner signals by both analog and digital computers improved the accuracy of target recognition. Selection and ranking of the best channels for signature recognition was the greatest contribution of digital processing. Improvements were made in separating hardwoods from conifers and old-kill pine trees from recent discolored trees and from healthy trees, but accuracy of detecting the green infested trees is still not acceptable on either the SPARC or thermal-contouring processor. From six years of experience in processing line scan data it is clear that the greatest gain in previsual detection of stress will occur when registered multispectral data from a single aperture or common instantaneous field of view scanner system can be collected and processed.

  14. Does the Butcher-on-the-Bus Phenomenon Require a Dual-Process Explanation? A Signal Detection Analysis

    PubMed Central

    Tunney, Richard J.; Mullett, Timothy L.; Moross, Claudia J.; Gardner, Anna

    2012-01-01

    The butcher-on-the-bus is a rhetorical device or hypothetical phenomenon that is often used to illustrate how recognition decisions can be based on different memory processes (Mandler, 1980). The phenomenon describes a scenario in which a person is recognized but the recognition is accompanied by a sense of familiarity or knowing characterized by an absence of contextual details such as the person’s identity. We report two recognition memory experiments that use signal detection analyses to determine whether this phenomenon is evidence for a recollection plus familiarity model of recognition or is better explained by a univariate signal detection model. We conclude that there is an interaction between confidence estimates and remember-know judgments which is not explained fully by either single-process signal detection or traditional dual-process models. PMID:22745631

  15. Comparison of formant detection methods used in speech processing applications

    NASA Astrophysics Data System (ADS)

    Belean, Bogdan

    2013-11-01

    The paper describes time frequency representations of speech signal together with the formant significance in speech processing applications. Speech formants can be used in emotion recognition, sex discrimination or diagnosing different neurological diseases. Taking into account the various applications of formant detection in speech signal, two methods for detecting formants are presented. First, the poles resulted after a complex analysis of LPC coefficients are used for formants detection. The second approach uses the Kalman filter for formant prediction along the speech signal. Results are presented for both approaches on real life speech spectrograms. A comparison regarding the features of the proposed methods is also performed, in order to establish which method is more suitable in case of different speech processing applications.

  16. Analysis of Variance in Statistical Image Processing

    NASA Astrophysics Data System (ADS)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  17. A Unified Mathematical Approach to Image Analysis.

    DTIC Science & Technology

    1987-08-31

    describes four instances of the paradigm in detail. Directions for ongoing and future research are also indicated. Keywords: Image processing; Algorithms; Segmentation; Boundary detection; tomography; Global image analysis .

  18. Using failure mode and effects analysis to improve the safety of neonatal parenteral nutrition.

    PubMed

    Arenas Villafranca, Jose Javier; Gómez Sánchez, Araceli; Nieto Guindo, Miriam; Faus Felipe, Vicente

    2014-07-15

    Failure mode and effects analysis (FMEA) was used to identify potential errors and to enable the implementation of measures to improve the safety of neonatal parenteral nutrition (PN). FMEA was used to analyze the preparation and dispensing of neonatal PN from the perspective of the pharmacy service in a general hospital. A process diagram was drafted, illustrating the different phases of the neonatal PN process. Next, the failures that could occur in each of these phases were compiled and cataloged, and a questionnaire was developed in which respondents were asked to rate the following aspects of each error: incidence, detectability, and severity. The highest scoring failures were considered high risk and identified as priority areas for improvements to be made. The evaluation process detected a total of 82 possible failures. Among the phases with the highest number of possible errors were transcription of the medical order, formulation of the PN, and preparation of material for the formulation. After the classification of these 82 possible failures and of their relative importance, a checklist was developed to achieve greater control in the error-detection process. FMEA demonstrated that use of the checklist reduced the level of risk and improved the detectability of errors. FMEA was useful for detecting medication errors in the PN preparation process and enabling corrective measures to be taken. A checklist was developed to reduce errors in the most critical aspects of the process. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  19. The effect of image processing on the detection of cancers in digital mammography.

    PubMed

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  20. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices

    PubMed Central

    Golden, J.P.; Verbarg, J.; Howell, P.B.; Shriver-Lake, L.C.; Ligler, F.S.

    2012-01-01

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose–response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. PMID:22960010

  1. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices.

    PubMed

    Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S

    2013-02-15

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.

  2. Image classification of unlabeled malaria parasites in red blood cells.

    PubMed

    Zheng Zhang; Ong, L L Sharon; Kong Fang; Matthew, Athul; Dauwels, Justin; Ming Dao; Asada, Harry

    2016-08-01

    This paper presents a method to detect unlabeled malaria parasites in red blood cells. The current "gold standard" for malaria diagnosis is microscopic examination of thick blood smear, a time consuming process requiring extensive training. Our goal is to develop an automate process to identify malaria infected red blood cells. Major issues in automated analysis of microscopy images of unstained blood smears include overlapping cells and oddly shaped cells. Our approach creates robust templates to detect infected and uninfected red cells. Histogram of Oriented Gradients (HOGs) features are extracted from templates and used to train a classifier offline. Next, the ViolaJones object detection framework is applied to detect infected and uninfected red cells and the image background. Results show our approach out-performs classification approaches with PCA features by 50% and cell detection algorithms applying Hough transforms by 24%. Majority of related work are designed to automatically detect stained parasites in blood smears where the cells are fixed. Although it is more challenging to design algorithms for unstained parasites, our methods will allow analysis of parasite progression in live cells under different drug treatments.

  3. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  4. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    PubMed

    Lee, Jack; Zee, Benny Chung Ying; Li, Qing

    2013-01-01

    Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  5. The visual analysis of emotional actions.

    PubMed

    Chouchourelou, Arieta; Matsuka, Toshihiko; Harber, Kent; Shiffrar, Maggie

    2006-01-01

    Is the visual analysis of human actions modulated by the emotional content of those actions? This question is motivated by a consideration of the neuroanatomical connections between visual and emotional areas. Specifically, the superior temporal sulcus (STS), known to play a critical role in the visual detection of action, is extensively interconnected with the amygdala, a center for emotion processing. To the extent that amygdala activity influences STS activity, one would expect to find systematic differences in the visual detection of emotional actions. A series of psychophysical studies tested this prediction. Experiment 1 identified point-light walker movies that convincingly depicted five different emotional states: happiness, sadness, neutral, anger, and fear. In Experiment 2, participants performed a walker detection task with these movies. Detection performance was systematically modulated by the emotional content of the gaits. Participants demonstrated the greatest visual sensitivity to angry walkers. The results of Experiment 3 suggest that local velocity cues to anger may account for high false alarm rates to the presence of angry gaits. These results support the hypothesis that the visual analysis of human action depends upon emotion processes.

  6. Hyperspectral data acquisition and analysis in imaging and real-time active MIR backscattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Jarvis, Jan; Haertelt, Marko; Hugger, Stefan; Butschek, Lorenz; Fuchs, Frank; Ostendorf, Ralf; Wagner, Joachim; Beyerer, Juergen

    2017-04-01

    In this work we present data analysis algorithms for detection of hazardous substances in hyperspectral observations acquired using active mid-infrared (MIR) backscattering spectroscopy. We present a novel background extraction algorithm based on the adaptive target generation process proposed by Ren and Chang called the adaptive background generation process (ABGP) that generates a robust and physically meaningful set of background spectra for operation of the well-known adaptive matched subspace detection (AMSD) algorithm. It is shown that the resulting AMSD-ABGP detection algorithm competes well with other widely used detection algorithms. The method is demonstrated in measurement data obtained by two fundamentally different active MIR hyperspectral data acquisition devices. A hyperspectral image sensor applicable in static scenes takes a wavelength sequential approach to hyperspectral data acquisition, whereas a rapid wavelength-scanning single-element detector variant of the same principle uses spatial scanning to generate the hyperspectral observation. It is shown that the measurement timescale of the latter is sufficient for the application of the data analysis algorithms even in dynamic scenarios.

  7. Signal processing for the detection of explosive residues on varying substrates using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie

    2011-05-01

    Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.

  8. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  9. MRI Post-processing in Pre-surgical Evaluation

    PubMed Central

    Wang, Z. Irene; Alexopoulos, Andreas V.

    2016-01-01

    Purpose of Review Advanced MRI post-processing techniques are increasingly used to complement visual analysis and elucidate structural epileptogenic lesions. This review summarizes recent developments in MRI post-processing in the context of epilepsy pre-surgical evaluation, with the focus on patients with unremarkable MRI by visual analysis (i.e., “nonlesional” MRI). Recent Findings Various methods of MRI post-processing have been reported to show additional clinical values in the following areas: (1) lesion detection on an individual level; (2) lesion confirmation for reducing the risk of over reading the MRI; (3) detection of sulcal/gyral morphologic changes that are particularly difficult for visual analysis; and (4) delineation of cortical abnormalities extending beyond the visible lesion. Future directions to improve performance of MRI post-processing include using higher magnetic field strength for better signal and contrast to noise ratio, adopting a multi-contrast frame work, and integration with other noninvasive modalities. Summary MRI post-processing can provide essential value to increase the yield of structural MRI and should be included as part of the presurgical evaluation of nonlesional epilepsies. MRI post-processing allows for more accurate identification/delineation of cortical abnormalities, which should then be more confidently targeted and mapped. PMID:26900745

  10. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  11. Techniques for fire detection

    NASA Technical Reports Server (NTRS)

    Bukowski, Richard W.

    1987-01-01

    An overview is given of the basis for an analysis of combustable materials and potential ignition sources in a spacecraft. First, the burning process is discussed in terms of the production of the fire signatures normally associated with detection devices. These include convected and radiated thermal energy, particulates, and gases. Second, the transport processes associated with the movement of these from the fire to the detector, along with the important phenomena which cause the level of these signatures to be reduced, are described. Third, the operating characteristics of the individual types of detectors which influence their response to signals, are presented. Finally, vulnerability analysis using predictive fire modeling techniques is discussed as a means to establish the necessary response of the detection system to provide the level of protection required in the application.

  12. Automatic Fringe Detection for Oil Film Interferometry Measurement of Skin Friction

    NASA Technical Reports Server (NTRS)

    Naughton, Jonathan W.; Decker, Robert K.; Jafari, Farhad

    2001-01-01

    This report summarizes two years of work on investigating algorithms for automatically detecting fringe patterns in images acquired using oil-drop interferometry for the determination of skin friction. Several different analysis methods were tested, and a combination of a windowed Fourier transform followed by a correlation was found to be most effective. The implementation of this method is discussed and details of the process are described. The results indicate that this method shows promise for automating the fringe detection process, but further testing is required.

  13. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    PubMed

    Bhatia, Tripta

    2018-07-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  14. A fast automatic target detection method for detecting ships in infrared scenes

    NASA Astrophysics Data System (ADS)

    Özertem, Kemal Arda

    2016-05-01

    Automatic target detection in infrared scenes is a vital task for many application areas like defense, security and border surveillance. For anti-ship missiles, having a fast and robust ship detection algorithm is crucial for overall system performance. In this paper, a straight-forward yet effective ship detection method for infrared scenes is introduced. First, morphological grayscale reconstruction is applied to the input image, followed by an automatic thresholding onto the suppressed image. For the segmentation step, connected component analysis is employed to obtain target candidate regions. At this point, it can be realized that the detection is defenseless to outliers like small objects with relatively high intensity values or the clouds. To deal with this drawback, a post-processing stage is introduced. For the post-processing stage, two different methods are used. First, noisy detection results are rejected with respect to target size. Second, the waterline is detected by using Hough transform and the detection results that are located above the waterline with a small margin are rejected. After post-processing stage, there are still undesired holes remaining, which cause to detect one object as multi objects or not to detect an object as a whole. To improve the detection performance, another automatic thresholding is implemented only to target candidate regions. Finally, two detection results are fused and post-processing stage is repeated to obtain final detection result. The performance of overall methodology is tested with real world infrared test data.

  15. Feasibility of clinical detection of cervical dysplasia using angle-resolved low coherence interferometry measurements of depth-resolved nuclear morphology.

    PubMed

    Ho, Derek; Drake, Tyler K; Smith-McCune, Karen K; Darragh, Teresa M; Hwang, Loris Y; Wax, Adam

    2017-03-15

    This study sought to establish the feasibility of using in situ depth-resolved nuclear morphology measurements for detection of cervical dysplasia. Forty enrolled patients received routine cervical colposcopy with angle-resolved low coherence interferometry (a/LCI) measurements of nuclear morphology. a/LCI scans from 63 tissue sites were compared to histopathological analysis of co-registered biopsy specimens which were classified as benign, low-grade squamous intraepithelial lesion (LSIL), or high-grade squamous intraepithelial lesion (HSIL). Results were dichotomized as dysplastic (LSIL/HSIL) versus non-dysplastic and HSIL versus LSIL/benign to determine both accuracy and potential clinical utility of a/LCI nuclear morphology measurements. Analysis of a/LCI data was conducted using both traditional Mie theory based processing and a new hybrid algorithm that provides improved processing speed to ascertain the feasibility of real-time measurements. Analysis of depth-resolved nuclear morphology data revealed a/LCI was able to detect a significant increase in the nuclear diameter at the depth bin containing the basal layer of the epithelium for dysplastic versus non-dysplastic and HSIL versus LSIL/Benign biopsy sites (both p < 0.001). Both processing techniques resulted in high sensitivity and specificity (>0.80) in identifying dysplastic biopsies and HSIL. The hybrid algorithm demonstrated a threefold decrease in processing time at a slight cost in classification accuracy. The results demonstrate the feasibility of using a/LCI as an adjunctive clinical tool for detecting cervical dysplasia and guiding the identification of optimal biopsy sites. The faster speed from the hybrid algorithm offers a promising approach for real-time clinical analysis. © 2016 UICC.

  16. Feasibility of clinical detection of cervical dysplasia using angle-resolved low coherence interferometry measurements of depth-resolved nuclear morphology

    PubMed Central

    Ho, Derek; Drake, Tyler K.; Smith-McCune, Karen K.; Darragh, Teresa M.; Hwang, Loris Y.; Wax, Adam

    2017-01-01

    This study sought to establish the feasibility of using in situ depth-resolved nuclear morphology measurements for detection of cervical dysplasia. Forty (40) enrolled patients received routine cervical colposcopy with angle-resolved low coherence interferometry (a/LCI) measurements of nuclear morphology. a/LCI scans from 63 tissue sites were compared to histopathological analysis of co-registered biopsy specimens which were classified as benign, low-grade squamous intraepithelial lesion (LSIL), or high-grade squamous intraepithelial lesion (HSIL). Results were dichotomized as dysplastic (LSIL/HSIL) versus non-dysplastic and HSIL versus LSIL/benign to determine both accuracy and potential clinical utility of a/LCI nuclear morphology measurements. Analysis of a/LCI data was conducted using both traditional Mie theory based processing and a new hybrid algorithm that provides improved processing speed to ascertain the feasibility of real-time measurements. Analysis of depth-resolved nuclear morphology data revealed a/LCI was able to detect a significant increase in the nuclear diameter at the depth bin containing the basal layer of the epithelium for dysplastic versus non-dysplastic and HSIL versus LSIL/Benign biopsy sites (both p < 0.001). Both processing techniques resulted in high sensitivity and specificity (> 0.80) in identifying dysplastic biopsies and HSIL. The hybrid algorithm demonstrated a threefold decrease in processing time at a slight cost in classification accuracy. The results demonstrate the feasibility of using a/LCI as an adjunctive clinical tool for detecting cervical dysplasia and guiding the identification of optimal biopsy sites. The faster speed from the hybrid algorithm offers a promising approach for real-time clinical analysis. PMID:27883177

  17. In-TFT-Array-Process Micro Defect Inspection Using Nonlinear Principal Component Analysis

    PubMed Central

    Liu, Yi-Hung; Wang, Chi-Kai; Ting, Yung; Lin, Wei-Zhi; Kang, Zhi-Hao; Chen, Ching-Shun; Hwang, Jih-Shang

    2009-01-01

    Defect inspection plays a critical role in thin film transistor liquid crystal display (TFT-LCD) manufacture, and has received much attention in the field of automatic optical inspection (AOI). Previously, most focus was put on the problems of macro-scale Mura-defect detection in cell process, but it has recently been found that the defects which substantially influence the yield rate of LCD panels are actually those in the TFT array process, which is the first process in TFT-LCD manufacturing. Defect inspection in TFT array process is therefore considered a difficult task. This paper presents a novel inspection scheme based on kernel principal component analysis (KPCA) algorithm, which is a nonlinear version of the well-known PCA algorithm. The inspection scheme can not only detect the defects from the images captured from the surface of LCD panels, but also recognize the types of the detected defects automatically. Results, based on real images provided by a LCD manufacturer in Taiwan, indicate that the KPCA-based defect inspection scheme is able to achieve a defect detection rate of over 99% and a high defect classification rate of over 96% when the imbalanced support vector machine (ISVM) with 2-norm soft margin is employed as the classifier. More importantly, the inspection time is less than 1 s per input image. PMID:20057957

  18. Phase analysis of coherent radial-breathing-mode phonons in carbon nanotubes: Implications for generation and detection processes

    NASA Astrophysics Data System (ADS)

    Shimura, Akihiko; Yanagi, Kazuhiro; Yoshizawa, Masayuki

    2018-01-01

    In time-resolved pump-probe spectroscopy of carbon nanotubes, the fundamental understanding of the optical generation and detection processes of radial-breathing-mode (RBM) phonons has been inconsistent among the previous reports. In this study, the tunable-pumping/broadband-probing scheme was used to fully reveal the amplitude and phase of the phonon-modulated signals. We observed that signals detected off resonantly to excitonic transitions are delayed by π /2 radians with respect to resonantly detected signals, which demonstrates that RBM phonons are detected through dynamically modulating the linear response, not through adiabatically modulating the light absorption. Furthermore, we found that the initial phases are independent of the pump detuning across the first (E11) and the second (E22) excitonic resonances, evidencing that the RBM phonons are generated by the displacive excitation rather than stimulated Raman process.

  19. An effective fovea detection and automatic assessment of diabetic maculopathy in color fundus images.

    PubMed

    Medhi, Jyoti Prakash; Dandapat, Samarendra

    2016-07-01

    Prolonged diabetes causes severe damage to the vision through leakage of blood and blood constituents over the retina. The effect of the leakage becomes more threatening when these abnormalities involve the macula. This condition is known as diabetic maculopathy and it leads to blindness, if not treated in time. Early detection and proper diagnosis can help in preventing this irreversible damage. To achieve this, the possible way is to perform retinal screening at regular intervals. But the ratio of ophthalmologists to patients is very small and the process of evaluation is time consuming. Here, the automatic methods for analyzing retinal/fundus images prove handy and help the ophthalmologists to screen at a faster rate. Motivated from this aspect, an automated method for detection and analysis of diabetic maculopathy is proposed in this work. The method is implemented in two stages. The first stage involves preprocessing required for preparing the image for further analysis. During this stage the input image is enhanced and the optic disc is masked to avoid false detection during bright lesion identification. The second stage is maculopathy detection and its analysis. Here, the retinal lesions including microaneurysms, hemorrhages and exudates are identified by processing the green and hue plane color images. The macula and the fovea locations are determined using intensity property of processed red plane image. Different circular regions are thereafter marked in the neighborhood of the macula. The presence of lesions in these regions is identified to confirm positive maculopathy. Later, the information is used for evaluating its severity. The principal advantage of the proposed algorithm is, utilization of the relation of blood vessels with optic disc and macula, which enhances the detection process. Proper usage of various color plane information sequentially enables the algorithm to perform better. The method is tested on various publicly available databases consisting of both normal and maculopathy images. The algorithm detects fovea with an accuracy of 98.92% when applied on 1374 images. The average specificity and sensitivity of the proposed method for maculopathy detection are obtained as 98.05% and 98.86% respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Music Perception in Dementia.

    PubMed

    Golden, Hannah L; Clark, Camilla N; Nicholas, Jennifer M; Cohen, Miriam H; Slattery, Catherine F; Paterson, Ross W; Foulkes, Alexander J M; Schott, Jonathan M; Mummery, Catherine J; Crutch, Sebastian J; Warren, Jason D

    2017-01-01

    Despite much recent interest in music and dementia, music perception has not been widely studied across dementia syndromes using an information processing approach. Here we addressed this issue in a cohort of 30 patients representing major dementia syndromes of typical Alzheimer's disease (AD, n = 16), logopenic aphasia (LPA, an Alzheimer variant syndrome; n = 5), and progressive nonfluent aphasia (PNFA; n = 9) in relation to 19 healthy age-matched individuals. We designed a novel neuropsychological battery to assess perception of musical patterns in the dimensions of pitch and temporal information (requiring detection of notes that deviated from the established pattern based on local or global sequence features) and musical scene analysis (requiring detection of a familiar tune within polyphonic harmony). Performance on these tests was referenced to generic auditory (timbral) deviance detection and recognition of familiar tunes and adjusted for general auditory working memory performance. Relative to healthy controls, patients with AD and LPA had group-level deficits of global pitch (melody contour) processing while patients with PNFA as a group had deficits of local (interval) as well as global pitch processing. There was substantial individual variation within syndromic groups. Taking working memory performance into account, no specific deficits of musical temporal processing, timbre processing, musical scene analysis, or tune recognition were identified. The findings suggest that particular aspects of music perception such as pitch pattern analysis may open a window on the processing of information streams in major dementia syndromes. The potential selectivity of musical deficits for particular dementia syndromes and particular dimensions of processing warrants further systematic investigation.

  1. Application of higher order SVD to vibration-based system identification and damage detection

    NASA Astrophysics Data System (ADS)

    Chao, Shu-Hsien; Loh, Chin-Hsiung; Weng, Jian-Huang

    2012-04-01

    Singular value decomposition (SVD) is a powerful linear algebra tool. It is widely used in many different signal processing methods, such principal component analysis (PCA), singular spectrum analysis (SSA), frequency domain decomposition (FDD), subspace identification and stochastic subspace identification method ( SI and SSI ). In each case, the data is arranged appropriately in matrix form and SVD is used to extract the feature of the data set. In this study three different algorithms on signal processing and system identification are proposed: SSA, SSI-COV and SSI-DATA. Based on the extracted subspace and null-space from SVD of data matrix, damage detection algorithms can be developed. The proposed algorithm is used to process the shaking table test data of the 6-story steel frame. Features contained in the vibration data are extracted by the proposed method. Damage detection can then be investigated from the test data of the frame structure through subspace-based and nullspace-based damage indices.

  2. Research on a Defects Detection Method in the Ferrite Phase Shifter Cementing Process Based on a Multi-Sensor Prognostic and Health Management (PHM) System.

    PubMed

    Wan, Bo; Fu, Guicui; Li, Yanruoyue; Zhao, Youhu

    2016-08-10

    The cementing manufacturing process of ferrite phase shifters has the defect that cementing strength is insufficient and fractures always appear. A detection method of these defects was studied utilizing the multi-sensors Prognostic and Health Management (PHM) theory. Aiming at these process defects, the reasons that lead to defects are analyzed in this paper. In the meanwhile, the key process parameters were determined and Differential Scanning Calorimetry (DSC) tests during the cure process of resin cementing were carried out. At the same time, in order to get data on changing cementing strength, multiple-group cementing process tests of different key process parameters were designed and conducted. A relational model of cementing strength and cure temperature, time and pressure was established, by combining data of DSC and process tests as well as based on the Avrami formula. Through sensitivity analysis for three process parameters, the on-line detection decision criterion and the process parameters which have obvious impact on cementing strength were determined. A PHM system with multiple temperature and pressure sensors was established on this basis, and then, on-line detection, diagnosis and control for ferrite phase shifter cementing process defects were realized. It was verified by subsequent process that the on-line detection system improved the reliability of the ferrite phase shifter cementing process and reduced the incidence of insufficient cementing strength defects.

  3. A simplified implementation of edge detection in MATLAB is faster and more sensitive than fast fourier transform for actin fiber alignment quantification.

    PubMed

    Kemeny, Steven Frank; Clyne, Alisa Morss

    2011-04-01

    Fiber alignment plays a critical role in the structure and function of cells and tissues. While fiber alignment quantification is important to experimental analysis and several different methods for quantifying fiber alignment exist, many studies focus on qualitative rather than quantitative analysis perhaps due to the complexity of current fiber alignment methods. Speed and sensitivity were compared in edge detection and fast Fourier transform (FFT) for measuring actin fiber alignment in cells exposed to shear stress. While edge detection using matrix multiplication was consistently more sensitive than FFT, image processing time was significantly longer. However, when MATLAB functions were used to implement edge detection, MATLAB's efficient element-by-element calculations and fast filtering techniques reduced computation cost 100 times compared to the matrix multiplication edge detection method. The new computation time was comparable to the FFT method, and MATLAB edge detection produced well-distributed fiber angle distributions that statistically distinguished aligned and unaligned fibers in half as many sample images. When the FFT sensitivity was improved by dividing images into smaller subsections, processing time grew larger than the time required for MATLAB edge detection. Implementation of edge detection in MATLAB is simpler, faster, and more sensitive than FFT for fiber alignment quantification.

  4. Development of image processing method to detect noise in geostationary imagery

    NASA Astrophysics Data System (ADS)

    Khlopenkov, Konstantin V.; Doelling, David R.

    2016-10-01

    The Clouds and the Earth's Radiant Energy System (CERES) has incorporated imagery from 16 individual geostationary (GEO) satellites across five contiguous domains since March 2000. In order to derive broadband fluxes uniform across satellite platforms it is important to ensure a good quality of the input raw count data. GEO data obtained by older GOES imagers (such as MTSAT-1, Meteosat-5, Meteosat-7, GMS-5, and GOES-9) are known to frequently contain various types of noise caused by transmission errors, sync errors, stray light contamination, and others. This work presents an image processing methodology designed to detect most kinds of noise and corrupt data in all bands of raw imagery from modern and historic GEO satellites. The algorithm is based on a set of different approaches to detect abnormal image patterns, including inter-line and inter-pixel differences within a scanline, correlation between scanlines, analysis of spatial variance, and also a 2D Fourier analysis of the image spatial frequencies. In spite of computational complexity, the described method is highly optimized for performance to facilitate volume processing of multi-year data and runs in fully automated mode. Reliability of this noise detection technique has been assessed by human supervision for each GEO dataset obtained during selected time periods in 2005 and 2006. This assessment has demonstrated the overall detection accuracy of over 99.5% and the false alarm rate of under 0.3%. The described noise detection routine is currently used in volume processing of historical GEO imagery for subsequent production of global gridded data products and for cross-platform calibration.

  5. Fundamental deficits of auditory perception in Wernicke's aphasia.

    PubMed

    Robson, Holly; Grube, Manon; Lambon Ralph, Matthew A; Griffiths, Timothy D; Sage, Karen

    2013-01-01

    This work investigates the nature of the comprehension impairment in Wernicke's aphasia (WA), by examining the relationship between deficits in auditory processing of fundamental, non-verbal acoustic stimuli and auditory comprehension. WA, a condition resulting in severely disrupted auditory comprehension, primarily occurs following a cerebrovascular accident (CVA) to the left temporo-parietal cortex. Whilst damage to posterior superior temporal areas is associated with auditory linguistic comprehension impairments, functional-imaging indicates that these areas may not be specific to speech processing but part of a network for generic auditory analysis. We examined analysis of basic acoustic stimuli in WA participants (n = 10) using auditory stimuli reflective of theories of cortical auditory processing and of speech cues. Auditory spectral, temporal and spectro-temporal analysis was assessed using pure-tone frequency discrimination, frequency modulation (FM) detection and the detection of dynamic modulation (DM) in "moving ripple" stimuli. All tasks used criterion-free, adaptive measures of threshold to ensure reliable results at the individual level. Participants with WA showed normal frequency discrimination but significant impairments in FM and DM detection, relative to age- and hearing-matched controls at the group level (n = 10). At the individual level, there was considerable variation in performance, and thresholds for both FM and DM detection correlated significantly with auditory comprehension abilities in the WA participants. These results demonstrate the co-occurrence of a deficit in fundamental auditory processing of temporal and spectro-temporal non-verbal stimuli in WA, which may have a causal contribution to the auditory language comprehension impairment. Results are discussed in the context of traditional neuropsychology and current models of cortical auditory processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Robust crop and weed segmentation under uncontrolled outdoor illumination

    USDA-ARS?s Scientific Manuscript database

    A new machine vision for weed detection was developed from RGB color model images. Processes included in the algorithm for the detection were excessive green conversion, threshold value computation by statistical analysis, adaptive image segmentation by adjusting the threshold value, median filter, ...

  7. IsoMS: automated processing of LC-MS data generated by a chemical isotope labeling metabolomics platform.

    PubMed

    Zhou, Ruokun; Tseng, Chiao-Li; Huan, Tao; Li, Liang

    2014-05-20

    A chemical isotope labeling or isotope coded derivatization (ICD) metabolomics platform uses a chemical derivatization method to introduce a mass tag to all of the metabolites having a common functional group (e.g., amine), followed by LC-MS analysis of the labeled metabolites. To apply this platform to metabolomics studies involving quantitative analysis of different groups of samples, automated data processing is required. Herein, we report a data processing method based on the use of a mass spectral feature unique to the chemical labeling approach, i.e., any differential-isotope-labeled metabolites are detected as peak pairs with a fixed mass difference in a mass spectrum. A software tool, IsoMS, has been developed to process the raw data generated from one or multiple LC-MS runs by peak picking, peak pairing, peak-pair filtering, and peak-pair intensity ratio calculation. The same peak pairs detected from multiple samples are then aligned to produce a CSV file that contains the metabolite information and peak ratios relative to a control (e.g., a pooled sample). This file can be readily exported for further data and statistical analysis, which is illustrated in an example of comparing the metabolomes of human urine samples collected before and after drinking coffee. To demonstrate that this method is reliable for data processing, five (13)C2-/(12)C2-dansyl labeled metabolite standards were analyzed by LC-MS. IsoMS was able to detect these metabolites correctly. In addition, in the analysis of a (13)C2-/(12)C2-dansyl labeled human urine, IsoMS detected 2044 peak pairs, and manual inspection of these peak pairs found 90 false peak pairs, representing a false positive rate of 4.4%. IsoMS for Windows running R is freely available for noncommercial use from www.mycompoundid.org/IsoMS.

  8. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  9. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  10. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  11. An FPGA-Based Rapid Wheezing Detection System

    PubMed Central

    Lin, Bor-Shing; Yen, Tian-Shiue

    2014-01-01

    Wheezing is often treated as a crucial indicator in the diagnosis of obstructive pulmonary diseases. A rapid wheezing detection system may help physicians to monitor patients over the long-term. In this study, a portable wheezing detection system based on a field-programmable gate array (FPGA) is proposed. This system accelerates wheezing detection, and can be used as either a single-process system, or as an integrated part of another biomedical signal detection system. The system segments sound signals into 2-second units. A short-time Fourier transform was used to determine the relationship between the time and frequency components of wheezing sound data. A spectrogram was processed using 2D bilateral filtering, edge detection, multithreshold image segmentation, morphological image processing, and image labeling, to extract wheezing features according to computerized respiratory sound analysis (CORSA) standards. These features were then used to train the support vector machine (SVM) and build the classification models. The trained model was used to analyze sound data to detect wheezing. The system runs on a Xilinx Virtex-6 FPGA ML605 platform. The experimental results revealed that the system offered excellent wheezing recognition performance (0.912). The detection process can be used with a clock frequency of 51.97 MHz, and is able to perform rapid wheezing classification. PMID:24481034

  12. Improving Earth/Prediction Models to Improve Network Processing

    NASA Astrophysics Data System (ADS)

    Wagner, G. S.

    2017-12-01

    The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.

  13. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  14. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  15. Covariance of lucky images: performance analysis

    NASA Astrophysics Data System (ADS)

    Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.

    2017-01-01

    The covariance of ground-based lucky images is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper, we analyse the relevance of the number of processed frames, the frames' quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer-simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.

  16. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    NASA Astrophysics Data System (ADS)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good success rates to objectively detect landslides in high-resolution topography data by GEOBIA.

  17. Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters

    NASA Astrophysics Data System (ADS)

    Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon

    2018-04-01

    In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.

  18. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Carlos A.

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  19. Spectrometer gun

    DOEpatents

    Waechter, David A.; Wolf, Michael A.; Umbarger, C. John

    1985-01-01

    A hand-holdable, battery-operated, microprocessor-based spectrometer gun includes a low-power matrix display and sufficient memory to permit both real-time observation and extended analysis of detected radiation pulses. Universality of the incorporated signal processing circuitry permits operation with various detectors having differing pulse detection and sensitivity parameters.

  20. Algorithms used in the Airborne Lidar Processing System (ALPS)

    USGS Publications Warehouse

    Nagle, David B.; Wright, C. Wayne

    2016-05-23

    The Airborne Lidar Processing System (ALPS) analyzes Experimental Advanced Airborne Research Lidar (EAARL) data—digitized laser-return waveforms, position, and attitude data—to derive point clouds of target surfaces. A full-waveform airborne lidar system, the EAARL seamlessly and simultaneously collects mixed environment data, including submerged, sub-aerial bare earth, and vegetation-covered topographies.ALPS uses three waveform target-detection algorithms to determine target positions within a given waveform: centroid analysis, leading edge detection, and bottom detection using water-column backscatter modeling. The centroid analysis algorithm detects opaque hard surfaces. The leading edge algorithm detects topography beneath vegetation and shallow, submerged topography. The bottom detection algorithm uses water-column backscatter modeling for deeper submerged topography in turbid water.The report describes slant range calculations and explains how ALPS uses laser range and orientation measurements to project measurement points into the Universal Transverse Mercator coordinate system. Parameters used for coordinate transformations in ALPS are described, as are Interactive Data Language-based methods for gridding EAARL point cloud data to derive digital elevation models. Noise reduction in point clouds through use of a random consensus filter is explained, and detailed pseudocode, mathematical equations, and Yorick source code accompany the report.

  1. A retrospective detection algorithm for extraction of weak targets in clutter and interference environments

    NASA Astrophysics Data System (ADS)

    Prengaman, R. J.; Thurber, R. E.; Bath, W. G.

    The usefulness of radar systems depends on the ability to distinguish between signals returned from desired targets and noise. A retrospective processor uses all contacts (or 'plots') from several past radar scans, taking into account all possible target trajectories formed from stored contacts for each input detection. The processor eliminates many false alarms, while retaining those contacts describing resonable trajectories. The employment of a retrospective processor makes it, therefore, possible to obtain large improvements in detection sensitivity in certain important clutter environments. Attention is given to the retrospective processing concept, a theoretical analysis of the multiscan detection process, the experimental evaluation of retrospective data filter, and aspects of retrospective data filter hardware implementation.

  2. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  3. An Analysis of the Sensitivity of Proteogenomic Mapping of Somatic Mutations and Novel Splicing Events in Cancer.

    PubMed

    Ruggles, Kelly V; Tang, Zuojian; Wang, Xuya; Grover, Himanshu; Askenazi, Manor; Teubl, Jennifer; Cao, Song; McLellan, Michael D; Clauser, Karl R; Tabb, David L; Mertins, Philipp; Slebos, Robbert; Erdmann-Gilmore, Petra; Li, Shunqiang; Gunawardena, Harsha P; Xie, Ling; Liu, Tao; Zhou, Jian-Ying; Sun, Shisheng; Hoadley, Katherine A; Perou, Charles M; Chen, Xian; Davies, Sherri R; Maher, Christopher A; Kinsinger, Christopher R; Rodland, Karen D; Zhang, Hui; Zhang, Zhen; Ding, Li; Townsend, R Reid; Rodriguez, Henry; Chan, Daniel; Smith, Richard D; Liebler, Daniel C; Carr, Steven A; Payne, Samuel; Ellis, Matthew J; Fenyő, David

    2016-03-01

    Improvements in mass spectrometry (MS)-based peptide sequencing provide a new opportunity to determine whether polymorphisms, mutations, and splice variants identified in cancer cells are translated. Herein, we apply a proteogenomic data integration tool (QUILTS) to illustrate protein variant discovery using whole genome, whole transcriptome, and global proteome datasets generated from a pair of luminal and basal-like breast-cancer-patient-derived xenografts (PDX). The sensitivity of proteogenomic analysis for singe nucleotide variant (SNV) expression and novel splice junction (NSJ) detection was probed using multiple MS/MS sample process replicates defined here as an independent tandem MS experiment using identical sample material. Despite analysis of over 30 sample process replicates, only about 10% of SNVs (somatic and germline) detected by both DNA and RNA sequencing were observed as peptides. An even smaller proportion of peptides corresponding to NSJ observed by RNA sequencing were detected (<0.1%). Peptides mapping to DNA-detected SNVs without a detectable mRNA transcript were also observed, suggesting that transcriptome coverage was incomplete (∼80%). In contrast to germline variants, somatic variants were less likely to be detected at the peptide level in the basal-like tumor than in the luminal tumor, raising the possibility of differential translation or protein degradation effects. In conclusion, this large-scale proteogenomic integration allowed us to determine the degree to which mutations are translated and identify gaps in sequence coverage, thereby benchmarking current technology and progress toward whole cancer proteome and transcriptome analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  4. A robust human face detection algorithm

    NASA Astrophysics Data System (ADS)

    Raviteja, Thaluru; Karanam, Srikrishna; Yeduguru, Dinesh Reddy V.

    2012-01-01

    Human face detection plays a vital role in many applications like video surveillance, managing a face image database, human computer interface among others. This paper proposes a robust algorithm for face detection in still color images that works well even in a crowded environment. The algorithm uses conjunction of skin color histogram, morphological processing and geometrical analysis for detecting human faces. To reinforce the accuracy of face detection, we further identify mouth and eye regions to establish the presence/absence of face in a particular region of interest.

  5. Vision-based in-line fabric defect detection using yarn-specific shape features

    NASA Astrophysics Data System (ADS)

    Schneider, Dorian; Aach, Til

    2012-01-01

    We develop a methodology for automatic in-line flaw detection in industrial woven fabrics. Where state of the art detection algorithms apply texture analysis methods to operate on low-resolved ({200 ppi) image data, we describe here a process flow to segment single yarns in high-resolved ({1000 ppi) textile images. Four yarn shape features are extracted, allowing a precise detection and measurement of defects. The degree of precision reached allows a classification of detected defects according to their nature, providing an innovation in the field of automatic fabric flaw detection. The design has been carried out to meet real time requirements and face adverse conditions caused by loom vibrations and dirt. The entire process flow is discussed followed by an evaluation using a database with real-life industrial fabric images. This work pertains to the construction of an on-loom defect detection system to be used in manufacturing practice.

  6. Intelligent approach for analysis of respiratory signals and oxygen saturation in the sleep apnea/hypopnea syndrome.

    PubMed

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints.

  7. ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.

    PubMed

    Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus

    2011-12-01

    The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.

  8. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    PubMed Central

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  9. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  10. Automated face detection for occurrence and occupancy estimation in chimpanzees.

    PubMed

    Crunchant, Anne-Sophie; Egerer, Monika; Loos, Alexander; Burghardt, Tilo; Zuberbühler, Klaus; Corogenes, Katherine; Leinert, Vera; Kulik, Lars; Kühl, Hjalmar S

    2017-03-01

    Surveying endangered species is necessary to evaluate conservation effectiveness. Camera trapping and biometric computer vision are recent technological advances. They have impacted on the methods applicable to field surveys and these methods have gained significant momentum over the last decade. Yet, most researchers inspect footage manually and few studies have used automated semantic processing of video trap data from the field. The particular aim of this study is to evaluate methods that incorporate automated face detection technology as an aid to estimate site use of two chimpanzee communities based on camera trapping. As a comparative baseline we employ traditional manual inspection of footage. Our analysis focuses specifically on the basic parameter of occurrence where we assess the performance and practical value of chimpanzee face detection software. We found that the semi-automated data processing required only 2-4% of the time compared to the purely manual analysis. This is a non-negligible increase in efficiency that is critical when assessing the feasibility of camera trap occupancy surveys. Our evaluations suggest that our methodology estimates the proportion of sites used relatively reliably. Chimpanzees are mostly detected when they are present and when videos are filmed in high-resolution: the highest recall rate was 77%, for a false alarm rate of 2.8% for videos containing only chimpanzee frontal face views. Certainly, our study is only a first step for transferring face detection software from the lab into field application. Our results are promising and indicate that the current limitation of detecting chimpanzees in camera trap footage due to lack of suitable face views can be easily overcome on the level of field data collection, that is, by the combined placement of multiple high-resolution cameras facing reverse directions. This will enable to routinely conduct chimpanzee occupancy surveys based on camera trapping and semi-automated processing of footage. Using semi-automated ape face detection technology for processing camera trap footage requires only 2-4% of the time compared to manual analysis and allows to estimate site use by chimpanzees relatively reliably. © 2017 Wiley Periodicals, Inc.

  11. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  12. Identification and assessment of common errors in the admission process of patients in Isfahan Fertility and Infertility Center based on "failure modes and effects analysis".

    PubMed

    Dehghan, Ashraf; Abumasoudi, Rouhollah Sheikh; Ehsanpour, Soheila

    2016-01-01

    Infertility and errors in the process of its treatment have a negative impact on infertile couples. The present study was aimed to identify and assess the common errors in the reception process by applying the approach of "failure modes and effects analysis" (FMEA). In this descriptive cross-sectional study, the admission process of fertility and infertility center of Isfahan was selected for evaluation of its errors based on the team members' decision. At first, the admission process was charted through observations and interviewing employees, holding multiple panels, and using FMEA worksheet, which has been used in many researches all over the world and also in Iran. Its validity was evaluated through content and face validity, and its reliability was evaluated through reviewing and confirmation of the obtained information by the FMEA team, and eventually possible errors, causes, and three indicators of severity of effect, probability of occurrence, and probability of detection were determined and corrective actions were proposed. Data analysis was determined by the number of risk priority (RPN) which is calculated by multiplying the severity of effect, probability of occurrence, and probability of detection. Twenty-five errors with RPN ≥ 125 was detected through the admission process, in which six cases of error had high priority in terms of severity and occurrence probability and were identified as high-risk errors. The team-oriented method of FMEA could be useful for assessment of errors and also to reduce the occurrence probability of errors.

  13. Threshold detection in an on-off binary communications channel with atmospheric scintillation

    NASA Technical Reports Server (NTRS)

    Webb, W. E.; Marino, J. T., Jr.

    1974-01-01

    The optimum detection threshold in an on-off binary optical communications system operating in the presence of atmospheric turbulence was investigated assuming a poisson detection process and log normal scintillation. The dependence of the probability of bit error on log amplitude variance and received signal strength was analyzed and semi-emperical relationships to predict the optimum detection threshold derived. On the basis of this analysis a piecewise linear model for an adaptive threshold detection system is presented. Bit error probabilities for non-optimum threshold detection system were also investigated.

  14. Threshold detection in an on-off binary communications channel with atmospheric scintillation

    NASA Technical Reports Server (NTRS)

    Webb, W. E.

    1975-01-01

    The optimum detection threshold in an on-off binary optical communications system operating in the presence of atmospheric turbulence was investigated assuming a poisson detection process and log normal scintillation. The dependence of the probability of bit error on log amplitude variance and received signal strength was analyzed and semi-empirical relationships to predict the optimum detection threshold derived. On the basis of this analysis a piecewise linear model for an adaptive threshold detection system is presented. The bit error probabilities for nonoptimum threshold detection systems were also investigated.

  15. Statistical process control and verifying positional accuracy of a cobra motion couch using step-wedge quality assurance tool.

    PubMed

    Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B

    2017-09-01

    This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting systematic and random variations. SPC analysis for couch positional accuracy enabled greater control in the identification of errors, thereby increasing confidence levels in daily treatment setups. © 2017 Royal Brisbane and Women's Hospital, Metro North Hospital and Health Service. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  16. What drives high flow events in the Swiss Alps? Recent developments in wavelet spectral analysis and their application to hydrology

    NASA Astrophysics Data System (ADS)

    Schaefli, B.; Maraun, D.; Holschneider, M.

    2007-12-01

    Extreme hydrological events are often triggered by exceptional co-variations of the relevant hydrometeorological processes and in particular by exceptional co-oscillations at various temporal scales. Wavelet and cross wavelet spectral analysis offers promising time-scale resolved analysis methods to detect and analyze such exceptional co-oscillations. This paper presents the state-of-the-art methods of wavelet spectral analysis, discusses related subtleties, potential pitfalls and recently developed solutions to overcome them and shows how wavelet spectral analysis, if combined to a rigorous significance test, can lead to reliable new insights into hydrometeorological processes for real-world applications. The presented methods are applied to detect potentially flood triggering situations in a high Alpine catchment for which a recent re-estimation of design floods encountered significant problems simulating the observed high flows. For this case study, wavelet spectral analysis of precipitation, temperature and discharge offers a powerful tool to help detecting potentially flood producing meteorological situations and to distinguish between different types of floods with respect to the prevailing critical hydrometeorological conditions. This opens very new perspectives for the analysis of model performances focusing on the occurrence and non-occurrence of different types of high flow events. Based on the obtained results, the paper summarizes important recommendations for future applications of wavelet spectral analysis in hydrology.

  17. [Research on a non-invasive pulse wave detection and analysis system].

    PubMed

    Li, Ting; Yu, Gang

    2008-10-01

    A novel non-invasive pulse wave detection and analysis system has been developed, including the software and the hardware. Bi-channel signals can be acquired, stored and shown on the screen dynamically at the same time. Pulse wave can be reshown and printed after pulse wave analysis and pulse wave velocity analysis. This system embraces a computer which is designed for fast data saving, analyzing and processing, and a portable data sampling machine which is based on a singlechip. Experimental results have shown that the system is stable and easy to use, and the parameters are calculated accurately.

  18. Real-Time Plasma Process Condition Sensing and Abnormal Process Detection

    PubMed Central

    Yang, Ryan; Chen, Rongshun

    2010-01-01

    The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES) is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools. PMID:22219683

  19. Spectrometer gun

    DOEpatents

    Waechter, D.A.; Wolf, M.A.; Umbarger, C.J.

    1981-11-03

    A hand-holdable, battery-operated, microprocessor-based spectrometer gun is described that includes a low-power matrix display and sufficient memory to permit both real-time observation and extended analysis of detected radiation pulses. Universality of the incorporated signal processing circuitry permits operation with various detectors having differing pulse detection and sensitivity parameters.

  20. Methodological Variables in the Analysis of Cell-Free DNA.

    PubMed

    Bronkhorst, Abel Jacobus; Aucamp, Janine; Pretorius, Piet J

    2016-01-01

    In recent years, cell-free DNA (cfDNA) analysis has received increasing amounts of attention as a potential non-invasive screening tool for the early detection of genetic aberrations and a wide variety of diseases, especially cancer. However, except for some prenatal tests and BEAMing, a technique used to detect mutations in various genes of cancer patients, cfDNA analysis is not yet routinely applied in clinical practice. Although some confusing biological factors inherent to the in vivo setting play a key part, it is becoming increasingly clear that this struggle is mainly due to the lack of an analytical consensus, especially as regards quantitative analyses of cfDNA. In order to use quantitative analysis of cfDNA with confidence, process optimization and standardization are crucial. In this work we aim to elucidate the most confounding variables of each preanalytical step that must be considered for process optimization and equivalence of procedures.

  1. Effect of food processing on plant DNA degradation and PCR-based GMO analysis: a review.

    PubMed

    Gryson, Nicolas

    2010-03-01

    The applicability of a DNA-based method for GMO detection and quantification depends on the quality and quantity of the DNA. Important food-processing conditions, for example temperature and pH, may lead to degradation of the DNA, rendering PCR analysis impossible or GMO quantification unreliable. This review discusses the effect of several food processes on DNA degradation and subsequent GMO detection and quantification. The data show that, although many of these processes do indeed lead to the fragmentation of DNA, amplification of the DNA may still be possible. Length and composition of the amplicon may, however, affect the result, as also may the method of extraction used. Also, many techniques are used to describe the behaviour of DNA in food processing, which occasionally makes it difficult to compare research results. Further research should be aimed at defining ingredients in terms of their DNA quality and PCR amplification ability, and elaboration of matrix-specific certified reference materials.

  2. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants

    NASA Astrophysics Data System (ADS)

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  3. Multi-fault clustering and diagnosis of gear system mined by spectrum entropy clustering based on higher order cumulants.

    PubMed

    Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei

    2013-02-01

    Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.

  4. Towards real-time medical diagnostics using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Bjorgan, Asgeir; Randeberg, Lise L.

    2015-07-01

    Hyperspectral imaging provides non-contact, high resolution spectral images which has a substantial diagnostic potential. This can be used for e.g. diagnosis and early detection of arthritis in finger joints. Processing speed is currently a limitation for clinical use of the technique. A real-time system for analysis and visualization using GPU processing and threaded CPU processing is presented. Images showing blood oxygenation, blood volume fraction and vessel enhanced images are among the data calculated in real-time. This study shows the potential of real-time processing in this context. A combination of the processing modules will be used in detection of arthritic finger joints from hyperspectral reflectance and transmittance data.

  5. EPIBLASTER-fast exhaustive two-locus epistasis detection strategy using graphical processing units

    PubMed Central

    Kam-Thong, Tony; Czamara, Darina; Tsuda, Koji; Borgwardt, Karsten; Lewis, Cathryn M; Erhardt-Lehmann, Angelika; Hemmer, Bernhard; Rieckmann, Peter; Daake, Markus; Weber, Frank; Wolf, Christiane; Ziegler, Andreas; Pütz, Benno; Holsboer, Florian; Schölkopf, Bernhard; Müller-Myhsok, Bertram

    2011-01-01

    Detection of epistatic interaction between loci has been postulated to provide a more in-depth understanding of the complex biological and biochemical pathways underlying human diseases. Studying the interaction between two loci is the natural progression following traditional and well-established single locus analysis. However, the added costs and time duration required for the computation involved have thus far deterred researchers from pursuing a genome-wide analysis of epistasis. In this paper, we propose a method allowing such analysis to be conducted very rapidly. The method, dubbed EPIBLASTER, is applicable to case–control studies and consists of a two-step process in which the difference in Pearson's correlation coefficients is computed between controls and cases across all possible SNP pairs as an indication of significant interaction warranting further analysis. For the subset of interactions deemed potentially significant, a second-stage analysis is performed using the likelihood ratio test from the logistic regression to obtain the P-value for the estimated coefficients of the individual effects and the interaction term. The algorithm is implemented using the parallel computational capability of commercially available graphical processing units to greatly reduce the computation time involved. In the current setup and example data sets (211 cases, 222 controls, 299468 SNPs; and 601 cases, 825 controls, 291095 SNPs), this coefficient evaluation stage can be completed in roughly 1 day. Our method allows for exhaustive and rapid detection of significant SNP pair interactions without imposing significant marginal effects of the single loci involved in the pair. PMID:21150885

  6. Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Darren

    2004-07-01

    MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefrontsmore » at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.« less

  7. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  8. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Applying shot boundary detection for automated crystal growth analysis during in situ transmission electron microscope experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moeglein, W. A.; Griswold, R.; Mehdi, B. L.

    In-situ (scanning) transmission electron microscopy (S/TEM) is being developed for numerous applications in the study of nucleation and growth under electrochemical driving forces. For this type of experiment, one of the key parameters is to identify when nucleation initiates. Typically the process of identifying the moment that crystals begin to form is a manual process requiring the user to perform an observation and respond accordingly (adjust focus, magnification, translate the stage etc.). However, as the speed of the cameras being used to perform these observations increases, the ability of a user to “catch” the important initial stage of nucleation decreasesmore » (there is more information that is available in the first few milliseconds of the process). Here we show that video shot boundary detection (SBD) can automatically detect frames where a change in the image occurs. We show that this method can be applied to quickly and accurately identify points of change during crystal growth. This technique allows for automated segmentation of a digital stream for further analysis and the assignment of arbitrary time stamps for the initiation of processes that are independent of the user’s ability to observe and react.« less

  10. Extraction and determination of biogenic amines in fermented sausages and other meat products using reversed-phase-HPLC.

    PubMed

    Straub, B; Schollenberger, M; Kicherer, M; Luckas, B; Hammes, W P

    1993-09-01

    A convenient method is described for the analysis of biogenic amines (BA) by means of reversed-phase-HPLC. The method is characterized by multi-channel UV detection (diodearray), subsequent post-column derivatization with o-phthaldialdehyde and 3-mercaptopropionic acid, and fluorescence detection. For the analysis of meat products and especially fermented sausages an optimized perchloric acid extraction process was introduced to determine putrescine, cadaverine, histamine, tyramine and 2-phenylethylamine. BA recoveries from meat ranged between 96 and 113% with a detection limit for amines of 0.5 mg/kg.

  11. The Chandra Source Catalog: Processing and Infrastructure

    NASA Astrophysics Data System (ADS)

    Evans, Janet; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Hall, Diane M.; Miller, Joseph B.; Plummer, David A.; Zografou, Panagoula; Primini, Francis A.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    Chandra Source Catalog processing recalibrates each observation using the latest available calibration data, and employs a wavelet-based source detection algorithm to identify all the X-ray sources in the field of view. Source properties are then extracted from each detected source that is a candidate for inclusion in the catalog. Catalog processing is completed by matching sources across multiple observations, merging common detections, and applying quality assurance checks. The Chandra Source Catalog processing system shares a common processing infrastructure and utilizes much of the functionality that is built into the Standard Data Processing (SDP) pipeline system that provides calibrated Chandra data to end-users. Other key components of the catalog processing system have been assembled from the portable CIAO data analysis package. Minimal new software tool development has been required to support the science algorithms needed for catalog production. Since processing pipelines must be instantiated for each detected source, the number of pipelines that are run during catalog construction is a factor of order 100 times larger than for SDP. The increased computational load, and inherent parallel nature of the processing, is handled by distributing the workload across a multi-node Beowulf cluster. Modifications to the SDP automated processing application to support catalog processing, and extensions to Chandra Data Archive software to ingest and retrieve catalog products, complete the upgrades to the infrastructure to support catalog processing.

  12. Supervised detection of exoplanets in high-contrast imaging sequences

    NASA Astrophysics Data System (ADS)

    Gomez Gonzalez, C. A.; Absil, O.; Van Droogenbroeck, M.

    2018-06-01

    Context. Post-processing algorithms play a key role in pushing the detection limits of high-contrast imaging (HCI) instruments. State-of-the-art image processing approaches for HCI enable the production of science-ready images relying on unsupervised learning techniques, such as low-rank approximations, for generating a model point spread function (PSF) and subtracting the residual starlight and speckle noise. Aims: In order to maximize the detection rate of HCI instruments and survey campaigns, advanced algorithms with higher sensitivities to faint companions are needed, especially for the speckle-dominated innermost region of the images. Methods: We propose a reformulation of the exoplanet detection task (for ADI sequences) that builds on well-established machine learning techniques to take HCI post-processing from an unsupervised to a supervised learning context. In this new framework, we present algorithmic solutions using two different discriminative models: SODIRF (random forests) and SODINN (neural networks). We test these algorithms on real ADI datasets from VLT/NACO and VLT/SPHERE HCI instruments. We then assess their performances by injecting fake companions and using receiver operating characteristic analysis. This is done in comparison with state-of-the-art ADI algorithms, such as ADI principal component analysis (ADI-PCA). Results: This study shows the improved sensitivity versus specificity trade-off of the proposed supervised detection approach. At the diffraction limit, SODINN improves the true positive rate by a factor ranging from 2 to 10 (depending on the dataset and angular separation) with respect to ADI-PCA when working at the same false-positive level. Conclusions: The proposed supervised detection framework outperforms state-of-the-art techniques in the task of discriminating planet signal from speckles. In addition, it offers the possibility of re-processing existing HCI databases to maximize their scientific return and potentially improve the demographics of directly imaged exoplanets.

  13. Computer analysis of gallbladder ultrasonic images towards recognition of pathological lesions

    NASA Astrophysics Data System (ADS)

    Ogiela, M. R.; Bodzioch, S.

    2011-06-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards automatic detection and interpretation of disease symptoms on processed US images. First, in this paper, there is presented a new heuristic method of filtering gallbladder contours from images. A major stage in this filtration is to segment and section off areas occupied by the said organ. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours, based on rank filtration, as well as on the analysis of line profile sections on tested organs. The second part concerns detecting the most important lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. The methodology of computer analysis of US gallbladder images presented here is clearly utilitarian in nature and after standardising can be used as a technique for supporting the diagnostics of selected gallbladder disorders using the images of this organ.

  14. Nondestructive study of corrosion by the analysis of diffused light

    NASA Astrophysics Data System (ADS)

    Hogert, Elsa N.; Landau, Monica R.; Marengo, Jose A.; Ruiz Gale, Maria F.; Gaggioli, Nestor G.; Paiva, Raul D., Jr.; Soga, Diogo; Muramatsu, Mikiya

    1999-07-01

    This work describes the application of mean intensity diffusion analysis to detect and analyze metallic corrosion phenomena. We present some new results in the characterization of the corrosion process using a model based in electroerosion phenomena. Valuable information is provided about surface microrelief changes, which is also useful for numerous engineering applications. The quality of our results supports the idea that this technique can contribute to a better analysis of corrosion processes, in particular in real time.

  15. Dynamic Network Change Detection

    DTIC Science & Technology

    2008-12-01

    Change Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...Fisher and Mackenzie, 1922). These methods are used in quality engineering to detect small changes in a process (Montgomery, 1991; Ryan , 2000). Larger...Social Network Modeling and Analysis: Workshop Summary and Papers, Ronald Breiger, Kathleen Carley, and Philippa Pattison, (Eds

  16. No psychological effect of color context in a low level vision task

    PubMed Central

    Pedley, Adam; Wade, Alex R

    2013-01-01

    Background: A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Methods: Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task.  Results: A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η 2 = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. Discussion: We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas. PMID:25075280

  17. No psychological effect of color context in a low level vision task.

    PubMed

    Pedley, Adam; Wade, Alex R

    2013-01-01

    A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task.  A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η (2) = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas.

  18. Evaluation of a dual-probe real time PCR system for detection of mandarin in commercial orange juice.

    PubMed

    Pardo, Miguel Angel

    2015-04-01

    A dual-probe real time PCR assay, based on the simultaneous detection of two TaqMan® probes, was evaluated for the detection of mandarin in orange juice. A single conserved polymorphism, located at the 314 position of intron belongs to chloroplast trnL gene, was confirmed by sequencing in 30 mandarin, 28 orange cultivars and 13 hybrids. The assay was also successfully evaluated in a blind trial against analysing 60 samples from different industrial processes in different countries around the world. The detection limit of the assay was established in 1% presence of mandarin detectable in processed orange juice and with a 100% precision. The quantitative application of the assay on citrus mixtures was also investigated, pointing out that the number of chloroplast DNA copies is too variable for its possible use as quantitative analysis. This assay can be employed as a routine methodology to control the accidental mixing during industrial processes and to deter intentional fraud. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Network Anomaly Detection Based on Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  20. Spatiotemporal Change Detection Using Landsat Imagery: the Case Study of Karacabey Flooded Forest, Bursa, Turkey

    NASA Astrophysics Data System (ADS)

    Akay, A. E.; Gencal, B.; Taş, İ.

    2017-11-01

    This short paper aims to detect spatiotemporal detection of land use/land cover change within Karacabey Flooded Forest region. Change detection analysis applied to Landsat 5 TM images representing July 2000 and a Landsat 8 OLI representing June 2017. Various image processing tools were implemented using ERDAS 9.2, ArcGIS 10.4.1, and ENVI programs to conduct spatiotemporal change detection over these two images such as band selection, corrections, subset, classification, recoding, accuracy assessment, and change detection analysis. Image classification revealed that there are five significant land use/land cover types, including forest, flooded forest, swamp, water, and other lands (i.e. agriculture, sand, roads, settlement, and open areas). The results indicated that there was increase in flooded forest, water, and other lands, while the cover of forest and swamp decreased.

  1. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  2. Bacterial Pathogens Associated with Community-acquired Pneumonia in Children Aged Below Five Years.

    PubMed

    Das, Anusmita; Patgiri, Saurav J; Saikia, Lahari; Dowerah, Pritikar; Nath, Reema

    2016-03-01

    To determine the spectrum of bacterial pathogens causing community-acquired pneumonia in children below 5 years of age. Children aged below 5 years satisfying the WHO criteria for pneumonia, severe pneumonia or very severe pneumonia, and with the presence of lung infiltrates on chest X-ray were enrolled. Two respiratory samples, one for culture and the other for PCR analysis, and a blood sample for culture were collected from every child. Of the 180 samples processed, bacterial pathogens were detected in 64.4%. Streptococcus pneumoniae and Hemophilus influenzae were most frequently detected. The performance of PCR analysis and culture were identical for the typical bacterial pathogens; atypical pathogens were detected by PCR analysis only. S. pneumoniae and H. influenza were the most commonly detected organisms from respiratory secretions of children with community acquired pneumonia.

  3. Adaptive Locally Optimum Processing for Interference Suppression from Communication and Undersea Surveillance Signals

    DTIC Science & Technology

    1994-07-01

    1993. "Analysis of the 1730-1732. Track - Before - Detect Approach to Target Detection using Pixel Statistics", to appear in IEEE Transactions Scholz, J...large surveillance arrays. One approach to combining energy in different spatial cells is track - before - detect . References to examples appear in the next... track - before - detect problem. The results obtained are not expected to depend strongly on model details. In particular, the structure of the tracking

  4. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  5. A Novel Method for Block Size Forensics Based on Morphological Operations

    NASA Astrophysics Data System (ADS)

    Luo, Weiqi; Huang, Jiwu; Qiu, Guoping

    Passive forensics analysis aims to find out how multimedia data is acquired and processed without relying on pre-embedded or pre-registered information. Since most existing compression schemes for digital images are based on block processing, one of the fundamental steps for subsequent forensics analysis is to detect the presence of block artifacts and estimate the block size for a given image. In this paper, we propose a novel method for blind block size estimation. A 2×2 cross-differential filter is first applied to detect all possible block artifact boundaries, morphological operations are then used to remove the boundary effects caused by the edges of the actual image contents, and finally maximum-likelihood estimation (MLE) is employed to estimate the block size. The experimental results evaluated on over 1300 nature images show the effectiveness of our proposed method. Compared with existing gradient-based detection method, our method achieves over 39% accuracy improvement on average.

  6. BP fusion model for the detection of oil spills on the sea by remote sensing

    NASA Astrophysics Data System (ADS)

    Chen, Weiwei; An, Jubai; Zhang, Hande; Lin, Bin

    2003-06-01

    Oil spills are very serious marine pollution in many countries. In order to detect and identify the oil-spilled on the sea by remote sensor, scientists have to conduct a research work on the remote sensing image. As to the detection of oil spills on the sea, edge detection is an important technology in image processing. There are many algorithms of edge detection developed for image processing. These edge detection algorithms always have their own advantages and disadvantages in the image processing. Based on the primary requirements of edge detection of the oil spills" image on the sea, computation time and detection accuracy, we developed a fusion model. The model employed a BP neural net to fuse the detection results of simple operators. The reason we selected BP neural net as the fusion technology is that the relation between simple operators" result of edge gray level and the image"s true edge gray level is nonlinear, while BP neural net is good at solving the nonlinear identification problem. Therefore in this paper we trained a BP neural net by some oil spill images, then applied the BP fusion model on the edge detection of other oil spill images and obtained a good result. In this paper the detection result of some gradient operators and Laplacian operator are also compared with the result of BP fusion model to analysis the fusion effect. At last the paper pointed out that the fusion model has higher accuracy and higher speed in the processing oil spill image"s edge detection.

  7. The simulation study on optical target laser active detection performance

    NASA Astrophysics Data System (ADS)

    Li, Ying-chun; Hou, Zhao-fei; Fan, Youchen

    2014-12-01

    According to the working principle of laser active detection system, the paper establishes the optical target laser active detection simulation system, carry out the simulation study on the detection process and detection performance of the system. For instance, the performance model such as the laser emitting, the laser propagation in the atmosphere, the reflection of optical target, the receiver detection system, the signal processing and recognition. We focus on the analysis and modeling the relationship between the laser emitting angle and defocus amount and "cat eye" effect echo laser in the reflection of optical target. Further, in the paper some performance index such as operating range, SNR and the probability of the system have been simulated. The parameters including laser emitting parameters, the reflection of the optical target and the laser propagation in the atmosphere which make a great influence on the performance of the optical target laser active detection system. Finally, using the object-oriented software design methods, the laser active detection system with the opening type, complete function and operating platform, realizes the process simulation that the detection system detect and recognize the optical target, complete the performance simulation of each subsystem, and generate the data report and the graph. It can make the laser active detection system performance models more intuitive because of the visible simulation process. The simulation data obtained from the system provide a reference to adjust the structure of the system parameters. And it provides theoretical and technical support for the top level design of the optical target laser active detection system and performance index optimization.

  8. The time course of symbolic number adaptation: oscillatory EEG activity and event-related potential analysis.

    PubMed

    Hsu, Yi-Fang; Szűcs, Dénes

    2012-02-15

    Several functional magnetic resonance imaging (fMRI) studies have used neural adaptation paradigms to detect anatomical locations of brain activity related to number processing. However, currently not much is known about the temporal structure of number adaptation. In the present study, we used electroencephalography (EEG) to elucidate the time course of neural events in symbolic number adaptation. The numerical distance of deviants relative to standards was manipulated. In order to avoid perceptual confounds, all levels of deviants consisted of perceptually identical stimuli. Multiple successive numerical distance effects were detected in event-related potentials (ERPs). Analysis of oscillatory activity further showed at least two distinct stages of neural processes involved in the automatic analysis of numerical magnitude, with the earlier effect emerging at around 200ms and the later effect appearing at around 400ms. The findings support for the hypothesis that numerical magnitude processing involves a succession of cognitive events. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  9. Atmospheric effects on microphone array analysis of aircraft vortex sound

    DOT National Transportation Integrated Search

    2006-05-08

    This paper provides the basis of a comprehensive analysis of vortex sound propagation : through the atmosphere in order to assess real atmospheric effects on acoustic array : processing. Such effects may impact vortex localization accuracy and detect...

  10. Disappearance of six pesticides in fresh and processed zucchini, bioavailability and health risk assessment.

    PubMed

    Oliva, J; Cermeño, S; Cámara, M A; Martínez, G; Barba, A

    2017-08-15

    A field study was carried out on the dissipation of three insecticides and three fungicides during the freezing of zucchini. A simultaneous residue analysis method is validated using QuEChERS extraction with acetonitrile and CG-MS and LC-MS analysis. The residues detected after field application never exceeded the established maximum residue limits. The processing factors calculated (fresh product/frozen product) are lower than 1, indicating a clear influence of the stages of the freezing process, especially the washing and blanching. The in vitro study of bioavailability establishes a low percentage of stomach absorption capacity. The level of residues detected in fresh zucchini and the Estimated Daily Intake calculated for Spain suggest that there is no risk of acute toxicity due to dietary exposure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Analysis of Space Shuttle Ground Support System Fault Detection, Isolation, and Recovery Processes and Resources

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.

    2009-01-01

    As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.

  12. Detection of contamination on selected apple cultivars using reflectance hyperspectral and multispectral analysis

    NASA Astrophysics Data System (ADS)

    Mehl, Patrick M.; Chao, Kevin; Kim, Moon S.; Chen, Yud-Ren

    2001-03-01

    Presence of natural or exogenous contaminations on apple cultivars is a food safety and quality concern touching the general public and strongly affecting this commodity market. Accumulations of human pathogens are usually observed on surface lesions of commodities. Detections of either lesions or directly of the pathogens are essential for assuring the quality and safety of commodities. We are presenting the application of hyperspectral image analysis towards the development of multispectral techniques for the detection of defects on chosen apple cultivars, such as Golden Delicious, Red Delicious, and Gala apples. Separate apple cultivars possess different spectral characteristics leading to different approaches for analysis. General preprocessing analysis with morphological treatments is followed by different image treatments and condition analysis for highlighting lesions and contaminations on the apple cultivars. Good isolations of scabs, fungal and soil contaminations and bruises are observed with hyperspectral imaging processing either using principal component analysis or utilizing the chlorophyll absorption peak. Applications of hyperspectral results to a multispectral detection are limited by the spectral capabilities of our RGB camera using either specific band pass filters and using direct neutral filters. Good separations of defects are obtained for Golden Delicious apples. It is however limited for the other cultivars. Having an extra near infrared channel will increase the detection level utilizing the chlorophyll absorption band for detection as demonstrated by the present hyperspectral imaging analysis

  13. A Statistical Analysis of the Output Signals of an Acousto-Optic Spectrum Analyzer for CW (Continuous-Wave) Signals

    DTIC Science & Technology

    1988-10-01

    A statistical analysis on the output signals of an acousto - optic spectrum analyzer (AOSA) is performed for the case when the input signal is a...processing, Electronic warfare, Radar countermeasures, Acousto - optic , Spectrum analyzer, Statistical analysis, Detection, Estimation, Canada, Modelling.

  14. A novel method about detecting missing holes on the motor carling

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Tan, Hao; Li, Guirong

    2018-03-01

    After a deep analysis on how to use an image processing system to detect the missing holes on the motor carling, we design the whole system combined with the actual production conditions of the motor carling. Afterwards we explain the whole system's hardware and software in detail. We introduce the general functions for the system's hardware and software. Analyzed these general functions, we discuss the modules of the system's hardware and software and the theory to design these modules in detail. The measurement to confirm the area to image processing, edge detection, randomized Hough transform to circle detecting is explained in detail. Finally, the system result tested in the laboratory and in the factory is given out.

  15. Inflight and Preflight Detection of Pitot Tube Anomalies

    NASA Technical Reports Server (NTRS)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  16. Flow injection gas chromatography with sulfur chemiluminescence detection for the analysis of total sulfur in complex hydrocarbon matrixes.

    PubMed

    Hua, Yujuan; Hawryluk, Myron; Gras, Ronda; Shearer, Randall; Luong, Jim

    2018-01-01

    A fast and reliable analytical technique for the determination of total sulfur levels in complex hydrocarbon matrices is introduced. The method employed flow injection technique using a gas chromatograph as a sample introduction device and a gas phase dual-plasma sulfur chemiluminescence detector for sulfur quantification. Using the technique described, total sulfur measurement in challenging hydrocarbon matrices can be achieved in less than 10 s with sample-to-sample time <2 min. The high degree of selectivity and sensitivity toward sulfur compounds of the detector offers the ability to measure low sulfur levels with a detection limit in the range of 20 ppb w/w S. The equimolar response characteristic of the detector allows the quantitation of unknown sulfur compounds and simplifies the calibration process. Response is linear over a concentration range of five orders of magnitude, with a high degree of repeatability. The detector's lack of response to hydrocarbons enables direct analysis without the need for time-consuming sample preparation and chromatographic separation processes. This flow injection-based sulfur chemiluminescence detection technique is ideal for fast analysis or trace sulfur analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. ACVP-14: Next-Generation Multiplex vRNA and vDNA Lineage Specific In Situ Hybridization Detection With Immunohisto-Fluorescence or Chromogen in the Same Tissue Section with Quantitative Image Analysis in Fixed Tissues from Virally Infected Specimens | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Tissue Analysis Core within the AIDS and Cancer Virus Program will process, embed and perform microtomy on fixed tissue samples presented in ethanol. HIV/SIVin situhybridization for detection of vRNA and vDNA will be performed using the next-gene

  18. Applying Statistical Process Control to Clinical Data: An Illustration.

    ERIC Educational Resources Information Center

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  19. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  20. Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance

    NASA Technical Reports Server (NTRS)

    Paschall, Steve; Brady, Tye; Sostaric, Ron

    2009-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system design process.

  1. A Bio-Inspired, Motion-Based Analysis of Crowd Behavior Attributes Relevance to Motion Transparency, Velocity Gradients, and Motion Patterns

    PubMed Central

    Raudies, Florian; Neumann, Heiko

    2012-01-01

    The analysis of motion crowds is concerned with the detection of potential hazards for individuals of the crowd. Existing methods analyze the statistics of pixel motion to classify non-dangerous or dangerous behavior, to detect outlier motions, or to estimate the mean throughput of people for an image region. We suggest a biologically inspired model for the analysis of motion crowds that extracts motion features indicative for potential dangers in crowd behavior. Our model consists of stages for motion detection, integration, and pattern detection that model functions of the primate primary visual cortex area (V1), the middle temporal area (MT), and the medial superior temporal area (MST), respectively. This model allows for the processing of motion transparency, the appearance of multiple motions in the same visual region, in addition to processing opaque motion. We suggest that motion transparency helps to identify “danger zones” in motion crowds. For instance, motion transparency occurs in small exit passages during evacuation. However, motion transparency occurs also for non-dangerous crowd behavior when people move in opposite directions organized into separate lanes. Our analysis suggests: The combination of motion transparency and a slow motion speed can be used for labeling of candidate regions that contain dangerous behavior. In addition, locally detected decelerations or negative speed gradients of motions are a precursor of danger in crowd behavior as are globally detected motion patterns that show a contraction toward a single point. In sum, motion transparency, image speeds, motion patterns, and speed gradients extracted from visual motion in videos are important features to describe the behavioral state of a motion crowd. PMID:23300930

  2. Driver drowsiness detection using ANN image processing

    NASA Astrophysics Data System (ADS)

    Vesselenyi, T.; Moca, S.; Rus, A.; Mitran, T.; Tătaru, B.

    2017-10-01

    The paper presents a study regarding the possibility to develop a drowsiness detection system for car drivers based on three types of methods: EEG and EOG signal processing and driver image analysis. In previous works the authors have described the researches on the first two methods. In this paper the authors have studied the possibility to detect the drowsy or alert state of the driver based on the images taken during driving and by analyzing the state of the driver’s eyes: opened, half-opened and closed. For this purpose two kinds of artificial neural networks were employed: a 1 hidden layer network and an autoencoder network.

  3. Quantitative naturalistic methods for detecting change points in psychotherapy research: an illustration with alliance ruptures.

    PubMed

    Eubanks-Carter, Catherine; Gorman, Bernard S; Muran, J Christopher

    2012-01-01

    Analysis of change points in psychotherapy process could increase our understanding of mechanisms of change. In particular, naturalistic change point detection methods that identify turning points or breakpoints in time series data could enhance our ability to identify and study alliance ruptures and resolutions. This paper presents four categories of statistical methods for detecting change points in psychotherapy process: criterion-based methods, control chart methods, partitioning methods, and regression methods. Each method's utility for identifying shifts in the alliance is illustrated using a case example from the Beth Israel Psychotherapy Research program. Advantages and disadvantages of the various methods are discussed.

  4. Rapid detection of hazardous chemicals in textiles by direct analysis in real-time mass spectrometry (DART-MS).

    PubMed

    Antal, Borbála; Kuki, Ákos; Nagy, Lajos; Nagy, Tibor; Zsuga, Miklós; Kéki, Sándor

    2016-07-01

    Residues of chemicals on clothing products were examined by direct analysis in real-time (DART) mass spectrometry. Our experiments have revealed the presence of more than 40 chemicals in 15 different clothing items. The identification was confirmed by DART tandem mass spectrometry (MS/MS) experiments for 14 compounds. The most commonly detected hazardous substances were nonylphenol ethoxylates (NPEs), phthalic acid esters (phthalates), amines released by azo dyes, and quinoline derivates. DART-MS was able to detect NPEs on the skin of the person wearing the clothing item contaminated by NPE residuals. Automated data acquisition and processing method was developed and tested for the recognition of NPE residues thereby reducing the analysis time.

  5. Simulate different environments TDLAS On the analysis of the test signal strength

    NASA Astrophysics Data System (ADS)

    Li, Xin; Zhou, Tao; Jia, Xiaodong

    2014-12-01

    TDLAS system is the use of the wavelength tuning characteristics of the laser diode, for detecting the absorption spectrum of the gas absorption line. Detecting the gas space, temperature, pressure and flow rate and concentration. The use of laboratory techniques TDLAS gas detection, experimental simulation engine combustion water vapor and smoke. using an optical lens system receives the signal acquisition and signal interference test analysis. Analog water vapor and smoke in two different environments in the sample pool interference. In both experiments environmental interference gas absorption in the optical signal acquisition, signal amplitude variation analysis, and records related to the signal data. In order to study site conditions in the engine combustion process for signal acquisition provides an ideal experimental data .

  6. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X.

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  7. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia.

    PubMed

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  8. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  9. Interdisciplinary Investigations in Support of Project DI-MOD

    NASA Technical Reports Server (NTRS)

    Starks, Scott A. (Principal Investigator)

    1996-01-01

    Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.

  10. Review of Processing and Analytical Methods for Francisella ...

    EPA Pesticide Factsheets

    Journal Article The etiological agent of tularemia, Francisella tularensis, is a resilient organism within the environment and can be acquired many ways (infectious aerosols and dust, contaminated food and water, infected carcasses, and arthropod bites). However, isolating F. tularensis from environmental samples can be challenging due to its nutritionally fastidious and slow-growing nature. In order to determine the current state of the science regarding available processing and analytical methods for detection and recovery of F. tularensis from water and soil matrices, a review of the literature was conducted. During the review, analysis via culture, immunoassays, and genomic identification were the most commonly found methods for F. tularensis detection within environmental samples. Other methods included combined culture and genomic analysis for rapid quantification of viable microorganisms and use of one assay to identify multiple pathogens from a single sample. Gaps in the literature that were identified during this review suggest that further work to integrate culture and genomic identification would advance our ability to detect and to assess the viability of Francisella spp. The optimization of DNA extraction, whole genome amplification with inhibition-resistant polymerases, and multiagent microarray detection would also advance biothreat detection.

  11. High accuracy position method based on computer vision and error analysis

    NASA Astrophysics Data System (ADS)

    Chen, Shihao; Shi, Zhongke

    2003-09-01

    The study of high accuracy position system is becoming the hotspot in the field of autocontrol. And positioning is one of the most researched tasks in vision system. So we decide to solve the object locating by using the image processing method. This paper describes a new method of high accuracy positioning method through vision system. In the proposed method, an edge-detection filter is designed for a certain running condition. Here, the filter contains two mainly parts: one is image-processing module, this module is to implement edge detection, it contains of multi-level threshold self-adapting segmentation, edge-detection and edge filter; the other one is object-locating module, it is to point out the location of each object in high accurate, and it is made up of medium-filtering and curve-fitting. This paper gives some analysis error for the method to prove the feasibility of vision in position detecting. Finally, to verify the availability of the method, an example of positioning worktable, which is using the proposed method, is given at the end of the paper. Results show that the method can accurately detect the position of measured object and identify object attitude.

  12. Feature Transformation Detection Method with Best Spectral Band Selection Process for Hyper-spectral Imaging

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike; Brickhouse, Mark

    2015-11-01

    We present a newly developed feature transformation (FT) detection method for hyper-spectral imagery (HSI) sensors. In essence, the FT method, by transforming the original features (spectral bands) to a different feature domain, may considerably increase the statistical separation between the target and background probability density functions, and thus may significantly improve the target detection and identification performance, as evidenced by the test results in this paper. We show that by differentiating the original spectral, one can completely separate targets from the background using a single spectral band, leading to perfect detection results. In addition, we have proposed an automated best spectral band selection process with a double-threshold scheme that can rank the available spectral bands from the best to the worst for target detection. Finally, we have also proposed an automated cross-spectrum fusion process to further improve the detection performance in lower spectral range (<1000 nm) by selecting the best spectral band pair with multivariate analysis. Promising detection performance has been achieved using a small background material signature library for concept-proving, and has then been further evaluated and verified using a real background HSI scene collected by a HYDICE sensor.

  13. Online anomaly detection in wireless body area networks for reliable healthcare monitoring.

    PubMed

    Salem, Osman; Liu, Yaning; Mehaoua, Ahmed; Boutaba, Raouf

    2014-09-01

    In this paper, we propose a lightweight approach for online detection of faulty measurements by analyzing the data collected from medical wireless body area networks. The proposed framework performs sequential data analysis using a smart phone as a base station, and takes into account the constrained resources of the smart phone, such as processing power and storage capacity. The main objective is to raise alarms only when patients enter in an emergency situation, and to discard false alarms triggered by faulty measurements or ill-behaved sensors. The proposed approach is based on the Haar wavelet decomposition, nonseasonal Holt-Winters forecasting, and the Hampel filter for spatial analysis, and on for temporal analysis. Our objective is to reduce false alarms resulting from unreliable measurements and to reduce unnecessary healthcare intervention. We apply our proposed approach on real physiological dataset. Our experimental results prove the effectiveness of our approach in achieving good detection accuracy with a low false alarm rate. The simplicity and the processing speed of our proposed framework make it useful and efficient for real time diagnosis.

  14. Detailed analysis of complex single molecule FRET data with the software MASH

    NASA Astrophysics Data System (ADS)

    Hadzic, Mélodie C. A. S.; Kowerko, Danny; Börner, Richard; Zelger-Paulus, Susann; Sigel, Roland K. O.

    2016-04-01

    The processing and analysis of surface-immobilized single molecule FRET (Förster resonance energy transfer) data follows systematic steps (e.g. single molecule localization, clearance of different sources of noise, selection of the conformational and kinetic model, etc.) that require a solid knowledge in optics, photophysics, signal processing and statistics. The present proceeding aims at standardizing and facilitating procedures for single molecule detection by guiding the reader through an optimization protocol for a particular experimental data set. Relevant features were determined from single molecule movies (SMM) imaging Cy3- and Cy5-labeled Sc.ai5γ group II intron molecules synthetically recreated, to test the performances of four different detection algorithms. Up to 120 different parameterizations per method were routinely evaluated to finally establish an optimum detection procedure. The present protocol is adaptable to any movie displaying surface-immobilized molecules, and can be easily reproduced with our home-written software MASH (multifunctional analysis software for heterogeneous data) and script routines (both available in the download section of www.chem.uzh.ch/rna).

  15. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    NASA Astrophysics Data System (ADS)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  16. MO-D-213-02: Quality Improvement Through a Failure Mode and Effects Analysis of Pediatric External Beam Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, J; Lukose, R; Bronson, J

    2015-06-15

    Purpose: To conduct a failure mode and effects analysis (FMEA) as per AAPM Task Group 100 on clinical processes associated with teletherapy, and the development of mitigations for processes with identified high risk. Methods: A FMEA was conducted on clinical processes relating to teletherapy treatment plan development and delivery. Nine major processes were identified for analysis. These steps included CT simulation, data transfer, image registration and segmentation, treatment planning, plan approval and preparation, and initial and subsequent treatments. Process tree mapping was utilized to identify the steps contained within each process. Failure modes (FM) were identified and evaluated with amore » scale of 1–10 based upon three metrics: the severity of the effect, the probability of occurrence, and the detectability of the cause. The analyzed metrics were scored as follows: severity – no harm = 1, lethal = 10; probability – not likely = 1, certainty = 10; detectability – always detected = 1, undetectable = 10. The three metrics were combined multiplicatively to determine the risk priority number (RPN) which defined the overall score for each FM and the order in which process modifications should be deployed. Results: Eighty-nine procedural steps were identified with 186 FM accompanied by 193 failure effects with 213 potential causes. Eighty-one of the FM were scored with a RPN > 10, and mitigations were developed for FM with RPN values exceeding ten. The initial treatment had the most FM (16) requiring mitigation development followed closely by treatment planning, segmentation, and plan preparation with fourteen each. The maximum RPN was 400 and involved target delineation. Conclusion: The FMEA process proved extremely useful in identifying previously unforeseen risks. New methods were developed and implemented for risk mitigation and error prevention. Similar to findings reported for adult patients, the process leading to the initial treatment has an associated high risk.« less

  17. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  19. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions

    PubMed Central

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R

    2005-01-01

    Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities. PMID:15805453

  20. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions.

    PubMed

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R E

    2005-04-01

    Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities.

  1. An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.

    PubMed

    Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong

    2014-08-01

    Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images. © 2014 Wiley Periodicals, Inc.

  2. Streak detection and analysis pipeline for space-debris optical images

    NASA Astrophysics Data System (ADS)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for both scenarios for the bright streaks (SNR > 1), while in the low-SNR regime, the sensitivity is still 50% at SNR = 0.5 .

  3. Single quantum dot analysis enables multiplexed point mutation detection by gap ligase chain reaction.

    PubMed

    Song, Yunke; Zhang, Yi; Wang, Tza-Huei

    2013-04-08

    Gene point mutations present important biomarkers for genetic diseases. However, existing point mutation detection methods suffer from low sensitivity, specificity, and a tedious assay processes. In this report, an assay technology is proposed which combines the outstanding specificity of gap ligase chain reaction (Gap-LCR), the high sensitivity of single-molecule coincidence detection, and the superior optical properties of quantum dots (QDs) for multiplexed detection of point mutations in genomic DNA. Mutant-specific ligation products are generated by Gap-LCR and subsequently captured by QDs to form DNA-QD nanocomplexes that are detected by single-molecule spectroscopy (SMS) through multi-color fluorescence burst coincidence analysis, allowing for multiplexed mutation detection in a separation-free format. The proposed assay is capable of detecting zeptomoles of KRAS codon 12 mutation variants with near 100% specificity. Its high sensitivity allows direct detection of KRAS mutation in crude genomic DNA without PCR pre-amplification. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Appropriate Handling, Processing and Analysis of Blood Samples Is Essential to Avoid Oxidation of Vitamin C to Dehydroascorbic Acid

    PubMed Central

    Pullar, Juliet M.; Carr, Anitra C.

    2018-01-01

    Vitamin C (ascorbate) is the major water-soluble antioxidant in plasma and its oxidation to dehydroascorbic acid (DHA) has been proposed as a marker of oxidative stress in vivo. However, controversy exists in the literature around the amount of DHA detected in blood samples collected from various patient cohorts. In this study, we report on DHA concentrations in a selection of different clinical cohorts (diabetes, pneumonia, cancer, and critically ill). All clinical samples were collected into EDTA anticoagulant tubes and processed at 4 °C prior to storage at −80 °C for subsequent analysis by HPLC with electrochemical detection. We also investigated the effects of different handling and processing conditions on short-term and long-term ascorbate and DHA stability in vitro and in whole blood and plasma samples. These conditions included metal chelation, anticoagulants (EDTA and heparin), and processing temperatures (ice, 4 °C and room temperature). Analysis of our clinical cohorts indicated very low to negligible DHA concentrations. Samples exhibiting haemolysis contained significantly higher concentrations of DHA. Metal chelation inhibited oxidation of vitamin C in vitro, confirming the involvement of contaminating metal ions. Although EDTA is an effective metal chelator, complexes with transition metal ions are still redox active, thus its use as an anticoagulant can facilitate metal ion-dependent oxidation of vitamin C in whole blood and plasma. Handling and processing blood samples on ice (or at 4 °C) delayed oxidation of vitamin C by a number of hours. A review of the literature regarding DHA concentrations in clinical cohorts highlighted the fact that studies using colourimetric or fluorometric assays reported significantly higher concentrations of DHA compared to those using HPLC with electrochemical detection. In conclusion, careful handling and processing of samples, combined with appropriate analysis, is crucial for accurate determination of ascorbate and DHA in clinical samples. PMID:29439480

  5. [Advances in automatic detection technology for images of thin blood film of malaria parasite].

    PubMed

    Juan-Sheng, Zhang; Di-Qiang, Zhang; Wei, Wang; Xiao-Guang, Wei; Zeng-Guo, Wang

    2017-05-05

    This paper reviews the computer vision and image analysis studies aiming at automated diagnosis or screening of malaria in microscope images of thin blood film smears. On the basis of introducing the background and significance of automatic detection technology, the existing detection technologies are summarized and divided into several steps, including image acquisition, pre-processing, morphological analysis, segmentation, count, and pattern classification components. Then, the principles and implementation methods of each step are given in detail. In addition, the promotion and application in automatic detection technology of thick blood film smears are put forwarded as questions worthy of study, and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.

  6. Analysis of digital communication signals and extraction of parameters

    NASA Astrophysics Data System (ADS)

    Al-Jowder, Anwar

    1994-12-01

    The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).

  7. A Candidate Functional Architecture Design for the Detection and Monitoring Process of a Counterdrug Joint Task Force

    DTIC Science & Technology

    1993-06-01

    completes the functional decomposition of the detection and monitoring requirements of the Counterdrug JTF. David Marca in his text SADT, Structural...September 1992. 12. Marca , D. McGowan, C., SADT, Structured Analysis and Design Technique, Mc Graw-Hill , 1988. 13. United States Department of

  8. Differentiation of cotton from other crops at different growth stages using spectral properties and discriminant analysis

    USDA-ARS?s Scientific Manuscript database

    Timely detection and remediation of volunteer cotton plants in both cultivated and non-cultivated habitats is critical for completing boll weevil eradication in Central and South Texas. However, timely detection of cotton plants over large areas and habitats is a challenging process. The spectral ...

  9. A Detection-Theoretic Model of Echo Inhibition

    ERIC Educational Resources Information Center

    Saberi, Kourosh; Petrosyan, Agavni

    2004-01-01

    A detection-theoretic analysis of the auditory localization of dual-impulse stimuli is described, and a model for the processing of spatial cues in the echo pulse is developed. Although for over 50 years "echo suppression" has been the topic of intense theoretical and empirical study within the hearing sciences, only a rudimentary understanding of…

  10. Application of reflectance spectroscopies (FTIR-ATR & FT-NIR) coupled with multivariate methods for robust in vivo detection of begomovirus infection in papaya leaves

    NASA Astrophysics Data System (ADS)

    Haq, Quazi M. I.; Mabood, Fazal; Naureen, Zakira; Al-Harrasi, Ahmed; Gilani, Sayed A.; Hussain, Javid; Jabeen, Farah; Khan, Ajmal; Al-Sabari, Ruqaya S. M.; Al-khanbashi, Fatema H. S.; Al-Fahdi, Amira A. M.; Al-Zaabi, Ahoud K. A.; Al-Shuraiqi, Fatma A. M.; Al-Bahaisi, Iman M.

    2018-06-01

    Nucleic acid & serology based methods have revolutionized plant disease detection, however, they are not very reliable at asymptomatic stage, especially in case of pathogen with systemic infection, in addition, they need at least 1-2 days for sample harvesting, processing, and analysis. In this study, two reflectance spectroscopies i.e. Near Infrared reflectance spectroscopy (NIR) and Fourier-Transform-Infrared spectroscopy with Attenuated Total Reflection (FT-IR, ATR) coupled with multivariate exploratory methods like Principle Component Analysis (PCA) and Partial least square discriminant analysis (PLS-DA) have been deployed to detect begomovirus infection in papaya leaves. The application of those techniques demonstrates that they are very useful for robust in vivo detection of plant begomovirus infection. These methods are simple, sensitive, reproducible, precise, and do not require any lengthy samples preparation procedures.

  11. An Adaptive and Time-Efficient ECG R-Peak Detection Algorithm.

    PubMed

    Qin, Qin; Li, Jianqing; Yue, Yinggao; Liu, Chengyu

    2017-01-01

    R-peak detection is crucial in electrocardiogram (ECG) signal analysis. This study proposed an adaptive and time-efficient R-peak detection algorithm for ECG processing. First, wavelet multiresolution analysis was applied to enhance the ECG signal representation. Then, ECG was mirrored to convert large negative R-peaks to positive ones. After that, local maximums were calculated by the first-order forward differential approach and were truncated by the amplitude and time interval thresholds to locate the R-peaks. The algorithm performances, including detection accuracy and time consumption, were tested on the MIT-BIH arrhythmia database and the QT database. Experimental results showed that the proposed algorithm achieved mean sensitivity of 99.39%, positive predictivity of 99.49%, and accuracy of 98.89% on the MIT-BIH arrhythmia database and 99.83%, 99.90%, and 99.73%, respectively, on the QT database. By processing one ECG record, the mean time consumptions were 0.872 s and 0.763 s for the MIT-BIH arrhythmia database and QT database, respectively, yielding 30.6% and 32.9% of time reduction compared to the traditional Pan-Tompkins method.

  12. An Adaptive and Time-Efficient ECG R-Peak Detection Algorithm

    PubMed Central

    Qin, Qin

    2017-01-01

    R-peak detection is crucial in electrocardiogram (ECG) signal analysis. This study proposed an adaptive and time-efficient R-peak detection algorithm for ECG processing. First, wavelet multiresolution analysis was applied to enhance the ECG signal representation. Then, ECG was mirrored to convert large negative R-peaks to positive ones. After that, local maximums were calculated by the first-order forward differential approach and were truncated by the amplitude and time interval thresholds to locate the R-peaks. The algorithm performances, including detection accuracy and time consumption, were tested on the MIT-BIH arrhythmia database and the QT database. Experimental results showed that the proposed algorithm achieved mean sensitivity of 99.39%, positive predictivity of 99.49%, and accuracy of 98.89% on the MIT-BIH arrhythmia database and 99.83%, 99.90%, and 99.73%, respectively, on the QT database. By processing one ECG record, the mean time consumptions were 0.872 s and 0.763 s for the MIT-BIH arrhythmia database and QT database, respectively, yielding 30.6% and 32.9% of time reduction compared to the traditional Pan-Tompkins method. PMID:29104745

  13. Rapid screening of basic colorants in processed vegetables through mass spectrometry using an interchangeable thermal desorption electrospray ionization source.

    PubMed

    Chao, Yu-Ying; Chen, Yen-Ling; Lin, Hong-Yi; Huang, Yeou-Lih

    2018-06-20

    Thermal desorption electrospray ionization/mass spectrometry (TD-ESI-MS) employing a quickly interchangeable ionization source is a relatively new ambient ionization mass spectrometric technique that has had, to date, only a limited number of applications related to food safety control. With reallocation of resources, this direct-analysis technique has had wider use in food analysis when operated in dual-working mode (pretreatment-free qualitative screening and conventional quantitative confirmation) after switching to an ambient ionization source from a traditional atmospheric pressure ionization source. Herein, we describe the benefits and challenges associated with the use of a TD-ESI source to detect adulterants in processed vegetables (PVs), as a proof-of-concept for the detection of basic colorants. While TD-ESI can offer direct qualitative screening analyses for PVs with detection capabilities lower than those provided with liquid chromatography/UV detection within 30 s, the use of TD-ESI for semi-quantification is applicable only for homogeneous food matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Determination of dissolved-phase pesticides in surface water from the Yakima River basin, Washington, using the Goulden large-sample extractor and gas chromatography/mass spectrometer

    USGS Publications Warehouse

    Foster, Gregory D.; Gates, Paul M.; Foreman, William T.; McKenzie, Stuart W.; Rinella, Frank A.

    1993-01-01

    Concentrations of pesticides in the dissolved phase of surface water samples from the Yakima River basin, WA, were determined using preconcentration in the Goulden large-sample extractor (GLSE) and gas chromatography/mass spectrometry (GC/MS) analysis. Sample volumes ranging from 10 to 120 L were processed with the GLSE, and the results from the large-sample analyses were compared to those derived from 1-L continuous liquid-liquid extractions Few of the 40 target pesticides were detected in 1-L samples, whereas large-sample preconcentration in the GLSE provided detectable levels for many of the target pesticides. The number of pesticides detected in GLSE processed samples was usually directly proportional to sample volume, although the measured concentrations of the pesticides were generally lower at the larger sample volumes for the same water source. The GLSE can be used to provide lower detection levels relative to conventional liquid-liquid extraction in GC/MS analysis of pesticides in samples of surface water.

  15. Topological characterization and early detection of bifurcations and chaos in complex systems using persistent homology.

    PubMed

    Mittal, Khushboo; Gupta, Shalabh

    2017-05-01

    Early detection of bifurcations and chaos and understanding their topological characteristics are essential for safe and reliable operation of various electrical, chemical, physical, and industrial processes. However, the presence of non-linearity and high-dimensionality in system behavior makes this analysis a challenging task. The existing methods for dynamical system analysis provide useful tools for anomaly detection (e.g., Bendixson-Dulac and Poincare-Bendixson criteria can detect the presence of limit cycles); however, they do not provide a detailed topological understanding about system evolution during bifurcations and chaos, such as the changes in the number of subcycles and their positions, lifetimes, and sizes. This paper addresses this research gap by using topological data analysis as a tool to study system evolution and develop a mathematical framework for detecting the topological changes in the underlying system using persistent homology. Using the proposed technique, topological features (e.g., number of relevant k-dimensional holes, etc.) are extracted from nonlinear time series data which are useful for deeper analysis of the system behavior and early detection of bifurcations and chaos. When applied to a Logistic map, a Duffing oscillator, and a real life Op-amp based Jerk circuit, these features are shown to accurately characterize the system dynamics and detect the onset of chaos.

  16. eSensor: an electrochemical detection-based DNA microarray technology enabling sample-to-answer molecular diagnostics

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Longiaru, Mathew

    2009-05-01

    DNA microarrays are becoming a widespread tool used in life science and drug screening due to its many benefits of miniaturization and integration. Microarrays permit a highly multiplexed DNA analysis. Recently, the development of new detection methods and simplified methodologies has rapidly expanded the use of microarray technologies from predominantly gene expression analysis into the arena of diagnostics. Osmetech's eSensor® is an electrochemical detection platform based on a low-to- medium density DNA hybridization array on a cost-effective printed circuit board substrate. eSensor® has been cleared by FDA for Warfarin sensitivity test and Cystic Fibrosis Carrier Detection. Other genetic-based diagnostic and infectious disease detection tests are under development. The eSensor® platform eliminates the need for an expensive laser-based optical system and fluorescent reagents. It allows one to perform hybridization and detection in a single and small instrument without any fluidic processing and handling. Furthermore, the eSensor® platform is readily adaptable to on-chip sample-to-answer genetic analyses using microfluidics technology. The eSensor® platform provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus have a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.

  17. Revealing stable processing products from ribosome-associated small RNAs by deep-sequencing data analysis.

    PubMed

    Zywicki, Marek; Bakowska-Zywicka, Kamilla; Polacek, Norbert

    2012-05-01

    The exploration of the non-protein-coding RNA (ncRNA) transcriptome is currently focused on profiling of microRNA expression and detection of novel ncRNA transcription units. However, recent studies suggest that RNA processing can be a multi-layer process leading to the generation of ncRNAs of diverse functions from a single primary transcript. Up to date no methodology has been presented to distinguish stable functional RNA species from rapidly degraded side products of nucleases. Thus the correct assessment of widespread RNA processing events is one of the major obstacles in transcriptome research. Here, we present a novel automated computational pipeline, named APART, providing a complete workflow for the reliable detection of RNA processing products from next-generation-sequencing data. The major features include efficient handling of non-unique reads, detection of novel stable ncRNA transcripts and processing products and annotation of known transcripts based on multiple sources of information. To disclose the potential of APART, we have analyzed a cDNA library derived from small ribosome-associated RNAs in Saccharomyces cerevisiae. By employing the APART pipeline, we were able to detect and confirm by independent experimental methods multiple novel stable RNA molecules differentially processed from well known ncRNAs, like rRNAs, tRNAs or snoRNAs, in a stress-dependent manner.

  18. Variable threshold method for ECG R-peak detection.

    PubMed

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  19. Distributed Scene Analysis For Autonomous Road Vehicle Guidance

    NASA Astrophysics Data System (ADS)

    Mysliwetz, Birger D.; Dickmanns, E. D.

    1987-01-01

    An efficient distributed processing scheme has been developed for visual road boundary tracking by 'VaMoRs', a testbed vehicle for autonomous mobility and computer vision. Ongoing work described here is directed to improving the robustness of the road boundary detection process in the presence of shadows, ill-defined edges and other disturbing real world effects. The system structure and the techniques applied for real-time scene analysis are presented along with experimental results. All subfunctions of road boundary detection for vehicle guidance, such as edge extraction, feature aggregation and camera pointing control, are executed in parallel by an onboard multiprocessor system. On the image processing level local oriented edge extraction is performed in multiple 'windows', tighly controlled from a hierarchically higher, modelbased level. The interpretation process involving a geometric road model and the observer's relative position to the road boundaries is capable of coping with ambiguity in measurement data. By using only selected measurements to update the model parameters even high noise levels can be dealt with and misleading edges be rejected.

  20. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    PubMed Central

    Qin, Kunming; Liu, Qidi; Cai, Hao; Cao, Gang; Lu, Tulin; Shen, Baojia; Shu, Yachun; Cai, Baochang

    2014-01-01

    Background: In traditional Chinese medicine (TCM), raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae), is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA) by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM. PMID:25422559

  1. Quantification of vocal fold motion using echography: application to recurrent nerve paralysis detection

    NASA Astrophysics Data System (ADS)

    Cohen, Mike-Ely; Lefort, Muriel; Bergeret-Cassagne, Héloïse; Hachi, Siham; Li, Ang; Russ, Gilles; Lazard, Diane; Menegaux, Fabrice; Leenhardt, Laurence; Trésallet, Christophe; Frouin, Frédérique

    2015-03-01

    Recurrent nerve paralysis (RP) is one of the most frequent complications of thyroid surgery. It reduces vocal fold mobility. Nasal endoscopy, a mini-invasive procedure, is the conventional way to detect RP. We suggest a new approach based on laryngeal ultrasound and a specific data analysis was designed to help with the automated detection of RP. Ten subjects were enrolled for this feasibility study: four controls, three patients with RP and three patients without RP according to nasal endoscopy. The ultrasound protocol was based on a ten seconds B-mode acquisition in a coronal plane during normal breathing. Image processing included three steps: 1) automated detection of two consecutive closing and opening images, corresponding to extreme positions of vocal folds in the sequence of B-mode images, using principal component analysis of the image sequence; 2) positioning of three landmarks and robust tracking of these points using a multi-pyramidal refined optical flow approach; 3) estimation of quantitative parameters indicating left and right fractions of mobility, and motion symmetry. Results provided by automated image processing were compared to those obtained by an expert. Detection of extreme images was accurate; tracking of landmarks was reliable in 80% of cases. Motion symmetry indices showed similar values for controls and patients without RP. Fraction of mobility was reduced in cases of RP. Thus, our CAD system helped in the detection of RP. Laryngeal ultrasound combined with appropriate image processing helped in the diagnosis of recurrent nerve paralysis and could be proposed as a first-line method.

  2. Enhancement of laser-induced breakdown spectroscopy (LIBS) Detection limit using a low-pressure and short-pulse laser-induced plasma process.

    PubMed

    Wang, Zhen Zhen; Deguchi, Yoshihiro; Kuwahara, Masakazu; Yan, Jun Jie; Liu, Ji Ping

    2013-11-01

    Laser-induced breakdown spectroscopy (LIBS) technology is an appealing technique compared with many other types of elemental analysis because of the fast response, high sensitivity, real-time, and noncontact features. One of the challenging targets of LIBS is the enhancement of the detection limit. In this study, the detection limit of gas-phase LIBS analysis has been improved by controlling the pressure and laser pulse width. In order to verify this method, low-pressure gas plasma was induced using nanosecond and picosecond lasers. The method was applied to the detection of Hg. The emission intensity ratio of the Hg atom to NO (IHg/INO) was analyzed to evaluate the LIBS detection limit because the NO emission (interference signal) was formed during the plasma generation and cooling process of N2 and O2 in the air. It was demonstrated that the enhancement of IHg/INO arose by decreasing the pressure to a few kilopascals, and the IHg/INO of the picosecond breakdown was always much higher than that of the nanosecond breakdown at low buffer gas pressure. Enhancement of IHg/INO increased more than 10 times at 700 Pa using picosecond laser with 35 ps pulse width. The detection limit was enhanced to 0.03 ppm (parts per million). We also saw that the spectra from the center and edge parts of plasma showed different features. Comparing the central spectra with the edge spectra, IHg/INO of the edge spectra was higher than that of the central spectra using the picosecond laser breakdown process.

  3. A comparative analysis of frequency modulation threshold extension techniques

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Loch, F. J.

    1970-01-01

    FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output

  4. Detection and measurement of the intracellular calcium variation in follicular cells.

    PubMed

    Herrera-Navarro, Ana M; Terol-Villalobos, Iván R; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca(2+). Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal.

  5. Detection and Measurement of the Intracellular Calcium Variation in Follicular Cells

    PubMed Central

    Herrera-Navarro, Ana M.; Terol-Villalobos, Iván R.; Jiménez-Hernández, Hugo; Peregrina-Barreto, Hayde; Gonzalez-Barboza, José-Joel

    2014-01-01

    This work presents a new method for measuring the variation of intracellular calcium in follicular cells. The proposal consists in two stages: (i) the detection of the cell's nuclei and (ii) the analysis of the fluorescence variations. The first stage is performed via watershed modified transformation, where the process of labeling is controlled. The detection process uses the contours of the cells as descriptors, where they are enhanced with a morphological filter that homogenizes the luminance variation of the image. In the second stage, the fluorescence variations are modeled as an exponential decreasing function, where the fluorescence variations are highly correlated with the changes of intracellular free Ca2+. Additionally, it is introduced a new morphological called medium reconstruction process, which helps to enhance the data for the modeling process. This filter exploits the undermodeling and overmodeling properties of reconstruction operators, such that it preserves the structure of the original signal. Finally, an experimental process shows evidence of the capabilities of the proposal. PMID:25342958

  6. A method based on coffee-ring deposition confocal Raman spectroscopy of analysis of melamine in milk

    NASA Astrophysics Data System (ADS)

    Tan, Zong; Chen, Da

    2016-10-01

    In this work, an economical and high-efficiency method for detection of melamine in milk was developed. The enrichment effect of coffee-ring was combined with the micro-region analysis of confocal Raman spectroscopy, in addition, assisted with chemometric algorithmthe. Consequently, a desired result was obtained that the LOD of melamine in this method was 1 ppm, which was excellent because the sensitivity of conventional Raman detection was generally low. Furthermore, the whole process were processed in an easily available condition with almost no chemical reagents consumption, and the chosen substrates for the formation of coffee-ring were reusable. Thus, the method is environmental friendly and has a great potential application in food safety inspection.

  7. Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis

    NASA Astrophysics Data System (ADS)

    Zapata, J. F.; Garzón, J.

    2011-01-01

    This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.

  8. Streak detection and analysis pipeline for optical images

    NASA Astrophysics Data System (ADS)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic flow starts from the segmentation of the acquired image (i.e., the extraction of all sources), followed by the astrometric and photometric characterization of the candidate streaks, and ends with orbital validation of the detected streaks. For the low-SNR extraction of objects, we put forward an approach which does not rely on a priori information, such as the object velocities, a typical assumption in earlier implementations. Our algorithm is based on local grayscale mean difference evaluation, followed by a threshold operation and spatial filtering of black-and-white (1-bit) data to remove stars and other non-streak features. For long streaks, the challenge is to extract position information and related registered epochs with sufficient precision. Moreover, satellite streaks can show up in complex morphologies because of their fast, and often irregular lightcurve variations. A central concept of the pipeline is streak classification which guides the actual characterization process by aiming to identify the interesting sources and to filter out the uninteresting ones, as well as by allowing the tailoring of algorithms for specific streak classes (e.g. PSF fitting for point-like vs. long, disintegrated streaks). Finally, to validate the single-image detections, the processing is finalized by orbital analysis using our statistical inverse methods (see, Muinonen et al., this conference), resulting in preliminary orbital classification (e.g., Earth-bound vs. non-Earth-bound orbits) for the detected streaks.

  9. Combining vibrational biomolecular spectroscopy with chemometric techniques for the study of response and sensitivity of molecular structures/functional groups mainly related to lipid biopolymer to various processing applications.

    PubMed

    Yu, Gloria Qingyu; Yu, Peiqiang

    2015-09-01

    The objectives of this project were to (1) combine vibrational spectroscopy with chemometric multivariate techniques to determine the effect of processing applications on molecular structural changes of lipid biopolymer that mainly related to functional groups in green- and yellow-type Crop Development Centre (CDC) pea varieties [CDC strike (green-type) vs. CDC meadow (yellow-type)] that occurred during various processing applications; (2) relatively quantify the effect of processing applications on the antisymmetric CH3 ("CH3as") and CH2 ("CH2as") (ca. 2960 and 2923 cm(-1), respectively), symmetric CH3 ("CH3s") and CH2 ("CH2s") (ca. 2873 and 2954 cm(-1), respectively) functional groups and carbonyl C=O ester (ca. 1745 cm(-1)) spectral intensities as well as their ratios of antisymmetric CH3 to antisymmetric CH2 (ratio of CH3as to CH2as), ratios of symmetric CH3 to symmetric CH2 (ratio of CH3s to CH2s), and ratios of carbonyl C=O ester peak area to total CH peak area (ratio of C=O ester to CH); and (3) illustrate non-invasive techniques to detect the sensitivity of individual molecular functional group to the various processing applications in the recently developed different types of pea varieties. The hypothesis of this research was that processing applications modified the molecular structure profiles in the processed products as opposed to original unprocessed pea seeds. The results showed that the different processing methods had different impacts on lipid molecular functional groups. Different lipid functional groups had different sensitivity to various heat processing applications. These changes were detected by advanced molecular spectroscopy with chemometric techniques which may be highly related to lipid utilization and availability. The multivariate molecular spectral analyses, cluster analysis, and principal component analysis of original spectra (without spectral parameterization) are unable to fully distinguish the structural differences in the antisymmetric and symmetric CH3 and CH2 spectral region (ca. 3001-2799 cm(-1)) and carbonyl C=O ester band region (ca. 1771-1714 cm(-1)). This result indicated that the sensitivity to detect treatment difference by multivariate analysis of cluster analysis (CLA) and principal components analysis (PCA) might be lower compared with univariate molecular spectral analysis. In the future, other more sensitive techniques such as "discriminant analysis" could be considered for discriminating and classifying structural differences. Molecular spectroscopy can be used as non-invasive technique to study processing-induced structural changes that are related to lipid compound in legume seeds.

  10. U.S. data processing for the IRAS project. [by Jet Propulsion Laboratory Scientific Data Analysis System

    NASA Technical Reports Server (NTRS)

    Duxbury, J. H.

    1983-01-01

    The JPL's Scientific Data Analysis System (SDAS), which will process IRAS data and produce a catalogue of perhaps a million infrared sources in the sky, as well as other information for astronomical records, is described. The purposes of SDAS are discussed, and the major SDAS processors are shown in block diagram. The catalogue processing is addressed, mentioning the basic processing steps which will be applied to raw detector data. Signal reconstruction and conversion to astrophysical units, source detection, source confirmation, data management, and survey data products are considered in detail.

  11. Foreign object detection and removal to improve automated analysis of chest radiographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime

    2013-07-15

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The methodmore » is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.« less

  12. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  13. Highlight on Bottlenecks in Food Allergen Analysis: Detection and Quantification by Mass Spectrometry.

    PubMed

    Planque, Mélanie; Arnould, Thierry; Renard, Patricia; Delahaut, Philippe; Dieu, Marc; Gillard, Nathlie

    2017-07-01

    Food laboratories have developed methods for testing allergens in foods. The efficiency of qualitative and quantitative methods is of prime importance in protecting allergic populations. Unfortunately, food laboratories encounter barriers to developing efficient methods. Bottlenecks include the lack of regulatory thresholds, delays in the emergence of reference materials and guidelines, and the need to detect processed allergens. In this study, ultra-HPLC coupled to tandem MS was used to illustrate difficulties encountered in determining method performances. We measured the major influences of both processing and matrix effects on the detection of egg, milk, soy, and peanut allergens in foodstuffs. The main goals of this work were to identify difficulties that food laboratories still encounter in detecting and quantifying allergens and to sensitize researchers to them.

  14. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Close-in detection system for the Mine Hunter/Killer program

    NASA Astrophysics Data System (ADS)

    Bishop, Steven S.; Campana, Stephen B.; Lang, David A.; Wiggins, Carl M.

    2000-08-01

    The Close-in Detection (CID) System is the vehicle-mounted multisensor landmine detection system for the Army CECOM Night Vision Electronic Sensors Directorate (NVESD) Mine Hunter/Killer (MH/K) Program. The CID System is being developed by BAE Systems in San Diego, CA. TRW Systems and Information Technology Group in Arlington, VA and a team of specialists for ERIM, E-OIR, SNL, and APL/JHU support NVESD in the development, analysis and testing of the CID and associated signal and data processing. The CID System includes tow down-looking sensor arrays: a ground- penetrating radar (GPR) array, and a set of Electro-Magnetic Induction (EMI) coils for metal detection. These arrays span a 3-meter wide swath in front of a high mobility, multipurpose wheeled vehicle. The system also includes a forward looking IR imaging system mounted on the roof of the vehicle and covering a swath of the road ahead of the vehicle. Signals from each sensor are processed separately to detect and localize objects of interest. Features of candidate objects are integrated in a processor that uses them to discriminates between anti-tank miens and clutter. Mine locations are passed to the neutralization subsystem of MH/K. This paper reviews the design of the sensors and signal processing of the CID system and gives examples and analysis of recent test results at the NVESD mine lanes. The strengths and weaknesses of each sensor are discussed, and the application of multisensor fusion is illustrated.

  16. [Detection of genetically modified soy (Roundup-Ready) in processed food products].

    PubMed

    Hagen, M; Beneke, B

    2000-01-01

    In this study, the application of a qualitative and a quantitative method of analysis to detect genetically modified RR-Soy (Roundup-Ready Soy) in processed foods is described. A total of 179 various products containing soy such as baby food and diet products, soy drinks and desserts, tofu and tofu products, soy based meat substitutes, soy protein, breads, flour, granules, cereals, noodles, soy bean sprouts, fats and oils as well as condiments were investigated following the pattern of the section 35 LMBG-method L 23.01.22-1. The DNA was extracted from the samples and analysed using a soybean specific lectin gene PCR as well as a PCR, specific for the genetic modification. Additional, by means of PCR in combination with fluorescence-detection (TaqMan 5'-Nuclease Assay), suspicious samples were subjected to a real-time quantification of the percentage of genetically modified RR-Soy. The methods of analysis proved to be extremely sensitive and specific in regard to the food groups checked. The fats and oils, as well as the condiments were the exceptions in which amplifiable soy DNA could not be detected. The genetic modification of RR-Soy was detected in 34 samples. Eight of these samples contained more than 1% of RR-Soy. It is necessary to determine the percentage of transgenic soy in order to assess whether genetically modified ingredients were deliberately added, or whether they were caused by technically unavoidable contamination (for example during transportation and processing).

  17. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    PubMed

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Image processing can cause some malignant soft-tissue lesions to be missed in digital mammography images.

    PubMed

    Warren, L M; Halling-Brown, M D; Looney, P T; Dance, D R; Wallis, M G; Given-Wilson, R M; Wilkinson, L; McAvinchey, R; Young, K C

    2017-09-01

    To investigate the effect of image processing on cancer detection in mammography. An observer study was performed using 349 digital mammography images of women with normal breasts, calcification clusters, or soft-tissue lesions including 191 subtle cancers. Images underwent two types of processing: FlavourA (standard) and FlavourB (added enhancement). Six observers located features in the breast they suspected to be cancerous (4,188 observations). Data were analysed using jackknife alternative free-response receiver operating characteristic (JAFROC) analysis. Characteristics of the cancers detected with each image processing type were investigated. For calcifications, the JAFROC figure of merit (FOM) was equal to 0.86 for both types of image processing. For soft-tissue lesions, the JAFROC FOM were better for FlavourA (0.81) than FlavourB (0.78); this difference was significant (p=0.001). Using FlavourA a greater number of cancers of all grades and sizes were detected than with FlavourB. FlavourA improved soft-tissue lesion detection in denser breasts (p=0.04 when volumetric density was over 7.5%) CONCLUSIONS: The detection of malignant soft-tissue lesions (which were primarily invasive) was significantly better with FlavourA than FlavourB image processing. This is despite FlavourB having a higher contrast appearance often preferred by radiologists. It is important that clinical choice of image processing is based on objective measures. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  19. The CRDS method application for study of the gas-phase processes in the hot CVD diamond thin film.

    NASA Astrophysics Data System (ADS)

    Buzaianumakarov, Vladimir; Hidalgo, Arturo; Morell, Gerardo; Weiner, Brad; Buzaianu, Madalina

    2006-03-01

    For detailed analysis of problem related to the hot CVD carbon-containing nano-material growing, we have to detect different intermediate species forming during the growing process as well as investigate dependences of concentrations of these species on different experimental parameters (concentrations of the CJH4, H2S stable chemical compounds and distance from the filament system to the substrate surface). In the present study, the HS and CS radicals were detected using the Cavity Ring Down Spectroscopic (CRDS) method in the hot CVD diamond thin film for the CH4(0.4 %) + H2 mixture doped by H2S (400 ppm). The absolute absorption density spectra of the HS and CS radicals were obtained as a function of different experimental parameters. This study proofs that the HS and CS radicals are an intermediate, which forms during the hot filament CVD process. The kinetics approach was developed for detailed analysis of the experimental data obtained. The kinetics scheme includes homogenous and heterogenous processes as well as processes of the chemical species transport in the CVD chamber.

  20. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    PubMed

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).

  1. Inferring mixed-culture growth from total biomass data in a wavelet approach

    NASA Astrophysics Data System (ADS)

    Ibarra-Junquera, V.; Escalante-Minakata, P.; Murguía, J. S.; Rosu, H. C.

    2006-10-01

    It is shown that the presence of mixed-culture growth in batch fermentation processes can be very accurately inferred from total biomass data by means of the wavelet analysis for singularity detection. This is accomplished by considering simple phenomenological models for the mixed growth and the more complicated case of mixed growth on a mixture of substrates. The main quantity provided by the wavelet analysis is the Hölder exponent of the singularity that we determine for our illustrative examples. The numerical results point to the possibility that Hölder exponents can be used to characterize the nature of the mixed-culture growth in batch fermentation processes with potential industrial applications. Moreover, the analysis of the same data affected by the common additive Gaussian noise still lead to the wavelet detection of the singularities although the Hölder exponent is no longer a useful parameter.

  2. Detection of Pigment Networks in Dermoscopy Images

    NASA Astrophysics Data System (ADS)

    Eltayef, Khalid; Li, Yongmin; Liu, Xiaohui

    2017-02-01

    One of the most important structures in dermoscopy images is the pigment network, which is also one of the most challenging and fundamental task for dermatologists in early detection of melanoma. This paper presents an automatic system to detect pigment network from dermoscopy images. The design of the proposed algorithm consists of four stages. First, a pre-processing algorithm is carried out in order to remove the noise and improve the quality of the image. Second, a bank of directional filters and morphological connected component analysis are applied to detect the pigment networks. Third, features are extracted from the detected image, which can be used in the subsequent stage. Fourth, the classification process is performed by applying feed-forward neural network, in order to classify the region as either normal or abnormal skin. The method was tested on a dataset of 200 dermoscopy images from Hospital Pedro Hispano (Matosinhos), and better results were produced compared to previous studies.

  3. Development of yarn breakage detection software system based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  4. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  5. Determination of diethanolamine or N-methyldiethanolamine in high ammonium concentration matrices by capillary electrophoresis with indirect UV detection: application to the analysis of refinery process waters.

    PubMed

    Bord, N; Crétier, G; Rocca, J-L; Bailly, C; Souchez, J-P

    2004-09-01

    Alkanolamines such as diethanolamine (DEA) and N-methyldiethanolamine (MDEA) are used in desulfurization processes in crude oil refineries. These compounds may be found in process waters following an accidental contamination. The analysis of alkanolamines in refinery process waters is very difficult due to the high ammonium concentration of the samples. This paper describes a method for the determination of DEA in high ammonium concentration refinery process waters by using capillary electrophoresis (CE) with indirect UV detection. The same method can be used for the determination of MDEA. Best results were achieved with a background electrolyte (BGE) comprising 10 mM histidine adjusted to pH 5.0 with acetic acid. The development of this electrolyte and the analytical performances are discussed. The quantification was performed by using internal standardization, by which triethanolamine (TEA) was used as internal standard. A matrix effect due to the high ammonium content has been highlighted and standard addition was therefore used. The developed method was characterized in terms of repeatability of migration times and corrected peak areas, linearity, and accuracy. Limits of detection (LODs) and quantification (LOQs) obtained were 0.2 and 0.7 ppm, respectively. The CE method was applied to the determination of DEA or MDEA in refinery process waters spiked with known amounts of analytes and it gave excellent results, since uncertainties obtained were 8 and 5%, respectively.

  6. Analysis of Android Device-Based Solutions for Fall Detection

    PubMed Central

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  7. Analysis of Android Device-Based Solutions for Fall Detection.

    PubMed

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  8. Clustering analysis of moving target signatures

    NASA Astrophysics Data System (ADS)

    Martone, Anthony; Ranney, Kenneth; Innocenti, Roberto

    2010-04-01

    Previously, we developed a moving target indication (MTI) processing approach to detect and track slow-moving targets inside buildings, which successfully detected moving targets (MTs) from data collected by a low-frequency, ultra-wideband radar. Our MTI algorithms include change detection, automatic target detection (ATD), clustering, and tracking. The MTI algorithms can be implemented in a real-time or near-real-time system; however, a person-in-the-loop is needed to select input parameters for the clustering algorithm. Specifically, the number of clusters to input into the cluster algorithm is unknown and requires manual selection. A critical need exists to automate all aspects of the MTI processing formulation. In this paper, we investigate two techniques that automatically determine the number of clusters: the adaptive knee-point (KP) algorithm and the recursive pixel finding (RPF) algorithm. The KP algorithm is based on a well-known heuristic approach for determining the number of clusters. The RPF algorithm is analogous to the image processing, pixel labeling procedure. Both algorithms are used to analyze the false alarm and detection rates of three operational scenarios of personnel walking inside wood and cinderblock buildings.

  9. Analysis of exhaled breath by laser detection

    NASA Astrophysics Data System (ADS)

    Thrall, Karla D.; Toth, James J.; Sharpe, Steven W.

    1996-04-01

    The goal of our work is two fold: (1) to develop a portable rapid laser based breath analyzer for monitoring metabolic processes, and (2) predict these metabolic processes through physiologically based pharmacokinetic (PBPK) modeling. Small infrared active molecules such as ammonia, carbon monoxide, carbon dioxide, methane and ethane are present in exhaled breath and can be readily detected by laser absorption spectroscopy. In addition, many of the stable isotopomers of these molecules can be accurately detected, making it possible to follow specific metabolic processes. Potential areas of applications for this technology include the diagnosis of certain pathologies (e.g. Helicobacter Pylori infection), detection of trauma due to either physical or chemical causes and monitoring nutrient uptake (i.e., malnutrition). In order to understand the origin and elucidate the metabolic processes associated with these small molecules, we are employing physiologically based pharmacokinetic (PBPK) models. A PBPK model is founded on known physiological processes (i.e., blood flow rates, tissue volumes, breathing rate, etc.), chemical-specific processes (i.e., tissue solubility coefficients, molecular weight, chemical density, etc.), and on metabolic processes (tissue site and rate of metabolic biotransformation). Since many of these processes are well understood, a PBPK model can be developed and validated against the more readily available experimental animal data, and then by extrapolating the parameters to apply to man, the model can predict chemical behavior in humans.

  10. Target Detection and Classification Using Seismic and PIR Sensors

    DTIC Science & Technology

    2012-06-01

    time series analysis via wavelet - based partitioning,” Signal Process...regard, this paper presents a wavelet - based method for target detection and classification. The proposed method has been validated on data sets of...The work reported in this paper makes use of a wavelet - based feature extraction method , called Symbolic Dynamic Filtering (SDF) [12]–[14]. The

  11. Why Are People Bad at Detecting Randomness? A Statistical Argument

    ERIC Educational Resources Information Center

    Williams, Joseph J.; Griffiths, Thomas L.

    2013-01-01

    Errors in detecting randomness are often explained in terms of biases and misconceptions. We propose and provide evidence for an account that characterizes the contribution of the inherent statistical difficulty of the task. Our account is based on a Bayesian statistical analysis, focusing on the fact that a random process is a special case of…

  12. Point source detection in infrared astronomical surveys

    NASA Technical Reports Server (NTRS)

    Pelzmann, R. F., Jr.

    1977-01-01

    Data processing techniques useful for infrared astronomy data analysis systems are reported. This investigation is restricted to consideration of data from space-based telescope systems operating as survey instruments. In this report the theoretical background for specific point-source detection schemes is completed, and the development of specific algorithms and software for the broad range of requirements is begun.

  13. Wearable Networked Sensing for Human Mobility and Activity Analytics: A Systems Study.

    PubMed

    Dong, Bo; Biswas, Subir

    2012-01-01

    This paper presents implementation details, system characterization, and the performance of a wearable sensor network that was designed for human activity analysis. Specific machine learning mechanisms are implemented for recognizing a target set of activities with both out-of-body and on-body processing arrangements. Impacts of energy consumption by the on-body sensors are analyzed in terms of activity detection accuracy for out-of-body processing. Impacts of limited processing abilities in the on-body scenario are also characterized in terms of detection accuracy, by varying the background processing load in the sensor units. Through a rigorous systems study, it is shown that an efficient human activity analytics system can be designed and operated even under energy and processing constraints of tiny on-body wearable sensors.

  14. Processing changes across reading encounters.

    PubMed

    Levy, B A; Newell, S; Snyder, J; Timmins, K

    1986-10-01

    Five experiments examined changes in the processing of a text across reading encounters. Experiment 1 showed that reading speed increased systematically across encounters, with no loss in the extensiveness of analyses of the printed text, as indicated by the ability to detect nonword errors embedded within that passage. Experiment 2 replicated this improved reading fluency with experience and showed that it occurred even with typescript changes across trials, thus indicating that a primed visual operations explanation cannot account for the effect. The third and fourth experiments then extended the study of the familiarity effect to higher level processing, as indicated by the detection of word errors. Familiarity facilitated the detection of these violations at the syntactic-semantic levels. Finally, Experiment 5 showed that these higher level violations continued to be well detected over a series of reading encounters with the same text. The results indicate that prior experience improves reading speed, with no attenuation of analysis of the printed words or of the passage's message.

  15. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  16. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  17. Appropriate IMFs associated with cepstrum and envelope analysis for ball-bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Tsao, Wen-Chang; Pan, Min-Chun

    2014-03-01

    The traditional envelope analysis is an effective method for the fault detection of rolling bearings. However, all the resonant frequency bands must be examined during the bearing-fault detection process. To handle the above deficiency, this paper proposes using the empirical mode decomposition (EMD) to select a proper intrinsic mode function (IMF) for the subsequent detection tools; here both envelope analysis and cepstrum analysis are employed and compared. By virtue of the band-pass filtering nature of EMD, the resonant frequency bands of structure to be measured are captured in the IMFs. As impulses arising from rolling elements striking bearing faults modulate with structure resonance, proper IMFs potentially enable to characterize fault signatures. In the study, faulty ball bearings are used to justify the proposed method, and comparisons with the traditional envelope analysis are made. Post the use of IMFs highlighting faultybearing features, the performance of using envelope analysis and cepstrum analysis to single out bearing faults is objectively compared and addressed; it is noted that generally envelope analysis offers better performance.

  18. Self-Sealing Wet Chemistry Cell for Field Analysis

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2012-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes, especially when detecting low concentration organic molecules that may identify extraterrestrial life. Wet chemistry based instruments are the techniques of choice for most laboratory- based analysis of organic molecules due to several factors including less fragmentation of fragile biomarkers, and ability to concentrate target species resulting in much lower limits of detection. Development of an automated wet chemistry preparation system that can operate autonomously on Earth and is also designed to operate under Martian ambient conditions will demonstrate the technical feasibility of including wet chemistry on future missions. An Automated Sample Processing System (ASPS) has recently been developed that receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species, and delivers sample to multiple instruments for analysis (including for non-organic soluble species). The key to this system is a sample cell that can autonomously function under field conditions. As a result, a self-sealing sample cell was developed that can autonomously hermetically seal fines and powder into a container, regardless of orientation of the apparatus. The cap is designed with a beveled edge, which allows the cap to be self-righted as the capping motor engages. Each cap consists of a C-clip lock ring below a crucible O-ring that is placed into a groove cut into the sample cap.

  19. A Signal Processing Module for the Analysis of Heart Sounds and Heart Murmurs

    NASA Astrophysics Data System (ADS)

    Javed, Faizan; Venkatachalam, P. A.; H, Ahmad Fadzil M.

    2006-04-01

    In this paper a Signal Processing Module (SPM) for the computer-aided analysis of heart sounds has been developed. The module reveals important information of cardiovascular disorders and can assist general physician to come up with more accurate and reliable diagnosis at early stages. It can overcome the deficiency of expert doctors in rural as well as urban clinics and hospitals. The module has five main blocks: Data Acquisition & Pre-processing, Segmentation, Feature Extraction, Murmur Detection and Murmur Classification. The heart sounds are first acquired using an electronic stethoscope which has the capability of transferring these signals to the near by workstation using wireless media. Then the signals are segmented into individual cycles as well as individual components using the spectral analysis of heart without using any reference signal like ECG. Then the features are extracted from the individual components using Spectrogram and are used as an input to a MLP (Multiple Layer Perceptron) Neural Network that is trained to detect the presence of heart murmurs. Once the murmur is detected they are classified into seven classes depending on their timing within the cardiac cycle using Smoothed Pseudo Wigner-Ville distribution. The module has been tested with real heart sounds from 40 patients and has proved to be quite efficient and robust while dealing with a large variety of pathological conditions.

  20. Applications of independent component analysis in SAR images

    NASA Astrophysics Data System (ADS)

    Huang, Shiqi; Cai, Xinhua; Hui, Weihua; Xu, Ping

    2009-07-01

    The detection of faint, small and hidden targets in synthetic aperture radar (SAR) image is still an issue for automatic target recognition (ATR) system. How to effectively separate these targets from the complex background is the aim of this paper. Independent component analysis (ICA) theory can enhance SAR image targets and improve signal clutter ratio (SCR), which benefits to detect and recognize faint targets. Therefore, this paper proposes a new SAR image target detection algorithm based on ICA. In experimental process, the fast ICA (FICA) algorithm is utilized. Finally, some real SAR image data is used to test the method. The experimental results verify that the algorithm is feasible, and it can improve the SCR of SAR image and increase the detection rate for the faint small targets.

  1. Background estimation and player detection in badminton video clips using histogram of pixel values along temporal dimension

    NASA Astrophysics Data System (ADS)

    Peng, Yahui; Ma, Xiao; Gao, Xinyu; Zhou, Fangxu

    2015-12-01

    Computer vision is an important tool for sports video processing. However, its application in badminton match analysis is very limited. In this study, we proposed a straightforward but robust histogram-based background estimation and player detection methods for badminton video clips, and compared the results with the naive averaging method and the mixture of Gaussians methods, respectively. The proposed method yielded better background estimation results than the naive averaging method and more accurate player detection results than the mixture of Gaussians player detection method. The preliminary results indicated that the proposed histogram-based method could estimate the background and extract the players accurately. We conclude that the proposed method can be used for badminton player tracking and further studies are warranted for automated match analysis.

  2. Squids in the Study of Cerebral Magnetic Field

    NASA Astrophysics Data System (ADS)

    Romani, G. L.; Narici, L.

    The following sections are included: * INTRODUCTION * HISTORICAL OVERVIEW * NEUROMAGNETIC FIELDS AND AMBIENT NOISE * DETECTORS * Room temperature sensors * SQUIDs * DETECTION COILS * Magnetometers * Gradiometers * Balancing * Planar gradiometers * Choice of the gradiometer parameters * MODELING * Current pattern due to neural excitations * Action potentials and postsynaptic currents * The current dipole model * Neural population and detected fields * Spherically bounded medium * SPATIAL CONFIGURATION OF THE SENSORS * SOURCE LOCALIZATION * Localization procedure * Experimental accuracy and reproducibility * SIGNAL PROCESSING * Analog Filtering * Bandpass filters * Line rejection filters * DATA ANALYSIS * Analysis of evoked/event-related responses * Simple average * Selected average * Recursive techniques * Similarity analysis * Analysis of spontaneous activity * Mapping and localization * EXAMPLES OF NEUROMAGNETIC STUDIES * Neuromagnetic measurements * Studies on the normal brain * Clinical applications * Epilepsy * Tinnitus * CONCLUSIONS * ACKNOWLEDGEMENTS * REFERENCES

  3. Critical Analysis of Dual-Probe Heat-Pulse Technique Applied to Measuring Thermal Diffusivity

    NASA Astrophysics Data System (ADS)

    Bovesecchi, G.; Coppa, P.; Corasaniti, S.; Potenza, M.

    2018-07-01

    The paper presents an analysis of the experimental parameters involved in application of the dual-probe heat pulse technique, followed by a critical review of methods for processing thermal response data (e.g., maximum detection and nonlinear least square regression) and the consequent obtainable uncertainty. Glycerol was selected as testing liquid, and its thermal diffusivity was evaluated over the temperature range from - 20 °C to 60 °C. In addition, Monte Carlo simulation was used to assess the uncertainty propagation for maximum detection. It was concluded that maximum detection approach to process thermal response data gives the closest results to the reference data inasmuch nonlinear regression results are affected by major uncertainties due to partial correlation between the evaluated parameters. Besides, the interpolation of temperature data with a polynomial to find the maximum leads to a systematic difference between measured and reference data, as put into evidence by the Monte Carlo simulations; through its correction, this systematic error can be reduced to a negligible value, about 0.8 %.

  4. Event-Related Brain Potential Correlates of Emotional Face Processing

    ERIC Educational Resources Information Center

    Eimer, Martin; Holmes, Amanda

    2007-01-01

    Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was…

  5. Integrating the Medical Home into the EHDI Process

    ERIC Educational Resources Information Center

    Munoz, Karen F.; Nelson, Lauri; Bradham, Tamala S.; Hoffman, Jeff; Houston, K. Todd

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that examined 12 areas within state EHDI programs. Related to how the medical home is integrated into the EHDI process, 273 items were listed by 48 coordinators, and themes were identified…

  6. Alterations in the sarcoplasmic protein fraction of beef muscle with postmortem aging and hydrodynamic pressure processing

    USDA-ARS?s Scientific Manuscript database

    Capillary electrophoresis (CE) and reversed-phase high performance liquid chromatography (RP-HPLC) analysis were utilized to detect differences in the sarcoplasmic protein profiles of beef strip loins subjected to aging and hydrodynamic pressure processing (HDP) treatments. At 48 h postmortem, stri...

  7. Time-frequency analysis of acoustic signals in the audio-frequency range generated during Hadfield's steel friction

    NASA Astrophysics Data System (ADS)

    Dobrynin, S. A.; Kolubaev, E. A.; Smolin, A. Yu.; Dmitriev, A. I.; Psakhie, S. G.

    2010-07-01

    Time-frequency analysis of sound waves detected by a microphone during the friction of Hadfield’s steel has been performed using wavelet transform and window Fourier transform methods. This approach reveals a relationship between the appearance of quasi-periodic intensity outbursts in the acoustic response signals and the processes responsible for the formation of wear products. It is shown that the time-frequency analysis of acoustic emission in a tribosystem can be applied, along with traditional approaches, to studying features in the wear and friction process.

  8. The Analysis of Detective Genre in Media Studies in the Student Audience

    ERIC Educational Resources Information Center

    Fedorov, Alexander

    2011-01-01

    Development of skills for the critical analysis of media texts--an important task of media education. However, media literacy practice shows that students have the problems with the discussion/analysis of entertainment genres in the early stages of media studies, for example, the difficulties in the process of understanding and interpreting the…

  9. Arc-Welding Spectroscopic Monitoring based on Feature Selection and Neural Networks.

    PubMed

    Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M

    2008-10-21

    A new spectral processing technique designed for application in the on-line detection and classification of arc-welding defects is presented in this paper. A noninvasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed in two consecutive stages. A compression algorithm is first applied to the data, allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in previous works, giving rise to an improvement in the performance of the monitoring system.

  10. Removing external DNA contamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  11. Removing external DNA decontamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  12. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    PubMed

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  13. Cryogenic mirror analysis

    NASA Technical Reports Server (NTRS)

    Nagy, S.

    1988-01-01

    Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.

  14. Analysis of surface deformation during the eruptive process of El Hierro Island (Canary Islands, Spain): Detection, Evolution and Forecasting.

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Fernandez-Ros, A.; Prates, G.; Martin, M.; Hurtado, R.; Pereda, J.; Garcia, M. J.; Garcia-Cañada, L.; Ortiz, R.; Garcia, A.

    2012-04-01

    The surface deformation has been an essential parameter for the onset and evolution of the eruptive process of the island of El Hierro (October 2011) as well as for forecasting changes in seismic and volcanic activity during the crisis period. From GNSS-GPS observations the reactivation is early detected by analizing the change in the deformation of the El Hierro Island regional geodynamics. It is found that the surface deformation changes are detected before the occurrence of seismic activity using the station FRON (GRAFCAN). The evolution of the process has been studied by the analysis of time series of topocentric coordinates and the variation of the distance between stations on the island of El Hierro (GRAFCAN station;IGN network; and UCA-CSIC points) and LPAL-IGS station on the island of La Palma. In this work the main methodologies and their results are shown: •The location (and its changes) of the litospheric pressure source obtained by applying the Mogi model. •Kalman filtering technique for high frequency time series, used to make the forecasts issued for volcanic emergency management. •Correlations between deformation of the different GPS stations and their relationship with seismovolcanic settings.

  15. A robust object-based shadow detection method for cloud-free high resolution satellite images over urban areas and water bodies

    NASA Astrophysics Data System (ADS)

    Tatar, Nurollah; Saadatseresht, Mohammad; Arefi, Hossein; Hadavand, Ahmad

    2018-06-01

    Unwanted contrast in high resolution satellite images such as shadow areas directly affects the result of further processing in urban remote sensing images. Detecting and finding the precise position of shadows is critical in different remote sensing processing chains such as change detection, image classification and digital elevation model generation from stereo images. The spectral similarity between shadow areas, water bodies, and some dark asphalt roads makes the development of robust shadow detection algorithms challenging. In addition, most of the existing methods work on pixel-level and neglect the contextual information contained in neighboring pixels. In this paper, a new object-based shadow detection framework is introduced. In the proposed method a pixel-level shadow mask is built by extending established thresholding methods with a new C4 index which enables to solve the ambiguity of shadow and water bodies. Then the pixel-based results are further processed in an object-based majority analysis to detect the final shadow objects. Four different high resolution satellite images are used to validate this new approach. The result shows the superiority of the proposed method over some state-of-the-art shadow detection method with an average of 96% in F-measure.

  16. Theory on data processing and instrumentation. [remote sensing

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1978-01-01

    A selection of NASA Earth observations programs are reviewed, emphasizing hardware capabilities. Sampling theory, noise and detection considerations, and image evaluation are discussed for remote sensor imagery. Vision and perception are considered, leading to numerical image processing. The use of multispectral scanners and of multispectral data processing systems, including digital image processing, is depicted. Multispectral sensing and analysis in application with land use and geographical data systems are also covered.

  17. An information-based approach to change-point analysis with applications to biophysics and cell biology.

    PubMed

    Wiggins, Paul A

    2015-07-21

    This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    NASA Astrophysics Data System (ADS)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  19. Cascaded image analysis for dynamic crack detection in material testing

    NASA Astrophysics Data System (ADS)

    Hampel, U.; Maas, H.-G.

    Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.

  20. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less

  1. Microfluidic devices with thick-film electrochemical detection

    DOEpatents

    Wang, Joseph; Tian, Baomin; Sahlin, Eskil

    2005-04-12

    An apparatus for conducting a microfluidic process and analysis, including at least one elongated microfluidic channel, fluidic transport means for transport of fluids through the microfluidic channel, and at least one thick-film electrode in fluidic connection with the outlet end of the microfluidic channel. The present invention includes an integrated on-chip combination reaction, separation and thick-film electrochemical detection microsystem, for use in detection of a wide range of analytes, and methods for the use thereof.

  2. A Bernoulli Gaussian Watermark for Detecting Integrity Attacks in Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weerakkody, Sean; Ozel, Omur; Sinopoli, Bruno

    We examine the merit of Bernoulli packet drops in actively detecting integrity attacks on control systems. The aim is to detect an adversary who delivers fake sensor measurements to a system operator in order to conceal their effect on the plant. Physical watermarks, or noisy additive Gaussian inputs, have been previously used to detect several classes of integrity attacks in control systems. In this paper, we consider the analysis and design of Gaussian physical watermarks in the presence of packet drops at the control input. On one hand, this enables analysis in a more general network setting. On the othermore » hand, we observe that in certain cases, Bernoulli packet drops can improve detection performance relative to a purely Gaussian watermark. This motivates the joint design of a Bernoulli-Gaussian watermark which incorporates both an additive Gaussian input and a Bernoulli drop process. We characterize the effect of such a watermark on system performance as well as attack detectability in two separate design scenarios. Here, we consider a correlation detector for attack recognition. We then propose efficiently solvable optimization problems to intelligently select parameters of the Gaussian input and the Bernoulli drop process while addressing security and performance trade-offs. Finally, we provide numerical results which illustrate that a watermark with packet drops can indeed outperform a Gaussian watermark.« less

  3. Detection and classification of concealed weapons using a magnetometer-based portal

    NASA Astrophysics Data System (ADS)

    Kotter, Dale K.; Roybal, Lyle G.; Polk, Robert E.

    2002-08-01

    A concealed weapons detection technology was developed through the support of the National Institute of Justice (NIJ) to provide a non intrusive means for rapid detection, location, and archiving of data (including visual) of potential suspects and weapon threats. This technology, developed by the Idaho National Engineering and Environmental Laboratory (INEEL), has been applied in a portal style weapons detection system using passive magnetic sensors as its basis. This paper will report on enhancements to the weapon detection system to enable weapon classification and to discriminate threats from non-threats. Advanced signal processing algorithms were used to analyze the magnetic spectrum generated when a person passes through a portal. These algorithms analyzed multiple variables including variance in the magnetic signature from random weapon placement and/or orientation. They perform pattern recognition and calculate the probability that the collected magnetic signature correlates to a known database of weapon versus non-weapon responses. Neural networks were used to further discriminate weapon type and identify controlled electronic items such as cell phones and pagers. False alarms were further reduced by analyzing the magnetic detector response by using a Joint Time Frequency Analysis digital signal processing technique. The frequency components and power spectrum for a given sensor response were derived. This unique fingerprint provided additional information to aid in signal analysis. This technology has the potential to produce major improvements in weapon detection and classification.

  4. Determination of volatile organic compounds in human breath for Helicobacter pylori detection by SPME-GC/MS.

    PubMed

    Ulanowska, Agnieszka; Kowalkowski, Tomasz; Hrynkiewicz, Katarzyna; Jackowski, Marek; Buszewski, Bogusław

    2011-03-01

    Helicobacter pylori living in the human stomach release volatile organic compounds (VOCs) that can be detected in expired air. The aim of the study was the application of breath analysis for bacteria detection. It was accomplished by determination of VOCs characteristic for patients with H. pylori and the analysis of gases released by bacteria in suspension. Solid-phase microextraction was applied as a selective technique for preconcentration and isolation of analytes. Gas chromatography coupled with mass spectrometry was used for the separation and identification of volatile analytes in breath samples and bacterial headspace. For data calculation and processing, discriminant and factor analyses were used. Endogenous substances such as isobutane, 2-butanone and ethyl acetate were detected in the breath of persons with H. pylori in the stomach and in the gaseous mixture released by the bacteria strain but they were not identified in the breath of healthy volunteers. The canonical analysis of discrimination functions showed a strong difference between the three examined groups. Knowledge of substances emitted by H. pylori with the application of an optimized breath analysis method might become a very useful tool for noninvasive detection of this bacterium. Copyright © 2010 John Wiley & Sons, Ltd.

  5. A Small Leak Detection Method Based on VMD Adaptive De-Noising and Ambiguity Correlation Classification Intended for Natural Gas Pipelines.

    PubMed

    Xiao, Qiyang; Li, Jian; Bai, Zhiliang; Sun, Jiedi; Zhou, Nan; Zeng, Zhoumo

    2016-12-13

    In this study, a small leak detection method based on variational mode decomposition (VMD) and ambiguity correlation classification (ACC) is proposed. The signals acquired from sensors were decomposed using the VMD, and numerous components were obtained. According to the probability density function (PDF), an adaptive de-noising algorithm based on VMD is proposed for noise component processing and de-noised components reconstruction. Furthermore, the ambiguity function image was employed for analysis of the reconstructed signals. Based on the correlation coefficient, ACC is proposed to detect the small leak of pipeline. The analysis of pipeline leakage signals, using 1 mm and 2 mm leaks, has shown that proposed detection method can detect a small leak accurately and effectively. Moreover, the experimental results have shown that the proposed method achieved better performances than support vector machine (SVM) and back propagation neural network (BP) methods.

  6. A Small Leak Detection Method Based on VMD Adaptive De-Noising and Ambiguity Correlation Classification Intended for Natural Gas Pipelines

    PubMed Central

    Xiao, Qiyang; Li, Jian; Bai, Zhiliang; Sun, Jiedi; Zhou, Nan; Zeng, Zhoumo

    2016-01-01

    In this study, a small leak detection method based on variational mode decomposition (VMD) and ambiguity correlation classification (ACC) is proposed. The signals acquired from sensors were decomposed using the VMD, and numerous components were obtained. According to the probability density function (PDF), an adaptive de-noising algorithm based on VMD is proposed for noise component processing and de-noised components reconstruction. Furthermore, the ambiguity function image was employed for analysis of the reconstructed signals. Based on the correlation coefficient, ACC is proposed to detect the small leak of pipeline. The analysis of pipeline leakage signals, using 1 mm and 2 mm leaks, has shown that proposed detection method can detect a small leak accurately and effectively. Moreover, the experimental results have shown that the proposed method achieved better performances than support vector machine (SVM) and back propagation neural network (BP) methods. PMID:27983577

  7. Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.

    PubMed

    Froim, D; Hopkins, C E; Belenky, A; Cohen, A S

    1997-11-01

    The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation.

  8. Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.

    PubMed Central

    Froim, D; Hopkins, C E; Belenky, A; Cohen, A S

    1997-01-01

    The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation. PMID:9336449

  9. Analysis of characteristics of Si in blast furnace pig iron and calibration methods in the detection by laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Mei, Yaguang; Cheng, Yuxin; Cheng, Shusen; Hao, Zhongqi; Guo, Lianbo; Li, Xiangyou; Zeng, Xiaoyan

    2017-10-01

    During the iron-making process in blast furnace, the Si content in liquid pig iron was usually used to evaluate the quality of liquid iron and thermal state of blast furnace. None effective method was found for rapid detecting the Si concentration of liquid iron. Laser-induced breakdown spectroscopy (LIBS) is a kind of atomic emission spectrometry technology based on laser ablation. Its obvious advantage is realizing rapid, in-situ, online analysis of element concentration in open air without sample pretreatment. The characteristics of Si in liquid iron were analyzed from the aspect of thermodynamic theory and metallurgical technology. The relationship between Si and C, Mn, S, P or other alloy elements were revealed based on thermodynamic calculation. Subsequently, LIBS was applied on rapid detection of Si of pig iron in this work. During LIBS detection process, several groups of standard pig iron samples were employed in this work to calibrate the Si content in pig iron. The calibration methods including linear, quadratic and cubic internal standard calibration, multivariate linear calibration and partial least squares (PLS) were compared with each other. It revealed that the PLS improved by normalization was the best calibration method for Si detection by LIBS.

  10. Exploring inter-frame correlation analysis and wavelet-domain modeling for real-time caption detection in streaming video

    NASA Astrophysics Data System (ADS)

    Li, Jia; Tian, Yonghong; Gao, Wen

    2008-01-01

    In recent years, the amount of streaming video has grown rapidly on the Web. Often, retrieving these streaming videos offers the challenge of indexing and analyzing the media in real time because the streams must be treated as effectively infinite in length, thus precluding offline processing. Generally speaking, captions are important semantic clues for video indexing and retrieval. However, existing caption detection methods often have difficulties to make real-time detection for streaming video, and few of them concern on the differentiation of captions from scene texts and scrolling texts. In general, these texts have different roles in streaming video retrieval. To overcome these difficulties, this paper proposes a novel approach which explores the inter-frame correlation analysis and wavelet-domain modeling for real-time caption detection in streaming video. In our approach, the inter-frame correlation information is used to distinguish caption texts from scene texts and scrolling texts. Moreover, wavelet-domain Generalized Gaussian Models (GGMs) are utilized to automatically remove non-text regions from each frame and only keep caption regions for further processing. Experiment results show that our approach is able to offer real-time caption detection with high recall and low false alarm rate, and also can effectively discern caption texts from the other texts even in low resolutions.

  11. In-process fault detection for textile fabric production: onloom imaging

    NASA Astrophysics Data System (ADS)

    Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til

    2011-05-01

    Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.

  12. Potato Operation: automatic detection of potato diseases

    NASA Astrophysics Data System (ADS)

    Lefebvre, Marc; Zimmerman, Thierry; Baur, Charles; Guegerli, Paul; Pun, Thierry

    1995-01-01

    The Potato Operation is a collaborative, multidisciplinary project in the domain of destructive testing of agricultural products. It aims at automatizing pulp sampling of potatoes in order to detect possible viral diseases. Such viruses can decrease fields productivity by a factor of up to ten. A machine, composed of three conveyor belts, a vision system, a robotic arm and controlled by a PC has been built. Potatoes are brought one by one from a bulk to the vision system, where they are seized by a rotating holding device. The sprouts, where the viral activity is maximum, are then detected by an active vision process operating on multiple views. The 3D coordinates of the sampling point are communicated to the robot arm holding a drill. Some flesh is then sampled by the drill, then deposited into an Elisa plate. After sampling, the robot arm washes the drill in order to prevent any contamination. The PC computer simultaneously controls these processes, the conveying of the potatoes, the vision algorithms and the sampling procedure. The master process, that is the vision procedure, makes use of three methods to achieve the sprouts detection. A profile analysis first locates the sprouts as protuberances. Two frontal analyses, respectively based on fluorescence and local variance, confirm the previous detection and provide the 3D coordinate of the sampling zone. The other two processes work by interruption of the master process.

  13. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the same six pairs of modalities were significantly different, but the JAFROC confidence intervals were about 32% smaller than ROC confidence intervals. This study shows that image processing has a significant impact on the detection of microcalcifications in digital mammograms. Objective measurements, such as described here, should be used by the manufacturers to select the optimal image processing algorithm.

  14. Muon detection studied by pulse-height energy analysis: Novel converter arrangements.

    PubMed

    Holmlid, Leif; Olafsson, Sveinn

    2015-08-01

    Muons are conventionally measured by a plastic scintillator-photomultiplier detector. Muons from processes in ultra-dense hydrogen H(0) are detected here by a novel type of converter in front of a photomultiplier. The muon detection yield can be increased relative to that observed with a plastic scintillator by at least a factor of 100, using a converter of metal, semiconductor (Ge), or glass for interaction with the muons penetrating through the metal housing of the detector. This detection process is due to transient formation of excited nuclei by the well-known process of muon capture, giving beta decay. The main experimental results shown here are in the form of beta electron energy spectra detected directly by the photomultiplier. Events which give a high-energy tail in the energy spectra are probably due to gamma photons from the muons. Sharp and intense x-ray peaks from a muonic aluminium converter or housing material are observed. The detection conversion in glass and Ge converters has a time constant of the order of many minutes to reach the final conversion level, while the process in metal converters is stabilized faster. The time constants are not due to lifetimes of the excited nuclei or neutrons but are due to internal charging in the insulating converter material. Interaction of this charging with the high voltage in the photomultiplier is observed.

  15. Muon detection studied by pulse-height energy analysis: Novel converter arrangements

    NASA Astrophysics Data System (ADS)

    Holmlid, Leif; Olafsson, Sveinn

    2015-08-01

    Muons are conventionally measured by a plastic scintillator-photomultiplier detector. Muons from processes in ultra-dense hydrogen H(0) are detected here by a novel type of converter in front of a photomultiplier. The muon detection yield can be increased relative to that observed with a plastic scintillator by at least a factor of 100, using a converter of metal, semiconductor (Ge), or glass for interaction with the muons penetrating through the metal housing of the detector. This detection process is due to transient formation of excited nuclei by the well-known process of muon capture, giving beta decay. The main experimental results shown here are in the form of beta electron energy spectra detected directly by the photomultiplier. Events which give a high-energy tail in the energy spectra are probably due to gamma photons from the muons. Sharp and intense x-ray peaks from a muonic aluminium converter or housing material are observed. The detection conversion in glass and Ge converters has a time constant of the order of many minutes to reach the final conversion level, while the process in metal converters is stabilized faster. The time constants are not due to lifetimes of the excited nuclei or neutrons but are due to internal charging in the insulating converter material. Interaction of this charging with the high voltage in the photomultiplier is observed.

  16. Muon detection studied by pulse-height energy analysis: Novel converter arrangements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmlid, Leif, E-mail: holmlid@chem.gu.se; Olafsson, Sveinn

    2015-08-15

    Muons are conventionally measured by a plastic scintillator–photomultiplier detector. Muons from processes in ultra-dense hydrogen H(0) are detected here by a novel type of converter in front of a photomultiplier. The muon detection yield can be increased relative to that observed with a plastic scintillator by at least a factor of 100, using a converter of metal, semiconductor (Ge), or glass for interaction with the muons penetrating through the metal housing of the detector. This detection process is due to transient formation of excited nuclei by the well-known process of muon capture, giving beta decay. The main experimental results shownmore » here are in the form of beta electron energy spectra detected directly by the photomultiplier. Events which give a high-energy tail in the energy spectra are probably due to gamma photons from the muons. Sharp and intense x-ray peaks from a muonic aluminium converter or housing material are observed. The detection conversion in glass and Ge converters has a time constant of the order of many minutes to reach the final conversion level, while the process in metal converters is stabilized faster. The time constants are not due to lifetimes of the excited nuclei or neutrons but are due to internal charging in the insulating converter material. Interaction of this charging with the high voltage in the photomultiplier is observed.« less

  17. Widely tunable quantum cascade lasers for spectroscopic sensing

    NASA Astrophysics Data System (ADS)

    Wagner, J.; Ostendorf, R.; Grahmann, J.; Merten, A.; Hugger, S.; Jarvis, J.-P.; Fuchs, F.; Boskovic, D.; Schenk, H.

    2015-01-01

    In this paper recent advances in broadband-tuneable mid-infrared (MIR) external-cavity quantum cascade lasers (EC-QCL) technology are reported as well as their use in spectroscopic process analysis and imaging stand-off detection of hazardous substances, such as explosive and related precursors. First results are presented on rapid scan EC-QCL, employing a custom-made MOEMS scanning grating in Littrow-configuration as wavelength-selective optical feedback element. This way, a scanning rate of 1 kHz was achieved, which corresponds to 2000 full wavelength scans per second. Furthermore, exemplary case studies of EC-QCL based MIR spectroscopy will be presented. These include timeresolved analysis of catalytic reactions in chemical process control, as well as imaging backscattering spectroscopy for the detection of residues of explosives and related precursors in a relevant environment.

  18. Detecting misinformation and knowledge conflicts in relational data

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian

    2014-06-01

    Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).

  19. Smartphone Cortex Controlled Real-Time Image Processing and Reprocessing for Concentration Independent LED Induced Fluorescence Detection in Capillary Electrophoresis.

    PubMed

    Szarka, Mate; Guttman, Andras

    2017-10-17

    We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.

  20. Indirect competitive assays on DVD for direct multiplex detection of drugs of abuse in oral fluids.

    PubMed

    Zhang, Lingling; Li, Xiaochun; Li, Yunchao; Shi, Xiaoli; Yu, Hua-Zhong

    2015-02-03

    On-site oral fluid testing for drugs of abuse has become prominent in order to take immediate administrative action in an enforcement process. Herein, we report a DVD technology-based indirect competitive immunoassay platform for the quantitative detection of drugs of abuse. A microfluidic approach was adapted to prepare multiplex immunoassays on a standard DVD-R, an unmodified multimode DVD/Blu-Ray drive to read signal, and a free disc-quality analysis software program to process the data. The DVD assay platform was successfully demonstrated for the simultaneous, quantitative detection of drug candidates (morphine and cocaine) in oral fluids with high selectivity. The detection limit achieved was as low as 1.0 ppb for morphine and 5.0 ppb for cocaine, comparable with that of standard mass spectrometry and ELISA methods.

  1. 'Known Secure Sensor Measurements' for Critical Infrastructure Systems: Detecting Falsification of System State

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles McQueen; Annarita Giani

    2011-09-01

    This paper describes a first investigation on a low cost and low false alarm, reliable mechanism for detecting manipulation of critical physical processes and falsification of system state. We call this novel mechanism Known Secure Sensor Measurements (KSSM). The method moves beyond analysis of network traffic and host based state information, in fact it uses physical measurements of the process being controlled to detect falsification of state. KSSM is intended to be incorporated into the design of new, resilient, cost effective critical infrastructure control systems. It can also be included in incremental upgrades of already in- stalled systems for enhancedmore » resilience. KSSM is based on known secure physical measurements for assessing the likelihood of an attack and will demonstrate a practical approach to creating, transmitting, and using the known secure measurements for detection.« less

  2. Network and biosignature analysis for the integration of transcriptomic and metabolomic data to characterize leaf senescence process in sunflower.

    PubMed

    Moschen, Sebastián; Higgins, Janet; Di Rienzo, Julio A; Heinz, Ruth A; Paniego, Norma; Fernandez, Paula

    2016-06-06

    In recent years, high throughput technologies have led to an increase of datasets from omics disciplines allowing the understanding of the complex regulatory networks associated with biological processes. Leaf senescence is a complex mechanism controlled by multiple genetic and environmental variables, which has a strong impact on crop yield. Transcription factors (TFs) are key proteins in the regulation of gene expression, regulating different signaling pathways; their function is crucial for triggering and/or regulating different aspects of the leaf senescence process. The study of TF interactions and their integration with metabolic profiles under different developmental conditions, especially for a non-model organism such as sunflower, will open new insights into the details of gene regulation of leaf senescence. Weighted Gene Correlation Network Analysis (WGCNA) and BioSignature Discoverer (BioSD, Gnosis Data Analysis, Heraklion, Greece) were used to integrate transcriptomic and metabolomic data. WGCNA allowed the detection of 10 metabolites and 13 TFs whereas BioSD allowed the detection of 1 metabolite and 6 TFs as potential biomarkers. The comparative analysis demonstrated that three transcription factors were detected through both methodologies, highlighting them as potentially robust biomarkers associated with leaf senescence in sunflower. The complementary use of network and BioSignature Discoverer analysis of transcriptomic and metabolomic data provided a useful tool for identifying candidate genes and metabolites which may have a role during the triggering and development of the leaf senescence process. The WGCNA tool allowed us to design and test a hypothetical network in order to infer relationships across selected transcription factor and metabolite candidate biomarkers involved in leaf senescence, whereas BioSignature Discoverer selected transcripts and metabolites which discriminate between different ages of sunflower plants. The methodology presented here would help to elucidate and predict novel networks and potential biomarkers of leaf senescence in sunflower.

  3. Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis

    NASA Astrophysics Data System (ADS)

    Che, E.; Olsen, M. J.

    2017-09-01

    Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.

  4. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials.

    PubMed

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.

  5. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials

    PubMed Central

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291

  6. RNA-Seq analysis reveals new evidence for inflammation-related changes in aged kidney

    PubMed Central

    Park, Daeui; Kim, Byoung-Chul; Kim, Chul-Hong; Choi, Yeon Ja; Jeong, Hyoung Oh; Kim, Mi Eun; Lee, Jun Sik; Park, Min Hi; Chung, Ki Wung; Kim, Dae Hyun; Lee, Jaewon; Im, Dong-Soon; Yoon, Seokjoo; Lee, Sunghoon; Yu, Byung Pal; Bhak, Jong; Chung, Hae Young

    2016-01-01

    Age-related dysregulated inflammation plays an essential role as a major risk factor underlying the pathophysiological aging process. To better understand how inflammatory processes are related to aging at the molecular level, we sequenced the transcriptome of young and aged rat kidney using RNA-Seq to detect known genes, novel genes, and alternative splicing events that are differentially expressed. By comparing young (6 months of age) and old (25 months of age) rats, we detected 722 up-regulated genes and 111 down-regulated genes. In the aged rats, we found 32 novel genes and 107 alternatively spliced genes. Notably, 6.6% of the up-regulated genes were related to inflammation (P < 2.2 × 10−16, Fisher exact t-test); 15.6% were novel genes with functional protein domains (P = 1.4 × 10−5); and 6.5% were genes showing alternative splicing events (P = 3.3 × 10−4). Based on the results of pathway analysis, we detected the involvement of inflammation-related pathways such as cytokines (P = 4.4 × 10−16), which were found up-regulated in the aged rats. Furthermore, an up-regulated inflammatory gene analysis identified the involvement of transcription factors, such as STAT4, EGR1, and FOSL1, which regulate cancer as well as inflammation in aging processes. Thus, RNA changes in these pathways support their involvement in the pro-inflammatory status during aging. We propose that whole RNA-Seq is a useful tool to identify novel genes and alternative splicing events by documenting broadly implicated inflammation-related genes involved in aging processes. PMID:27153548

  7. The Researches on Damage Detection Method for Truss Structures

    NASA Astrophysics Data System (ADS)

    Wang, Meng Hong; Cao, Xiao Nan

    2018-06-01

    This paper presents an effective method to detect damage in truss structures. Numerical simulation and experimental analysis were carried out on a damaged truss structure under instantaneous excitation. The ideal excitation point and appropriate hammering method were determined to extract time domain signals under two working conditions. The frequency response function and principal component analysis were used for data processing, and the angle between the frequency response function vectors was selected as a damage index to ascertain the location of a damaged bar in the truss structure. In the numerical simulation, the time domain signal of all nodes was extracted to determine the location of the damaged bar. In the experimental analysis, the time domain signal of a portion of the nodes was extracted on the basis of an optimal sensor placement method based on the node strain energy coefficient. The results of the numerical simulation and experimental analysis showed that the damage detection method based on the frequency response function and principal component analysis could locate the damaged bar accurately.

  8. Bandwidth and Detection of Packet Length Covert Channels

    DTIC Science & Technology

    2011-03-01

    Shared Resource Matrix ( SRM ): Develop a matrix of all resources on one side and on the other all the processes. Then, determine which process uses which...system calls. This method is similar to that of the SRM . Covert channels have also been created by modulating packet timing, data and headers of net- work...analysis, noninterference analysis, SRM method, and the covert flow tree method [4]. These methods can be used during the design phase of a system. Less

  9. Protein arginine methylation: Cellular functions and methods of analysis.

    PubMed

    Pahlich, Steffen; Zakaryan, Rouzanna P; Gehring, Heinz

    2006-12-01

    During the last few years, new members of the growing family of protein arginine methyltransferases (PRMTs) have been identified and the role of arginine methylation in manifold cellular processes like signaling, RNA processing, transcription, and subcellular transport has been extensively investigated. In this review, we describe recent methods and findings that have yielded new insights into the cellular functions of arginine-methylated proteins, and we evaluate the currently used procedures for the detection and analysis of arginine methylation.

  10. Experience with dynamic reinforcement rates decreases resistance to extinction.

    PubMed

    Craig, Andrew R; Shahan, Timothy A

    2016-03-01

    The ability of organisms to detect reinforcer-rate changes in choice preparations is positively related to two factors: the magnitude of the change in rate and the frequency with which rates change. Gallistel (2012) suggested similar rate-detection processes are responsible for decreases in responding during operant extinction. Although effects of magnitude of change in reinforcer rate on resistance to extinction are well known (e.g., the partial-reinforcement-extinction effect), effects of frequency of changes in rate prior to extinction are unknown. Thus, the present experiments examined whether frequency of changes in baseline reinforcer rates impacts resistance to extinction. Pigeons pecked keys for variable-interval food under conditions where reinforcer rates were stable and where they changed within and between sessions. Overall reinforcer rates between conditions were controlled. In Experiment 1, resistance to extinction was lower following exposure to dynamic reinforcement schedules than to static schedules. Experiment 2 showed that resistance to presession feeding, a disruptor that should not involve change-detection processes, was unaffected by baseline-schedule dynamics. These findings are consistent with the suggestion that change detection contributes to extinction. We discuss implications of change-detection processes for extinction of simple and discriminated operant behavior and relate these processes to the behavioral-momentum based approach to understanding extinction. © 2016 Society for the Experimental Analysis of Behavior.

  11. Visual communications and image processing '92; Proceedings of the Meeting, Boston, MA, Nov. 18-20, 1992

    NASA Astrophysics Data System (ADS)

    Maragos, Petros

    The topics discussed at the conference include hierarchical image coding, motion analysis, feature extraction and image restoration, video coding, and morphological and related nonlinear filtering. Attention is also given to vector quantization, morphological image processing, fractals and wavelets, architectures for image and video processing, image segmentation, biomedical image processing, and model-based analysis. Papers are presented on affine models for motion and shape recovery, filters for directly detecting surface orientation in an image, tracking of unresolved targets in infrared imagery using a projection-based method, adaptive-neighborhood image processing, and regularized multichannel restoration of color images using cross-validation. (For individual items see A93-20945 to A93-20951)

  12. Detection of EEG-patterns associated with real and imaginary movements using detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Pavlov, Alexey N.; Runnova, Anastasiya E.; Maksimenko, Vladimir A.; Grishina, Daria S.; Hramov, Alexander E.

    2018-02-01

    Authentic recognition of specific patterns of electroencephalograms (EEGs) associated with real and imagi- nary movements is an important stage for the development of brain-computer interfaces. In experiments with untrained participants, the ability to detect the motor-related brain activity based on the multichannel EEG processing is demonstrated. Using the detrended fluctuation analysis, changes in the EEG patterns during the imagination of hand movements are reported. It is discussed how the ability to recognize brain activity related to motor executions depends on the electrode position.

  13. Space Shuttle solid rocket motor exposure monitoring

    NASA Technical Reports Server (NTRS)

    Brown, S. W.

    1993-01-01

    During the processing of the Space Shuttle Solid Rocket Booster (SRB), segments at the Kennedy Space Center, an odor was detected around the solid propellant. An Industrial Hygiene survey was conducted to determine the chemical identity of the SRB offgassing constituents. Air samples were collected inside a forward SRB segment and analyzed to determine chemical composition. Specific chemical analysis for suspected offgassing constituents of the propellant indicated ammonia to be present. A gas chromatograph mass spectroscopy (GC/MS) analysis of the air samples detected numerous high molecular weight hydrocarbons.

  14. Hydrocyclone/Filter for Concentrating Biomarkers from Soil

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian; Obenhuber, Donald

    2008-01-01

    The hydrocyclone-filtration extractor (HFE), now undergoing development, is a simple, robust apparatus for processing large amounts of soil to extract trace amounts of microorganisms, soluble organic compounds, and other biomarkers from soil and to concentrate the extracts in amounts sufficient to enable such traditional assays as cell culturing, deoxyribonucleic acid (DNA) analysis, and isotope analysis. Originally intended for incorporation into a suite of instruments for detecting signs of life on Mars, the HFE could also be used on Earth for similar purposes, including detecting trace amounts of biomarkers or chemical wastes in soils.

  15. Sequential Ideal-Observer Analysis of Visual Discriminations.

    ERIC Educational Resources Information Center

    Geisler, Wilson S.

    1989-01-01

    A new analysis, based on the concept of the ideal observer in signal detection theory, is described. It allows: tracing of the flow of discrimination information through the initial physiological stages of visual processing for arbitrary spatio-chromatic stimuli, and measurement of the information content of said visual stimuli. (TJH)

  16. Early-stage detection of VE-cadherin during endothelial differentiation of human mesenchymal stem cells using SPR biosensor.

    PubMed

    Fathi, Farzaneh; Rezabakhsh, Aysa; Rahbarghazi, Reza; Rashidi, Mohammad-Reza

    2017-10-15

    Surface plasmon resonance (SPR) biosensors are most commonly applied for real-time dynamic analysis and measurement of interactions in bio-molecular studies and cell-surface analysis without the need for labeling processes. Up to present, SPR application in stem cell biology and biomedical sciences was underused. Herein, a very simple and sensitive method was developed to evaluate human mesenchymal stem cells trans-differentiation to endothelial lineage of over a period of 14 days based on VE-cadherin biomarker. The SPR signals increased with the increase of the amount of VE-cadherin expression on the cell surface during cell differentiation process. The method was able to detect ≈27 cells permm 2 . No significant effect was observed on the cell viability during the cell attachment to the surface of immune-reactive biochips and during the SPR analysis. Using this highly sensitive SPR method, it was possible to sense the early stage of endothelial differentiation on day 3 in label-free form, whereas flow cytometry and fluorescent microscopy methods were found unable to detect the cell differentiation at the same time. Therefore, the proposed method can rapidly and accurately detect cell differentiation in live cells and label-free manner without any need of cell breakage and has great potential for both diagnostic and experimental approaches. Copyright © 2017. Published by Elsevier B.V.

  17. An application of cluster detection to scene analysis

    NASA Technical Reports Server (NTRS)

    Rosenfeld, A. H.; Lee, Y. H.

    1971-01-01

    Certain arrangements of local features in a scene tend to group together and to be seen as units. It is suggested that in some instances, this phenomenon might be interpretable as a process of cluster detection in a graph-structured space derived from the scene. This idea is illustrated using a class of scenes that contain only horizontal and vertical line segments.

  18. The analysis of the pilot's cognitive and decision processes

    NASA Technical Reports Server (NTRS)

    Curry, R. E.

    1975-01-01

    Articles are presented on pilot performance in zero-visibility precision approach, failure detection by pilots during automatic landing, experiments in pilot decision-making during simulated low visibility approaches, a multinomial maximum likelihood program, and a random search algorithm for laboratory computers. Other topics discussed include detection of system failures in multi-axis tasks and changes in pilot workload during an instrument landing.

  19. OPAD data analysis

    NASA Astrophysics Data System (ADS)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-06-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  20. The detectability half-life in predator-prey research: what it is, why we need it, how to measure it, and what it’s good for

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. However, gut-content assays produce qualitative results, necessitating care in using them to infer the impact of predators on prey populations. In order for gut-content assays to ...

  1. Robust approach to ocular fundus image analysis

    NASA Astrophysics Data System (ADS)

    Tascini, Guido; Passerini, Giorgio; Puliti, Paolo; Zingaretti, Primo

    1993-07-01

    The analysis of morphological and structural modifications of retinal blood vessels plays an important role both to establish the presence of some systemic diseases as hypertension and diabetes and to study their course. The paper describes a robust set of techniques developed to quantitatively evaluate morphometric aspects of the ocular fundus vascular and micro vascular network. They are defined: (1) the concept of 'Local Direction of a vessel' (LD); (2) a special form of edge detection, named Signed Edge Detection (SED), which uses LD to choose the convolution kernel in the edge detection process and is able to distinguish between the left or the right vessel edge; (3) an iterative tracking (IT) method. The developed techniques use intensively both LD and SED in: (a) the automatic detection of number, position and size of blood vessels departing from the optical papilla; (b) the tracking of body and edges of the vessels; (c) the recognition of vessel branches and crossings; (d) the extraction of a set of features as blood vessel length and average diameter, arteries and arterioles tortuosity, crossing position and angle between two vessels. The algorithms, implemented in C language, have an execution time depending on the complexity of the currently processed vascular network.

  2. Multi-instrument observations of sub-minute quasi-periodic pulsations in solar flares

    NASA Astrophysics Data System (ADS)

    Dominique, Marie; Zhukov, Andrei; Dolla, Laurent

    2017-08-01

    Since a decade, quasi-periodic pulsations (QPPs) have been regularly reported to be observed in EUV and SXR during solar flares, while they were previously mostly observed in HXR and radio wavelengths. These new detections can be credited to a new generation of EUV space radiometers (SDO/EVE, PROBA2/LYRA, etc.) that significantly enhanced the instrument performances in terms of signal-to-noise ratio and time resolution. These new instruments allow us to perform statistical analysis of QPPs, which could ultimately help solving the long-debated question of their origin. However, recently, the methods (mainly the way to pre-process data and to account for the noise) used to detect QPPs in those wavelengths were questioned. In this presentation, we will discuss our current understanding of QPPs and the difficulties inherent to their detection. I will particularly address the sub-minute QPPs in the EUV and analyze them in the broader picture of multi-wavelength detection. How do they compare to the pulsations observed in other wavelength ranges? Are sub-minute QPPs and QPPs with longer periods produced by the same processes? What can we learn from the analysis of QPPs? Possible answers to these questions will be presented and discussed.

  3. Rapid Target Detection in High Resolution Remote Sensing Images Using Yolo Model

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Chen, X.; Gao, Y.; Li, Y.

    2018-04-01

    Object detection in high resolution remote sensing images is a fundamental and challenging problem in the field of remote sensing imagery analysis for civil and military application due to the complex neighboring environments, which can cause the recognition algorithms to mistake irrelevant ground objects for target objects. Deep Convolution Neural Network(DCNN) is the hotspot in object detection for its powerful ability of feature extraction and has achieved state-of-the-art results in Computer Vision. Common pipeline of object detection based on DCNN consists of region proposal, CNN feature extraction, region classification and post processing. YOLO model frames object detection as a regression problem, using a single CNN predicts bounding boxes and class probabilities in an end-to-end way and make the predict faster. In this paper, a YOLO based model is used for object detection in high resolution sensing images. The experiments on NWPU VHR-10 dataset and our airport/airplane dataset gain from GoogleEarth show that, compare with the common pipeline, the proposed model speeds up the detection process and have good accuracy.

  4. Addressing multi-label imbalance problem of surgical tool detection using CNN.

    PubMed

    Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan

    2017-06-01

    A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.

  5. Performance analysis of a multispectral system for mine detection in the littoral zone

    NASA Astrophysics Data System (ADS)

    Hargrove, John T.; Louchard, Eric

    2004-09-01

    Science & Technology International (STI) has developed, under contract with the Office of Naval Research, a system of multispectral airborne sensors and processing algorithms capable of detecting mine-like objects in the surf zone. STI has used this system to detect mine-like objects in a littoral environment as part of blind tests at Kaneohe Marine Corps Base Hawaii, and Panama City, Florida. The airborne and ground subsystems are described. The detection algorithm is graphically illustrated. We report on the performance of the system configured to operate without a human in the loop. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone, and in shallow water with wave spillage and foam. Our analysis demonstrates that this STI-developed multispectral airborne mine detection system provides a technical foundation for a viable mine counter-measures system for use prior to an amphibious assault.

  6. Food Forensics: Using Mass Spectrometry To Detect Foodborne Protein Contaminants, as Exemplified by Shiga Toxin Variants and Prion Strains.

    PubMed

    Silva, Christopher J

    2018-06-13

    Food forensicists need a variety of tools to detect the many possible food contaminants. As a result of its analytical flexibility, mass spectrometry is one of those tools. Use of the multiple reaction monitoring (MRM) method expands its use to quantitation as well as detection of infectious proteins (prions) and protein toxins, such as Shiga toxins. The sample processing steps inactivate prions and Shiga toxins; the proteins are digested with proteases to yield peptides suitable for MRM-based analysis. Prions are detected by their distinct physicochemical properties and differential covalent modification. Shiga toxin analysis is based on detecting peptides derived from the five identical binding B subunits comprising the toxin. 15 N-labeled internal standards are prepared from cloned proteins. These examples illustrate the power of MRM, in that the same instrument can be used to safely detect and quantitate protein toxins, prions, and small molecules that might contaminate our food.

  7. Effects of sample injection amount and time-of-flight mass spectrometric detection dynamic range on metabolome analysis by high-performance chemical isotope labeling LC-MS.

    PubMed

    Zhou, Ruokun; Li, Liang

    2015-04-06

    The effect of sample injection amount on metabolome analysis in a chemical isotope labeling (CIL) liquid chromatography-mass spectrometry (LC-MS) platform was investigated. The performance of time-of-flight (TOF) mass spectrometers with and without a high-dynamic-range (HD) detection system was compared in the analysis of (12)C2/(13)C2-dansyl labeled human urine samples. An average of 1635 ± 21 (n = 3) peak pairs or putative metabolites was detected using the HD-TOF-MS, compared to 1429 ± 37 peak pairs from a conventional or non-HD TOF-MS. In both instruments, signal saturation was observed. However, in the HD-TOF-MS, signal saturation was mainly caused by the ionization process, while in the non-HD TOF-MS, it was caused by the detection process. To extend the MS detection range in the non-HD TOF-MS, an automated switching from using (12)C to (13)C-natural abundance peaks for peak ratio calculation when the (12)C peaks are saturated has been implemented in IsoMS, a software tool for processing CIL LC-MS data. This work illustrates that injecting an optimal sample amount is important to maximize the metabolome coverage while avoiding the sample carryover problem often associated with over-injection. A TOF mass spectrometer with an enhanced detection dynamic range can also significantly increase the number of peak pairs detected. In chemical isotope labeling (CIL) LC-MS, relative metabolite quantification is done by measuring the peak ratio of a (13)C2-/(12)C2-labeled peak pair for a given metabolite present in two comparative samples. The dynamic range of peak ratio measurement does not need to be very large, as only subtle changes of metabolite concentrations are encountered in most metabolomic studies where relative metabolome quantification of different groups of samples is performed. However, the absolute concentrations of different metabolites can be very different, requiring a technique to provide a wide detection dynamic range to allow the detection of as many peak pairs as possible. In this work, we demonstrated that controlling the sample injection amount into LC-MS was critical to achieve the optimal detectability while avoiding sample carry-over problem. In addition, the use of a high-dynamic-range TOF system increased the number of peak pairs detected, compared to a conventional TOF system. We also investigated the ionization and detection saturation factors limiting the dynamic range of detection. This article is part of a Special Issue entitled: Protein dynamics in health and disease. Guest Editors: Pierre Thibault and Anne-Claude Gingras. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Forest fire autonomous decision system based on fuzzy logic

    NASA Astrophysics Data System (ADS)

    Lei, Z.; Lu, Jianhua

    2010-11-01

    The proposed system integrates GPS / pseudolite / IMU and thermal camera in order to autonomously process the graphs by identification, extraction, tracking of forest fire or hot spots. The airborne detection platform, the graph-based algorithms and the signal processing frame are analyzed detailed; especially the rules of the decision function are expressed in terms of fuzzy logic, which is an appropriate method to express imprecise knowledge. The membership function and weights of the rules are fixed through a supervised learning process. The perception system in this paper is based on a network of sensorial stations and central stations. The sensorial stations collect data including infrared and visual images and meteorological information. The central stations exchange data to perform distributed analysis. The experiment results show that working procedure of detection system is reasonable and can accurately output the detection alarm and the computation of infrared oscillations.

  9. On-Line Loss of Control Detection Using Wavelets

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J. (Technical Monitor); Thompson, Peter M.; Klyde, David H.; Bachelder, Edward N.; Rosenthal, Theodore J.

    2005-01-01

    Wavelet transforms are used for on-line detection of aircraft loss of control. Wavelet transforms are compared with Fourier transform methods and shown to more rapidly detect changes in the vehicle dynamics. This faster response is due to a time window that decreases in length as the frequency increases. New wavelets are defined that further decrease the detection time by skewing the shape of the envelope. The wavelets are used for power spectrum and transfer function estimation. Smoothing is used to tradeoff the variance of the estimate with detection time. Wavelets are also used as front-end to the eigensystem reconstruction algorithm. Stability metrics are estimated from the frequency response and models, and it is these metrics that are used for loss of control detection. A Matlab toolbox was developed for post-processing simulation and flight data using the wavelet analysis methods. A subset of these methods was implemented in real time and named the Loss of Control Analysis Tool Set or LOCATS. A manual control experiment was conducted using a hardware-in-the-loop simulator for a large transport aircraft, in which the real time performance of LOCATS was demonstrated. The next step is to use these wavelet analysis tools for flight test support.

  10. Description, characteristics and testing of the NASA airborne radar

    NASA Technical Reports Server (NTRS)

    Jones, W. R.; Altiz, O.; Schaffner, P.; Schrader, J. H.; Blume, H. J. C.

    1991-01-01

    Presented here is a description of a coherent radar scattermeter and its associated signal processing hardware, which have been specifically designed to detect microbursts and record their radar characteristics. Radar parameters, signal processing techniques and detection algorithms, all under computer control, combine to sense and process reflectivity, clutter, and microburst data. Also presented is the system's high density, high data rate recording system. This digital system is capable of recording many minutes of the in-phase and quadrature components and corresponding receiver gains of the scattered returns for selected spatial regions, as well as other aircraft and hardware related parameters of interest for post-flight analysis. Information is given in viewgraph form.

  11. Detection of defects in laser powder deposition (LPD) components by pulsed laser transient thermography

    NASA Astrophysics Data System (ADS)

    Santospirito, S. P.; Słyk, Kamil; Luo, Bin; Łopatka, Rafał; Gilmour, Oliver; Rudlin, John

    2013-05-01

    Detection of defects in Laser Powder Deposition (LPD) produced components has been achieved by laser thermography. An automatic in-process NDT defect detection software system has been developed for the analysis of laser thermography to automatically detect, reliably measure and then sentence defects in individual beads of LPD components. A deposition path profile definition has been introduced so all laser powder deposition beads can be modeled, and the inspection system has been developed to automatically generate an optimized inspection plan in which sampling images follow the deposition track, and automatically control and communicate with robot-arms, the source laser and cameras to implement image acquisition. Algorithms were developed so that the defect sizes can be correctly evaluated and these have been confirmed using test samples. Individual inspection images can also be stitched together for a single bead, a layer of beads or multiple layers of beads so that defects can be mapped through the additive process. A mathematical model was built up to analyze and evaluate the movement of heat throughout the inspection bead. Inspection processes were developed and positional and temporal gradient algorithms have been used to measure the flaw sizes. Defect analysis is then performed to determine if the defect(s) can be further classified (crack, lack of fusion, porosity) and the sentencing engine then compares the most significant defect or group of defects against the acceptance criteria - independent of human decisions. Testing on manufactured defects from the EC funded INTRAPID project has successful detected and correctly sentenced all samples.

  12. Recent Advances of Malaria Parasites Detection Systems Based on Mathematical Morphology

    PubMed Central

    Di Ruberto, Cecilia; Kocher, Michel

    2018-01-01

    Malaria is an epidemic health disease and a rapid, accurate diagnosis is necessary for proper intervention. Generally, pathologists visually examine blood stained slides for malaria diagnosis. Nevertheless, this kind of visual inspection is subjective, error-prone and time-consuming. In order to overcome the issues, numerous methods of automatic malaria diagnosis have been proposed so far. In particular, many researchers have used mathematical morphology as a powerful tool for computer aided malaria detection and classification. Mathematical morphology is not only a theory for the analysis of spatial structures, but also a very powerful technique widely used for image processing purposes and employed successfully in biomedical image analysis, especially in preprocessing and segmentation tasks. Microscopic image analysis and particularly malaria detection and classification can greatly benefit from the use of morphological operators. The aim of this paper is to present a review of recent mathematical morphology based methods for malaria parasite detection and identification in stained blood smears images. PMID:29419781

  13. Recent Advances of Malaria Parasites Detection Systems Based on Mathematical Morphology.

    PubMed

    Loddo, Andrea; Di Ruberto, Cecilia; Kocher, Michel

    2018-02-08

    Malaria is an epidemic health disease and a rapid, accurate diagnosis is necessary for proper intervention. Generally, pathologists visually examine blood stained slides for malaria diagnosis. Nevertheless, this kind of visual inspection is subjective, error-prone and time-consuming. In order to overcome the issues, numerous methods of automatic malaria diagnosis have been proposed so far. In particular, many researchers have used mathematical morphology as a powerful tool for computer aided malaria detection and classification. Mathematical morphology is not only a theory for the analysis of spatial structures, but also a very powerful technique widely used for image processing purposes and employed successfully in biomedical image analysis, especially in preprocessing and segmentation tasks. Microscopic image analysis and particularly malaria detection and classification can greatly benefit from the use of morphological operators. The aim of this paper is to present a review of recent mathematical morphology based methods for malaria parasite detection and identification in stained blood smears images.

  14. Fault detection and classification in electrical power transmission system using artificial neural network.

    PubMed

    Jamil, Majid; Sharma, Sanjeev Kumar; Singh, Rajveer

    2015-01-01

    This paper focuses on the detection and classification of the faults on electrical power transmission line using artificial neural networks. The three phase currents and voltages of one end are taken as inputs in the proposed scheme. The feed forward neural network along with back propagation algorithm has been employed for detection and classification of the fault for analysis of each of the three phases involved in the process. A detailed analysis with varying number of hidden layers has been performed to validate the choice of the neural network. The simulation results concluded that the present method based on the neural network is efficient in detecting and classifying the faults on transmission lines with satisfactory performances. The different faults are simulated with different parameters to check the versatility of the method. The proposed method can be extended to the Distribution network of the Power System. The various simulations and analysis of signals is done in the MATLAB(®) environment.

  15. FIND: difFerential chromatin INteractions Detection using a spatial Poisson process

    PubMed Central

    Chen, Yang; Zhang, Michael Q.

    2018-01-01

    Polymer-based simulations and experimental studies indicate the existence of a spatial dependency between the adjacent DNA fibers involved in the formation of chromatin loops. However, the existing strategies for detecting differential chromatin interactions assume that the interacting segments are spatially independent from the other segments nearby. To resolve this issue, we developed a new computational method, FIND, which considers the local spatial dependency between interacting loci. FIND uses a spatial Poisson process to detect differential chromatin interactions that show a significant difference in their interaction frequency and the interaction frequency of their neighbors. Simulation and biological data analysis show that FIND outperforms the widely used count-based methods and has a better signal-to-noise ratio. PMID:29440282

  16. Photoacoustic spectroscopy of CO2 laser in the detection of gaseous molecules

    NASA Astrophysics Data System (ADS)

    Lima, G. R.; Sthel, M. S.; da Silva, M. G.; Schramm, D. U. S.; de Castro, M. P. P.; Vargas, H.

    2011-01-01

    The detection of trace gases is very important for a variety of applications, including the monitoring of atmospheric pollutants, industrial process control, measuring air quality in workplaces, research into fruits physiological processes and medical diagnosis of diseases through the analysis of exhaled gases. The implementation of these and many other applications requiring gas sensors able to meet high sensitivity and selectivity. In this work, a photoacoustic laser spectrometer with CO2 emission in the infrared range and a resonant photoacoustic cell was used. We obtain the resonance frequency of 2.4 kHz to photoacoustic cell, was estimated detection limit of the spectrometer for molecules of ethylene (C2H4), 16 ppbV and ammonia (NH3) 42 ppbV.

  17. Vision-based method for detecting driver drowsiness and distraction in driver monitoring system

    NASA Astrophysics Data System (ADS)

    Jo, Jaeik; Lee, Sung Joo; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie

    2011-12-01

    Most driver-monitoring systems have attempted to detect either driver drowsiness or distraction, although both factors should be considered for accident prevention. Therefore, we propose a new driver-monitoring method considering both factors. We make the following contributions. First, if the driver is looking ahead, drowsiness detection is performed; otherwise, distraction detection is performed. Thus, the computational cost and eye-detection error can be reduced. Second, we propose a new eye-detection algorithm that combines adaptive boosting, adaptive template matching, and blob detection with eye validation, thereby reducing the eye-detection error and processing time significantly, which is hardly achievable using a single method. Third, to enhance eye-detection accuracy, eye validation is applied after initial eye detection, using a support vector machine based on appearance features obtained by principal component analysis (PCA) and linear discriminant analysis (LDA). Fourth, we propose a novel eye state-detection algorithm that combines appearance features obtained using PCA and LDA, with statistical features such as the sparseness and kurtosis of the histogram from the horizontal edge image of the eye. Experimental results showed that the detection accuracies of the eye region and eye states were 99 and 97%, respectively. Both driver drowsiness and distraction were detected with a success rate of 98%.

  18. System reliability and recovery.

    DOT National Transportation Integrated Search

    1971-06-01

    The paper exhibits a variety of reliability techniques applicable to future ATC data processing systems. Presently envisioned schemes for error detection, error interrupt and error analysis are considered, along with methods of retry, reconfiguration...

  19. Arc-welding quality assurance by means of embedded fiber sensor and spectral processing combining feature selection and neural networks

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; García-Allende, P. B.; Cobo, A.; Conde, O.; López-Higuera, J. M.

    2007-07-01

    A new spectral processing technique designed for its application in the on-line detection and classification of arc-welding defects is presented in this paper. A non-invasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed by means of two consecutive stages. A compression algorithm is first applied to the data allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in a previous paper, giving rise to an improvement in the performance of the monitoring system.

  20. Detection of the diurnal cycle in rainfall from the TRMM satellite

    NASA Technical Reports Server (NTRS)

    Bell, Thomas L.

    1989-01-01

    Consideration is given to the process of detecting the diurnal cycle from data that will be collected by the Tropical Rainfall Measuring Mission satellite. The analysis of data for the diurnal cycle is discussed, accounting for the fact that satellite visits will be irregularly spaced in time. The accuracy with which the first few harmonics of the diurnal cycle can be detected from several months of satellite data is estimated using rainfall statistics observed during the GARP Atlantic Tropical Experiment.

  1. Single hair analysis of small molecules using MALDI-triple quadrupole MS imaging and LC-MS/MS: investigations on opportunities and pitfalls.

    PubMed

    Poetzsch, Michael; Steuer, Andrea E; Roemmelt, Andreas T; Baumgartner, Markus R; Kraemer, Thomas

    2014-12-02

    Single hair analysis normally requires extensive sample preparation microscale protocols including time-consuming steps like segmentation and extraction. Matrix assisted laser desorption and ionization mass spectrometric imaging (MALDI-MSI) was shown to be an alternative tool in single hair analysis, but still, questions remain. Therefore, an investigation of MALDI-MSI in single hair analysis concerning the extraction process, usage of internal standard (IS), and influences on the ionization processes were systematically investigated to enable the reliable application to hair analysis. Furthermore, single dose detection, quantitative correlation to a single hair, and hair strand LC-MS/MS results were performed, and the performance was compared to LC-MS/MS single hair monitoring. The MALDI process was shown to be independent from natural hair color and not influenced by the presence of melanin. Ionization was shown to be reproducible along and in between different hair samples. MALDI image intensities in single hair and hair snippets showed good semiquantitative correlation to zolpidem hair concentrations obtained from validated routine LC-MS/MS methods. MALDI-MSI is superior to LC-MS/MS analysis when a fast, easy, and cheap sample preparation is necessary, whereas LC-MS/MS showed higher sensitivity with the ability of single dose detection for zolpidem. MALDI-MSI and LC-MS/MS segmental single hair analysis showed good correlation, and both are suitable for consumption monitoring of drugs of abuse with a high time resolution.

  2. Benefit from NASA

    NASA Image and Video Library

    1985-01-01

    The NASA imaging processing technology, an advanced computer technique to enhance images sent to Earth in digital form by distant spacecraft, helped develop a new vision screening process. The Ocular Vision Screening system, an important step in preventing vision impairment, is a portable device designed especially to detect eye problems in children through the analysis of retinal reflexes.

  3. All-integrated and highly sensitive paper based device with sample treatment platform for Cd2+ immunodetection in drinking/tap waters.

    PubMed

    López Marzo, Adaris M; Pons, Josefina; Blake, Diane A; Merkoçi, Arben

    2013-04-02

    Nowadays, the development of systems, devices, or methods that integrate several process steps into one multifunctional step for clinical, environmental, or industrial purposes constitutes a challenge for many ongoing research projects. Here, we present a new integrated paper based cadmium (Cd(2+)) immunosensing system in lateral flow format, which integrates the sample treatment process with the analyte detection process. The principle of Cd(2+) detection is based on competitive reaction between the cadmium-ethylenediaminetetraacetic acid-bovine serum albumin-gold nanoparticles (Cd-EDTA-BSA-AuNP) conjugate deposited on the conjugation pad strip and the Cd-EDTA complex formed in the analysis sample for the same binding sites of the 2A81G5 monoclonal antibody (mAb), specific to Cd-EDTA but not Cd(2+) free, which is immobilized onto the test line. This platform operates without any sample pretreatment step for Cd(2+) detection thanks to an extra conjugation pad that ensures Cd(2+) complexation with EDTA and interference masking through ovalbumin (OVA). The detection and quantification limits found for the device were 0.1 and 0.4 ppb, respectively, these being the lowest limits reported up to now for metal sensors based on paper. The accuracy of the device was evaluated by addition of known quantities of Cd(2+) to different drinking water samples and subsequent Cd(2+) content analysis. Sample recoveries ranged from 95 to 105% and the coefficient of variation for the intermediate precision assay was less than 10%. In addition, the results obtained here were compared with those obtained with the well-established inductively coupled plasma emission spectroscopy (ICPES) and the analysis of certificate standard samples.

  4. Physicochemical characterization of spray-dried PLGA/PEG microspheres, and preliminary assessment of biological response.

    PubMed

    Javiya, Curie; Jonnalagadda, Sriramakamal

    2016-09-01

    The use of spray-drying to prepare blended PLGA:PEG microspheres with lower immune detection. To study physical properties, polymer miscibility and alveolar macrophage response for blended PLGA:PEG microspheres prepared by a laboratory-scale spray-drying process. Microspheres were prepared by spray-drying 0-20% w/w ratios of PLGA 65:35 and PEG 3350 in dichloromethane. Particle size and morphology was studied using scanning electron microscopy. Polymer miscibility and residual solvent levels evaluated by thermal analysis (differential scanning calorimetry - DSC and thermogravimetric analysis - TGA). Immunogenicity was assessed in vitro by response of rat alveolar macrophages (NR8383) by the MTT-based cell viability assay and reactive oxygen species (ROS) detection. The spray dried particles were spherical, with a size range of about 2-3 µm and a yield of 16-60%. Highest yield was obtained at 1% PEG concentration. Thermal analysis showed a melting peak at 59 °C (enthalpy: 170.61 J/g) and a degradation-onset of 180 °C for PEG 3350. PLGA 65:35 was amorphous, with a Tg of 43 °C. Blended PLGA:PEG microspheres showed a delayed degradation-onset of 280 °C, and PEG enthalpy-loss corresponding to 15% miscibility of PEG in PLGA. NR8383 viability studies and ROS detection upon exposure to these cells suggested that blended PLGA:PEG microspheres containing 1 and 5% PEG are optimal in controling cell proliferation and activation. This research establishes the feasibility of using a spray-drying process to prepare spherical particles (2-3 µm) of molecularly-blended PLGA 65:35 and PEG 3350. A PEG concentration of 1-5% was optimal to maximize process yield, with minimal potential for immune detection.

  5. Improvement on Exoplanet Detection Methods and Analysis via Gaussian Process Fitting Techniques

    NASA Astrophysics Data System (ADS)

    Van Ross, Bryce; Teske, Johanna

    2018-01-01

    Planetary signals in radial velocity (RV) data are often accompanied by signals coming solely from stellar photo- or chromospheric variation. Such variation can reduce the precision of planet detection and mass measurements, and cause misidentification of planetary signals. Recently, several authors have demonstrated the utility of Gaussian Process (GP) regression for disentangling planetary signals in RV observations (Aigrain et al. 2012; Angus et al. 2017; Czekala et al. 2017; Faria et al. 2016; Gregory 2015; Haywood et al. 2014; Rajpaul et al. 2015; Foreman-Mackey et al. 2017). GP models the covariance of multivariate data to make predictions about likely underlying trends in the data, which can be applied to regions where there are no existing observations. The potency of GP has been used to infer stellar rotation periods; to model and disentangle time series spectra; and to determine physical aspects, populations, and detection of exoplanets, among other astrophysical applications. Here, we implement similar analysis techniques to times series of the Ca-2 H and K activity indicator measured simultaneously with RVs in a small sample of stars from the large Keck/HIRES RV planet search program. Our goal is to characterize the pattern(s) of non-planetary variation to be able to know what is/ is not a planetary signal. We investigated ten different GP kernels and their respective hyperparameters to determine the optimal combination (e.g., the lowest Bayesian Information Criterion value) in each stellar data set. To assess the hyperparameters’ error, we sampled their posterior distributions using Markov chain Monte Carlo (MCMC) analysis on the optimized kernels. Our results demonstrate how GP analysis of stellar activity indicators alone can contribute to exoplanet detection in RV data, and highlight the challenges in applying GP analysis to relatively small, irregularly sampled time series.

  6. Nature and mechanisms of hepatocyte apoptosis induced by D-galactosamine/lipopolysaccharide challenge in mice.

    PubMed

    Wu, Yi-Hang; Hu, Shao-Qing; Liu, Jun; Cao, Hong-Cui; Xu, Wei; Li, Yong-Jun; Li, Lan-Juan

    2014-06-01

    Apoptosis plays a role in the normal development of liver. However, overactivation thereof may lead to hepatocellular damage. The aim of this study was to assess D-galactosamine (D-GalN)/lipopolysaccharide (LPS)-induced hepatocyte apoptotic changes in mice and clarify the mechanisms involved in this process. DNA ladder detection was employed to determine the induction condition of hepatic apoptosis. An initial test indicated that typical hepatocyte apoptosis was observed at 6-10 h after the intraperitoneal injection of D-GalN (700 mg/kg) and LPS (10 µg/kg). Subsequently, we evaluated hepatocyte apoptosis at 8 h after administering D-GalN/LPS by histopathological analysis, terminal deoxynucleotidyl transferase-mediated dUTP nick end‑labeling (TUNEL) detection, flow cytometry and electron microscopy analysis. To clarify the apoptosis-related gene expression, the expression levels of tumor necrosis factor-α (TNF-α), transforming growth factor-β1 (TGF-β1), caspase-3, and Fas/Fas ligand (FasL) were determined by serum enzyme immunoassay, immunohistochemistry and western blot analysis. Strong apoptotic positive signals following D-GalN/LPS injection were observed from the results of the serum analysis, histopathological and immunohistochemical analyses, DNA ladder detection, TUNEL detection, flow cytometry and electron microscopy analysis. Additionally, apoptotic hepatocytes were mainly at the late stage of cell apoptosis. The expression of TNF-α, TGF-β1, caspase-3 and Fas/FasL was significantly increased. In conclusion, this study evaluated the D-GalN/LPS-induced hepatocyte apoptotic changes and clarified the apoptosis-related gene expression in mice. The hepatocyte apoptosis induced by D-GalN/LPS may be mainly regulated by the death receptor pathway. TGF-β signaling pathway may also play a vital role in this process of hepatocyte apoptosis.

  7. A Novel Biomedical Device Utilizing Light Emitting Nano-Structures

    NASA Technical Reports Server (NTRS)

    Varaljay, Vanessa A.

    2004-01-01

    This paper will discuss the development of a novel biomedical detection device that will be used to detect microorganisms with the use of infrared fluorochrome polymers attached to antibodies in fluids such as water. The fluorochrome polymers emit light in the near inferred region (NIR), approximately 805 nm, when excited by an NIR laser at 778 nm. The device could remarkably change the way laboratory testing is done today. The testing process is usually performed on a time scale of days while our device will be able to detect microorganisms in minutes. This type of time efficient analysis is ideal for use aboard the International Space Station and the Space Shuttle (ISS/SS) and has many useful commercial applications, for instance at a water treatment plant and food processing plants. With more research and experimentation the testing might also one day be used to detect bacteria and viruses in complex fluids such as blood, which would revolutionize blood analysis as it is performed today. My contribution to the project has been to develop a process which will allow an antibody/fluorescent dye pair to be conjugated to a specific bacteria or virus and than to to be separated from a sample body of water for detection. The antibody being used in this experiment is anti beta galactosidase and its complement enzyme is beta galactosidase, a non harmful derivative of E. Coli. The anti beta galactosidase has been conjugated to the fluorochrome polymer, IRDye800, which emits at approximately 806 nm. The dye when excited by the NIR laser emits a signal which is detected by a spectrometer and then is read by state of the art computer software. The state-of-the-art process includes incubating the anti beta galactosidase and beta galactosidase in a phosphate buffer solution in a test tube, allowing the antibody to bind to specific sites on the enzyme. After the antibody is bound to the enzyme, it is centrifuged in specific filters that will allow free antibody to wash away and leave the antibody-enzyme complexes on top in solution for testing and analysis. This solution is pipetted into a cuvette, a special plastic test tube, which will then be excited by the laser. The signal read will tell US that an antibody is present and since it is bound to the enzyme, that the bacteria is also present.

  8. Sample processing approach for detection of ricin in surface samples.

    PubMed

    Kane, Staci; Shah, Sanjiv; Erler, Anne Marie; Alfaro, Teneile

    2017-12-01

    With several ricin contamination incidents reported over the past decade, rapid and accurate methods are needed for environmental sample analysis, especially after decontamination. A sample processing method was developed for common surface sampling devices to improve the limit of detection and avoid false negative/positive results for ricin analysis. Potential assay interferents from the sample matrix (bleach residue, sample material, wetting buffer), including reference dust, were tested using a Time-Resolved Fluorescence (TRF) immunoassay. Test results suggested that the sample matrix did not cause the elevated background fluorescence sometimes observed when analyzing post-bleach decontamination samples from ricin incidents. Furthermore, sample particulates (80mg/mL Arizona Test Dust) did not enhance background fluorescence or interfere with ricin detection by TRF. These results suggested that high background fluorescence in this immunoassay could be due to labeled antibody quality and/or quantity issues. Centrifugal ultrafiltration devices were evaluated for ricin concentration as a part of sample processing. Up to 30-fold concentration of ricin was observed by the devices, which serve to remove soluble interferents and could function as the front-end sample processing step to other ricin analytical methods. The procedure has the potential to be used with a broader range of environmental sample types and with other potential interferences and to be followed by other ricin analytical methods, although additional verification studies would be required. Published by Elsevier B.V.

  9. On-chip wavelength multiplexed detection of cancer DNA biomarkers in blood

    PubMed Central

    Cai, H.; Stott, M. A.; Ozcelik, D.; Parks, J. W.; Hawkins, A. R.; Schmidt, H.

    2016-01-01

    We have developed an optofluidic analysis system that processes biomolecular samples starting from whole blood and then analyzes and identifies multiple targets on a silicon-based molecular detection platform. We demonstrate blood filtration, sample extraction, target enrichment, and fluorescent labeling using programmable microfluidic circuits. We detect and identify multiple targets using a spectral multiplexing technique based on wavelength-dependent multi-spot excitation on an antiresonant reflecting optical waveguide chip. Specifically, we extract two types of melanoma biomarkers, mutated cell-free nucleic acids —BRAFV600E and NRAS, from whole blood. We detect and identify these two targets simultaneously using the spectral multiplexing approach with up to a 96% success rate. These results point the way toward a full front-to-back chip-based optofluidic compact system for high-performance analysis of complex biological samples. PMID:28058082

  10. Cellphone-based detection platform for rbST biomarker analysis in milk extracts using a microsphere fluorescence immunoassay.

    PubMed

    Ludwig, Susann K J; Zhu, Hongying; Phillips, Stephen; Shiledar, Ashutosh; Feng, Steve; Tseng, Derek; van Ginkel, Leendert A; Nielen, Michel W F; Ozcan, Aydogan

    2014-11-01

    Current contaminant and residue monitoring throughout the food chain is based on sampling, transport, administration, and analysis in specialized control laboratories. This is a highly inefficient and costly process since typically more than 99% of the samples are found to be compliant. On-site simplified prescreening may provide a scenario in which only samples that are suspect are transported and further processed. Such a prescreening can be performed using a small attachment on a cellphone. To this end, a cellphone-based imaging platform for a microsphere fluorescence immunoassay that detects the presence of anti-recombinant bovine somatotropin (rbST) antibodies in milk extracts was developed. RbST administration to cows increases their milk production, but is illegal in the EU and a public health concern in the USA. The cellphone monitors the presence of anti-rbST antibodies (rbST biomarker), which are endogenously produced upon administration of rbST and excreted in milk. The rbST biomarker present in milk extracts was captured by rbST covalently coupled to paramagnetic microspheres and labeled by quantum dot (QD)-coupled detection antibodies. The emitted fluorescence light from these captured QDs was then imaged using the cellphone camera. Additionally, a dark-field image was taken in which all microspheres present were visible. The fluorescence and dark-field microimages were analyzed using a custom-developed Android application running on the same cellphone. With this setup, the microsphere fluorescence immunoassay and cellphone-based detection were successfully applied to milk sample extracts from rbST-treated and untreated cows. An 80% true-positive rate and 95% true-negative rate were achieved using this setup. Next, the cellphone-based detection platform was benchmarked against a newly developed planar imaging array alternative and found to be equally performing versus the much more sophisticated alternative. Using cellphone-based on-site analysis in future residue monitoring can limit the number of samples for laboratory analysis already at an early stage. Therewith, the entire monitoring process can become much more efficient and economical.

  11. Proposal on How To Conduct a Biopharmaceutical Process Failure Mode and Effect Analysis (FMEA) as a Risk Assessment Tool.

    PubMed

    Zimmermann, Hartmut F; Hentschel, Norbert

    2011-01-01

    With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

  12. Use of a systematic risk analysis method (FMECA) to improve quality in a clinical laboratory procedure.

    PubMed

    Serafini, A; Troiano, G; Franceschini, E; Calzoni, P; Nante, N; Scapellato, C

    2016-01-01

    Risk management is a set of actions to recognize or identify risks, errors and their consequences and to take the steps to counter it. The aim of our study was to apply FMECA (Failure Mode, Effects and Criticality Analysis) to the Activated Protein C resistance (APCR) test in order to detect and avoid mistakes in this process. We created a team and the process was divided in phases and sub phases. For each phase we calculated the probability of occurrence (O) of an error, the detectability score (D) and the severity (S). The product of these three indexes yields the RPN (Risk Priority Number). Phases with a higher RPN need corrective actions with a higher priority. The calculation of RPN showed that more than 20 activities have a score higher than 150 and need important preventive actions; 8 have a score between 100 and 150. Only 23 actions obtained an acceptable score lower than 100. This was one of the first experience of application of FMECA analysis to a laboratory process, and the first one which applies this technique to the identification of the factor V Leiden, and our results confirm that FMECA could be a simple, powerful and useful tool in risk management and helps to identify quickly the criticality in a laboratory process.

  13. OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization

    NASA Astrophysics Data System (ADS)

    Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian

    2018-03-01

    OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.

  14. Analysis and improved design considerations for airborne pulse Doppler radar signal processing in the detection of hazardous windshear

    NASA Technical Reports Server (NTRS)

    Lee, Jonggil

    1990-01-01

    High resolution windspeed profile measurements are needed to provide reliable detection of hazardous low altitude windshear with an airborne pulse Doppler radar. The system phase noise in a Doppler weather radar may degrade the spectrum moment estimation quality and the clutter cancellation capability which are important in windshear detection. Also the bias due to weather return Doppler spectrum skewness may cause large errors in pulse pair spectral parameter estimates. These effects are analyzed for the improvement of an airborne Doppler weather radar signal processing design. A method is presented for the direct measurement of windspeed gradient using low pulse repetition frequency (PRF) radar. This spatial gradient is essential in obtaining the windshear hazard index. As an alternative, the modified Prony method is suggested as a spectrum mode estimator for both the clutter and weather signal. Estimation of Doppler spectrum modes may provide the desired windshear hazard information without the need of any preliminary processing requirement such as clutter filtering. The results obtained by processing a NASA simulation model output support consideration of mode identification as one component of a windshear detection algorithm.

  15. Some comments on Hurst exponent and the long memory processes on capital markets

    NASA Astrophysics Data System (ADS)

    Sánchez Granero, M. A.; Trinidad Segovia, J. E.; García Pérez, J.

    2008-09-01

    The analysis of long memory processes in capital markets has been one of the topics in finance, since the existence of the market memory could implicate the rejection of an efficient market hypothesis. The study of these processes in finance is realized through Hurst exponent and the most classical method applied is R/S analysis. In this paper we will discuss the efficiency of this methodology as well as some of its more important modifications to detect the long memory. We also propose the application of a classical geometrical method with short modifications and we compare both approaches.

  16. GridMass: a fast two-dimensional feature detection method for LC/MS.

    PubMed

    Treviño, Victor; Yañez-Garza, Irma-Luz; Rodriguez-López, Carlos E; Urrea-López, Rafael; Garza-Rodriguez, Maria-Lourdes; Barrera-Saldaña, Hugo-Alberto; Tamez-Peña, José G; Winkler, Robert; Díaz de-la-Garza, Rocío-Isabel

    2015-01-01

    One of the initial and critical procedures for the analysis of metabolomics data using liquid chromatography and mass spectrometry is feature detection. Feature detection is the process to detect boundaries of the mass surface from raw data. It consists of detected abundances arranged in a two-dimensional (2D) matrix of mass/charge and elution time. MZmine 2 is one of the leading software environments that provide a full analysis pipeline for these data. However, the feature detection algorithms provided in MZmine 2 are based mainly on the analysis of one-dimension at a time. We propose GridMass, an efficient algorithm for 2D feature detection. The algorithm is based on landing probes across the chromatographic space that are moved to find local maxima providing accurate boundary estimations. We tested GridMass on a controlled marker experiment, on plasma samples, on plant fruits, and in a proteome sample. Compared with other algorithms, GridMass is faster and may achieve comparable or better sensitivity and specificity. As a proof of concept, GridMass has been implemented in Java under the MZmine 2 environment and is available at http://www.bioinformatica.mty.itesm.mx/GridMass and MASSyPup. It has also been submitted to the MZmine 2 developing community. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    NASA Astrophysics Data System (ADS)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  18. Implementation of a portable device for real-time ECG signal analysis.

    PubMed

    Jeon, Taegyun; Kim, Byoungho; Jeon, Moongu; Lee, Byung-Geun

    2014-12-10

    Cardiac disease is one of the main causes of catastrophic mortality. Therefore, detecting the symptoms of cardiac disease as early as possible is important for increasing the patient's survival. In this study, a compact and effective architecture for detecting atrial fibrillation (AFib) and myocardial ischemia is proposed. We developed a portable device using this architecture, which allows real-time electrocardiogram (ECG) signal acquisition and analysis for cardiac diseases. A noisy ECG signal was preprocessed by an analog front-end consisting of analog filters and amplifiers before it was converted into digital data. The analog front-end was minimized to reduce the size of the device and power consumption by implementing some of its functions with digital filters realized in software. With the ECG data, we detected QRS complexes based on wavelet analysis and feature extraction for morphological shape and regularity using an ARM processor. A classifier for cardiac disease was constructed based on features extracted from a training dataset using support vector machines. The classifier then categorized the ECG data into normal beats, AFib, and myocardial ischemia. A portable ECG device was implemented, and successfully acquired and processed ECG signals. The performance of this device was also verified by comparing the processed ECG data with high-quality ECG data from a public cardiac database. Because of reduced computational complexity, the ARM processor was able to process up to a thousand samples per second, and this allowed real-time acquisition and diagnosis of heart disease. Experimental results for detection of heart disease showed that the device classified AFib and ischemia with a sensitivity of 95.1% and a specificity of 95.9%. Current home care and telemedicine systems have a separate device and diagnostic service system, which results in additional time and cost. Our proposed portable ECG device provides captured ECG data and suspected waveform to identify sporadic and chronic events of heart diseases. This device has been built and evaluated for high quality of signals, low computational complexity, and accurate detection.

  19. Broadband external cavity quantum cascade laser based sensor for gasoline detection

    NASA Astrophysics Data System (ADS)

    Ding, Junya; He, Tianbo; Zhou, Sheng; Li, Jinsong

    2018-02-01

    A new type of tunable diode spectroscopy sensor based on an external cavity quantum cascade laser (ECQCL) and a quartz crystal tuning fork (QCTF) were used for quantitative analysis of volatile organic compounds. In this work, the sensor system had been tested on different gasoline sample analysis. For signal processing, the self-established interpolation algorithm and multiple linear regression algorithm model were used for quantitative analysis of major volatile organic compounds in gasoline samples. The results were very consistent with that of the standard spectra taken from the Pacific Northwest National Laboratory (PNNL) database. In future, The ECQCL sensor will be used for trace explosive, chemical warfare agent, and toxic industrial chemical detection and spectroscopic analysis, etc.

  20. CoLiTec software - detection of the near-zero apparent motion

    NASA Astrophysics Data System (ADS)

    Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.

    2017-06-01

    In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.

  1. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 89.1% were detected by the SCED method within 2°. Based on the type of check that detected the error, determination of error sources was achieved. With noise ranging from no random noise to four times the established noise value, the averaged relevant dose error detection rate of the SCED method was between 94.0% and 95.8% and that of gamma between 82.8% and 89.8%. An EPID-frame-based error detection process for VMAT deliveries was successfully designed and tested via simulations. The SCED method was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of relevant dose errors. Compared to a typical (3%, 3 mm) gamma analysis, the SCED method produced a higher detection rate for all introduced dose errors, identified errors in an earlier stage, displayed a higher robustness to noise variations, and indicated the error source. © 2017 American Association of Physicists in Medicine.

  2. A hierarchical model for estimating the spatial distribution and abundance of animals detected by continuous-time recorders

    USGS Publications Warehouse

    Dorazio, Robert; Karanth, K. Ullas

    2017-01-01

    MotivationSeveral spatial capture-recapture (SCR) models have been developed to estimate animal abundance by analyzing the detections of individuals in a spatial array of traps. Most of these models do not use the actual dates and times of detection, even though this information is readily available when using continuous-time recorders, such as microphones or motion-activated cameras. Instead most SCR models either partition the period of trap operation into a set of subjectively chosen discrete intervals and ignore multiple detections of the same individual within each interval, or they simply use the frequency of detections during the period of trap operation and ignore the observed times of detection. Both practices make inefficient use of potentially important information in the data.Model and data analysisWe developed a hierarchical SCR model to estimate the spatial distribution and abundance of animals detected with continuous-time recorders. Our model includes two kinds of point processes: a spatial process to specify the distribution of latent activity centers of individuals within the region of sampling and a temporal process to specify temporal patterns in the detections of individuals. We illustrated this SCR model by analyzing spatial and temporal patterns evident in the camera-trap detections of tigers living in and around the Nagarahole Tiger Reserve in India. We also conducted a simulation study to examine the performance of our model when analyzing data sets of greater complexity than the tiger data.BenefitsOur approach provides three important benefits: First, it exploits all of the information in SCR data obtained using continuous-time recorders. Second, it is sufficiently versatile to allow the effects of both space use and behavior of animals to be specified as functions of covariates that vary over space and time. Third, it allows both the spatial distribution and abundance of individuals to be estimated, effectively providing a species distribution model, even in cases where spatial covariates of abundance are unknown or unavailable. We illustrated these benefits in the analysis of our data, which allowed us to quantify differences between nocturnal and diurnal activities of tigers and to estimate their spatial distribution and abundance across the study area. Our continuous-time SCR model allows an analyst to specify many of the ecological processes thought to be involved in the distribution, movement, and behavior of animals detected in a spatial trapping array of continuous-time recorders. We plan to extend this model to estimate the population dynamics of animals detected during multiple years of SCR surveys.

  3. Detection of illicit drugs with the technique of spectral fluorescence signatures (SFS)

    NASA Astrophysics Data System (ADS)

    Poryvkina, Larisa; Babichenko, Sergey

    2010-10-01

    The SFS technology has already proved its analytical capabilities in a variety of industrial and environmental tasks. Recently it has been introduced for forensic applications. The key features of the SFS method - measuring a 3-dimensional spectrum of fluorescence of the sample (intensity versus excitation and emission wavelengths) with following recognition of specific spectral patterns of SFS responsible for individual drugs - provide an effective tool for the analysis of untreated seized samples, without any separation of the substance of interest from its mixture with accompanying cutting agents and diluents as a preparatory step. In such approach the chemical analysis of the sample is substituted by the analysis of SFS matrix visualized as an optical image. The SFS technology of drug detection is realized by NarTest® NTX2000 analyzer, compact device intended to measure suspicious samples in liquid, solid and powder forms. It simplifies the detection process due to fully automated procedures of SFS measuring and integrated expert system for recognition of spectral patterns. Presently the expert system of NTX2000 is able to detect marijuana, cocaine, heroin, MDMA, amphetamine and methamphetamine with the detection limit down to 5% of the drug concentration in various mixtures. The numerous tests with street samples confirmed that the use of SFS method provides reliable results with high sensitivity and selectivity for identification of drugs of abuse. More than 3000 street samples of the aforesaid drugs were analyzed with NTX2000 during validation process, and the correspondence of SFS results and conclusions of standard forensic analyses with GC/MS techniques was in 99.4% cases.

  4. Real-Time and Secure Wireless Health Monitoring

    PubMed Central

    Dağtaş, S.; Pekhteryev, G.; Şahinoğlu, Z.; Çam, H.; Challa, N.

    2008-01-01

    We present a framework for a wireless health monitoring system using wireless networks such as ZigBee. Vital signals are collected and processed using a 3-tiered architecture. The first stage is the mobile device carried on the body that runs a number of wired and wireless probes. This device is also designed to perform some basic processing such as the heart rate and fatal failure detection. At the second stage, further processing is performed by a local server using the raw data transmitted by the mobile device continuously. The raw data is also stored at this server. The processed data as well as the analysis results are then transmitted to the service provider center for diagnostic reviews as well as storage. The main advantages of the proposed framework are (1) the ability to detect signals wirelessly within a body sensor network (BSN), (2) low-power and reliable data transmission through ZigBee network nodes, (3) secure transmission of medical data over BSN, (4) efficient channel allocation for medical data transmission over wireless networks, and (5) optimized analysis of data using an adaptive architecture that maximizes the utility of processing and computational capacity at each platform. PMID:18497866

  5. Diversity of sulfate-reducing bacteria in a plant using deep geothermal energy

    NASA Astrophysics Data System (ADS)

    Alawi, Mashal; Lerm, Stephanie; Vetter, Alexandra; Wolfgramm, Markus; Seibt, Andrea; Würdemann, Hilke

    2011-06-01

    Enhanced process understanding of engineered geothermal systems is a prerequisite to optimize plant reliability and economy. We investigated microbial, geochemical and mineralogical aspects of a geothermal groundwater system located in the Molasse Basin by fluid analysis. Fluids are characterized by temperatures ranging from 61°C to 103°C, salinities from 600 to 900 mg/l and a dissolved organic carbon content (DOC) between 6.4 to 19.3 mg C/l. The microbial population of fluid samples was analyzed by genetic fingerprinting techniques based on PCR-amplified 16S rRNA- and dissimilatory sulfite reductase genes. Despite of the high temperatures, microbes were detected in all investigated fluids. Fingerprinting and DNA sequencing enabled a correlation to metabolic classes and biogeochemical processes. The analysis revealed a broad diversity of sulfate-reducing bacteria. Overall, the detection of microbes known to be involved in biocorrosion and mineral precipitation indicates that microorganisms could play an important role for the understanding of processes in engineered geothermal systems.

  6. Applications of satellite image processing to the analysis of Amazonian cultural ecology

    NASA Technical Reports Server (NTRS)

    Behrens, Clifford A.

    1991-01-01

    This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.

  7. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    PubMed

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Ultra-sensitive detection of tumorigenic cellular impurities in human cell-processed therapeutic products by digital analysis of soft agar colony formation.

    PubMed

    Kusakawa, Shinji; Yasuda, Satoshi; Kuroda, Takuya; Kawamata, Shin; Sato, Yoji

    2015-12-08

    Contamination with tumorigenic cellular impurities is one of the most pressing concerns for human cell-processed therapeutic products (hCTPs). The soft agar colony formation (SACF) assay, which is a well-known in vitro assay for the detection of malignant transformed cells, is applicable for the quality assessment of hCTPs. Here we established an image-based screening system for the SACF assay using a high-content cell analyzer termed the digital SACF assay. Dual fluorescence staining of formed colonies and the dissolution of soft agar led to accurate detection of transformed cells with the imaging cytometer. Partitioning a cell sample into multiple wells of culture plates enabled digital readout of the presence of colonies and elevated the sensitivity for their detection. In practice, the digital SACF assay detected impurity levels as low as 0.00001% of the hCTPs, i.e. only one HeLa cell contained in 10,000,000 human mesenchymal stem cells, within 30 days. The digital SACF assay saves time, is more sensitive than in vivo tumorigenicity tests, and would be useful for the quality control of hCTPs in the manufacturing process.

  9. Ultra-sensitive detection of tumorigenic cellular impurities in human cell-processed therapeutic products by digital analysis of soft agar colony formation

    PubMed Central

    Kusakawa, Shinji; Yasuda, Satoshi; Kuroda, Takuya; Kawamata, Shin; Sato, Yoji

    2015-01-01

    Contamination with tumorigenic cellular impurities is one of the most pressing concerns for human cell-processed therapeutic products (hCTPs). The soft agar colony formation (SACF) assay, which is a well-known in vitro assay for the detection of malignant transformed cells, is applicable for the quality assessment of hCTPs. Here we established an image-based screening system for the SACF assay using a high-content cell analyzer termed the digital SACF assay. Dual fluorescence staining of formed colonies and the dissolution of soft agar led to accurate detection of transformed cells with the imaging cytometer. Partitioning a cell sample into multiple wells of culture plates enabled digital readout of the presence of colonies and elevated the sensitivity for their detection. In practice, the digital SACF assay detected impurity levels as low as 0.00001% of the hCTPs, i.e. only one HeLa cell contained in 10,000,000 human mesenchymal stem cells, within 30 days. The digital SACF assay saves time, is more sensitive than in vivo tumorigenicity tests, and would be useful for the quality control of hCTPs in the manufacturing process. PMID:26644244

  10. Real-time traffic sign detection and recognition

    NASA Astrophysics Data System (ADS)

    Herbschleb, Ernst; de With, Peter H. N.

    2009-01-01

    The continuous growth of imaging databases increasingly requires analysis tools for extraction of features. In this paper, a new architecture for the detection of traffic signs is proposed. The architecture is designed to process a large database with tens of millions of images with a resolution up to 4,800x2,400 pixels. Because of the size of the database, a high reliability as well as a high throughput is required. The novel architecture consists of a three-stage algorithm with multiple steps per stage, combining both color and specific spatial information. The first stage contains an area-limitation step which is performance critical in both the detection rate as the overall processing time. The second stage locates suggestions for traffic signs using recently published feature processing. The third stage contains a validation step to enhance reliability of the algorithm. During this stage, the traffic signs are recognized. Experiments show a convincing detection rate of 99%. With respect to computational speed, the throughput for line-of-sight images of 800×600 pixels is 35 Hz and for panorama images it is 4 Hz. Our novel architecture outperforms existing algorithms, with respect to both detection rate and throughput

  11. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    PubMed

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Theoretical detection limit of PIXE analysis using 20 MeV proton beams

    NASA Astrophysics Data System (ADS)

    Ishii, Keizo; Hitomi, Keitaro

    2018-02-01

    Particle-induced X-ray emission (PIXE) analysis is usually performed using proton beams with energies in the range 2∼3 MeV because at these energies, the detection limit is low. The detection limit of PIXE analysis depends on the X-ray production cross-section, the continuous background of the PIXE spectrum and the experimental parameters such as the beam currents and the solid angle and detector efficiency of X-ray detector. Though the continuous background increases as the projectile energy increases, the cross-section of the X-ray increases as well. Therefore, the detection limit of high energy proton PIXE is not expected to increase significantly. We calculated the cross sections of continuous X-rays produced in several bremsstrahlung processes and estimated the detection limit of a 20 MeV proton PIXE analysis by modelling the Compton tail of the γ-rays produced in the nuclear reactions, and the escape effect on the secondary electron bremsstrahlung. We found that the Compton tail does not affect the detection limit when a thin X-ray detector is used, but the secondary electron bremsstrahlung escape effect does have an impact. We also confirmed that the detection limit of the PIXE analysis, when used with 4 μm polyethylene backing film and an integrated beam current of 1 μC, is 0.4∼2.0 ppm for proton energies in the range 10∼30 MeV and elements with Z = 16-90. This result demonstrates the usefulness of several 10 MeV cyclotrons for performing PIXE analysis. Cyclotrons with these properties are currently installed in positron emission tomography (PET) centers.

  13. Analysis of volatile organic compounds from illicit cocaine samples

    NASA Astrophysics Data System (ADS)

    Robins, W. H.; Wright, Bob W.

    1994-10-01

    Detection of illicit cocaine hydrochloride shipments can be improved if there is a greater understanding of the identity and quantity of volatile compounds present. This study provides preliminary data concerning the volatile organic compounds detected in a limited set of cocaine hydrochloride samples. In all cases, cocaine was one of the major volatile compounds detected. Other tropeines were detected in almost all samples. Low concentrations of compounds which may be residues of processing solvents were observed in some samples. The equilibrium emissivity of cocaine from cocaine hydrochloride was investigated and a value of 83 parts-per-trillion was determined.

  14. Information theoretic analysis of canny edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  15. Fabricating PFPE Membranes for Microfluidic Valves and Pumps

    NASA Technical Reports Server (NTRS)

    Greer, Frank; White, Victor E.; Lee, Michael C.; Willis, Peter A.; Grunthaner, Frank J.; Rolland, Jason; Rolland, Jason

    2009-01-01

    A process has been developed for fabricating membranes of a perfluoropolyether (PFPE) and integrating them into valves and pumps in laboratory-on-achip microfluidic devices. Membranes of poly(tetrafluoroethylene) [PTFE] and poly(dimethylsilane) [PDMS] have been considered for this purpose and found wanting. By making it possible to use PFPE instead of PTFE or PDMS, the present process expands the array of options for further development of microfluidic devices for diverse applications that could include detection of biochemicals of interest, detection of toxins and biowarfare agents, synthesis and analysis of proteins, medical diagnosis, and synthesis of fuels.

  16. Applications of ICA and fractal dimension in sEMG signal processing for subtle movement analysis: a review.

    PubMed

    Naik, Ganesh R; Arjunan, Sridhar; Kumar, Dinesh

    2011-06-01

    The surface electromyography (sEMG) signal separation and decphompositions has always been an interesting research topic in the field of rehabilitation and medical research. Subtle myoelectric control is an advanced technique concerned with the detection, processing, classification, and application of myoelectric signals to control human-assisting robots or rehabilitation devices. This paper reviews recent research and development in independent component analysis and Fractal dimensional analysis for sEMG pattern recognition, and presents state-of-the-art achievements in terms of their type, structure, and potential application. Directions for future research are also briefly outlined.

  17. A survey of the use of soy in processed Turkish meat products and detection of genetic modification.

    PubMed

    Ulca, Pelin; Balta, Handan; Senyuva, Hamide Z

    2014-01-01

    To screen for possible illegal use of soybeans in meat products, the performance characteristics of a commercial polymer chain reaction (PCR) kit for detection of soybean DNA in raw and cooked meat products were established. Minced chicken and beef products containing soybean at levels from 0.1% to 10.0% were analysed by real-time PCR to amplify the soybean lectin gene. The PCR method could reliably detect the addition of soybean at a level of 0.1%. A survey of 38 Turkish processed meat products found only six samples to be negative for the presence of soybean. In 32 (84%) positive samples, 13 (34%) contained levels of soy above 0.1%. Of soybean positive samples, further DNA analysis was conducted by real-time PCR to detect whether genetically modified (GM) soybean had been used. Of 32 meat samples containing soybean, two samples were positive for GM modification.

  18. The Architecture Design of Detection and Calibration System for High-voltage Electrical Equipment

    NASA Astrophysics Data System (ADS)

    Ma, Y.; Lin, Y.; Yang, Y.; Gu, Ch; Yang, F.; Zou, L. D.

    2018-01-01

    With the construction of Material Quality Inspection Center of Shandong electric power company, Electric Power Research Institute takes on more jobs on quality analysis and laboratory calibration for high-voltage electrical equipment, and informationization construction becomes urgent. In the paper we design a consolidated system, which implements the electronic management and online automation process for material sampling, test apparatus detection and field test. In the three jobs we use QR code scanning, online Word editing and electronic signature. These techniques simplify the complex process of warehouse management and testing report transferring, and largely reduce the manual procedure. The construction of the standardized detection information platform realizes the integrated management of high-voltage electrical equipment from their networking, running to periodic detection. According to system operation evaluation, the speed of transferring report is doubled, and querying data is also easier and faster.

  19. ECG Based Heart Arrhythmia Detection Using Wavelet Coherence and Bat Algorithm

    NASA Astrophysics Data System (ADS)

    Kora, Padmavathi; Sri Rama Krishna, K.

    2016-12-01

    Atrial fibrillation (AF) is a type of heart abnormality, during the AF electrical discharges in the atrium are rapid, results in abnormal heart beat. The morphology of ECG changes due to the abnormalities in the heart. This paper consists of three major steps for the detection of heart diseases: signal pre-processing, feature extraction and classification. Feature extraction is the key process in detecting the heart abnormality. Most of the ECG detection systems depend on the time domain features for cardiac signal classification. In this paper we proposed a wavelet coherence (WTC) technique for ECG signal analysis. The WTC calculates the similarity between two waveforms in frequency domain. Parameters extracted from WTC function is used as the features of the ECG signal. These features are optimized using Bat algorithm. The Levenberg Marquardt neural network classifier is used to classify the optimized features. The performance of the classifier can be improved with the optimized features.

  20. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces.

    PubMed

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  1. Automatic Detection and Classification of Audio Events for Road Surveillance Applications.

    PubMed

    Almaadeed, Noor; Asim, Muhammad; Al-Maadeed, Somaya; Bouridane, Ahmed; Beghdadi, Azeddine

    2018-06-06

    This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs) to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  2. Towards the Automatic Detection of Pre-Existing Termite Mounds through UAS and Hyperspectral Imagery.

    PubMed

    Sandino, Juan; Wooler, Adam; Gonzalez, Felipe

    2017-09-24

    The increased technological developments in Unmanned Aerial Vehicles (UAVs) combined with artificial intelligence and Machine Learning (ML) approaches have opened the possibility of remote sensing of extensive areas of arid lands. In this paper, a novel approach towards the detection of termite mounds with the use of a UAV, hyperspectral imagery, ML and digital image processing is intended. A new pipeline process is proposed to detect termite mounds automatically and to reduce, consequently, detection times. For the classification stage, several ML classification algorithms' outcomes were studied, selecting support vector machines as the best approach for their role in image classification of pre-existing termite mounds. Various test conditions were applied to the proposed algorithm, obtaining an overall accuracy of 68%. Images with satisfactory mound detection proved that the method is "resolution-dependent". These mounds were detected regardless of their rotation and position in the aerial image. However, image distortion reduced the number of detected mounds due to the inclusion of a shape analysis method in the object detection phase, and image resolution is still determinant to obtain accurate results. Hyperspectral imagery demonstrated better capabilities to classify a huge set of materials than implementing traditional segmentation methods on RGB images only.

  3. Humour processing in frontotemporal lobar degeneration: A behavioural and neuroanatomical analysis

    PubMed Central

    Clark, Camilla N.; Nicholas, Jennifer M.; Henley, Susie M.D.; Downey, Laura E.; Woollacott, Ione O.; Golden, Hannah L.; Fletcher, Phillip D.; Mummery, Catherine J.; Schott, Jonathan M.; Rohrer, Jonathan D.; Crutch, Sebastian J.; Warren, Jason D.

    2015-01-01

    Humour is a complex cognitive and emotional construct that is vulnerable in neurodegenerative diseases, notably the frontotemporal lobar degenerations. However, humour processing in these diseases has been little studied. Here we assessed humour processing in patients with behavioural variant frontotemporal dementia (n = 22, mean age 67 years, four female) and semantic dementia (n = 11, mean age 67 years, five female) relative to healthy individuals (n = 21, mean age 66 years, 11 female), using a joint cognitive and neuroanatomical approach. We created a novel neuropsychological test requiring a decision about the humorous intent of nonverbal cartoons, in which we manipulated orthogonally humour content and familiarity of depicted scenarios. Structural neuroanatomical correlates of humour detection were assessed using voxel-based morphometry. Assessing performance in a signal detection framework and after adjusting for standard measures of cognitive function, both patient groups showed impaired accuracy of humour detection in familiar and novel scenarios relative to healthy older controls (p < .001). Patient groups showed similar overall performance profiles; however the behavioural variant frontotemporal dementia group alone showed a significant advantage for detection of humour in familiar relative to novel scenarios (p = .045), suggesting that the behavioural variant syndrome may lead to particular difficulty decoding novel situations for humour, while semantic dementia produces a more general deficit of humour detection that extends to stock comedic situations. Humour detection accuracy was associated with grey matter volume in a distributed network including temporo-parietal junctional and anterior superior temporal cortices, with predominantly left-sided correlates of processing humour in familiar scenarios and right-sided correlates of processing novel humour. The findings quantify deficits of core cognitive operations underpinning humour processing in frontotemporal lobar degenerations and suggest a candidate brain substrate in cortical hub regions processing incongruity and semantic associations. Humour is a promising candidate tool with which to assess complex social signal processing in neurodegenerative disease. PMID:25973788

  4. [Online endpoint detection algorithm for blending process of Chinese materia medica].

    PubMed

    Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang

    2017-03-01

    Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.

  5. Ship detection from high-resolution imagery based on land masking and cloud filtering

    NASA Astrophysics Data System (ADS)

    Jin, Tianming; Zhang, Junping

    2015-12-01

    High resolution satellite images play an important role in target detection application presently. This article focuses on the ship target detection from the high resolution panchromatic images. Taking advantage of geographic information such as the coastline vector data provided by NOAA Medium Resolution Coastline program, the land region is masked which is a main noise source in ship detection process. After that, the algorithm tries to deal with the cloud noise which appears frequently in the ocean satellite images, which is another reason for false alarm. Based on the analysis of cloud noise's feature in frequency domain, we introduce a windowed noise filter to get rid of the cloud noise. With the help of morphological processing algorithms adapted to target detection, we are able to acquire ship targets in fine shapes. In addition, we display the extracted information such as length and width of ship targets in a user-friendly way i.e. a KML file interpreted by Google Earth.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Yanmei; Li, Xinli; Bai, Yan

    The measurement of multiphase flow parameters is of great importance in a wide range of industries. In the measurement of multiphase, the signals from the sensors are extremely weak and often buried in strong background noise. It is thus desirable to develop effective signal processing techniques that can detect the weak signal from the sensor outputs. In this paper, two methods, i.e., lock-in-amplifier (LIA) and improved Duffing chaotic oscillator are compared to detect and process the weak signal. For sinusoidal signal buried in noise, the correlation detection with sinusoidal reference signal is simulated by using LIA. The improved Duffing chaoticmore » oscillator method, which based on the Wigner transformation, can restore the signal waveform and detect the frequency. Two methods are combined to detect and extract the weak signal. Simulation results show the effectiveness and accuracy of the proposed improved method. The comparative analysis shows that the improved Duffing chaotic oscillator method can restrain noise strongly since it is sensitive to initial conditions.« less

  7. Resonant photoacoustic cell for pulsed laser analysis of gases at high temperature

    NASA Astrophysics Data System (ADS)

    Sorvajärvi, Tapio; Manninen, Albert; Toivonen, Juha; Saarela, Jaakko; Hernberg, Rolf

    2009-12-01

    A new approach to high temperature gas analysis by means of photoacoustic (PA) spectroscopy is presented. The transverse modes of the resonant PA cell were excited with a pulsed laser and detected with a microphone. Changes in the properties of the PA cell resulting from a varying temperature are discussed and considered when processing the PA signal. The feasibility of the proposed method was demonstrated by studying PA response from saturated vapor of potassium chloride (KCl) in the temperature range extending from 410 to 691 °C. The PA spectrum, the detection limit, and the signal saturation of KCl vapor are discussed. At 245 nm excitation wavelength and 300 μJ pulse energy, the achieved detection limit for KCl is 15 ppb.

  8. [Application of the mixed programming with Labview and Matlab in biomedical signal analysis].

    PubMed

    Yu, Lu; Zhang, Yongde; Sha, Xianzheng

    2011-01-01

    This paper introduces the method of mixed programming with Labview and Matlab, and applies this method in a pulse wave pre-processing and feature detecting system. The method has been proved suitable, efficient and accurate, which has provided a new kind of approach for biomedical signal analysis.

  9. Signal Detection Methods and Discriminant Analysis Applied to Categorization of Newspaper and Government Documents: A Preliminary Study.

    ERIC Educational Resources Information Center

    Ng, Kwong Bor; Rieh, Soo Young; Kantor, Paul

    2000-01-01

    Discussion of natural language processing focuses on experiments using linear discriminant analysis to distinguish "Wall Street Journal" texts from "Federal Register" tests using information about the frequency of occurrence of word boundaries, sentence boundaries, and punctuation marks. Displays and interprets results in terms…

  10. Identifying Students with Learning Disabilities: Composite Profile Analysis Using the Cognitive Assessment System

    ERIC Educational Resources Information Center

    Huang, Leesa V.; Bardos, Achilles N.; D'Amato, Rik Carl

    2010-01-01

    The detection of cognitive patterns in children with learning disabilities (LD) has been a priority in the identification process. Subtest profile analysis from traditional cognitive assessment has drawn sharp criticism for inaccurate identification and weak connections to educational planning. Therefore, the purpose of this study is to use a new…

  11. Unnecessary roughness? Testing the hypothesis that predators destined for molecular gut-content analysis must be hand-collected to avoid cross-contamination

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables direct detection of arthropod predation with minimal disruption of on-going ecosystem processes. Mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, could lead to regurgitation or even rupturing of predators along with uneaten ...

  12. Surface Enhanced Raman Spectroscopy (SERS) and multivariate analysis as a screening tool for detecting Sudan I dye in culinary spices

    NASA Astrophysics Data System (ADS)

    Di Anibal, Carolina V.; Marsal, Lluís F.; Callao, M. Pilar; Ruisánchez, Itziar

    2012-02-01

    Raman spectroscopy combined with multivariate analysis was evaluated as a tool for detecting Sudan I dye in culinary spices. Three Raman modalities were studied: normal Raman, FT-Raman and SERS. The results show that SERS is the most appropriate modality capable of providing a proper Raman signal when a complex matrix is analyzed. To get rid of the spectral noise and background, Savitzky-Golay smoothing with polynomial baseline correction and wavelet transform were applied. Finally, to check whether unadulterated samples can be differentiated from samples adulterated with Sudan I dye, an exploratory analysis such as principal component analysis (PCA) was applied to raw data and data processed with the two mentioned strategies. The results obtained by PCA show that Raman spectra need to be properly treated if useful information is to be obtained and both spectra treatments are appropriate for processing the Raman signal. The proposed methodology shows that SERS combined with appropriate spectra treatment can be used as a practical screening tool to distinguish samples suspicious to be adulterated with Sudan I dye.

  13. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  14. Information Assurance Technology Analysis Center Information Assurance Tools Report Intrusion Detection

    DTIC Science & Technology

    1998-01-01

    such as central processing unit (CPU) usage, disk input/output (I/O), memory usage, user activity, and number of logins attempted. The statistics... EMERALD Commercial anomaly detection, system monitoring SRI porras@csl.sri.com www.csl.sri.com/ emerald /index. html Gabriel Commercial system...sensors, it starts to protect the network with minimal configuration and maximum intelligence. T 11 EMERALD TITLE EMERALD (Event Monitoring

  15. Track Score Processing of Multiple Dissimilar Sensors

    DTIC Science & Technology

    2007-06-01

    sensors ( infrared and light detection and ranging system) and one radio frenquency sensor (radar). The signal to noise ratio and design considerations...categorized as Johnson noise , shot noise , generation-recombination noise , temperature noise , microphonic noise , 1/f noise , and finally electronic...of 2.1 µm. The values of detectivity in this figure were derived from an analysis of commercial detectors , under background- limited conditions, at

  16. A novel sulfate-reducing bacteria detection method based on inhibition of cysteine protease activity.

    PubMed

    Qi, Peng; Zhang, Dun; Wan, Yi

    2014-11-01

    Sulfate-reducing bacteria (SRB) have been extensively studied in corrosion and environmental science. However, fast enumeration of SRB population is still a difficult task. This work presents a novel specific SRB detection method based on inhibition of cysteine protease activity. The hydrolytic activity of cysteine protease was inhibited by taking advantage of sulfide, the characteristic metabolic product of SRB, to attack active cysteine thiol group in cysteine protease catalytic sites. The active thiol S-sulfhydration process could be used for SRB detection, since the amount of sulfide accumulated in culture medium was highly related with initial bacterial concentration. The working conditions of cysteine protease have been optimized to obtain better detection capability, and the SRB detection performances have been evaluated in this work. The proposed SRB detection method based on inhibition of cysteine protease activity avoided the use of biological recognition elements. In addition, compared with the widely used most probable number (MPN) method which would take up to at least 15days to accomplish whole detection process, the method based on inhibition of papain activity could detect SRB in 2 days, with a detection limit of 5.21×10(2) cfu mL(-1). The detection time for SRB population quantitative analysis was greatly shortened. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Rapid processing of PET list-mode data for efficient uncertainty estimation and data analysis

    NASA Astrophysics Data System (ADS)

    Markiewicz, P. J.; Thielemans, K.; Schott, J. M.; Atkinson, D.; Arridge, S. R.; Hutton, B. F.; Ourselin, S.

    2016-07-01

    In this technical note we propose a rapid and scalable software solution for the processing of PET list-mode data, which allows the efficient integration of list mode data processing into the workflow of image reconstruction and analysis. All processing is performed on the graphics processing unit (GPU), making use of streamed and concurrent kernel execution together with data transfers between disk and CPU memory as well as CPU and GPU memory. This approach leads to fast generation of multiple bootstrap realisations, and when combined with fast image reconstruction and analysis, it enables assessment of uncertainties of any image statistic and of any component of the image generation process (e.g. random correction, image processing) within reasonable time frames (e.g. within five minutes per realisation). This is of particular value when handling complex chains of image generation and processing. The software outputs the following: (1) estimate of expected random event data for noise reduction; (2) dynamic prompt and random sinograms of span-1 and span-11 and (3) variance estimates based on multiple bootstrap realisations of (1) and (2) assuming reasonable count levels for acceptable accuracy. In addition, the software produces statistics and visualisations for immediate quality control and crude motion detection, such as: (1) count rate curves; (2) centre of mass plots of the radiodistribution for motion detection; (3) video of dynamic projection views for fast visual list-mode skimming and inspection; (4) full normalisation factor sinograms. To demonstrate the software, we present an example of the above processing for fast uncertainty estimation of regional SUVR (standard uptake value ratio) calculation for a single PET scan of 18F-florbetapir using the Siemens Biograph mMR scanner.

  18. Multi-analyte profiling of inflammatory mediators in COPD sputum--the effects of processing.

    PubMed

    Pedersen, Frauke; Holz, Olaf; Lauer, Gereon; Quintini, Gianluca; Kiwull-Schöne, Heidrun; Kirsten, Anne-Marie; Magnussen, Helgo; Rabe, Klaus F; Goldmann, Torsten; Watz, Henrik

    2015-02-01

    Prior to using a new multi-analyte platform for the detection of markers in sputum it is advisable to assess whether sputum processing, especially mucus homogenization by dithiothreitol (DTT), affects the analysis. In this study we tested a novel Human Inflammation Multi Analyte Profiling® Kit (v1.0 Luminex platform; xMAP®). Induced sputum samples of 20 patients with stable COPD (mean FEV1, 59.2% pred.) were processed in parallel using standard processing (with DTT) and a more time consuming sputum dispersion method with phosphate buffered saline (PBS) only. A panel of 47 markers was analyzed in these sputum supernatants by the xMAP®. Twenty-five of 47 analytes have been detected in COPD sputum. Interestingly, 7 markers have been detected in sputum processed with DTT only, or significantly higher levels were observed following DTT treatment (VDBP, α-2-Macroglobulin, haptoglobin, α-1-antitrypsin, VCAM-1, and fibrinogen). However, standard DTT-processing resulted in lower detectable concentrations of ferritin, TIMP-1, MCP-1, MIP-1β, ICAM-1, and complement C3. The correlation between processing methods for the different markers indicates that DTT processing does not introduce a bias by affecting individual sputum samples differently. In conclusion, our data demonstrates that the Luminex-based xMAP® panel can be used for multi-analyte profiling of COPD sputum using the routinely applied method of sputum processing with DTT. However, researchers need to be aware that the absolute concentration of selected inflammatory markers can be affected by DTT. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. [Research on optimal modeling strategy for licorice extraction process based on near-infrared spectroscopy technology].

    PubMed

    Wang, Hai-Xia; Suo, Tong-Chuan; Yu, He-Shui; Li, Zheng

    2016-10-01

    The manufacture of traditional Chinese medicine (TCM) products is always accompanied by processing complex raw materials and real-time monitoring of the manufacturing process. In this study, we investigated different modeling strategies for the extraction process of licorice. Near-infrared spectra associate with the extraction time was used to detemine the states of the extraction processes. Three modeling approaches, i.e., principal component analysis (PCA), partial least squares regression (PLSR) and parallel factor analysis-PLSR (PARAFAC-PLSR), were adopted for the prediction of the real-time status of the process. The overall results indicated that PCA, PLSR and PARAFAC-PLSR can effectively detect the errors in the extraction procedure and predict the process trajectories, which has important significance for the monitoring and controlling of the extraction processes. Copyright© by the Chinese Pharmaceutical Association.

  20. A Comparative Analysis for Selection of Appropriate Mother Wavelet for Detection of Stationary Disturbances

    NASA Astrophysics Data System (ADS)

    Kamble, Saurabh Prakash; Thawkar, Shashank; Gaikwad, Vinayak G.; Kothari, D. P.

    2017-12-01

    Detection of disturbances is the first step of mitigation. Power electronics plays a crucial role in modern power system which makes system operation efficient but it also bring stationary disturbances in the power system and added impurities to the supply. It happens because of the non-linear loads used in modern day power system which inject disturbances like harmonic disturbances, flickers, sag etc. in power grid. These impurities can damage equipments so it is necessary to mitigate these impurities present in the supply very quickly. So, digital signal processing techniques are incorporated for detection purpose. Signal processing techniques like fast Fourier transform, short-time Fourier transform, Wavelet transform etc. are widely used for the detection of disturbances. Among all, wavelet transform is widely used because of its better detection capabilities. But, which mother wavelet has to use for detection is still a mystery. Depending upon the periodicity, the disturbances are classified as stationary and non-stationary disturbances. This paper presents the importance of selection of mother wavelet for analyzing stationary disturbances using discrete wavelet transform. Signals with stationary disturbances of various frequencies are generated using MATLAB. The analysis of these signals is done using various mother wavelets like Daubechies and bi-orthogonal wavelets and the measured root mean square value of stationary disturbance is obtained. The measured value obtained by discrete wavelet transform is compared with the exact RMS value of the frequency component and the percentage differences are presented which helps to select optimum mother wavelet.

  1. An adaptive confidence limit for periodic non-steady conditions fault detection

    NASA Astrophysics Data System (ADS)

    Wang, Tianzhen; Wu, Hao; Ni, Mengqi; Zhang, Milu; Dong, Jingjing; Benbouzid, Mohamed El Hachemi; Hu, Xiong

    2016-05-01

    System monitoring has become a major concern in batch process due to the fact that failure rate in non-steady conditions is much higher than in steady ones. A series of approaches based on PCA have already solved problems such as data dimensionality reduction, multivariable decorrelation, and processing non-changing signal. However, if the data follows non-Gaussian distribution or the variables contain some signal changes, the above approaches are not applicable. To deal with these concerns and to enhance performance in multiperiod data processing, this paper proposes a fault detection method using adaptive confidence limit (ACL) in periodic non-steady conditions. The proposed ACL method achieves four main enhancements: Longitudinal-Standardization could convert non-Gaussian sampling data to Gaussian ones; the multiperiod PCA algorithm could reduce dimensionality, remove correlation, and improve the monitoring accuracy; the adaptive confidence limit could detect faults under non-steady conditions; the fault sections determination procedure could select the appropriate parameter of the adaptive confidence limit. The achieved result analysis clearly shows that the proposed ACL method is superior to other fault detection approaches under periodic non-steady conditions.

  2. Employing image processing techniques for cancer detection using microarray images.

    PubMed

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Processing Ocean Images to Detect Large Drift Nets

    NASA Technical Reports Server (NTRS)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  4. DOA Estimation for Underwater Wideband Weak Targets Based on Coherent Signal Subspace and Compressed Sensing.

    PubMed

    Li, Jun; Lin, Qiu-Hua; Kang, Chun-Yu; Wang, Kai; Yang, Xiu-Ting

    2018-03-18

    Direction of arrival (DOA) estimation is the basis for underwater target localization and tracking using towed line array sonar devices. A method of DOA estimation for underwater wideband weak targets based on coherent signal subspace (CSS) processing and compressed sensing (CS) theory is proposed. Under the CSS processing framework, wideband frequency focusing is accompanied by a two-sided correlation transformation, allowing the DOA of underwater wideband targets to be estimated based on the spatial sparsity of the targets and the compressed sensing reconstruction algorithm. Through analysis and processing of simulation data and marine trial data, it is shown that this method can accomplish the DOA estimation of underwater wideband weak targets. Results also show that this method can considerably improve the spatial spectrum of weak target signals, enhancing the ability to detect them. It can solve the problems of low directional resolution and unreliable weak-target detection in traditional beamforming technology. Compared with the conventional minimum variance distortionless response beamformers (MVDR), this method has many advantages, such as higher directional resolution, wider detection range, fewer required snapshots and more accurate detection for weak targets.

  5. Rotation covariant image processing for biomedical applications.

    PubMed

    Skibbe, Henrik; Reisert, Marco

    2013-01-01

    With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.

  6. Novel Diffusion-Weighted MRI for High-Grade Prostate Cancer Detection

    DTIC Science & Technology

    2016-10-01

    in image resolution and scale.This process is critical for evaluating new imaging modalities.Our initial findings illustrate the potential of the...eligible for analysis as determined by adequate pathologic processing and MR images deemed to be of adequate quality by the study team.  The...histology samples have been requested from the UIC biorepository for digitization  All MR images have been collected and prepared for image processing

  7. Immersion lithography defectivity analysis at DUV inspection wavelength

    NASA Astrophysics Data System (ADS)

    Golan, E.; Meshulach, D.; Raccah, N.; Yeo, J. Ho.; Dassa, O.; Brandl, S.; Schwarz, C.; Pierson, B.; Montgomery, W.

    2007-03-01

    Significant effort has been directed in recent years towards the realization of immersion lithography at 193nm wavelength. Immersion lithography is likely a key enabling technology for the production of critical layers for 45nm and 32nm design rule (DR) devices. In spite of the significant progress in immersion lithography technology, there remain several key technology issues, with a critical issue of immersion lithography process induced defects. The benefits of the optical resolution and depth of focus, made possible by immersion lithography, are well understood. Yet, these benefits cannot come at the expense of increased defect counts and decreased production yield. Understanding the impact of the immersion lithography process parameters on wafer defects formation and defect counts, together with the ability to monitor, control and minimize the defect counts down to acceptable levels is imperative for successful introduction of immersion lithography for production of advanced DR's. In this report, we present experimental results of immersion lithography defectivity analysis focused on topcoat layer thickness parameters and resist bake temperatures. Wafers were exposed on the 1150i-α-immersion scanner and 1200B Scanner (ASML), defect inspection was performed using a DUV inspection tool (UVision TM, Applied Materials). Higher sensitivity was demonstrated at DUV through detection of small defects not detected at the visible wavelength, indicating on the potential high sensitivity benefits of DUV inspection for this layer. The analysis indicates that certain types of defects are associated with different immersion process parameters. This type of analysis at DUV wavelengths would enable the optimization of immersion lithography processes, thus enabling the qualification of immersion processes for volume production.

  8. Seabed-Structure Interaction: Workshop Report and Recommendations for Future Research Held in Metairie, Louisiana on 5-6 November 1991.

    DTIC Science & Technology

    1992-02-01

    14 Measurements of Sediment Properties and Data Analysis ............................................. 15 object...Object Sensing Methods (Detect/Classification) and (B) Sediment Properties Measurements and Data Analysis . Although important to the understanding of S...characterized by a variety of geological materials, seabed properties, and hydrodynamic processes, the problems of I modeling, analysis , and prediction of S-SI

  9. The Objective Identification and Quantification of Interstitial Lung Abnormalities in Smokers.

    PubMed

    Ash, Samuel Y; Harmouche, Rola; Ross, James C; Diaz, Alejandro A; Hunninghake, Gary M; Putman, Rachel K; Onieva, Jorge; Martinez, Fernando J; Choi, Augustine M; Lynch, David A; Hatabu, Hiroto; Rosas, Ivan O; Estepar, Raul San Jose; Washko, George R

    2017-08-01

    Previous investigation suggests that visually detected interstitial changes in the lung parenchyma of smokers are highly clinically relevant and predict outcomes, including death. Visual subjective analysis to detect these changes is time-consuming, insensitive to subtle changes, and requires training to enhance reproducibility. Objective detection of such changes could provide a method of disease identification without these limitations. The goal of this study was to develop and test a fully automated image processing tool to objectively identify radiographic features associated with interstitial abnormalities in the computed tomography scans of a large cohort of smokers. An automated tool that uses local histogram analysis combined with distance from the pleural surface was used to detect radiographic features consistent with interstitial lung abnormalities in computed tomography scans from 2257 individuals from the Genetic Epidemiology of COPD study, a longitudinal observational study of smokers. The sensitivity and specificity of this tool was determined based on its ability to detect the visually identified presence of these abnormalities. The tool had a sensitivity of 87.8% and a specificity of 57.5% for the detection of interstitial lung abnormalities, with a c-statistic of 0.82, and was 100% sensitive and 56.7% specific for the detection of the visual subtype of interstitial abnormalities called fibrotic parenchymal abnormalities, with a c-statistic of 0.89. In smokers, a fully automated image processing tool is able to identify those individuals who have interstitial lung abnormalities with moderate sensitivity and specificity. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  10. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  11. GaiaGrid : Its Implications and Implementation

    NASA Astrophysics Data System (ADS)

    Ansari, S. G.; Lammers, U.; Ter Linden, M.

    2005-12-01

    Gaia is an ESA space mission to determine positions of 1 billion objects in the Galaxy at micro-arcsecond precision. The data analysis and processing requirements of the mission involves about 20 institutes across Europe, each providing specific algorithms for specific tasks, which range from relativistic effects on positional determination, classification, astrometric binary star detection, photometric analysis, spectroscopic analysis etc. In an initial phase, a study has been ongoing over the past three years to determine the complexity of Gaia's data processing. Two processing categories have materialised: core and shell. While core deals with routine data processing, shell tasks are algorithms to carry out data analysis, which involves the Gaia Community at large. For this latter category, we are currently experimenting with use of Grid paradigms to allow access to the core data and to augment processing power to simulate and analyse the data in preparation for the actual mission. We present preliminary results and discuss the sociological impact of distributing the tasks amongst the community.

  12. Mesocyclones in Central Europe as seen by radar

    NASA Astrophysics Data System (ADS)

    Wapler, Kathrin; Hengstebeck, Thomas; Groenemeijer, Pieter

    2016-02-01

    The occurrence and characteristics of mesocyclones in Central Europe as seen by radar are analysed. A three year analysis shows an annual and diurnal cycle with a wider maximum in the late afternoon/evening compared to the diurnal cycle of general thunderstorms. Analysis of F2 tornado events and over a hundred hail storms show the characteristics of the corresponding mesocyclones as seen by radar. For all of the six F2 tornadoes in the three-year period in Germany a corresponding mesocyclone could be detected in radar data. Furthermore the analysis reveals that about half of all hail storms in Germany are associated with a mesocyclone detected in radar data within 10 km and 10 min. Some mesocyclone attributes, e.g. depth and maximum shear, and of the associated convective cell, e.g. reflectivity related parameters VIL, VILD and echotop, have predictive skill for indicating the occurrence of hail. The mesocyclone detection algorithm may support the analysis and nowcasting of severe weather events and thus support the warning process.

  13. High perfomance liquid chromatography fingerprint analysis for quality control of brotowali (Tinospora crispa)

    NASA Astrophysics Data System (ADS)

    Syarifah, V. B.; Rafi, M.; Wahyuni, W. T.

    2017-05-01

    Brotowali (Tinospora crispa) is widely used in Indonesia as ingredient of herbal medicine formulation. To ensure the quality, safety, and efficacy of herbal medicine products, its chemical constituents should be continuously evaluated. High performance liquid chromatography (HPLC) fingerprint is one of powerful technique for this quality control process. In this study, HPLC fingerprint analysis method was developed for quality control of brotowali. HPLC analysis was performed in C18 column and detection was performed using photodiode array detector. The optimum mobile phase for brotowali fingerprint was acetonitrile (ACN) and 0.1% formic acid in gradient elution mode at a flow rate of 1 mL/min. The number of peaks detected in HPLC fingerprint of brotowali was 32 peaks and 23 peaks for stems and leaves, respectively. Berberine as marker compound was detected at retention time of 20.525 minutes. Evaluation of analytical performance including precision, reproducibility, and stability prove that this HPLC fingerprint analysis was reliable and could be applied for quality control of brotowali.

  14. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  15. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  16. A service relation model for web-based land cover change detection

    NASA Astrophysics Data System (ADS)

    Xing, Huaqiao; Chen, Jun; Wu, Hao; Zhang, Jun; Li, Songnian; Liu, Boyu

    2017-10-01

    Change detection with remotely sensed imagery is a critical step in land cover monitoring and updating. Although a variety of algorithms or models have been developed, none of them can be universal for all cases. The selection of appropriate algorithms and construction of processing workflows depend largely on the expertise of experts about the "algorithm-data" relations among change detection algorithms and the imagery data used. This paper presents a service relation model for land cover change detection by integrating the experts' knowledge about the "algorithm-data" relations into the web-based geo-processing. The "algorithm-data" relations are mapped into a set of web service relations with the analysis of functional and non-functional service semantics. These service relations are further classified into three different levels, i.e., interface, behavior and execution levels. A service relation model is then established using the Object and Relation Diagram (ORD) approach to represent the multi-granularity services and their relations for change detection. A set of semantic matching rules are built and used for deriving on-demand change detection service chains from the service relation model. A web-based prototype system is developed in .NET development environment, which encapsulates nine change detection and pre-processing algorithms and represents their service relations as an ORD. Three test areas from Shandong and Hebei provinces, China with different imagery conditions are selected for online change detection experiments, and the results indicate that on-demand service chains can be generated according to different users' demands.

  17. Quantitative, multiplexed workflow for deep analysis of human blood plasma and biomarker discovery by mass spectrometry.

    PubMed

    Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A

    2017-08-01

    Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.

  18. Liver CT image processing: a short introduction of the technical elements.

    PubMed

    Masutani, Y; Uozumi, K; Akahane, Masaaki; Ohtomo, Kuni

    2006-05-01

    In this paper, we describe the technical aspects of image analysis for liver diagnosis and treatment, including the state-of-the-art of liver image analysis and its applications. After discussion on modalities for liver image analysis, various technical elements for liver image analysis such as registration, segmentation, modeling, and computer-assisted detection are covered with examples performed with clinical data sets. Perspective in the imaging technologies is also reviewed and discussed.

  19. Palladium configuration dependence of hydrogen detection sensitivity based on graphene FET for breath analysis

    NASA Astrophysics Data System (ADS)

    Sakamoto, Yuri; Uemura, Kohei; Ikuta, Takashi; Maehashi, Kenzo

    2018-04-01

    We have succeeded in fabricating a hydrogen gas sensor based on palladium-modified graphene field-effect transistors (FETs). The negative-voltage shift in the transfer characteristics was observed with exposure to hydrogen gas, which was explained by the change in work function. The hydrogen concentration dependence of the voltage shift was investigated using graphene FETs with palladium deposited by three different evaporation processes. The results indicate that the hydrogen detection sensitivity of the palladium-modified graphene FETs is strongly dependent on the palladium configuration. Therefore, the palladium-modified graphene FET is a candidate for breath analysis.

  20. Identification, characterization, synthesis and HPLC quantification of new process-related impurities and degradation products in retigabine.

    PubMed

    Douša, Michal; Srbek, Jan; Rádl, Stanislav; Cerný, Josef; Klecán, Ondřej; Havlíček, Jaroslav; Tkadlecová, Marcela; Pekárek, Tomáš; Gibala, Petr; Nováková, Lucie

    2014-06-01

    Two new impurities were described and determined using gradient HPLC method with UV detection in retigabine (RET). Using LC-HRMS, NMR and IR analysis the impurities were identified as RET-dimer I: diethyl {4,4'-diamino-6,6'-bis[(4-fluorobenzyl)amino]biphenyl-3,3'-diyl}biscarbamate and RET-dimer II: ethyl {2-amino-5-[{2-amino-4-[(4-fluorobenzyl) amino] phenyl} (ethoxycarbonyl) amino]-4-[(4-fluorobenzyl)amino] phenyl}carbamate. Reference standards of these impurities were synthesized followed by semipreparative HPLC purification. The mechanism of the formation of these impurities is also discussed. An HPLC method was optimized in order to separate, selectively detect and quantify all process-related impurities and degradation products of RET. The presented method, which was validated in terms of linearity, limit of detection (LOD), limit of quantification (LOQ) and selectivity is very quick (less than 11min including re-equilibration time) and therefore highly suitable for routine analysis of RET related substances as well as stability studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Minimal disease detection of B-cell lymphoproliferative disorders by flow cytometry: multidimensional cluster analysis.

    PubMed

    Duque, Ricardo E

    2012-04-01

    Flow cytometric analysis of cell suspensions involves the sequential 'registration' of intrinsic and extrinsic parameters of thousands of cells in list mode files. Thus, it is almost irresistible to describe phenomena in numerical terms or by 'ratios' that have the appearance of 'accuracy' due to the presence of numbers obtained from thousands of cells. The concepts involved in the detection and characterization of B cell lymphoproliferative processes are revisited in this paper by identifying parameters that, when analyzed appropriately, are both necessary and sufficient. The neoplastic process (cluster) can be visualized easily because the parameters that distinguish it form a cluster in multidimensional space that is unique and distinguishable from neighboring clusters that are not of diagnostic interest but serve to provide a background. For B cell neoplasia it is operationally necessary to identify the multidimensional space occupied by a cluster whose kappa:lambda ratio is 100:0 or 0:100. Thus, the concept of kappa:lambda ratio is without meaning and would not detect B cell neoplasia in an unacceptably high number of cases.

  2. Towards a neural circuit model of verbal humor processing: an fMRI study of the neural substrates of incongruity detection and resolution.

    PubMed

    Chan, Yu-Chen; Chou, Tai-Li; Chen, Hsueh-Chih; Yeh, Yu-Chu; Lavallee, Joseph P; Liang, Keng-Chen; Chang, Kuo-En

    2013-02-01

    The present study builds on our previous study within the framework of Wyer and Collin's comprehension-elaboration theory of humor processing. In this study, an attempt is made to segregate the neural substrates of incongruity detection and incongruity resolution during the comprehension of verbal jokes. Although a number of fMRI studies have investigated the incongruity-resolution process, the differential neurological substrates of comprehension are still not fully understood. The present study utilized an event-related fMRI design incorporating three conditions (unfunny, nonsensical and funny) to examine distinct brain regions associated with the detection and resolution of incongruities. Stimuli in the unfunny condition contained no incongruities; stimuli in the nonsensical condition contained irresolvable incongruities; and stimuli in the funny condition contained resolvable incongruities. The results showed that the detection of incongruities was associated with greater activation in the right middle temporal gyrus and right medial frontal gyrus, and the resolution of incongruities with greater activation in the left superior frontal gyrus and left inferior parietal lobule. Further analysis based on participants' rating scores provided converging results. Our findings suggest a three-stage neural circuit model of verbal humor processing: incongruity detection and incongruity resolution during humor comprehension and inducement of the feeling of amusement during humor elaboration. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Hillslope to fluvial process domain transitions in headwater catchments

    NASA Astrophysics Data System (ADS)

    Williams, Karen Mary

    The landscape is partitioned into hillslopes and unchanneled valleys (hollows), and colluvial (hillslope controlled) and alluvial (self-formed) channels. The key issue for any study of headwater catchments is the rational distinction between these elements. Accurate identification of process domain transitions from hillslopes to hollows, hollows to colluvial channels and colluvial to alluvial channels, are not obvious either in the field or from topographic data derived from remotely sensed data such as laser derived (LIDAR) digital elevation models. The research in this dissertation investigates the spatial arrangement of these landforms and how hillslope and fluvial process domains interact in two pairs of headwater catchments in southwest and central Montana, using LIDAR data. This dissertation uses digital terrain analysis of LIDAR-derived topography and field studies to investigate methods of detection, modeling, and prediction of process transitions from the hillslope to fluvial domains and within the fluvial domain, from colluvial to alluvial channel reaches. Inflections in the scaling relationships between landscape parameters such as flowpath length, unit stream power (a metric of the energy expended by the channel in doing work), and drainage area were used to detect transitions in flow regimes characteristic of hillslope, unchanneled valleys, and channeled landforms. Using the scale-invariant properties of fluvial systems as a threshold condition, magnitude-frequency distributions of curvature and the derivative of aspect were also used to detect hillslope, fluvial, and transitional process domains. Finally, within the classification of channeled landforms, the transition from colluvial to alluvial channels was detected using the presence/absence of repeating patterns in the power spectra of fluvial energy and channel form parameters. LIDAR-derived scaling relations and magnitude-frequency distributions successfully detected and predicted locations of mapped channel heads and hollows and spatial regions of process transitions. Subreaches of arguably alluvial channel conditions were also identified in power spectra. However, extrinsic forcing limits ability to detect a clear transition from colluvial to fully alluvial conditions. Headwater catchments present a mosaic of process domains, in large determined by local structure and lithology. However, process domain transitions appear detectable and statistically, though not deterministically, predictable, irrespective of setting.

  4. Adsorption detection for polylysine biomolecules based on high-Q silica capillary whispering gallery mode microresonator

    NASA Astrophysics Data System (ADS)

    Wu, Jixuan; Liu, Bo; Zhang, Hao; Song, Binbin

    2017-11-01

    A silica-capillary-based whispering gallery mode (WGM) microresonator has been proposed and experimentally demonstrated for the real-time monitoring of the polylysine adsorption process. The spectral characteristics of the WGM resonance dips with high quality factor and good wavelength selectivity have been investigated to evaluate the dynamic process for the binding of polylysine with a capillary surface. The WGM transmission spectrum shows a regular shift with increments of observation time, which could be exploited for the analysis of the polylysine adsorption process. The proposed WGM microresonator system possesses desirable qualities such as high sensitivity, fast response, label-free method, high detection resolution and compactness, which could find promising applications in histology and related bioengineering areas.

  5. FIND: difFerential chromatin INteractions Detection using a spatial Poisson process.

    PubMed

    Djekidel, Mohamed Nadhir; Chen, Yang; Zhang, Michael Q

    2018-02-12

    Polymer-based simulations and experimental studies indicate the existence of a spatial dependency between the adjacent DNA fibers involved in the formation of chromatin loops. However, the existing strategies for detecting differential chromatin interactions assume that the interacting segments are spatially independent from the other segments nearby. To resolve this issue, we developed a new computational method, FIND, which considers the local spatial dependency between interacting loci. FIND uses a spatial Poisson process to detect differential chromatin interactions that show a significant difference in their interaction frequency and the interaction frequency of their neighbors. Simulation and biological data analysis show that FIND outperforms the widely used count-based methods and has a better signal-to-noise ratio. © 2018 Djekidel et al.; Published by Cold Spring Harbor Laboratory Press.

  6. SkinScan©: A PORTABLE LIBRARY FOR MELANOMA DETECTION ON HANDHELD DEVICES

    PubMed Central

    Wadhawan, Tarun; Situ, Ning; Lancaster, Keith; Yuan, Xiaojing; Zouridakis, George

    2011-01-01

    We have developed a portable library for automated detection of melanoma termed SkinScan© that can be used on smartphones and other handheld devices. Compared to desktop computers, embedded processors have limited processing speed, memory, and power, but they have the advantage of portability and low cost. In this study we explored the feasibility of running a sophisticated application for automated skin cancer detection on an Apple iPhone 4. Our results demonstrate that the proposed library with the advanced image processing and analysis algorithms has excellent performance on handheld and desktop computers. Therefore, deployment of smartphones as screening devices for skin cancer and other skin diseases can have a significant impact on health care delivery in underserved and remote areas. PMID:21892382

  7. Development of a variable structure-based fault detection and diagnosis strategy applied to an electromechanical system

    NASA Astrophysics Data System (ADS)

    Gadsden, S. Andrew; Kirubarajan, T.

    2017-05-01

    Signal processing techniques are prevalent in a wide range of fields: control, target tracking, telecommunications, robotics, fault detection and diagnosis, and even stock market analysis, to name a few. Although first introduced in the 1950s, the most popular method used for signal processing and state estimation remains the Kalman filter (KF). The KF offers an optimal solution to the estimation problem under strict assumptions. Since this time, a number of other estimation strategies and filters were introduced to overcome robustness issues, such as the smooth variable structure filter (SVSF). In this paper, properties of the SVSF are explored in an effort to detect and diagnosis faults in an electromechanical system. The results are compared with the KF method, and future work is discussed.

  8. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927

  9. Application of PCR-LDR-nucleic acid detection strip in detection of YMDD mutation in hepatitis B patients treated with lamivudine.

    PubMed

    Xu, Gaolian; You, Qimin; Pickerill, Sam; Zhong, Huayan; Wang, Hongying; Shi, Jian; Luo, Ying; You, Paul; Kong, Huimin; Lu, Fengmin; Hu, Lin

    2010-07-01

    Chronic hepatitis B virus (CHBV) infection causes cirrhosis and hepatocellular carcinoma. Lamivudine (LAM) has been successfully used to treat CHBV infections but prolonged use leads to the emergence of drug-resistant variants. This is primarily linked to a mutation in the tyrosine-methionine-aspartate-aspartate (YMDD) motif of the HBV polymerase gene at position 204. Rapid diagnosis of drug-resistant HBV is necessary for a prompt treatment response. Common diagnostic methods such as sequencing and restriction fragment length polymorphism (RFLP) analysis lack sensitivity and require significant processing. The aim of this study was to demonstrate the usefulness of a novel diagnostic method that combines polymerase chain reaction (PCR), ligase detection reaction (LDR) and a nucleic acid detection strip (NADS) in detecting site-specific mutations related to HBV LAM resistance. We compared this method (PLNA) to direct sequencing and RFLP analysis in 50 clinical samples from HBV infected patients. There was 90% concordance between all three results. PLNA detected more samples containing mutant variants than both sequencing and RFLP analysis and was more sensitive in detecting mixed variant populations. Plasmid standards indicated that the sensitivity of PLNA is at or below 3,000 copies per ml and that it can detect a minor variant at 5% of the total viral population. This warrants its further development and suggests that the PLNA method could be a useful tool in detecting LAM resistance. (c) 2010 Wiley-Liss, Inc.

  10. SCADA alarms processing for wind turbine component failure detection

    NASA Astrophysics Data System (ADS)

    Gonzalez, E.; Reder, M.; Melero, J. J.

    2016-09-01

    Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.

  11. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.

  12. Theory and Application of Magnetic Flux Leakage Pipeline Detection.

    PubMed

    Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei

    2015-12-10

    Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted.

  13. Theory and Application of Magnetic Flux Leakage Pipeline Detection

    PubMed Central

    Shi, Yan; Zhang, Chao; Li, Rui; Cai, Maolin; Jia, Guanwei

    2015-01-01

    Magnetic flux leakage (MFL) detection is one of the most popular methods of pipeline inspection. It is a nondestructive testing technique which uses magnetic sensitive sensors to detect the magnetic leakage field of defects on both the internal and external surfaces of pipelines. This paper introduces the main principles, measurement and processing of MFL data. As the key point of a quantitative analysis of MFL detection, the identification of the leakage magnetic signal is also discussed. In addition, the advantages and disadvantages of different identification methods are analyzed. Then the paper briefly introduces the expert systems used. At the end of this paper, future developments in pipeline MFL detection are predicted. PMID:26690435

  14. Patient Safety in Complementary Medicine through the Application of Clinical Risk Management in the Public Health System

    PubMed Central

    Rossi, Elio G.; Picchi, Marco; Baccetti, Sonia; Monechi, Maria Valeria; Vuono, Catia; Sabatini, Federica; Traversi, Antonella; Di Stefano, Mariella; Firenzuoli, Fabio; Albolino, Sara; Tartaglia, Riccardo

    2017-01-01

    Aim: To develop a systematic approach to detect and prevent clinical risks in complementary medicine (CM) and increase patient safety through the analysis of activities in homeopathy and acupuncture centres in the Tuscan region using a significant event audit (SEA) and failure modes and effects analysis (FMEA). Methods: SEA is the selected tool for studying adverse events (AE) and detecting the best solutions to prevent future incidents in our Regional Healthcare Service (RHS). This requires the active participation of all the actors and external experts to validate the analysis. FMEA is a proactive risk assessment tool involving the selection of the clinical process, the input of a multidisciplinary group of experts, description of the process, identification of the failure modes (FMs) for each step, estimates of the frequency, severity, and detectability of FMs, calculation of the risk priority number (RPN), and prioritized improvement actions to prevent FMs. Results: In homeopathy, the greatest risk depends on the decision to switch from allopathic to homeopathic therapy. In acupuncture, major problems can arise, mainly from delayed treatment and from the modalities of needle insertion. Conclusions: The combination of SEA and FMEA can reveal potential risks for patients and suggest actions for safer and more reliable services in CM. PMID:29258191

  15. Patient Safety in Complementary Medicine through the Application of Clinical Risk Management in the Public Health System.

    PubMed

    Rossi, Elio G; Bellandi, Tommaso; Picchi, Marco; Baccetti, Sonia; Monechi, Maria Valeria; Vuono, Catia; Sabatini, Federica; Traversi, Antonella; Di Stefano, Mariella; Firenzuoli, Fabio; Albolino, Sara; Tartaglia, Riccardo

    2017-12-16

    Aim: To develop a systematic approach to detect and prevent clinical risks in complementary medicine (CM) and increase patient safety through the analysis of activities in homeopathy and acupuncture centres in the Tuscan region using a significant event audit (SEA) and failure modes and effects analysis (FMEA). Methods: SEA is the selected tool for studying adverse events (AE) and detecting the best solutions to prevent future incidents in our Regional Healthcare Service (RHS). This requires the active participation of all the actors and external experts to validate the analysis. FMEA is a proactive risk assessment tool involving the selection of the clinical process, the input of a multidisciplinary group of experts, description of the process, identification of the failure modes (FMs) for each step, estimates of the frequency, severity, and detectability of FMs, calculation of the risk priority number (RPN), and prioritized improvement actions to prevent FMs. Results: In homeopathy, the greatest risk depends on the decision to switch from allopathic to homeopathic therapy. In acupuncture, major problems can arise, mainly from delayed treatment and from the modalities of needle insertion. Conclusions: The combination of SEA and FMEA can reveal potential risks for patients and suggest actions for safer and more reliable services in CM.

  16. Document reconstruction by layout analysis of snippets

    NASA Astrophysics Data System (ADS)

    Kleber, Florian; Diem, Markus; Sablatnig, Robert

    2010-02-01

    Document analysis is done to analyze entire forms (e.g. intelligent form analysis, table detection) or to describe the layout/structure of a document. Also skew detection of scanned documents is performed to support OCR algorithms that are sensitive to skew. In this paper document analysis is applied to snippets of torn documents to calculate features for the reconstruction. Documents can either be destroyed by the intention to make the printed content unavailable (e.g. tax fraud investigation, business crime) or due to time induced degeneration of ancient documents (e.g. bad storage conditions). Current reconstruction methods for manually torn documents deal with the shape, inpainting and texture synthesis techniques. In this paper the possibility of document analysis techniques of snippets to support the matching algorithm by considering additional features are shown. This implies a rotational analysis, a color analysis and a line detection. As a future work it is planned to extend the feature set with the paper type (blank, checked, lined), the type of the writing (handwritten vs. machine printed) and the text layout of a snippet (text size, line spacing). Preliminary results show that these pre-processing steps can be performed reliably on a real dataset consisting of 690 snippets.

  17. Sources of Infrasound events listed in IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick

    2017-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.

  18. Development and validation of an universal interface for compound-specific stable isotope analysis of chlorine (37Cl/35Cl) by GC-high-temperature conversion (HTC)-MS/IRMS.

    PubMed

    Renpenning, Julian; Hitzfeld, Kristina L; Gilevska, Tetyana; Nijenhuis, Ivonne; Gehre, Matthias; Richnow, Hans-Hermann

    2015-03-03

    A universal application of compound-specific isotope analysis of chlorine was thus far limited by the availability of suitable analysis techniques. In this study, gas chromatography in combination with a high-temperature conversion interface (GC-HTC), converting organic chlorine in the presence of H2 to gaseous HCl, was coupled to a dual-detection system, combining an ion trap mass spectrometer (MS) and isotope-ratio mass spectrometer (IRMS). The combination of the MS/IRMS detection enabled a detailed characterization, optimization, and online monitoring of the high-temperature conversion process via ion trap MS as well as a simultaneous chlorine isotope analysis by the IRMS. Using GC-HTC-MS/IRMS, chlorine isotope analysis at optimized conversion conditions resulted in very accurate isotope values (δ(37)Cl(SMOC)) for measured reference material with known isotope composition, including chlorinated ethylene, chloromethane, hexachlorocyclohexane, and trichloroacetic acids methyl ester. Respective detection limits were determined to be <15 nmol Cl on column with achieved precision of <0.3‰.

  19. Identifying and tracking dynamic processes in social networks

    NASA Astrophysics Data System (ADS)

    Chung, Wayne; Savell, Robert; Schütt, Jan-Peter; Cybenko, George

    2006-05-01

    The detection and tracking of embedded malicious subnets in an active social network can be computationally daunting due to the quantity of transactional data generated in the natural interaction of large numbers of actors comprising a network. In addition, detection of illicit behavior may be further complicated by evasive strategies designed to camouflage the activities of the covert subnet. In this work, we move beyond traditional static methods of social network analysis to develop a set of dynamic process models which encode various modes of behavior in active social networks. These models will serve as the basis for a new application of the Process Query System (PQS) to the identification and tracking of covert dynamic processes in social networks. We present a preliminary result from application of our technique in a real-world data stream-- the Enron email corpus.

  20. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  1. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  2. [The error analysis and experimental verification of laser radar spectrum detection and terahertz time domain spectroscopy].

    PubMed

    Liu, Wen-Tao; Li, Jing-Wen; Sun, Zhi-Hui

    2010-03-01

    Terahertz waves (THz, T-ray) lie between far-infrared and microwave in electromagnetic spectrum with frequency from 0.1 to 10 THz. Many chemical agent explosives show characteristic spectral features in the terahertz. Compared with conventional methods of detecting a variety of threats, such as weapons and chemical agent, THz radiation is low frequency and non-ionizing, and does not give rise to safety concerns. The present paper summarizes the latest progress in the application of terahertz time domain spectroscopy (THz-TDS) to chemical agent explosives. A kind of device on laser radar detecting and real time spectrum measuring was designed which measures the laser spectrum on the bases of Fourier optics and optical signal processing. Wedge interferometer was used as the beam splitter to wipe off the background light and detect the laser and measure the spectrum. The result indicates that 10 ns laser radar pulse can be detected and many factors affecting experiments are also introduced. The combination of laser radar spectrum detecting, THz-TDS, modern pattern recognition and signal processing technology is the developing trend of remote detection for chemical agent explosives.

  3. Coliphages as indicators of enteroviruses.

    PubMed Central

    Stetler, R E

    1984-01-01

    Coliphages were monitored in conjunction with indicator bacteria and enteroviruses in a drinking-water plant modified to reduce trihalomethane production. Coliphages could be detected in the source water by direct inoculation, and sufficient coliphages were detected in enterovirus concentrates to permit following the coliphage levels through different water treatment processes. The recovery efficiency by different filter types ranged from 1 to 53%. Statistical analysis of the data indicated that enterovirus isolates were better correlated with coliphages than with total coliforms, fecal coliforms, fecal streptococci, or standard plate count organisms. Coliphages were not detected in finished water. PMID:6093694

  4. Bayesian Inference for Signal-Based Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.

    2015-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http

  5. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  6. Tribology symposium 1995. PD-Volume 72

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masudi, H.

    After the keynote presentation by Professor Aaron Cohen of Texas A and M University, entitled Processes Used in Design, the program is divided into five major sessions: Research and Development -- Recent research and development of tribological components; Tribology in Manufacturing -- The impact of tribology on modern manufacturing; Design/Design Representation -- Aspects of design related to tribological systems; Tribo-Chemistry/Tribo-Physics -- Discussion of chemical and physical behavior of substances as related to tribology; and Failure Analysis -- An analysis of failure, failure detection, and failure monitoring as related to manufacturing processes. Papers have been processed separately for inclusion on themore » data base.« less

  7. Evaluation of lidar-derived DEMs through terrain analysis and field comparison

    Treesearch

    Cody P. Gillin; Scott W. Bailey; Kevin J. McGuire; Stephen P. Prisley

    2015-01-01

    Topographic analysis of watershed-scale soil and hydrological processes using digital elevation models (DEMs) is commonplace, but most studies have used DEMs of 10 m resolution or coarser. Availability of higher-resolution DEMs created from light detection and ranging (lidar) data is increasing but their suitability for such applications has received little critical...

  8. The Problem of Sample Contamination in a Fluvial Geochemistry Research Experience for Undergraduates.

    ERIC Educational Resources Information Center

    Andersen, Charles B.

    2001-01-01

    Introduces the analysis of a river as an excellent way to teach geochemical techniques because of the relative ease of sample collection and speed of sample analysis. Focuses on the potential sources of sample contamination during sampling, filtering, and bottle cleaning processes, and reviews methods to reduce and detect contamination. Includes…

  9. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  10. A quantitative analysis of spectral mechanisms involved in auditory detection of coloration by a single wall reflection.

    PubMed

    Buchholz, Jörg M

    2011-07-01

    Coloration detection thresholds (CDTs) were measured for a single reflection as a function of spectral content and reflection delay for diotic stimulus presentation. The direct sound was a 320-ms long burst of bandpass-filtered noise with varying lower and upper cut-off frequencies. The resulting threshold data revealed that: (1) sensitivity decreases with decreasing bandwidth and increasing reflection delay and (2) high-frequency components contribute less to detection than low-frequency components. The auditory processes that may be involved in coloration detection (CD) are discussed in terms of a spectrum-based auditory model, which is conceptually similar to the pattern-transformation model of pitch (Wightman, 1973). Hence, the model derives an auto-correlation function of the input stimulus by applying a frequency analysis to an auditory representation of the power spectrum. It was found that, to successfully describe the quantitative behavior of the CDT data, three important mechanisms need to be included: (1) auditory bandpass filters with a narrower bandwidth than classic Gammatone filters, the increase in spectral resolution was here linked to cochlear suppression, (2) a spectral contrast enhancement process that reflects neural inhibition mechanisms, and (3) integration of information across auditory frequency bands. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. A near-infrared fluorescence-based surgical navigation system imaging software for sentinel lymph node detection

    NASA Astrophysics Data System (ADS)

    Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie

    2014-02-01

    Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.

  12. Fine-Scale Event Location and Error Analysis in NET-VISA

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2016-12-01

    NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.

  13. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    PubMed

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  14. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    NASA Astrophysics Data System (ADS)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  15. Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory

    PubMed Central

    Wang, Shiyong; Li, Di; Liu, Chengliang

    2018-01-01

    The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444

  16. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  17. Detection of DNA Sequences Refractory to PCR Amplification Using a Biophysical SERRS Assay (Surface Enhanced Resonant Raman Spectroscopy)

    PubMed Central

    Feuillie, Cécile; Merheb, Maxime M.; Gillet, Benjamin; Montagnac, Gilles; Daniel, Isabelle; Hänni, Catherine

    2014-01-01

    The analysis of ancient or processed DNA samples is often a great challenge, because traditional Polymerase Chain Reaction – based amplification is impeded by DNA damage. Blocking lesions such as abasic sites are known to block the bypass of DNA polymerases, thus stopping primer elongation. In the present work, we applied the SERRS-hybridization assay, a fully non-enzymatic method, to the detection of DNA refractory to PCR amplification. This method combines specific hybridization with detection by Surface Enhanced Resonant Raman Scattering (SERRS). It allows the detection of a series of double-stranded DNA molecules containing a varying number of abasic sites on both strands, when PCR failed to detect the most degraded sequences. Our SERRS approach can quickly detect DNA molecules without any need for DNA repair. This assay could be applied as a pre-requisite analysis prior to enzymatic reparation or amplification. A whole new set of samples, both forensic and archaeological, could then deliver information that was not yet available due to a high degree of DNA damage. PMID:25502338

  18. Detection of DNA sequences refractory to PCR amplification using a biophysical SERRS assay (Surface Enhanced Resonant Raman Spectroscopy).

    PubMed

    Feuillie, Cécile; Merheb, Maxime M; Gillet, Benjamin; Montagnac, Gilles; Daniel, Isabelle; Hänni, Catherine

    2014-01-01

    The analysis of ancient or processed DNA samples is often a great challenge, because traditional Polymerase Chain Reaction - based amplification is impeded by DNA damage. Blocking lesions such as abasic sites are known to block the bypass of DNA polymerases, thus stopping primer elongation. In the present work, we applied the SERRS-hybridization assay, a fully non-enzymatic method, to the detection of DNA refractory to PCR amplification. This method combines specific hybridization with detection by Surface Enhanced Resonant Raman Scattering (SERRS). It allows the detection of a series of double-stranded DNA molecules containing a varying number of abasic sites on both strands, when PCR failed to detect the most degraded sequences. Our SERRS approach can quickly detect DNA molecules without any need for DNA repair. This assay could be applied as a pre-requisite analysis prior to enzymatic reparation or amplification. A whole new set of samples, both forensic and archaeological, could then deliver information that was not yet available due to a high degree of DNA damage.

  19. Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.

    PubMed

    Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao

    2018-04-02

    Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.

  20. RIPPLELAB: A Comprehensive Application for the Detection, Analysis and Classification of High Frequency Oscillations in Electroencephalographic Signals

    PubMed Central

    Alvarado-Rojas, Catalina; Le Van Quyen, Michel; Valderrama, Mario

    2016-01-01

    High Frequency Oscillations (HFOs) in the brain have been associated with different physiological and pathological processes. In epilepsy, HFOs might reflect a mechanism of epileptic phenomena, serving as a biomarker of epileptogenesis and epileptogenicity. Despite the valuable information provided by HFOs, their correct identification is a challenging task. A comprehensive application, RIPPLELAB, was developed to facilitate the analysis of HFOs. RIPPLELAB provides a wide range of tools for HFOs manual and automatic detection and visual validation; all of them are accessible from an intuitive graphical user interface. Four methods for automated detection—as well as several options for visualization and validation of detected events—were implemented and integrated in the application. Analysis of multiple files and channels is possible, and new options can be added by users. All features and capabilities implemented in RIPPLELAB for automatic detection were tested through the analysis of simulated signals and intracranial EEG recordings from epileptic patients (n = 16; 3,471 analyzed hours). Visual validation was also tested, and detected events were classified into different categories. Unlike other available software packages for EEG analysis, RIPPLELAB uniquely provides the appropriate graphical and algorithmic environment for HFOs detection (visual and automatic) and validation, in such a way that the power of elaborated detection methods are available to a wide range of users (experts and non-experts) through the use of this application. We believe that this open-source tool will facilitate and promote the collaboration between clinical and research centers working on the HFOs field. The tool is available under public license and is accessible through a dedicated web site. PMID:27341033

  1. Gene Expression Profile Change and Associated Physiological and Pathological Effects in Mouse Liver Induced by Fasting and Refeeding

    PubMed Central

    Zhang, Fang; Xu, Xiang; Zhou, Ben; He, Zhishui; Zhai, Qiwei

    2011-01-01

    Food availability regulates basal metabolism and progression of many diseases, and liver plays an important role in these processes. The effects of food availability on digital gene expression profile, physiological and pathological functions in liver are yet to be further elucidated. In this study, we applied high-throughput sequencing technology to detect digital gene expression profile of mouse liver in fed, fasted and refed states. Totally 12162 genes were detected, and 2305 genes were significantly regulated by food availability. Biological process and pathway analysis showed that fasting mainly affected lipid and carboxylic acid metabolic processes in liver. Moreover, the genes regulated by fasting and refeeding in liver were mainly enriched in lipid metabolic process or fatty acid metabolism. Network analysis demonstrated that fasting mainly regulated Drug Metabolism, Small Molecule Biochemistry and Endocrine System Development and Function, and the networks including Lipid Metabolism, Small Molecule Biochemistry and Gene Expression were affected by refeeding. In addition, FunDo analysis showed that liver cancer and diabetes mellitus were most likely to be affected by food availability. This study provides the digital gene expression profile of mouse liver regulated by food availability, and demonstrates the main biological processes, pathways, gene networks and potential hepatic diseases regulated by fasting and refeeding. These results show that food availability mainly regulates hepatic lipid metabolism and is highly correlated with liver-related diseases including liver cancer and diabetes. PMID:22096593

  2. Gene expression profile change and associated physiological and pathological effects in mouse liver induced by fasting and refeeding.

    PubMed

    Zhang, Fang; Xu, Xiang; Zhou, Ben; He, Zhishui; Zhai, Qiwei

    2011-01-01

    Food availability regulates basal metabolism and progression of many diseases, and liver plays an important role in these processes. The effects of food availability on digital gene expression profile, physiological and pathological functions in liver are yet to be further elucidated. In this study, we applied high-throughput sequencing technology to detect digital gene expression profile of mouse liver in fed, fasted and refed states. Totally 12162 genes were detected, and 2305 genes were significantly regulated by food availability. Biological process and pathway analysis showed that fasting mainly affected lipid and carboxylic acid metabolic processes in liver. Moreover, the genes regulated by fasting and refeeding in liver were mainly enriched in lipid metabolic process or fatty acid metabolism. Network analysis demonstrated that fasting mainly regulated Drug Metabolism, Small Molecule Biochemistry and Endocrine System Development and Function, and the networks including Lipid Metabolism, Small Molecule Biochemistry and Gene Expression were affected by refeeding. In addition, FunDo analysis showed that liver cancer and diabetes mellitus were most likely to be affected by food availability. This study provides the digital gene expression profile of mouse liver regulated by food availability, and demonstrates the main biological processes, pathways, gene networks and potential hepatic diseases regulated by fasting and refeeding. These results show that food availability mainly regulates hepatic lipid metabolism and is highly correlated with liver-related diseases including liver cancer and diabetes.

  3. The influence of operational and environmental loads on the process of assessing damages in beams

    NASA Astrophysics Data System (ADS)

    Furdui, H.; Muntean, F.; Minda, A. A.; Praisach, Z. I.; Gillich, N.

    2015-07-01

    Damage detection methods based on vibration analysis make use of the modal parameter changes. Natural frequencies are the features that can be acquired most simply and inexpensively. But this parameter is influenced by environmental conditions, e.g. temperature and operational loads as additional masses or axial loads induced by restraint displacements. The effect of these factors is not completely known, but in the numerous actual research it is considered that they affect negatively the damage assessment process. This is justified by the small frequency changes occurring due to damage, which can be masked by the frequency shifts due to external loads. The paper intends to clarify the effect of external loads on the natural frequencies of beams and truss elements, and to show in which manner the damage detection process is affected by these loads. The finite element analysis, performed on diverse structures for a large range of temperature values, has shown that the temperature itself has a very limited effect on the frequency changes. Thus, axial forces resulted due to obstructed displacements can influence more substantially the frequency changes. These facts are demonstrated by experimental and theoretical studies. Finally, we succeed to adapt a prior contrived relation providing the frequency changes due to damage in order to fit the case of known external loads. Whereas a new baseline for damage detection was found, considering the effect of temperature and external loads, this process can be performed without other complication.

  4. CT scan range estimation using multiple body parts detection: let PACS learn the CT image content.

    PubMed

    Wang, Chunliang; Lundström, Claes

    2016-02-01

    The aim of this study was to develop an efficient CT scan range estimation method that is based on the analysis of image data itself instead of metadata analysis. This makes it possible to quantitatively compare the scan range of two studies. In our study, 3D stacks are first projected to 2D coronal images via a ray casting-like process. Trained 2D body part classifiers are then used to recognize different body parts in the projected image. The detected candidate regions go into a structure grouping process to eliminate false-positive detections. Finally, the scale and position of the patient relative to the projected figure are estimated based on the detected body parts via a structural voting. The start and end lines of the CT scan are projected to a standard human figure. The position readout is normalized so that the bottom of the feet represents 0.0, and the top of the head is 1.0. Classifiers for 18 body parts were trained using 184 CT scans. The final application was tested on 136 randomly selected heterogeneous CT scans. Ground truth was generated by asking two human observers to mark the start and end positions of each scan on the standard human figure. When compared with the human observers, the mean absolute error of the proposed method is 1.2% (max: 3.5%) and 1.6% (max: 5.4%) for the start and end positions, respectively. We proposed a scan range estimation method using multiple body parts detection and relative structure position analysis. In our preliminary tests, the proposed method delivered promising results.

  5. Colorimetric detection of uranium in water

    DOEpatents

    DeVol, Timothy A [Clemson, SC; Hixon, Amy E [Piedmont, SC; DiPrete, David P [Evans, GA

    2012-03-13

    Disclosed are methods, materials and systems that can be used to determine qualitatively or quantitatively the level of uranium contamination in water samples. Beneficially, disclosed systems are relatively simple and cost-effective. For example, disclosed systems can be utilized by consumers having little or no training in chemical analysis techniques. Methods generally include a concentration step and a complexation step. Uranium concentration can be carried out according to an extraction chromatographic process and complexation can chemically bind uranium with a detectable substance such that the formed substance is visually detectable. Methods can detect uranium contamination down to levels even below the MCL as established by the EPA.

  6. A generic nuclei detection method for histopathological breast images

    NASA Astrophysics Data System (ADS)

    Kost, Henning; Homeyer, André; Bult, Peter; Balkenhol, Maschenka C. A.; van der Laak, Jeroen A. W. M.; Hahn, Horst K.

    2016-03-01

    The detection of cell nuclei plays a key role in various histopathological image analysis problems. Considering the high variability of its applications, we propose a novel generic and trainable detection approach. Adaption to specific nuclei detection tasks is done by providing training samples. A trainable deconvolution and classification algorithm is used to generate a probability map indicating the presence of a nucleus. The map is processed by an extended watershed segmentation step to identify the nuclei positions. We have tested our method on data sets with different stains and target nuclear types. We obtained F1-measures between 0.83 and 0.93.

  7. Research on the processing technology of elongated holes based on rotary ultrasonic drilling

    NASA Astrophysics Data System (ADS)

    Tong, Yi; Chen, Jianhua; Sun, Lipeng; Yu, Xin; Wang, Xin

    2014-08-01

    The optical glass is hard, brittle and difficult to process. Based on the method of rotating ultrasonic drilling, the study of single factor on drilling elongated holes was made in optical glass. The processing equipment was DAMA ultrasonic machine, and the machining tools were electroplated with diamond. Through the detection and analysis on the processing quality and surface roughness, the process parameters (the spindle speed, amplitude, feed rate) of rotary ultrasonic drilling were researched, and the influence of processing parameters on surface roughness was obtained, which will provide reference and basis for the actual processing.

  8. Emission measurement and safety assessment for the production process of silicon nanoparticles in a pilot-scale facility

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Asbach, Christof; Fissan, Heinz; Hülser, Tim; Kaminski, Heinz; Kuhlbusch, Thomas A. J.; Pui, David Y. H.

    2012-03-01

    Emission into the workplace was measured for the production process of silicon nanoparticles in a pilot-scale facility at the Institute of Energy and Environmental Technology e.V. (IUTA). The silicon nanoparticles were produced in a hot-wall reactor and consisted of primary particles around 60 nm in diameter. We employed real-time aerosol instruments to measure particle number and lung-deposited surface area concentrations and size distribution; airborne particles were also collected for off-line electron microscopic analysis. Emission of silicon nanoparticles was not detected during the processes of synthesis, collection, and bagging. This was attributed to the completely closed production system and other safety measures against particle release which will be discussed briefly. Emission of silicon nanoparticles significantly above the detection limit was only observed during the cleaning process when the production system was open and manually cleaned. The majority of the detected particles was in the size range of 100-400 nm and were silicon nanoparticle agglomerates first deposited in the tubing then re-suspended during the cleaning process. Appropriate personal protection equipment is recommended for safety protection of the workers during cleaning.

  9. Online Continuous Trace Process Analytics Using Multiplexing Gas Chromatography.

    PubMed

    Wunsch, Marco R; Lehnig, Rudolf; Trapp, Oliver

    2017-04-04

    The analysis of impurities at a trace level in chemical products, nutrition additives, and drugs is highly important to guarantee safe products suitable for consumption. However, trace analysis in the presence of a dominating component can be a challenging task because of noncompatible linear detection ranges or strong signal overlap that suppresses the signal of interest. Here, we developed a technique for quantitative analysis using multiplexing gas chromatography (mpGC) for continuous and completely automated process trace analytics exemplified for the analysis of a CO 2 stream in a production plant for detection of benzene, toluene, ethylbenzene, and the three structural isomers of xylene (BTEX) in the concentration range of 0-10 ppb. Additional minor components are methane and methanol with concentrations up to 100 ppm. The sample is injected up to 512 times according to a pseudorandom binary sequence (PRBS) with a mean frequency of 0.1 Hz into a gas chromatograph equipped with a flame ionization detector (FID). A superimposed chromatogram is recorded which is deconvoluted into an averaged chromatogram with Hadamard transformation. Novel algorithms to maintain the data acquisition rate of the detector by application of Hadamard transformation and to suppress correlation noise induced by components with much higher concentrations than the target substances are shown. Compared to conventional GC-FID, the signal-to-noise ratio has been increased by a factor of 10 with mpGC-FID. Correspondingly, the detection limits for BTEX in CO 2 have been lowered from 10 to 1 ppb each. This has been achieved despite the presence of detectable components (methane and methanol) with a concentration about 1000 times higher than the target substances. The robustness and reliability of mpGC has been proven in a two-month field test in a chemical production plant.

  10. Applicability of computer-aided comprehensive tool (LINDA: LINeament Detection and Analysis) and shaded digital elevation model for characterizing and interpreting morphotectonic features from lineaments

    NASA Astrophysics Data System (ADS)

    Masoud, Alaa; Koike, Katsuaki

    2017-09-01

    Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.

  11. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1976-01-01

    Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.

  12. Isothermal Amplification Methods for the Detection of Nucleic Acids in Microfluidic Devices

    PubMed Central

    Zanoli, Laura Maria; Spoto, Giuseppe

    2012-01-01

    Diagnostic tools for biomolecular detection need to fulfill specific requirements in terms of sensitivity, selectivity and high-throughput in order to widen their applicability and to minimize the cost of the assay. The nucleic acid amplification is a key step in DNA detection assays. It contributes to improving the assay sensitivity by enabling the detection of a limited number of target molecules. The use of microfluidic devices to miniaturize amplification protocols reduces the required sample volume and the analysis times and offers new possibilities for the process automation and integration in one single device. The vast majority of miniaturized systems for nucleic acid analysis exploit the polymerase chain reaction (PCR) amplification method, which requires repeated cycles of three or two temperature-dependent steps during the amplification of the nucleic acid target sequence. In contrast, low temperature isothermal amplification methods have no need for thermal cycling thus requiring simplified microfluidic device features. Here, the use of miniaturized analysis systems using isothermal amplification reactions for the nucleic acid amplification will be discussed. PMID:25587397

  13. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    PubMed Central

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  14. Non-invasive assessment of skeletal muscle activity

    NASA Astrophysics Data System (ADS)

    Merletti, Roberto; Orizio, Claudio; di Prampero, Pietro E.; Tesch, Per

    2005-10-01

    After the first 3 years (2002-2005), the MAP project has made available: - systems fo electrodes, signal conditioning and digital processing for multichannel simultaneously-detected EMG and MMG as well as for simultaneous electrical stimulation and EMG detection with artifact cancellation. - innovative non-invasive techniques for the extraction of individual motor unit action potentials (MUAPS) and individual motor and MMG contributions from the surface EMG interference signal and the MMG signal. - processing techniques for extractions of indicators of progressive fatigue from the electrically-elicited (M-wave) EMG signal. - techniques for the analysis of dynamic multichannel EMG during cyclic or explosive exercise (in collaboration with project EXER/MAP-MED-027).

  15. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  16. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  17. Aspect level sentiment analysis using machine learning

    NASA Astrophysics Data System (ADS)

    Shubham, D.; Mithil, P.; Shobharani, Meesala; Sumathy, S.

    2017-11-01

    In modern world the development of web and smartphones increases the usage of online shopping. The overall feedback about product is generated with the help of sentiment analysis using text processing.Opinion mining or sentiment analysis is used to collect and categorized the reviews of product. The proposed system uses aspect leveldetection in which features are extracted from the datasets. The system performs pre-processing operation such as tokenization, part of speech and limitization on the data tofinds meaningful information which is used to detect the polarity level and assigns rating to product. The proposed model focuses on aspects to produces accurate result by avoiding the spam reviews.

  18. Note: Durability analysis of optical fiber hydrogen sensor based on Pd-Y alloy film.

    PubMed

    Huang, Peng-cheng; Chen, You-ping; Zhang, Gang; Song, Han; Liu, Yi

    2016-02-01

    The Pd-Y alloy sensing film has an excellent property for hydrogen detection, but just for one month, the sensing film's property decreases seriously. To study the failure of the sensing film, the XPS spectra analysis was used to explore the chemical content of the Pd-Y alloy film, and analysis results demonstrate that the yttrium was oxidized. The paper presented that such an oxidized process was the potential reason of the failure of the sensing film. By understanding the reason of the failure of the sensing film better, we could improve the manufacturing process to enhance the property of hydrogen sensor.

  19. Automatic Road Gap Detection Using Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.

    2011-09-01

    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.

  20. Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)

    2001-01-01

    Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.

Top