COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Surface Management System Departure Event Data Analysis
NASA Technical Reports Server (NTRS)
Monroe, Gilena A.
2010-01-01
This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.
Electro-optical system for gunshot detection: analysis, concept, and performance
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.
2011-08-01
The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.
Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing
NASA Technical Reports Server (NTRS)
Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.
2008-01-01
Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.
From Pacemaker to Wearable: Techniques for ECG Detection Systems.
Kumar, Ashish; Komaragiri, Rama; Kumar, Manjeet
2018-01-11
With the alarming rise in the deaths due to cardiovascular diseases (CVD), present medical research scenario places notable importance on techniques and methods to detect CVDs. As adduced by world health organization, technological proceeds in the field of cardiac function assessment have become the nucleus and heart of all leading research studies in CVDs in which electrocardiogram (ECG) analysis is the most functional and convenient tool used to test the range of heart-related irregularities. Most of the approaches present in the literature of ECG signal analysis consider noise removal, rhythm-based analysis, and heartbeat detection to improve the performance of a cardiac pacemaker. Advancements achieved in the field of ECG segments detection and beat classification have a limited evaluation and still require clinical approvals. In this paper, approaches on techniques to implement on-chip ECG detector for a cardiac pacemaker system are discussed. Moreover, different challenges regarding the ECG signal morphology analysis deriving from medical literature is extensively reviewed. It is found that robustness to noise, wavelet parameter choice, numerical efficiency, and detection performance are essential performance indicators required by a state-of-the-art ECG detector. Furthermore, many algorithms described in the existing literature are not verified using ECG data from the standard databases. Some ECG detection algorithms show very high detection performance with the total number of detected QRS complexes. However, the high detection performance of the algorithm is verified using only a few datasets. Finally, gaps in current advancements and testing are identified, and the primary challenge remains to be implementing bullseye test for morphology analysis evaluation.
NASA Technical Reports Server (NTRS)
Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.
2016-01-01
A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.
Datta, Niladri Sekhar; Dutta, Himadri Sekhar; Majumder, Koushik
2016-01-01
The contrast enhancement of retinal image plays a vital role for the detection of microaneurysms (MAs), which are an early sign of diabetic retinopathy disease. A retinal image contrast enhancement method has been presented to improve the MA detection technique. The success rate on low-contrast noisy retinal image analysis shows the importance of the proposed method. Overall, 587 retinal input images are tested for performance analysis. The average sensitivity and specificity are obtained as 95.94% and 99.21%, respectively. The area under curve is found as 0.932 for the receiver operating characteristics analysis. The classifications of diabetic retinopathy disease are also performed here. The experimental results show that the overall MA detection method performs better than the current state-of-the-art MA detection algorithms.
NASA Astrophysics Data System (ADS)
Chen, Xinjia; Lacy, Fred; Carriere, Patrick
2015-05-01
Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.
Accurate mobile malware detection and classification in the cloud.
Wang, Xiaolei; Yang, Yuexiang; Zeng, Yingzhi
2015-01-01
As the dominator of the Smartphone operating system market, consequently android has attracted the attention of s malware authors and researcher alike. The number of types of android malware is increasing rapidly regardless of the considerable number of proposed malware analysis systems. In this paper, by taking advantages of low false-positive rate of misuse detection and the ability of anomaly detection to detect zero-day malware, we propose a novel hybrid detection system based on a new open-source framework CuckooDroid, which enables the use of Cuckoo Sandbox's features to analyze Android malware through dynamic and static analysis. Our proposed system mainly consists of two parts: anomaly detection engine performing abnormal apps detection through dynamic analysis; signature detection engine performing known malware detection and classification with the combination of static and dynamic analysis. We evaluate our system using 5560 malware samples and 6000 benign samples. Experiments show that our anomaly detection engine with dynamic analysis is capable of detecting zero-day malware with a low false negative rate (1.16 %) and acceptable false positive rate (1.30 %); it is worth noting that our signature detection engine with hybrid analysis can accurately classify malware samples with an average positive rate 98.94 %. Considering the intensive computing resources required by the static and dynamic analysis, our proposed detection system should be deployed off-device, such as in the Cloud. The app store markets and the ordinary users can access our detection system for malware detection through cloud service.
A Pulsed Thermographic Imaging System for Detection and Identification of Cotton Foreign Matter
Kuzy, Jesse; Li, Changying
2017-01-01
Detection of foreign matter in cleaned cotton is instrumental to accurately grading cotton quality, which in turn impacts the marketability of the cotton. Current grading systems return estimates of the amount of foreign matter present, but provide no information about the identity of the contaminants. This paper explores the use of pulsed thermographic analysis to detect and identify cotton foreign matter. The design and implementation of a pulsed thermographic analysis system is described. A sample set of 240 foreign matter and cotton lint samples were collected. Hand-crafted waveform features and frequency-domain features were extracted and analyzed for statistical significance. Classification was performed on these features using linear discriminant analysis and support vector machines. Using waveform features and support vector machine classifiers, detection of cotton foreign matter was performed with 99.17% accuracy. Using frequency-domain features and linear discriminant analysis, identification was performed with 90.00% accuracy. These results demonstrate that pulsed thermographic imaging analysis produces data which is of significant utility for the detection and identification of cotton foreign matter. PMID:28273848
Methodology for fast detection of false sharing in threaded scientific codes
Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang
2014-11-25
A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.
Main propulsion functional path analysis for performance monitoring fault detection and annunciation
NASA Technical Reports Server (NTRS)
Keesler, E. L.
1974-01-01
A total of 48 operational flight instrumentation measurements were identified for use in performance monitoring and fault detection. The Operational Flight Instrumentation List contains all measurements identified for fault detection and annunciation. Some 16 controller data words were identified for use in fault detection and annunciation.
2016-04-05
applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group
Automatic adventitious respiratory sound analysis: A systematic review.
Pramono, Renard Xaviero Adhi; Bowyer, Stuart; Rodriguez-Villegas, Esther
2017-01-01
Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11.69%) on rhonchi, and 18 (23.38%) on other sounds such as pleural rub, squawk, as well as the pathology. Instrumentation used to collect data included microphones, stethoscopes, and accelerometers. Several references obtained data from online repositories or book audio CD companions. Detection or classification methods used varied from empirically determined thresholds to more complex machine learning techniques. Performance reported in the surveyed works were converted to accuracy measures for data synthesis. Direct comparison of the performance of surveyed works cannot be performed as the input data used by each was different. A standard validation method has not been established, resulting in different works using different methods and performance measure definitions. A review of the literature was performed to summarise different analysis approaches, features, and methods used for the analysis. The performance of recent studies showed a high agreement with conventional non-automatic identification. This suggests that automated adventitious sound detection or classification is a promising solution to overcome the limitations of conventional auscultation and to assist in the monitoring of relevant diseases.
Automatic adventitious respiratory sound analysis: A systematic review
Bowyer, Stuart; Rodriguez-Villegas, Esther
2017-01-01
Background Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. Objective To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. Data sources A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Study selection Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Data extraction Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. Data synthesis A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11.69%) on rhonchi, and 18 (23.38%) on other sounds such as pleural rub, squawk, as well as the pathology. Instrumentation used to collect data included microphones, stethoscopes, and accelerometers. Several references obtained data from online repositories or book audio CD companions. Detection or classification methods used varied from empirically determined thresholds to more complex machine learning techniques. Performance reported in the surveyed works were converted to accuracy measures for data synthesis. Limitations Direct comparison of the performance of surveyed works cannot be performed as the input data used by each was different. A standard validation method has not been established, resulting in different works using different methods and performance measure definitions. Conclusion A review of the literature was performed to summarise different analysis approaches, features, and methods used for the analysis. The performance of recent studies showed a high agreement with conventional non-automatic identification. This suggests that automated adventitious sound detection or classification is a promising solution to overcome the limitations of conventional auscultation and to assist in the monitoring of relevant diseases. PMID:28552969
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Godiwala, P. M.; Morrell, F. R.
1985-01-01
This paper presents the performance analysis results of a fault inferring nonlinear detection system (FINDS) using integrated avionics sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment. First, an overview of the FINDS algorithm structure is given. Then, aircraft state estimate time histories and statistics for the flight data sensors are discussed. This is followed by an explanation of modifications made to the detection and decision functions in FINDS to improve false alarm and failure detection performance. Next, the failure detection and false alarm performance of the FINDS algorithm are analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minutes of flight data. Results indicate that the detection speed, failure level estimation, and false alarm performance show a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed is faster for filter measurement sensors such as MLS than for filter input sensors such as flight control accelerometers. Finally, the progress in modifications of the FINDS algorithm design to accommodate flight computer constraints is discussed.
Adam, Asrul; Ibrahim, Zuwairie; Mokhtar, Norrima; Shapiai, Mohd Ibrahim; Cumming, Paul; Mubin, Marizan
2016-01-01
Various peak models have been introduced to detect and analyze peaks in the time domain analysis of electroencephalogram (EEG) signals. In general, peak model in the time domain analysis consists of a set of signal parameters, such as amplitude, width, and slope. Models including those proposed by Dumpala, Acir, Liu, and Dingle are routinely used to detect peaks in EEG signals acquired in clinical studies of epilepsy or eye blink. The optimal peak model is the most reliable peak detection performance in a particular application. A fair measure of performance of different models requires a common and unbiased platform. In this study, we evaluate the performance of the four different peak models using the extreme learning machine (ELM)-based peak detection algorithm. We found that the Dingle model gave the best performance, with 72 % accuracy in the analysis of real EEG data. Statistical analysis conferred that the Dingle model afforded significantly better mean testing accuracy than did the Acir and Liu models, which were in the range 37-52 %. Meanwhile, the Dingle model has no significant difference compared to Dumpala model.
NASA Technical Reports Server (NTRS)
Guerreiro, Nelson M.; Butler, Ricky W.; Maddalon, Jeffrey M.; Hagen, George E.; Lewis, Timothy A.
2015-01-01
The performance of the conflict detection function in a separation assurance system is dependent on the content and quality of the data available to perform that function. Specifically, data quality and data content available to the conflict detection function have a direct impact on the accuracy of the prediction of an aircraft's future state or trajectory, which, in turn, impacts the ability to successfully anticipate potential losses of separation (detect future conflicts). Consequently, other separation assurance functions that rely on the conflict detection function - namely, conflict resolution - are prone to negative performance impacts. The many possible allocations and implementations of the conflict detection function between centralized and distributed systems drive the need to understand the key relationships that impact conflict detection performance, with respect to differences in data available. This paper presents the preliminary results of an analysis technique developed to investigate the impacts of data quality and data content on conflict detection performance. Flight track data recorded from a day of the National Airspace System is time-shifted to create conflicts not present in the un-shifted data. A methodology is used to smooth and filter the recorded data to eliminate sensor fusion noise, data drop-outs and other anomalies in the data. The metrics used to characterize conflict detection performance are presented and a set of preliminary results is discussed.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
2012-08-01
subsequent chemical analysis (into acetonitrile for high-performance liquid chromatography [ HPLC ] analysis or hexane for gas chromatography [GC... analysis ) is rapid and complete. In this work, PAHs were analyzed by Waters 2795 HPLC with fluorescent detection (USEPA Method 8310) and PCBs were...detection limits by direct water injection versus SPME with PDMS and coefficient of variation and correlation coefficient for SPME. Analysis by HPLC
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Godiwala, P. M.
1985-01-01
The performance analysis results of a fault inferring nonlinear detection system (FINDS) using sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment is presented. First, a statistical analysis of the flight recorded sensor data was made in order to determine the characteristics of sensor inaccuracies. Next, modifications were made to the detection and decision functions in the FINDS algorithm in order to improve false alarm and failure detection performance under real modelling errors present in the flight data. Finally, the failure detection and false alarm performance of the FINDS algorithm were analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minute flight data. In general, the detection speed, failure level estimation, and false alarm performance showed a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed was faster for filter measurement sensors soon as MLS than for filter input sensors such as flight control accelerometers.
Foreign object detection and removal to improve automated analysis of chest radiographs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime
2013-07-15
Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The methodmore » is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.« less
Cluster signal-to-noise analysis for evaluation of the information content in an image.
Weerawanich, Warangkana; Shimizu, Mayumi; Takeshita, Yohei; Okamura, Kazutoshi; Yoshida, Shoko; Yoshiura, Kazunori
2018-01-01
(1) To develop an observer-free method of analysing image quality related to the observer performance in the detection task and (2) to analyse observer behaviour patterns in the detection of small mass changes in cone-beam CT images. 13 observers detected holes in a Teflon phantom in cone-beam CT images. Using the same images, we developed a new method, cluster signal-to-noise analysis, to detect the holes by applying various cut-off values using ImageJ and reconstructing cluster signal-to-noise curves. We then evaluated the correlation between cluster signal-to-noise analysis and the observer performance test. We measured the background noise in each image to evaluate the relationship with false positive rates (FPRs) of the observers. Correlations between mean FPRs and intra- and interobserver variations were also evaluated. Moreover, we calculated true positive rates (TPRs) and accuracies from background noise and evaluated their correlations with TPRs from observers. Cluster signal-to-noise curves were derived in cluster signal-to-noise analysis. They yield the detection of signals (true holes) related to noise (false holes). This method correlated highly with the observer performance test (R 2 = 0.9296). In noisy images, increasing background noise resulted in higher FPRs and larger intra- and interobserver variations. TPRs and accuracies calculated from background noise had high correlation with actual TPRs from observers; R 2 was 0.9244 and 0.9338, respectively. Cluster signal-to-noise analysis can simulate the detection performance of observers and thus replace the observer performance test in the evaluation of image quality. Erroneous decision-making increased with increasing background noise.
Comparison of public peak detection algorithms for MALDI mass spectrometry data analysis.
Yang, Chao; He, Zengyou; Yu, Weichuan
2009-01-06
In mass spectrometry (MS) based proteomic data analysis, peak detection is an essential step for subsequent analysis. Recently, there has been significant progress in the development of various peak detection algorithms. However, neither a comprehensive survey nor an experimental comparison of these algorithms is yet available. The main objective of this paper is to provide such a survey and to compare the performance of single spectrum based peak detection methods. In general, we can decompose a peak detection procedure into three consequent parts: smoothing, baseline correction and peak finding. We first categorize existing peak detection algorithms according to the techniques used in different phases. Such a categorization reveals the differences and similarities among existing peak detection algorithms. Then, we choose five typical peak detection algorithms to conduct a comprehensive experimental study using both simulation data and real MALDI MS data. The results of comparison show that the continuous wavelet-based algorithm provides the best average performance.
Performance analysis of a generalized upset detection procedure
NASA Technical Reports Server (NTRS)
Blough, Douglas M.; Masson, Gerald M.
1987-01-01
A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.
RCS propulsion functional path analysis for performance monitoring fault detection and annunciation
NASA Technical Reports Server (NTRS)
Keesler, E. L.
1974-01-01
The operational flight instrumentation required for performance monitoring and fault detection are presented. Measurements by the burn through monitors are presented along with manifold and helium source pressures.
The Identification of Software Failure Regions
1990-06-01
be used to detect non-obviously redundant test cases. A preliminary examination of the manual analysis method is performed with a set of programs ...failure regions are defined and a method of failure region analysis is described in detail. The thesis describes how this analysis may be used to detect...is the termination of the ability of a functional unit to perform its required function. (Glossary, 1983) The presence of faults in program code
NASA Astrophysics Data System (ADS)
Li, Husheng; Betz, Sharon M.; Poor, H. Vincent
2007-05-01
This paper examines the performance of decision feedback based iterative channel estimation and multiuser detection in channel coded aperiodic DS-CDMA systems operating over multipath fading channels. First, explicit expressions describing the performance of channel estimation and parallel interference cancellation based multiuser detection are developed. These results are then combined to characterize the evolution of the performance of a system that iterates among channel estimation, multiuser detection and channel decoding. Sufficient conditions for convergence of this system to a unique fixed point are developed.
Metzger, Julia; Philipp, Ute; Lopes, Maria Susana; da Camara Machado, Artur; Felicetti, Michela; Silvestrelli, Maurizio; Distl, Ottmar
2013-07-18
Copy number variants (CNVs) have been shown to play an important role in genetic diversity of mammals and in the development of many complex phenotypic traits. The aim of this study was to perform a standard comparative evaluation of CNVs in horses using three different CNV detection programs and to identify genomic regions associated with body size in horses. Analysis was performed using the Illumina Equine SNP50 genotyping beadchip for 854 horses. CNVs were detected by three different algorithms, CNVPartition, PennCNV and QuantiSNP. Comparative analysis revealed 50 CNVs that affected 153 different genes mainly involved in sensory perception, signal transduction and cellular components. Genome-wide association analysis for body size showed highly significant deleted regions on ECA1, ECA8 and ECA9. Homologous regions to the detected CNVs on ECA1 and ECA9 have also been shown to be correlated with human height. Comparative analysis of CNV detection algorithms was useful to increase the specificity of CNV detection but had certain limitations dependent on the detection tool. GWAS revealed genome-wide associated CNVs for body size in horses.
2009-09-01
OF A LINK-16/JTIDS COMPATIBLE WAVEFORM WITH NONCOHERENT DETECTION, DIVERSITY AND SIDE INFORMATION by Ioannis Kagioglidis September 2009... Noncoherent Detection, Diversity and Side Information. 6. AUTHOR Ioannis Kagioglidis 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...baseband waveforms and detected noncoherently . For noncoherent detection, only one five bit symbol is transmitted on both the I and Q components of
NASA Astrophysics Data System (ADS)
Syarifah, V. B.; Rafi, M.; Wahyuni, W. T.
2017-05-01
Brotowali (Tinospora crispa) is widely used in Indonesia as ingredient of herbal medicine formulation. To ensure the quality, safety, and efficacy of herbal medicine products, its chemical constituents should be continuously evaluated. High performance liquid chromatography (HPLC) fingerprint is one of powerful technique for this quality control process. In this study, HPLC fingerprint analysis method was developed for quality control of brotowali. HPLC analysis was performed in C18 column and detection was performed using photodiode array detector. The optimum mobile phase for brotowali fingerprint was acetonitrile (ACN) and 0.1% formic acid in gradient elution mode at a flow rate of 1 mL/min. The number of peaks detected in HPLC fingerprint of brotowali was 32 peaks and 23 peaks for stems and leaves, respectively. Berberine as marker compound was detected at retention time of 20.525 minutes. Evaluation of analytical performance including precision, reproducibility, and stability prove that this HPLC fingerprint analysis was reliable and could be applied for quality control of brotowali.
Incipient fault detection study for advanced spacecraft systems
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.
1986-01-01
A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.
Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel
2017-01-01
Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Research on the strategy of underwater united detection fusion and communication using multi-sensor
NASA Astrophysics Data System (ADS)
Xu, Zhenhua; Huang, Jianguo; Huang, Hai; Zhang, Qunfei
2011-09-01
In order to solve the distributed detection fusion problem of underwater target detection, when the signal to noise ratio (SNR) of the acoustic channel is low, a new strategy for united detection fusion and communication using multiple sensors was proposed. The performance of detection fusion was studied and compared based on the Neyman-Pearson principle when the binary phase shift keying (BPSK) and on-off keying (OOK) modes were used by the local sensors. The comparative simulation and analysis between the optimal likelihood ratio test and the proposed strategy was completed, and both the theoretical analysis and simulation indicate that using the proposed new strategy could improve the detection performance effectively. In theory, the proposed strategy of united detection fusion and communication is of great significance to the establishment of an underwater target detection system.
Spiral Bevel Gear Damage Detection Using Decision Fusion Analysis
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Handschuh, Robert F.; Afjeh, Abdollah A.
2002-01-01
A diagnostic tool for detecting damage to spiral bevel gears was developed. Two different monitoring technologies, oil debris analysis and vibration, were integrated using data fusion into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual monitoring technologies. This diagnostic tool was evaluated by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spiral Bevel Gear Fatigue Rigs. Data was collected during experiments performed in this test rig when pitting damage occurred. Results show that combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spiral bevel gears.
Detecting a periodic signal in the terrestrial cratering record
NASA Technical Reports Server (NTRS)
Grieve, Richard A. F.; Rupert, James D.; Goodacre, Alan K.; Sharpton, Virgil L.
1988-01-01
A time-series analysis of model periodic data, where the period and phase are known, has been performed in order to investigate whether a significant period can be detected consistently from a mix of random and periodic impacts. Special attention is given to the effect of age uncertainties and random ages in the detection of a periodic signal. An equivalent analysis is performed with observed data on crater ages and compared with the model data, and the effects of the temporal distribution of crater ages on the results from the time-series analysis are studied. Evidence for a consistent 30-m.y. period is found to be weak.
NASA Astrophysics Data System (ADS)
Wunderlich, Adam; Goossens, Bart
2014-03-01
The majority of the literature on task-based image quality assessment has focused on lesion detection tasks, using the receiver operating characteristic (ROC) curve, or related variants, to measure performance. However, since many clinical image evaluation tasks involve both detection and estimation (e.g., estimation of kidney stone composition, estimation of tumor size), there is a growing interest in performance evaluation for joint detection and estimation tasks. To evaluate observer performance on such tasks, Clarkson introduced the estimation ROC (EROC) curve, and the area under the EROC curve as a summary figure of merit. In the present work, we propose nonparametric estimators for practical EROC analysis from experimental data, including estimators for the area under the EROC curve and its variance. The estimators are illustrated with a practical example comparing MRI images reconstructed from different k-space sampling trajectories.
Gonçalves, Luís Moreira; Magalhães, Paulo Jorge; Valente, Inês Maria; Pacheco, João Grosso; Dostálek, Pavel; Sýkora, David; Rodrigues, José António; Barros, Aquiles Araújo
2010-06-11
In this work, a recently developed extraction technique for sample preparation aiming the analysis of volatile and semi-volatile compounds named gas-diffusion microextraction (GDME) is applied in the chromatographic analysis of aldehydes in beer. Aldehydes-namely acetaldehyde (AA), methylpropanal (MA) and furfural (FA)-were simultaneously extracted and derivatized with 2,4-dinitrophenylhydrazine (DNPH), then the derivatives were separated and analyzed by high-performance liquid chromatography with spectrophotometric detection (HPLC-UV). The identity of the eluted compounds was confirmed by high-performance liquid chromatography-atmospheric pressure chemical ionization-mass-spectrometry detection in the negative ion mode (HPLC-APCI-MS). The developed methodology showed good repeatability (ca. 5%) and linearity as well as good limits of detection (AA-12.3, FA-1.5 and MA 5.4microgL(-1)) and quantification (AA-41, FA-4.9 and MA 18microgL(-1)); it also appears to be competitive in terms of speed and cost of analysis. Copyright 2010 Elsevier B.V. All rights reserved.
Systems and methods for detection of blowout precursors in combustors
Lieuwen, Tim C.; Nair, Suraj
2006-08-15
The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.
Reversed-phase high-performance liquid chromatography of sulfur mustard in water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raghuveeran, C.D.; Malhotra, R.C.; Dangi, R.S.
1993-01-01
A reversed-phase high-performance liquid chromatography method for the detection and quantitation of sulfur mustard (HD) in water is described with detection at 200 nm. The detection based on the solubility of HD in water revealed that extremely low quantities of HD (4 to 5 mg/L) only are soluble. Experience shows that water is still the medium of choice for the analysis of HD in water and aqueous effluents in spite of the minor handicap of its half-life of ca. 4 minutes, which only calls for speedy analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gundlach-Graham, Alexander W.; Dennis, Elise; Ray, Steven J.
An inductively coupled plasma distance-of-flight mass spectrometer (ICP-DOFMS) has been coupled with laser-ablation (LA) sample introduction for the elemental analysis of solids. ICP-DOFMS is well suited for the analysis of laser-generated aerosols because it offers both high-speed mass analysis and simultaneous multi-elemental detection. Here, we evaluate the analytical performance of the LA-ICP-DOFMS instrument, equipped with a microchannel plate-based imaging detector, for the measurement of steady-state LA signals, as well as transient signals produced from single LA events. Steady-state detection limits are 1 mg g1, and absolute single-pulse LA detection limits are 200 fg for uranium; the system is shown capablemore » of performing time-resolved single-pulse LA analysis. By leveraging the benefits of simultaneous multi-elemental detection, we also attain a good shot-to-shot reproducibility of 6% relative standard deviation (RSD) and isotope-ratio precision of 0.3% RSD with a 10 s integration time.« less
NASA Astrophysics Data System (ADS)
Shorts, Vincient F.
1994-09-01
The Janus combat simulation offers the user a wide variety of weather effects options to employ during the execution of any simulation run, which can directly influence detection of opposing forces. Realistic weather effects are required if the simulation is to accurately reproduce 'real world' results. This thesis examines the mathematics of the Janus weather effects models. A weather effect option in Janus is the sky-to-ground brightness ratio (SGR). SGR affects an optical sensor's ability to detect targets. It is a measure of the sun angle in relation to the horizon. A review of the derivation of SGR is performed and an analysis of SGR's affect on the number of optical detections and detection ranges is performed using an unmanned aerial vehicle (UAV) search scenario. For comparison, the UAV's are equipped with a combination of optical and thermal sensors.
Barcode DNA length polymorphisms vs fatty acid profiling for adulteration detection in olive oil.
Uncu, Ali Tevfik; Uncu, Ayse Ozgur; Frary, Anne; Doganlar, Sami
2017-04-15
The aim of this study was to compare the performance of a DNA-barcode assay with fatty acid profile analysis to authenticate the botanical origin of olive oil. To achieve this aim, we performed a PCR-capillary electrophoresis (PCR-CE) approach on olive oil: seed oil blends using the plastid trnL (UAA) intron barcode. In parallel to genomic analysis, we subjected the samples to gas chromatography analysis of fatty acid composition. While the PCR-CE assay proved equally efficient as gas chromatography analysis in detecting adulteration with soybean, palm, rapeseed, sunflower, sesame, cottonseed and peanut oils, it was superior to the widely utilized analytical chemistry approach in revealing the adulterant species and detecting small quantities of corn and safflower oils in olive oil. Moreover, the DNA-based test correctly identified all tested olive oil: hazelnut oil blends whereas it was not feasible to detect hazelnut oil adulteration through fatty acid profile analysis. Thus, the present research has shown the feasibility of a PCR-CE barcode assay to detect adulteration in olive oil. Copyright © 2016 Elsevier Ltd. All rights reserved.
Information theoretic analysis of canny edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2011-06-01
In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.
Di Segni, Mattia; de Soccio, Valeria; Cantisani, Vito; Bonito, Giacomo; Rubini, Antonello; Di Segni, Gabriele; Lamorte, Sveva; Magri, Valentina; De Vito, Corrado; Migliara, Giuseppe; Bartolotta, Tommaso Vincenzo; Metere, Alessio; Giacomelli, Laura; de Felice, Carlo; D'Ambrosio, Ferdinando
2018-06-01
To assess the diagnostic performance and the potential as a teaching tool of S-detect in the assessment of focal breast lesions. 61 patients (age 21-84 years) with benign breast lesions in follow-up or candidate to pathological sampling or with suspicious lesions candidate to biopsy were enrolled. The study was based on a prospective and on a retrospective phase. In the prospective phase, after completion of baseline US by an experienced breast radiologist and S-detect assessment, 5 operators with different experience and dedication to breast radiology performed elastographic exams. In the retrospective phase, the 5 operators performed a retrospective assessment and categorized lesions with BI-RADS 2013 lexicon. Integration of S-detect to in-training operators evaluations was performed by giving priority to S-detect analysis in case of disagreement. 2 × 2 contingency tables and ROC analysis were used to assess the diagnostic performances; inter-rater agreement was measured with Cohen's k; Bonferroni's test was used to compare performances. A significance threshold of p = 0.05 was adopted. All operators showed sensitivity > 90% and varying specificity (50-75%); S-detect showed sensitivity > 90 and 70.8% specificity, with inter-rater agreement ranging from moderate to good. Lower specificities were improved by the addition of S-detect. The addition of elastography did not lead to any improvement of the diagnostic performance. S-detect is a feasible tool for the characterization of breast lesions; it has a potential as a teaching tool for the less experienced operators.
Tao, Lingyan; Zhang, Qing; Wu, Yongjiang; Liu, Xuesong
2016-12-01
In this study, a fast and effective high-performance liquid chromatography method was developed to obtain a fingerprint chromatogram and quantitative analysis simultaneously of four indexes including gallic acid, chlorogenic acid, albiflorin and paeoniflorin of the traditional Chinese medicine Moluodan Concentrated Pill. The method was performed by using a Waters X-bridge C 18 reversed phase column on an Agilent 1200S high-performance liquid chromatography system coupled with diode array detection. The mobile phase of the high-performance liquid chromatography method was composed of 20 mmol/L phosphate solution and acetonitrile with a 1 mL/min eluent velocity, under a detection temperature of 30°C and a UV detection wavelength of 254 nm. After the methodology validation, 16 batches of Moluodan Concentrated Pill were analyzed by this high-performance liquid chromatography method and both qualitative and quantitative evaluation results were achieved by similarity analysis, principal component analysis and hierarchical cluster analysis. The results of these three chemometrics were in good agreement and all indicated that batch 10 and batch 16 showed significant differences with the other 14 batches. This suggested that the developed high-performance liquid chromatography method could be applied in the quality evaluation of Moluodan Concentrated Pill. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Information theoretic analysis of edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2010-08-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.
García Vicente, Ana María; Delgado-Bolton, Roberto C; Amo-Salas, Mariano; López-Fidalgo, Jesús; Caresia Aróztegui, Ana Paula; García Garzón, José Ramón; Orcajo Rincón, Javier; García Velloso, María José; de Arcocha Torres, María; Alvárez Ruíz, Soledad
2017-08-01
The detection of occult cancer in patients suspected of having a paraneoplastic neurological syndrome (PNS) poses a diagnostic challenge. The aim of our study was to perform a systematic review and meta-analysis to assess the diagnostic performance of FDG PET for the detection of occult malignant disease responsible for PNS. A systematic review of the literature (MEDLINE, EMBASE, Cochrane, and DARE) was undertaken to identify studies published in any language. The search strategy was structured after addressing clinical questions regarding the validity or usefulness of the test, following the PICO framework. Inclusion criteria were studies involving patients with PNS in whom FDG PET was performed to detect malignancy, and which reported sufficient primary data to allow calculation of diagnostic accuracy parameters. When possible, a meta-analysis was performed to calculate the joint sensitivity, specificity, and detection rate for malignancy (with 95% confidence intervals [CIs]), as well as a subgroup analysis based on patient characteristics (antibodies, syndrome). The comprehensive literature search revealed 700 references. Sixteen studies met the inclusion criteria and were ultimately selected. Most of the studies were retrospective (12/16). For the quality assessment, the QUADAS-2 tool was applied to assess the risk of bias. Across 16 studies (793 patients), the joint sensitivity, specificity, and detection rate for malignancy with FDG PET were 0.87 (95% CI: 0.80-0.93), 0.86 (95% CI: 0.83-0.89), and 14.9% (95% CI: 11.5-18.7), respectively. The area under the curve (AUC) of the summary ROC curve was 0.917. Homogeneity of results was observed for sensitivity but not for specificity. Some of the individual studies showed large 95% CIs as a result of small sample size. The results of our meta-analysis reveal high diagnostic performance of FDG PET in the detection of malignancy responsible for PNS, not affected by the presence of onconeural antibodies or clinical characteristics.
Denaturing high-performance liquid chromatography for mutation detection and genotyping.
Fackenthal, Donna Lee; Chen, Pei Xian; Howe, Ted; Das, Soma
2013-01-01
Denaturing high-performance liquid chromatography (DHPLC) is an accurate and efficient screening technique used for detecting DNA sequence changes by heteroduplex analysis. It can also be used for genotyping of single nucleotide polymorphisms (SNPs). The high sensitivity of DHPLC has made this technique one of the most reliable approaches to mutation analysis and, therefore, used in various areas of genetics, both in the research and clinical arena. This chapter describes the methods used for mutation detection analysis and the genotyping of SNPs by DHPLC on the WAVE™ system from Transgenomic Inc. ("WAVE" and "DNASep" are registered trademarks, and "Navigator" is a trademark, of Transgenomic, used with permission. All other trademarks are property of the respective owners).
Early Improper Motion Detection in Golf Swings Using Wearable Motion Sensors: The First Approach
Stančin, Sara; Tomažič, Sašo
2013-01-01
This paper presents an analysis of a golf swing to detect improper motion in the early phase of the swing. Led by the desire to achieve a consistent shot outcome, a particular golfer would (in multiple trials) prefer to perform completely identical golf swings. In reality, some deviations from the desired motion are always present due to the comprehensive nature of the swing motion. Swing motion deviations that are not detrimental to performance are acceptable. This analysis is conducted using a golfer's leading arm kinematic data, which are obtained from a golfer wearing a motion sensor that is comprised of gyroscopes and accelerometers. Applying the principal component analysis (PCA) to the reference observations of properly performed swings, the PCA components of acceptable swing motion deviations are established. Using these components, the motion deviations in the observations of other swings are examined. Any unacceptable deviations that are detected indicate an improper swing motion. Arbitrarily long observations of an individual player's swing sequences can be included in the analysis. The results obtained for the considered example show an improper swing motion in early phase of the swing, i.e., the first part of the backswing. An early detection method for improper swing motions that is conducted on an individual basis provides assistance for performance improvement. PMID:23752563
Early improper motion detection in golf swings using wearable motion sensors: the first approach.
Stančin, Sara; Tomažič, Sašo
2013-06-10
This paper presents an analysis of a golf swing to detect improper motion in the early phase of the swing. Led by the desire to achieve a consistent shot outcome, a particular golfer would (in multiple trials) prefer to perform completely identical golf swings. In reality, some deviations from the desired motion are always present due to the comprehensive nature of the swing motion. Swing motion deviations that are not detrimental to performance are acceptable. This analysis is conducted using a golfer's leading arm kinematic data, which are obtained from a golfer wearing a motion sensor that is comprised of gyroscopes and accelerometers. Applying the principal component analysis (PCA) to the reference observations of properly performed swings, the PCA components of acceptable swing motion deviations are established. Using these components, the motion deviations in the observations of other swings are examined. Any unacceptable deviations that are detected indicate an improper swing motion. Arbitrarily long observations of an individual player's swing sequences can be included in the analysis. The results obtained for the considered example show an improper swing motion in early phase of the swing, i.e., the first part of the backswing. An early detection method for improper swing motions that is conducted on an individual basis provides assistance for performance improvement.
An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis
NASA Technical Reports Server (NTRS)
Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon
2017-01-01
Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.
Smart ECG Monitoring Patch with Built-in R-Peak Detection for Long-Term HRV Analysis.
Lee, W K; Yoon, H; Park, K S
2016-07-01
Since heart rate variability (HRV) analysis is widely used to evaluate the physiological status of the human body, devices specifically designed for such applications are needed. To this end, we developed a smart electrocardiography (ECG) patch. The smart patch measures ECG using three electrodes integrated into the patch, filters the measured signals to minimize noise, performs analog-to-digital conversion, and detects R-peaks. The measured raw ECG data and the interval between the detected R-peaks can be recorded to enable long-term HRV analysis. Experiments were performed to evaluate the performance of the built-in R-wave detection, robustness of the device under motion, and applicability to the evaluation of mental stress. The R-peak detection results obtained with the device exhibited a sensitivity of 99.29%, a positive predictive value of 100.00%, and an error of 0.71%. The device also exhibited less motional noise than conventional ECG recording, being stable up to a walking speed of 5 km/h. When applied to mental stress analysis, the device evaluated the variation in HRV parameters in the same way as a normal ECG, with very little difference. This device can help users better understand their state of health and provide physicians with more reliable data for objective diagnosis.
Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution
NASA Astrophysics Data System (ADS)
Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin
2018-06-01
Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6 ± 36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.
Dehghanian, Fatemeh; Silawi, Mohammad; Tabei, Seyed M B
2017-02-01
Deficiency of phenylalanine hydroxylase (PAH) enzyme and elevation of phenylalanine in body fluids cause phenylketonuria (PKU). The gold standard for confirming PKU and PAH deficiency is detecting causal mutations by direct sequencing of the coding exons and splicing involved sequences of the PAH gene. Furthermore, haplotype analysis could be considered as an auxiliary approach for detecting PKU causative mutations before direct sequencing of the PAH gene by making comparisons between prior detected mutation linked-haplotypes and new PKU case haplotypes with undetermined mutations. In this study, 13 unrelated classical PKU patients took part in the study detecting causative mutations. Mutations were identified by polymerase chain reaction (PCR) and direct sequencing in all patients. After that, haplotype analysis was performed by studying VNTR and PAHSTR markers (linked genetic markers of the PAH gene) through application of PCR and capillary electrophoresis (CE). Mutation analysis was performed successfully and the detected mutations were as follows: c.782G>A, c.754C>T, c.842C>G, c.113-115delTCT, c.688G>A, and c.696A>G. Additionally, PAHSTR/VNTR haplotypes were detected to discover haplotypes linked to each mutation. Mutation detection is the best approach for confirming PAH enzyme deficiency in PKU patients. Due to the relatively large size of the PAH gene and high cost of the direct sequencing in developing countries, haplotype analysis could be used before DNA sequencing and mutation detection for a faster and cheaper way via identifying probable mutated exons.
Performance comparison of single and dual-excitation-wavelength resonance-Raman explosives detectors
NASA Astrophysics Data System (ADS)
Yellampalle, Balakishore; Martin, Robert; Witt, Kenneth; McCormick, William; Wu, Hai-Shan; Sluch, Mikhail; Ice, Robert; Lemoff, Brian
2017-05-01
Deep-ultraviolet Raman spectroscopy is a very useful approach for standoff detection of explosive traces. Using two simultaneous excitation wavelengths improves the specificity and sensitivity to standoff explosive detection. The High Technology Foundation developed a highly compact prototype of resonance Raman explosives detector. In this work, we discuss the relative performance of a dual-excitation sensor compared to a single-excitation sensor. We present trade space analysis comparing three representative Raman systems with similar size, weight, and power. The analysis takes into account, cost, spectral resolution, detection/identification time and the overall system benefit.
Tornado detection data reduction and analysis
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1977-01-01
Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.
NASA Technical Reports Server (NTRS)
Hopson, Charles B.
1987-01-01
The results of an analysis performed on seven successive Space Shuttle Main Engine (SSME) static test firings, utilizing envelope detection of external accelerometer data are discussed. The results clearly show the great potential for using envelope detection techniques in SSME incipient failure detection.
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor)
2009-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Reid, Ray D. (Inventor); Hug, William F. (Inventor)
2010-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.
Ergonomics for enhancing detection of machine abnormalities.
Illankoon, Prasanna; Abeysekera, John; Singh, Sarbjeet
2016-10-17
Detecting abnormal machine conditions is of great importance in an autonomous maintenance environment. Ergonomic aspects can be invaluable when detection of machine abnormalities using human senses is examined. This research outlines the ergonomic issues involved in detecting machine abnormalities and suggests how ergonomics would improve such detections. Cognitive Task Analysis was performed in a plant in Sri Lanka where Total Productive Maintenance is being implemented to identify sensory types that would be used to detect machine abnormalities and relevant Ergonomic characteristics. As the outcome of this research, a methodology comprising of an Ergonomic Gap Analysis Matrix for machine abnormality detection is presented.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
49 CFR 240.309 - Railroad oversight responsibilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... reported train accidents attributed to poor safety performance by locomotive engineers; (3) The number and... and analysis concerning the administration of its program for responding to detected instances of poor... analysis shall involve: (1) The number and nature of the instances of detected poor safety conduct...
NASA Astrophysics Data System (ADS)
Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie
2011-05-01
Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.
Berry, Nadine Kaye; Bain, Nicole L; Enjeti, Anoop K; Rowlings, Philip
2014-01-01
Aim To evaluate the role of whole genome comparative genomic hybridisation microarray (array-CGH) in detecting genomic imbalances as compared to conventional karyotype (GTG-analysis) or myeloma specific fluorescence in situ hybridisation (FISH) panel in a diagnostic setting for plasma cell dyscrasia (PCD). Methods A myeloma-specific interphase FISH (i-FISH) panel was carried out on CD138 PC-enriched bone marrow (BM) from 20 patients having BM biopsies for evaluation of PCD. Whole genome array-CGH was performed on reference (control) and neoplastic (test patient) genomic DNA extracted from CD138 PC-enriched BM and analysed. Results Comparison of techniques demonstrated a much higher detection rate of genomic imbalances using array-CGH. Genomic imbalances were detected in 1, 19 and 20 patients using GTG-analysis, i-FISH and array-CGH, respectively. Genomic rearrangements were detected in one patient using GTG-analysis and seven patients using i-FISH, while none were detected using array-CGH. I-FISH was the most sensitive method for detecting gene rearrangements and GTG-analysis was the least sensitive method overall. All copy number aberrations observed in GTG-analysis were detected using array-CGH and i-FISH. Conclusions We show that array-CGH performed on CD138-enriched PCs significantly improves the detection of clinically relevant and possibly novel genomic abnormalities in PCD, and thus could be considered as a standard diagnostic technique in combination with IGH rearrangement i-FISH. PMID:23969274
Berry, Nadine Kaye; Bain, Nicole L; Enjeti, Anoop K; Rowlings, Philip
2014-01-01
To evaluate the role of whole genome comparative genomic hybridisation microarray (array-CGH) in detecting genomic imbalances as compared to conventional karyotype (GTG-analysis) or myeloma specific fluorescence in situ hybridisation (FISH) panel in a diagnostic setting for plasma cell dyscrasia (PCD). A myeloma-specific interphase FISH (i-FISH) panel was carried out on CD138 PC-enriched bone marrow (BM) from 20 patients having BM biopsies for evaluation of PCD. Whole genome array-CGH was performed on reference (control) and neoplastic (test patient) genomic DNA extracted from CD138 PC-enriched BM and analysed. Comparison of techniques demonstrated a much higher detection rate of genomic imbalances using array-CGH. Genomic imbalances were detected in 1, 19 and 20 patients using GTG-analysis, i-FISH and array-CGH, respectively. Genomic rearrangements were detected in one patient using GTG-analysis and seven patients using i-FISH, while none were detected using array-CGH. I-FISH was the most sensitive method for detecting gene rearrangements and GTG-analysis was the least sensitive method overall. All copy number aberrations observed in GTG-analysis were detected using array-CGH and i-FISH. We show that array-CGH performed on CD138-enriched PCs significantly improves the detection of clinically relevant and possibly novel genomic abnormalities in PCD, and thus could be considered as a standard diagnostic technique in combination with IGH rearrangement i-FISH.
Rzeppa, S; Heinrich, G; Hemmersbach, P
2015-01-01
Improvements in doping analysis can be effected by speeding up analysis time and extending the detection time. Therefore, direct detection of phase II conjugates of doping agents, especially anabolic androgenic steroids (AAS), is proposed. Besides direct detection of conjugates with glucuronic acid, the analysis of sulfate conjugates, which are usually not part of the routine doping control analysis, can be of high interest. Sulfate conjugates of methandienone and methyltestosterone metabolites have already been identified as long-term metabolites. This study presents the synthesis of sulfate conjugates of six commonly used AAS and their metabolites: trenbolone, nandrolone, boldenone, methenolone, mesterolone, and drostanolone. In the following these sulfate conjugates were used for development of a fast and easy analysis method based on sample preparation using solid phase extraction with a mixed-mode sorbent and detection by high performance liquid chromatography coupled to tandem mass spectrometry (HPLC-MS/MS). Validation demonstrated the suitability of the method with regard to the criteria given by the technical documents of the World Anti-Doping Agency (WADA). In addition, suitability has been proven by successful detection of the synthesized sulfate conjugates in excretion urines and routine doping control samples. Copyright © 2015 John Wiley & Sons, Ltd.
Kirchner, Elsa A; Kim, Su Kyoung
2018-01-01
Event-related potentials (ERPs) are often used in brain-computer interfaces (BCIs) for communication or system control for enhancing or regaining control for motor-disabled persons. Especially results from single-trial EEG classification approaches for BCIs support correlations between single-trial ERP detection performance and ERP expression. Hence, BCIs can be considered as a paradigm shift contributing to new methods with strong influence on both neuroscience and clinical applications. Here, we investigate the relevance of the choice of training data and classifier transfer for the interpretability of results from single-trial ERP detection. In our experiments, subjects performed a visual-motor oddball task with motor-task relevant infrequent ( targets ), motor-task irrelevant infrequent ( deviants ), and motor-task irrelevant frequent ( standards ) stimuli. Under dual-task condition, a secondary senso-motor task was performed, compared to the simple-task condition. For evaluation, average ERP analysis and single-trial detection analysis with different numbers of electrodes were performed. Further, classifier transfer was investigated between simple and dual task. Parietal positive ERPs evoked by target stimuli (but not by deviants) were expressed stronger under dual-task condition, which is discussed as an increase of task emphasis and brain processes involved in task coordination and change of task set. Highest classification performance was found for targets irrespective whether all 62, 6 or 2 parietal electrodes were used. Further, higher detection performance of targets compared to standards was achieved under dual-task compared to simple-task condition in case of training on data from 2 parietal electrodes corresponding to results of ERP average analysis. Classifier transfer between tasks improves classification performance in case that training took place on more varying examples (from dual task). In summary, we showed that P300 and overlaying parietal positive ERPs can successfully be detected while subjects are performing additional ongoing motor activity. This supports single-trial detection of ERPs evoked by target events to, e.g., infer a patient's attentional state during therapeutic intervention.
The Tissue Analysis Core within the AIDS and Cancer Virus Program will process, embed and perform microtomy on fixed tissue samples presented in ethanol. HIV/SIVin situhybridization for detection of vRNA and vDNA will be performed using the next-gene
Borovcová, Lucie; Pauk, Volodymyr; Lemr, Karel
2018-05-01
New psychoactive substances represent serious social and health problem as tens of new compounds are detected in Europe annually. They often show structural proximity or even isomerism, which complicates their analysis. Two methods based on ultra high performance supercritical fluid chromatography and ultra high performance liquid chromatography with mass spectrometric detection were validated and compared. A simple dilute-filter-and-shoot protocol utilizing propan-2-ol or methanol for supercritical fluid or liquid chromatography, respectively, was proposed to detect and quantify 15 cathinones and phenethylamines in human urine. Both methods offered fast separation (<3 min) and short total analysis time. Precision was well <15% with a few exceptions in liquid chromatography. Limits of detection in urine ranged from 0.01 to 2.3 ng/mL, except for cathinone (5 ng/mL) in supercritical fluid chromatography. Nevertheless, this technique distinguished all analytes including four pairs of isomers, while liquid chromatography was unable to resolve fluoromethcathinone regioisomers. Concerning matrix effects and recoveries, supercritical fluid chromatography produced more uniform results for different compounds and at different concentration levels. This work demonstrates the performance and reliability of supercritical fluid chromatography and corroborates its applicability as an alternative tool for analysis of new psychoactive substances in biological matrixes. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Bueno, R. A.
1977-01-01
Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.
Automatic detection of lexical change: an auditory event-related potential study.
Muller-Gass, Alexandra; Roye, Anja; Kirmse, Ursula; Saupe, Katja; Jacobsen, Thomas; Schröger, Erich
2007-10-29
We investigated the detection of rare task-irrelevant changes in the lexical status of speech stimuli. Participants performed a nonlinguistic task on word and pseudoword stimuli that occurred, in separate conditions, rarely or frequently. Task performance for pseudowords was deteriorated relative to words, suggesting unintentional lexical analysis. Furthermore, rare word and pseudoword changes had a similar effect on the event-related potentials, starting as early as 165 ms. This is the first demonstration of the automatic detection of change in lexical status that is not based on a co-occurring acoustic change. We propose that, following lexical analysis of the incoming stimuli, a mental representation of the lexical regularity is formed and used as a template against which lexical change can be detected.
Signal Detection Analysis of Computer Enhanced Group Decision Making Strategies
2007-11-01
group decision making. 20 References American Psychological Association (2002). Ethical principles of psychologists and code of conduct. American... Creelman , C. D. (2005). Detection theory: A user’s guide (2nd ed.). Mahwah, NJ: Lawrence Erlbaum. Sorkin, R. D. (1998). Group performance depends on...the majority rule. Psychological Science, 9, 456-463. Sorkin, R. D. (2001). Signal-detection analysis of group decision making. Psychological
USDA-ARS?s Scientific Manuscript database
In this study, real-time RT-PCR assays were combined with high resolution melting (HRM) analysis for the simultaneous detection of Cherry necrotic rusty mottle virus (CNRMV) and Cherry green ring mottle virus (CGRMV) infection in sweet cherry trees. Detection of CNRMV and CGRMV was performed using a...
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
Transmission Bearing Damage Detection Using Decision Fusion Analysis
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Lewicki, David G.; Decker, Harry J.
2004-01-01
A diagnostic tool was developed for detecting fatigue damage to rolling element bearings in an OH-58 main rotor transmission. Two different monitoring technologies, oil debris analysis and vibration, were integrated using data fusion into a health monitoring system for detecting bearing surface fatigue pitting damage. This integrated system showed improved detection and decision-making capabilities as compared to using individual monitoring technologies. This diagnostic tool was evaluated by collecting vibration and oil debris data from tests performed in the NASA Glenn 500 hp Helicopter Transmission Test Stand. Data was collected during experiments performed in this test rig when two unanticipated bearing failures occurred. Results show that combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spiral bevel gears duplex ball bearings and spiral bevel pinion triplex ball bearings in a main rotor transmission.
Parameter Transient Behavior Analysis on Fault Tolerant Control System
NASA Technical Reports Server (NTRS)
Belcastro, Christine (Technical Monitor); Shin, Jong-Yeob
2003-01-01
In a fault tolerant control (FTC) system, a parameter varying FTC law is reconfigured based on fault parameters estimated by fault detection and isolation (FDI) modules. FDI modules require some time to detect fault occurrences in aero-vehicle dynamics. This paper illustrates analysis of a FTC system based on estimated fault parameter transient behavior which may include false fault detections during a short time interval. Using Lyapunov function analysis, the upper bound of an induced-L2 norm of the FTC system performance is calculated as a function of a fault detection time and the exponential decay rate of the Lyapunov function.
[Rapid detection of caffeine in blood by freeze-out extraction].
Bekhterev, V N; Gavrilova, S N; Kozina, E P; Maslakov, I V
2010-01-01
A new method for the detection of caffeine in blood has been proposed based on the combination of extraction and freezing-out to eliminate the influence of sample matrix. Metrological characteristics of the method are presented. Selectivity of detection is achieved by optimal conditions of analysis by high performance liquid chromatography. The method is technically simple and cost-efficient, it ensures rapid performance of the studies.
Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi
2014-06-01
A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M; Jonsson, Gudberg K; Anguera, M Teresa
2017-01-01
The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013-First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann-Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M.; Jonsson, Gudberg K.; Anguera, M. Teresa
2017-01-01
The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013—First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann–Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training. PMID:28878712
NASA Astrophysics Data System (ADS)
Bhushan, A.; Sharker, M. H.; Karimi, H. A.
2015-07-01
In this paper, we address outliers in spatiotemporal data streams obtained from sensors placed across geographically distributed locations. Outliers may appear in such sensor data due to various reasons such as instrumental error and environmental change. Real-time detection of these outliers is essential to prevent propagation of errors in subsequent analyses and results. Incremental Principal Component Analysis (IPCA) is one possible approach for detecting outliers in such type of spatiotemporal data streams. IPCA has been widely used in many real-time applications such as credit card fraud detection, pattern recognition, and image analysis. However, the suitability of applying IPCA for outlier detection in spatiotemporal data streams is unknown and needs to be investigated. To fill this research gap, this paper contributes by presenting two new IPCA-based outlier detection methods and performing a comparative analysis with the existing IPCA-based outlier detection methods to assess their suitability for spatiotemporal sensor data streams.
NASA Astrophysics Data System (ADS)
Fotin, Sergei V.; Yin, Yin; Haldankar, Hrishikesh; Hoffmeister, Jeffrey W.; Periaswamy, Senthil
2016-03-01
Computer-aided detection (CAD) has been used in screening mammography for many years and is likely to be utilized for digital breast tomosynthesis (DBT). Higher detection performance is desirable as it may have an impact on radiologist's decisions and clinical outcomes. Recently the algorithms based on deep convolutional architectures have been shown to achieve state of the art performance in object classification and detection. Similarly, we trained a deep convolutional neural network directly on patches sampled from two-dimensional mammography and reconstructed DBT volumes and compared its performance to a conventional CAD algorithm that is based on computation and classification of hand-engineered features. The detection performance was evaluated on the independent test set of 344 DBT reconstructions (GE SenoClaire 3D, iterative reconstruction algorithm) containing 328 suspicious and 115 malignant soft tissue densities including masses and architectural distortions. Detection sensitivity was measured on a region of interest (ROI) basis at the rate of five detection marks per volume. Moving from conventional to deep learning approach resulted in increase of ROI sensitivity from 0:832 +/- 0:040 to 0:893 +/- 0:033 for suspicious ROIs; and from 0:852 +/- 0:065 to 0:930 +/- 0:046 for malignant ROIs. These results indicate the high utility of deep feature learning in the analysis of DBT data and high potential of the method for broader medical image analysis tasks.
Fast linear feature detection using multiple directional non-maximum suppression.
Sun, C; Vallotton, P
2009-05-01
The capacity to detect linear features is central to image analysis, computer vision and pattern recognition and has practical applications in areas such as neurite outgrowth detection, retinal vessel extraction, skin hair removal, plant root analysis and road detection. Linear feature detection often represents the starting point for image segmentation and image interpretation. In this paper, we present a new algorithm for linear feature detection using multiple directional non-maximum suppression with symmetry checking and gap linking. Given its low computational complexity, the algorithm is very fast. We show in several examples that it performs very well in terms of both sensitivity and continuity of detected linear features.
Appropriate IMFs associated with cepstrum and envelope analysis for ball-bearing fault diagnosis
NASA Astrophysics Data System (ADS)
Tsao, Wen-Chang; Pan, Min-Chun
2014-03-01
The traditional envelope analysis is an effective method for the fault detection of rolling bearings. However, all the resonant frequency bands must be examined during the bearing-fault detection process. To handle the above deficiency, this paper proposes using the empirical mode decomposition (EMD) to select a proper intrinsic mode function (IMF) for the subsequent detection tools; here both envelope analysis and cepstrum analysis are employed and compared. By virtue of the band-pass filtering nature of EMD, the resonant frequency bands of structure to be measured are captured in the IMFs. As impulses arising from rolling elements striking bearing faults modulate with structure resonance, proper IMFs potentially enable to characterize fault signatures. In the study, faulty ball bearings are used to justify the proposed method, and comparisons with the traditional envelope analysis are made. Post the use of IMFs highlighting faultybearing features, the performance of using envelope analysis and cepstrum analysis to single out bearing faults is objectively compared and addressed; it is noted that generally envelope analysis offers better performance.
Data fusion for QRS complex detection in multi-lead electrocardiogram recordings
NASA Astrophysics Data System (ADS)
Ledezma, Carlos A.; Perpiñan, Gilberto; Severeyn, Erika; Altuve, Miguel
2015-12-01
Heart diseases are the main cause of death worldwide. The first step in the diagnose of these diseases is the analysis of the electrocardiographic (ECG) signal. In turn, the ECG analysis begins with the detection of the QRS complex, which is the one with the most energy in the cardiac cycle. Numerous methods have been proposed in the bibliography for QRS complex detection, but few authors have analyzed the possibility of taking advantage of the information redundancy present in multiple ECG leads (simultaneously acquired) to produce accurate QRS detection. In our previous work we presented such an approach, proposing various data fusion techniques to combine the detections made by an algorithm on multiple ECG leads. In this paper we present further studies that show the advantages of this multi-lead detection approach, analyzing how many leads are necessary in order to observe an improvement in the detection performance. A well known QRS detection algorithm was used to test the fusion techniques on the St. Petersburg Institute of Cardiological Technics database. Results show improvement in the detection performance with as little as three leads, but the reliability of these results becomes interesting only after using seven or more leads. Results were evaluated using the detection error rate (DER). The multi-lead detection approach allows an improvement from DER = 3:04% to DER = 1:88%. Further works are to be made in order to improve the detection performance by implementing further fusion steps.
Information theoretic analysis of linear shift-invariant edge-detection operators
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2012-06-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.
Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance
NASA Technical Reports Server (NTRS)
Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.
2016-01-01
Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.
Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer
2015-09-01
EPICOPY to obtain reliable copy number variation ( CNV ) data from the methylome array data, thereby decreasing the DNA requirements in half...in the R statistical environment. Samples were assessed for good performance on the array using detection p-values, a metric implemented by...Illumina to identify probes detected with confidence. Samples less than 90% of probes detected were removed from the analysis and probes undetected in any
A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers.
Sun, Kewen; Jin, Tian; Yang, Dongkai
2015-09-02
Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner-Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference Sensors 2015, 15 22168 detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments.
A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers
Sun, Kewen; Jin, Tian; Yang, Dongkai
2015-01-01
Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner–Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments. PMID:26364637
Pileggi, Claudia; Flotta, Domenico; Bianco, Aida; Nobile, Carmelo G A; Pavia, Maria
2014-07-01
Human-papillomavirus (HPV) DNA testing has been proposed as an alternative to primary cervical cancer screening using cytological testing. Review of the evidence shows that available data are conflicting for some aspects. The overall goal of the study is to update the performance of HPV DNA as stand-alone testing in primary cervical cancer screening, focusing particularly on the aspects related to the specificity profile of the HPV DNA testing in respect to cytology. We performed a meta-analysis of randomized controlled clinical trials. Eight articles were included in the meta-analysis. Three outcomes have been investigated: relative detection, relative specificity, and relative positive predictive value (PPV) of HPV DNA testing versus cytology. Overall evaluation of relative detection showed a significantly higher detection of CIN2+ and CIN3+ for HPV DNA testing versus cytology. Meta-analyses that considered all age groups showed a relative specificity that favored the cytology in detecting both CIN2+ and CIN3+ lesions whereas, in the ≥30 years' group, specificity of HPV DNA and cytology tests was similar in detecting both CIN2+ and CIN3+ lesions. Results of the pooled analysis on relative PPV showed a not significantly lower PPV of HPV DNA test over cytology. A main key finding of the study is that in women aged ≥30, has been found an almost overlapping specificity between the two screening tests in detecting CIN2 and above-grade lesions. Therefore, primary screening of cervical cancer by HPV DNA testing appears to offer the right balance between maximum detection of CIN2+ and adequate specificity, if performed in the age group ≥30 years. © 2013 UICC.
Glial brain tumor detection by using symmetry analysis
NASA Astrophysics Data System (ADS)
Pedoia, Valentina; Binaghi, Elisabetta; Balbi, Sergio; De Benedictis, Alessandro; Monti, Emanuele; Minotto, Renzo
2012-02-01
In this work a fully automatic algorithm to detect brain tumors by using symmetry analysis is proposed. In recent years a great effort of the research in field of medical imaging was focused on brain tumors segmentation. The quantitative analysis of MRI brain tumor allows to obtain useful key indicators of disease progression. The complex problem of segmenting tumor in MRI can be successfully addressed by considering modular and multi-step approaches mimicking the human visual inspection process. The tumor detection is often an essential preliminary phase to solvethe segmentation problem successfully. In visual analysis of the MRI, the first step of the experts cognitive process, is the detection of an anomaly respect the normal tissue, whatever its nature. An healthy brain has a strong sagittal symmetry, that is weakened by the presence of tumor. The comparison between the healthy and ill hemisphere, considering that tumors are generally not symmetrically placed in both hemispheres, was used to detect the anomaly. A clustering method based on energy minimization through Graph-Cut is applied on the volume computed as a difference between the left hemisphere and the right hemisphere mirrored across the symmetry plane. Differential analysis involves the loss the knowledge of the tumor side. Through an histogram analysis the ill hemisphere is recognized. Many experiments are performed to assess the performance of the detection strategy on MRI volumes in presence of tumors varied in terms of shapes positions and intensity levels. The experiments showed good results also in complex situations.
Pneumothorax detection in chest radiographs using local and global texture signatures
NASA Astrophysics Data System (ADS)
Geva, Ofer; Zimmerman-Moreno, Gali; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit
2015-03-01
A novel framework for automatic detection of pneumothorax abnormality in chest radiographs is presented. The suggested method is based on a texture analysis approach combined with supervised learning techniques. The proposed framework consists of two main steps: at first, a texture analysis process is performed for detection of local abnormalities. Labeled image patches are extracted in the texture analysis procedure following which local analysis values are incorporated into a novel global image representation. The global representation is used for training and detection of the abnormality at the image level. The presented global representation is designed based on the distinctive shape of the lung, taking into account the characteristics of typical pneumothorax abnormalities. A supervised learning process was performed on both the local and global data, leading to trained detection system. The system was tested on a dataset of 108 upright chest radiographs. Several state of the art texture feature sets were experimented with (Local Binary Patterns, Maximum Response filters). The optimal configuration yielded sensitivity of 81% with specificity of 87%. The results of the evaluation are promising, establishing the current framework as a basis for additional improvements and extensions.
An analysis of relational complexity in an air traffic control conflict detection task.
Boag, Christine; Neal, Andrew; Loft, Shayne; Halford, Graeme S
2006-11-15
Theoretical analyses of air traffic complexity were carried out using the Method for the Analysis of Relational Complexity. Twenty-two air traffic controllers examined static air traffic displays and were required to detect and resolve conflicts. Objective measures of performance included conflict detection time and accuracy. Subjective perceptions of mental workload were assessed by a complexity-sorting task and subjective ratings of the difficulty of different aspects of the task. A metric quantifying the complexity of pair-wise relations among aircraft was able to account for a substantial portion of the variance in the perceived complexity and difficulty of conflict detection problems, as well as reaction time. Other variables that influenced performance included the mean minimum separation between aircraft pairs and the amount of time that aircraft spent in conflict.
The simulation study on optical target laser active detection performance
NASA Astrophysics Data System (ADS)
Li, Ying-chun; Hou, Zhao-fei; Fan, Youchen
2014-12-01
According to the working principle of laser active detection system, the paper establishes the optical target laser active detection simulation system, carry out the simulation study on the detection process and detection performance of the system. For instance, the performance model such as the laser emitting, the laser propagation in the atmosphere, the reflection of optical target, the receiver detection system, the signal processing and recognition. We focus on the analysis and modeling the relationship between the laser emitting angle and defocus amount and "cat eye" effect echo laser in the reflection of optical target. Further, in the paper some performance index such as operating range, SNR and the probability of the system have been simulated. The parameters including laser emitting parameters, the reflection of the optical target and the laser propagation in the atmosphere which make a great influence on the performance of the optical target laser active detection system. Finally, using the object-oriented software design methods, the laser active detection system with the opening type, complete function and operating platform, realizes the process simulation that the detection system detect and recognize the optical target, complete the performance simulation of each subsystem, and generate the data report and the graph. It can make the laser active detection system performance models more intuitive because of the visible simulation process. The simulation data obtained from the system provide a reference to adjust the structure of the system parameters. And it provides theoretical and technical support for the top level design of the optical target laser active detection system and performance index optimization.
Detection of white matter lesions in cerebral small vessel disease
NASA Astrophysics Data System (ADS)
Riad, Medhat M.; Platel, Bram; de Leeuw, Frank-Erik; Karssemeijer, Nico
2013-02-01
White matter lesions (WML) are diffuse white matter abnormalities commonly found in older subjects and are important indicators of stroke, multiple sclerosis, dementia and other disorders. We present an automated WML detection method and evaluate it on a dataset of small vessel disease (SVD) patients. In early SVD, small WMLs are expected to be of importance for the prediction of disease progression. Commonly used WML segmentation methods tend to ignore small WMLs and are mostly validated on the basis of total lesion load or a Dice coefficient for all detected WMLs. Therefore, in this paper, we present a method that is designed to detect individual lesions, large or small, and we validate the detection performance of our system with FROC (free-response ROC) analysis. For the automated detection, we use supervised classification making use of multimodal voxel based features from different magnetic resonance imaging (MRI) sequences, including intensities, tissue probabilities, voxel locations and distances, neighborhood textures and others. After preprocessing, including co-registration, brain extraction, bias correction, intensity normalization, and nonlinear registration, ventricle segmentation is performed and features are calculated for each brain voxel. A gentle-boost classifier is trained using these features from 50 manually annotated subjects to give each voxel a probability of being a lesion voxel. We perform ROC analysis to illustrate the benefits of using additional features to the commonly used voxel intensities; significantly increasing the area under the curve (Az) from 0.81 to 0.96 (p<0.05). We perform the FROC analysis by testing our classifier on 50 previously unseen subjects and compare the results with manual annotations performed by two experts. Using the first annotator results as our reference, the second annotator performs at a sensitivity of 0.90 with an average of 41 false positives per subject while our automated method reached the same level of sensitivity at approximately 180 false positives per subject.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-02-28
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less
Spectroscopic chemical analysis methods and apparatus
NASA Technical Reports Server (NTRS)
Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)
2013-01-01
Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.
Cho, Hyun-Deok; Kim, Unyong; Suh, Joon Hyuk; Eom, Han Young; Kim, Junghyun; Lee, Seul Gi; Choi, Yong Seok; Han, Sang Beom
2016-04-01
Analytical methods using high-performance liquid chromatography with diode array and tandem mass spectrometry detection were developed for the discrimination of the rhizomes of four Atractylodes medicinal plants: A. japonica, A. macrocephala, A. chinensis, and A. lancea. A quantitative study was performed, selecting five bioactive components, including atractylenolide I, II, III, eudesma-4(14),7(11)-dien-8-one and atractylodin, on twenty-six Atractylodes samples of various origins. Sample extraction was optimized to sonication with 80% methanol for 40 min at room temperature. High-performance liquid chromatography with diode array detection was established using a C18 column with a water/acetonitrile gradient system at a flow rate of 1.0 mL/min, and the detection wavelength was set at 236 nm. Liquid chromatography with tandem mass spectrometry was applied to certify the reliability of the quantitative results. The developed methods were validated by ensuring specificity, linearity, limit of quantification, accuracy, precision, recovery, robustness, and stability. Results showed that cangzhu contained higher amounts of atractylenolide I and atractylodin than baizhu, and especially atractylodin contents showed the greatest variation between baizhu and cangzhu. Multivariate statistical analysis, such as principal component analysis and hierarchical cluster analysis, were also employed for further classification of the Atractylodes plants. The established method was suitable for quality control of the Atractylodes plants. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Road Anomalies Detection System Evaluation.
Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena
2018-06-21
Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.
Kirchner, Elsa A.; Kim, Su Kyoung
2018-01-01
Event-related potentials (ERPs) are often used in brain-computer interfaces (BCIs) for communication or system control for enhancing or regaining control for motor-disabled persons. Especially results from single-trial EEG classification approaches for BCIs support correlations between single-trial ERP detection performance and ERP expression. Hence, BCIs can be considered as a paradigm shift contributing to new methods with strong influence on both neuroscience and clinical applications. Here, we investigate the relevance of the choice of training data and classifier transfer for the interpretability of results from single-trial ERP detection. In our experiments, subjects performed a visual-motor oddball task with motor-task relevant infrequent (targets), motor-task irrelevant infrequent (deviants), and motor-task irrelevant frequent (standards) stimuli. Under dual-task condition, a secondary senso-motor task was performed, compared to the simple-task condition. For evaluation, average ERP analysis and single-trial detection analysis with different numbers of electrodes were performed. Further, classifier transfer was investigated between simple and dual task. Parietal positive ERPs evoked by target stimuli (but not by deviants) were expressed stronger under dual-task condition, which is discussed as an increase of task emphasis and brain processes involved in task coordination and change of task set. Highest classification performance was found for targets irrespective whether all 62, 6 or 2 parietal electrodes were used. Further, higher detection performance of targets compared to standards was achieved under dual-task compared to simple-task condition in case of training on data from 2 parietal electrodes corresponding to results of ERP average analysis. Classifier transfer between tasks improves classification performance in case that training took place on more varying examples (from dual task). In summary, we showed that P300 and overlaying parietal positive ERPs can successfully be detected while subjects are performing additional ongoing motor activity. This supports single-trial detection of ERPs evoked by target events to, e.g., infer a patient's attentional state during therapeutic intervention. PMID:29636660
Jiménez, M; Mateo, R
1997-08-22
A method of analysis for trichothecenes (nivalenol, deoxynivalenol, 3- and 15-acetyldeoxynivalenol, diacetoxyscirpenol, neosolaniol, T-2 tetraol, T-2 and HT-2 toxins), zearalenone and zearalenols, and another method for determination of fumonisin B1 are described and applied to cultures of Fusarium isolated from bananas. Both methods were adapted from different techniques of extraction, clean-up and determination of these mycotoxins. The first method involves extraction with methanol-1% aqueous sodium chloride, clean-up of extracts by partition with hexane and dichloromethane, additional solid reversed-phase clean-up and analysis of two eluates by both high-performance liquid chromatography with ultraviolet detection and capillary gas chromatography. The method for fumonisin B1 implies extraction with aqueous methanol, concentration, clean-up with water and methanol on Amberlite XAD-2 column, formation of a fluorescent 4-fluoro-7-nitrobenzofurazan derivative and analysis by high-performance liquid chromatography with fluorescence detection. Both procedures give good limits of detection and recoveries, and are considered suitable for the detection and quantification of the studied toxins in corn and rice cultures of Fusarium spp. isolated from banana fruits.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Performance analysis of a multispectral system for mine detection in the littoral zone
NASA Astrophysics Data System (ADS)
Hargrove, John T.; Louchard, Eric
2004-09-01
Science & Technology International (STI) has developed, under contract with the Office of Naval Research, a system of multispectral airborne sensors and processing algorithms capable of detecting mine-like objects in the surf zone. STI has used this system to detect mine-like objects in a littoral environment as part of blind tests at Kaneohe Marine Corps Base Hawaii, and Panama City, Florida. The airborne and ground subsystems are described. The detection algorithm is graphically illustrated. We report on the performance of the system configured to operate without a human in the loop. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone, and in shallow water with wave spillage and foam. Our analysis demonstrates that this STI-developed multispectral airborne mine detection system provides a technical foundation for a viable mine counter-measures system for use prior to an amphibious assault.
Vision-based method for detecting driver drowsiness and distraction in driver monitoring system
NASA Astrophysics Data System (ADS)
Jo, Jaeik; Lee, Sung Joo; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie
2011-12-01
Most driver-monitoring systems have attempted to detect either driver drowsiness or distraction, although both factors should be considered for accident prevention. Therefore, we propose a new driver-monitoring method considering both factors. We make the following contributions. First, if the driver is looking ahead, drowsiness detection is performed; otherwise, distraction detection is performed. Thus, the computational cost and eye-detection error can be reduced. Second, we propose a new eye-detection algorithm that combines adaptive boosting, adaptive template matching, and blob detection with eye validation, thereby reducing the eye-detection error and processing time significantly, which is hardly achievable using a single method. Third, to enhance eye-detection accuracy, eye validation is applied after initial eye detection, using a support vector machine based on appearance features obtained by principal component analysis (PCA) and linear discriminant analysis (LDA). Fourth, we propose a novel eye state-detection algorithm that combines appearance features obtained using PCA and LDA, with statistical features such as the sparseness and kurtosis of the histogram from the horizontal edge image of the eye. Experimental results showed that the detection accuracies of the eye region and eye states were 99 and 97%, respectively. Both driver drowsiness and distraction were detected with a success rate of 98%.
Detection of mental stress due to oral academic examination via ultra-short-term HRV analysis.
Castaldo, R; Xu, W; Melillo, P; Pecchia, L; Santamaria, L; James, C
2016-08-01
Mental stress may cause cognitive dysfunctions, cardiovascular disorders and depression. Mental stress detection via short-term Heart Rate Variability (HRV) analysis has been widely explored in the last years, while ultra-short term (less than 5 minutes) HRV has been not. This study aims to detect mental stress using linear and non-linear HRV features extracted from 3 minutes ECG excerpts recorded from 42 university students, during oral examination (stress) and at rest after a vacation. HRV features were then extracted and analyzed according to the literature using validated software tools. Statistical and data mining analysis were then performed on the extracted HRV features. The best performing machine learning method was the C4.5 tree algorithm, which discriminated between stress and rest with sensitivity, specificity and accuracy rate of 78%, 80% and 79% respectively.
Detecting chaos in particle accelerators through the frequency map analysis method.
Papaphilippou, Yannis
2014-06-01
The motion of beams in particle accelerators is dominated by a plethora of non-linear effects, which can enhance chaotic motion and limit their performance. The application of advanced non-linear dynamics methods for detecting and correcting these effects and thereby increasing the region of beam stability plays an essential role during the accelerator design phase but also their operation. After describing the nature of non-linear effects and their impact on performance parameters of different particle accelerator categories, the theory of non-linear particle motion is outlined. The recent developments on the methods employed for the analysis of chaotic beam motion are detailed. In particular, the ability of the frequency map analysis method to detect chaotic motion and guide the correction of non-linear effects is demonstrated in particle tracking simulations but also experimental data.
[Analysis and experimental verification of sensitivity and SNR of laser warning receiver].
Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue
2009-01-01
In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.
Jamil, Majid; Sharma, Sanjeev Kumar; Singh, Rajveer
2015-01-01
This paper focuses on the detection and classification of the faults on electrical power transmission line using artificial neural networks. The three phase currents and voltages of one end are taken as inputs in the proposed scheme. The feed forward neural network along with back propagation algorithm has been employed for detection and classification of the fault for analysis of each of the three phases involved in the process. A detailed analysis with varying number of hidden layers has been performed to validate the choice of the neural network. The simulation results concluded that the present method based on the neural network is efficient in detecting and classifying the faults on transmission lines with satisfactory performances. The different faults are simulated with different parameters to check the versatility of the method. The proposed method can be extended to the Distribution network of the Power System. The various simulations and analysis of signals is done in the MATLAB(®) environment.
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Theoretical detection limit of PIXE analysis using 20 MeV proton beams
NASA Astrophysics Data System (ADS)
Ishii, Keizo; Hitomi, Keitaro
2018-02-01
Particle-induced X-ray emission (PIXE) analysis is usually performed using proton beams with energies in the range 2∼3 MeV because at these energies, the detection limit is low. The detection limit of PIXE analysis depends on the X-ray production cross-section, the continuous background of the PIXE spectrum and the experimental parameters such as the beam currents and the solid angle and detector efficiency of X-ray detector. Though the continuous background increases as the projectile energy increases, the cross-section of the X-ray increases as well. Therefore, the detection limit of high energy proton PIXE is not expected to increase significantly. We calculated the cross sections of continuous X-rays produced in several bremsstrahlung processes and estimated the detection limit of a 20 MeV proton PIXE analysis by modelling the Compton tail of the γ-rays produced in the nuclear reactions, and the escape effect on the secondary electron bremsstrahlung. We found that the Compton tail does not affect the detection limit when a thin X-ray detector is used, but the secondary electron bremsstrahlung escape effect does have an impact. We also confirmed that the detection limit of the PIXE analysis, when used with 4 μm polyethylene backing film and an integrated beam current of 1 μC, is 0.4∼2.0 ppm for proton energies in the range 10∼30 MeV and elements with Z = 16-90. This result demonstrates the usefulness of several 10 MeV cyclotrons for performing PIXE analysis. Cyclotrons with these properties are currently installed in positron emission tomography (PET) centers.
Haraksingh, Rajini R; Abyzov, Alexej; Urban, Alexander Eckehart
2017-04-24
High-resolution microarray technology is routinely used in basic research and clinical practice to efficiently detect copy number variants (CNVs) across the entire human genome. A new generation of arrays combining high probe densities with optimized designs will comprise essential tools for genome analysis in the coming years. We systematically compared the genome-wide CNV detection power of all 17 available array designs from the Affymetrix, Agilent, and Illumina platforms by hybridizing the well-characterized genome of 1000 Genomes Project subject NA12878 to all arrays, and performing data analysis using both manufacturer-recommended and platform-independent software. We benchmarked the resulting CNV call sets from each array using a gold standard set of CNVs for this genome derived from 1000 Genomes Project whole genome sequencing data. The arrays tested comprise both SNP and aCGH platforms with varying designs and contain between ~0.5 to ~4.6 million probes. Across the arrays CNV detection varied widely in number of CNV calls (4-489), CNV size range (~40 bp to ~8 Mbp), and percentage of non-validated CNVs (0-86%). We discovered strikingly strong effects of specific array design principles on performance. For example, some SNP array designs with the largest numbers of probes and extensive exonic coverage produced a considerable number of CNV calls that could not be validated, compared to designs with probe numbers that are sometimes an order of magnitude smaller. This effect was only partially ameliorated using different analysis software and optimizing data analysis parameters. High-resolution microarrays will continue to be used as reliable, cost- and time-efficient tools for CNV analysis. However, different applications tolerate different limitations in CNV detection. Our study quantified how these arrays differ in total number and size range of detected CNVs as well as sensitivity, and determined how each array balances these attributes. This analysis will inform appropriate array selection for future CNV studies, and allow better assessment of the CNV-analytical power of both published and ongoing array-based genomics studies. Furthermore, our findings emphasize the importance of concurrent use of multiple analysis algorithms and independent experimental validation in array-based CNV detection studies.
Design and Performance of the Astro-E/XRS Signal Processing System
NASA Technical Reports Server (NTRS)
Boyce, Kevin R.; Audley, M. D.; Baker, R. G.; Dumonthier, J. J.; Fujimoto, R.; Gendreau, K. C.; Ishisaki, Y.; Kelley, R. L.; Stahle, C. K.; Szymkowiak, A. E.
1999-01-01
We describe the signal processing system of the Astro-E XRS instrument. The Calorimeter Analog Processor (CAP) provides bias and power for the detectors and amplifies the detector signals by a factor of 20,000. The Calorimeter Digital Processor (CDP) performs the digital processing of the calorimeter signals, detecting X-ray pulses and analyzing them by optimal filtering. We describe the operation of pulse detection, Pulse height analysis. and risetime determination. We also discuss performance, including the three event grades (hi-res mid-res, and low-res). anticoincidence detection, counting rate dependence, and noise rejection.
Gimenez, Thais; Braga, Mariana Minatel; Raggio, Daniela Procida; Deery, Chris; Ricketts, David N; Mendes, Fausto Medeiros
2013-01-01
Fluorescence-based methods have been proposed to aid caries lesion detection. Summarizing and analysing findings of studies about fluorescence-based methods could clarify their real benefits. We aimed to perform a comprehensive systematic review and meta-analysis to evaluate the accuracy of fluorescence-based methods in detecting caries lesions. Two independent reviewers searched PubMed, Embase and Scopus through June 2012 to identify papers/articles published. Other sources were checked to identify non-published literature. STUDY ELIGIBILITY CRITERIA, PARTICIPANTS AND DIAGNOSTIC METHODS: The eligibility criteria were studies that: (1) have assessed the accuracy of fluorescence-based methods of detecting caries lesions on occlusal, approximal or smooth surfaces, in both primary or permanent human teeth, in the laboratory or clinical setting; (2) have used a reference standard; and (3) have reported sufficient data relating to the sample size and the accuracy of methods. A diagnostic 2×2 table was extracted from included studies to calculate the pooled sensitivity, specificity and overall accuracy parameters (Diagnostic Odds Ratio and Summary Receiver-Operating curve). The analyses were performed separately for each method and different characteristics of the studies. The quality of the studies and heterogeneity were also evaluated. Seventy five studies met the inclusion criteria from the 434 articles initially identified. The search of the grey or non-published literature did not identify any further studies. In general, the analysis demonstrated that the fluorescence-based method tend to have similar accuracy for all types of teeth, dental surfaces or settings. There was a trend of better performance of fluorescence methods in detecting more advanced caries lesions. We also observed moderate to high heterogeneity and evidenced publication bias. Fluorescence-based devices have similar overall performance; however, better accuracy in detecting more advanced caries lesions has been observed.
Detection of person borne IEDs using multiple cooperative sensors
NASA Astrophysics Data System (ADS)
MacIntosh, Scott; Deming, Ross; Hansen, Thorkild; Kishan, Neel; Tang, Ling; Shea, Jing; Lang, Stephen
2011-06-01
The use of multiple cooperative sensors for the detection of person borne IEDs is investigated. The purpose of the effort is to evaluate the performance benefits of adding multiple sensor data streams into an aided threat detection algorithm, and a quantitative analysis of which sensor data combinations improve overall detection performance. Testing includes both mannequins and human subjects with simulated suicide bomb devices of various configurations, materials, sizes and metal content. Aided threat recognition algorithms are being developed to test detection performance of individual sensors against combined fused sensors inputs. Sensors investigated include active and passive millimeter wave imaging systems, passive infrared, 3-D profiling sensors and acoustic imaging. The paper describes the experimental set-up and outlines the methodology behind a decision fusion algorithm-based on the concept of a "body model".
Cao, Hongyou; Liu, Quanmin; Wahab, Magd Abdel
2017-01-01
Output-based structural damage detection is becoming increasingly appealing due to its potential in real engineering applications without any restriction regarding excitation measurements. A new transmissibility-based damage detection approach is presented in this study by combining transmissibility with correlation analysis in order to strengthen its performance in discriminating damaged from undamaged scenarios. From this perspective, damage detection strategies are hereafter established by constructing damage-sensitive indicators from a derived transmissibility. A cantilever beam is numerically analyzed to verify the feasibility of the proposed damage detection procedure, and an ASCE (American Society of Civil Engineers) benchmark is henceforth used in the validation for its application in engineering structures. The results of both studies reveal a good performance of the proposed methodology in identifying damaged states from intact states. The comparison between the proposed indicator and the existing indicator also affirms its applicability in damage detection, which might be adopted in further structural health monitoring systems as a discrimination criterion. This study contributed an alternative criterion for transmissibility-based damage detection in addition to the conventional ones. PMID:28773218
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377
Bevilacqua, Elisa; Jani, Jacques C; Letourneau, Alexandra; Duiella, Silvia F; Kleinfinger, Pascale; Lohmann, Laurence; Resta, Serena; Cos Sanchez, Teresa; Fils, Jean-François; Mirra, Marilyn; Benachi, Alexandra; Costa, Jean-Marc
2018-06-13
To evaluate the failure rate and performance of cell-free DNA (cfDNA) testing, mainly in terms of detection rates for trisomy 21, performed by 2 laboratories using different analytical methods. cfDNA testing was performed on 2,870 pregnancies with the HarmonyTM Prenatal Test using the targeted digital analysis of selected regions (DANSR) method, and on 2,635 pregnancies with the "Cerba test" using the genome-wide massively parallel sequencing (GW-MPS) method, with available outcomes. Propensity score analysis was used to match patients between the 2 groups. A comparison of the detection rates for trisomy 21 between the 2 laboratories was made. In all, 2,811 patients in the Harmony group and 2,530 patients in the Cerba group had no trisomy 21, 18, or 13. Postmatched comparisons of the patient characteristics indicated a higher no-result rate in the Harmony group (1.30%) than in the Cerba group (0.75%; p = 0.039). All 41 cases of trisomy 21 in the Harmony group and 93 cases in the Cerba group were detected. Both methods of cfDNA testing showed low no-result rates and a comparable performance in detecting trisomy 21; yet GW-MPS had a slightly lower no-result rate than the DANSR method. © 2018 S. Karger AG, Basel.
SSME propellant path leak detection real-time
NASA Technical Reports Server (NTRS)
Crawford, R. A.; Smith, L. M.
1994-01-01
Included are four documents that outline the technical aspects of the research performed on NASA Grant NAG8-140: 'A System for Sequential Step Detection with Application to Video Image Processing'; 'Leak Detection from the SSME Using Sequential Image Processing'; 'Digital Image Processor Specifications for Real-Time SSME Leak Detection'; and 'A Color Change Detection System for Video Signals with Applications to Spectral Analysis of Rocket Engine Plumes'.
Cell edge detection in JPEG2000 wavelet domain - analysis on sigmoid function edge model.
Punys, Vytenis; Maknickas, Ramunas
2011-01-01
Big virtual microscopy images (80K x 60K pixels and larger) are usually stored using the JPEG2000 image compression scheme. Diagnostic quantification, based on image analysis, might be faster if performed on compressed data (approx. 20 times less the original amount), representing the coefficients of the wavelet transform. The analysis of possible edge detection without reverse wavelet transform is presented in the paper. Two edge detection methods, suitable for JPEG2000 bi-orthogonal wavelets, are proposed. The methods are adjusted according calculated parameters of sigmoid edge model. The results of model analysis indicate more suitable method for given bi-orthogonal wavelet.
Funari, Mariana F A; Jorge, Alexander A L; Pinto, Emilia M; Arnhold, Ivo J P; Mendonca, Berenice B; Nishi, Mirian Y
2008-11-01
LWD is associated to SHOX haploinsufficiency, in most cases, due to gene deletion. Generally FISH and microsatellite analysis are used to identify SHOX deletion. MLPA is a new method of detecting gene copy variation, allowing simultaneous analysis of several regions. Here we describe the presence of a SHOX intragenic deletion in a family with LWD, analyzed through different methodologies. Genomic DNA of 11 subjects from one family were studied by microsatellite analysis, direct sequencing and MLPA. FISH was performed in two affected individuals. Microsatellite analysis showed that all affected members shared the same haplotype suggesting the involvement of SHOX. MLPA detected an intragenic deletion involving exons IV-VIa, which was not detected by FISH and microsatellite analysis. In conclusion, the MLPA technique was proved to be the best solution on detecting this small deletion, it has the advantage of being less laborious also allowing the analysis of several regions simultaneously.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
Noh, Yun Hong; Jeong, Do Un
2014-07-15
In this paper, a packet generator using a pattern matching algorithm for real-time abnormal heartbeat detection is proposed. The packet generator creates a very small data packet which conveys sufficient crucial information for health condition analysis. The data packet envelopes real time ECG signals and transmits them to a smartphone via Bluetooth. An Android application was developed specifically to decode the packet and extract ECG information for health condition analysis. Several graphical presentations are displayed and shown on the smartphone. We evaluate the performance of abnormal heartbeat detection accuracy using the MIT/BIH Arrhythmia Database and real time experiments. The experimental result confirm our finding that abnormal heart beat detection is practically possible. We also performed data compression ratio and signal restoration performance evaluations to establish the usefulness of the proposed packet generator and the results were excellent.
Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.
Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei
2016-01-01
Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.
High-performance liquid chromatography analysis of plant saponins: An update 2005-2010
Negi, Jagmohan S.; Singh, Pramod; Pant, Geeta Joshi Nee; Rawat, M. S. M.
2011-01-01
Saponins are widely distributed in plant kingdom. In view of their wide range of biological activities and occurrence as complex mixtures, saponins have been purified and separated by high-performance liquid chromatography using reverse-phase columns at lower wavelength. Mostly, saponins are not detected by ultraviolet detector due to lack of chromophores. Electrospray ionization mass spectrometry, diode array detector , evaporative light scattering detection, and charged aerosols have been used for overcoming the detection problem of saponins. PMID:22303089
Microwave photonic link with improved phase noise using a balanced detection scheme
NASA Astrophysics Data System (ADS)
Hu, Jingjing; Gu, Yiying; Tan, Wengang; Zhu, Wenwu; Wang, Linghua; Zhao, Mingshan
2016-07-01
A microwave photonic link (MPL) with improved phase noise performance using a dual output Mach-Zehnder modulator (DP-MZM) and balanced detection is proposed and experimentally demonstrated. The fundamental concept of the approach is based on the two complementary outputs of DP-MZM and the destructive combination of the photocurrent in balanced photodetector (BPD). Theoretical analysis is performed to numerical evaluate the additive phase noise performance and shows a good agreement with the experiment. Experimental results are presented for 4 GHz, 8 GHz and 12 GHz transmission link and an 11 dB improvement of phase noise performance at 10 MHz offset is achieved compared to the conventional intensity-modulation and direct-detection (IMDD) MPL.
Xu, Ren; Jiang, Ning; Mrachacz-Kersting, Natalie; Dremstrup, Kim; Farina, Dario
2016-01-01
Brain-computer interfacing (BCI) has recently been applied as a rehabilitation approach for patients with motor disorders, such as stroke. In these closed-loop applications, a brain switch detects the motor intention from brain signals, e.g., scalp EEG, and triggers a neuroprosthetic device, either to deliver sensory feedback or to mimic real movements, thus re-establishing the compromised sensory-motor control loop and promoting neural plasticity. In this context, single trial detection of motor intention with short latency is a prerequisite. The performance of the event detection from EEG recordings is mainly determined by three factors: the type of motor imagery (e.g., repetitive, ballistic), the frequency band (or signal modality) used for discrimination (e.g., alpha, beta, gamma, and MRCP, i.e., movement-related cortical potential), and the processing technique (e.g., time-series analysis, sub-band power estimation). In this study, we investigated single trial EEG traces during movement imagination on healthy individuals, and provided a comprehensive analysis of the performance of a short-latency brain switch when varying these three factors. The morphological investigation showed a cross-subject consistency of a prolonged negative phase in MRCP, and a delayed beta rebound in sensory-motor rhythms during repetitive tasks. The detection performance had the greatest accuracy when using ballistic MRCP with time-series analysis. In this case, the true positive rate (TPR) was ~70% for a detection latency of ~200 ms. The results presented here are of practical relevance for designing BCI systems for motor function rehabilitation. PMID:26834551
NASA Technical Reports Server (NTRS)
Rick, R. C.; Lushbaugh, C. C.; Mcdow, E.; Frome, E.
1972-01-01
Changes in respiratory variance revealed by power spectral analysis of the pulmonary impedance pneumogram can be used to detect and measure stresses directly or indirectly affecting human respiratory function. When gastrointestinal distress occurred during a series of 5 total-body exposures of 30 R at a rate of 1.5 R/min, it was accompanied by typical shifts in pulmonary impedance power spectra. These changes did not occur after protracted exposure of 250 R (30 R daily) at 1.5 R/hr that failed to cause radiation sickness. This system for quantitating respiratory effort can also be used to detect alterations in one's ability to perform under controlled exercise conditions.
Video content analysis of surgical procedures.
Loukas, Constantinos
2018-02-01
In addition to its therapeutic benefits, minimally invasive surgery offers the potential for video recording of the operation. The videos may be archived and used later for reasons such as cognitive training, skills assessment, and workflow analysis. Methods from the major field of video content analysis and representation are increasingly applied in the surgical domain. In this paper, we review recent developments and analyze future directions in the field of content-based video analysis of surgical operations. The review was obtained from PubMed and Google Scholar search on combinations of the following keywords: 'surgery', 'video', 'phase', 'task', 'skills', 'event', 'shot', 'analysis', 'retrieval', 'detection', 'classification', and 'recognition'. The collected articles were categorized and reviewed based on the technical goal sought, type of surgery performed, and structure of the operation. A total of 81 articles were included. The publication activity is constantly increasing; more than 50% of these articles were published in the last 3 years. Significant research has been performed for video task detection and retrieval in eye surgery. In endoscopic surgery, the research activity is more diverse: gesture/task classification, skills assessment, tool type recognition, shot/event detection and retrieval. Recent works employ deep neural networks for phase and tool recognition as well as shot detection. Content-based video analysis of surgical operations is a rapidly expanding field. Several future prospects for research exist including, inter alia, shot boundary detection, keyframe extraction, video summarization, pattern discovery, and video annotation. The development of publicly available benchmark datasets to evaluate and compare task-specific algorithms is essential.
Pirulli, D; Giordano, M; Lessi, M; Spanò, A; Puzzer, D; Zezlina, S; Boniotto, M; Crovella, S; Florian, F; Marangella, M; Momigliano-Richiardi, P; Savoldi, S; Amoroso, A
2001-06-01
Primary hyperoxaluria type 1 is an autosomal recessive disorder of glyoxylate metabolism, caused by a deficiency of alanine:glyoxylate aminotransferase, which is encoded by a single copy gene (AGXT. The aim of this research was to standardize denaturing high-performance liquid chromatography, a new, sensitive, relatively inexpensive, and automated technique, for the detection of AGXT mutation. Denaturing high-performance liquid chromatography was used to analyze in blind the AGXT gene in 20 unrelated Italian patients with primary hyperoxaluria type I previously studied by other standard methods (single-strand conformation polymorphism analysis and direct sequencing) and 50 controls. Denaturing high-performance liquid chromatography allowed us to identify 13 mutations and the polymorphism at position 154 in exon I of the AGXT gene. Hence the method is more sensitive and less time consuming than single-strand conformation polymorphism analysis for the detection of AGXT mutations, thus representing a useful and reliable tool for detecting the mutations responsible for primary hyperoxaluria type 1. The new technology could also be helpful in the search for healthy carriers of AGXT mutations amongst family members and their partners, and for screening of AGXT polymorphisms in patients with nephrolithiasis and healthy populations.
Detection of rebar delamination using modal analysis
NASA Astrophysics Data System (ADS)
Blodgett, David W.
2003-08-01
A non-destructive method for early detection of reinforcement steel bars (re-bar) delamination in concrete structures has been developed. This method, termed modal analysis, has been shown effective in both laboratory and field experiments. In modal analysis, an audio speaker is used to generate flexural resonant modes in the re-bar in reinforced concrete structures. Vibrations associated with these modes are coupled to the surrounding concrete and propagate to the surface where they are detected using a laser vibrometer and/or accelerometer. Monitoring both the frequency and amplitude of these vibrations provides information on the bonding state of the embedded re-bar. Laboratory measurements were performed on several specially prepared concrete blocks with re-bar of varying degrees of simulated corrosion. Field measurements were performed on an old bridge about to be torn down in Howard County, Maryland and the results compared with those obtained using destructive analysis of the bridge after demolition. Both laboratory and field test results show this technique to be sensitive to re-bar delamination.
In-injection port thermal desorption for explosives trace evidence analysis.
Sigman, M E; Ma, C Y
1999-10-01
A gas chromatographic method utilizing thermal desorption of a dry surface wipe for the analysis of explosives trace chemical evidence has been developed and validated using electron capture and negative ion chemical ionization mass spectrometric detection. Thermal desorption was performed within a split/splitless injection port with minimal instrument modification. Surface-abraded Teflon tubing provided the solid support for sample collection and desorption. Performance was characterized by desorption efficiency, reproducibility, linearity of the calibration, and method detection and quantitation limits. Method validation was performed with a series of dinitrotoluenes, trinitrotoluene, two nitroester explosives, and one nitramine explosive. The method was applied to the sampling of a single piece of debris from an explosion containing trinitrotoluene.
Nilsson, Björn; Håkansson, Petra; Johansson, Mikael; Nelander, Sven; Fioretos, Thoas
2007-01-01
Ontological analysis facilitates the interpretation of microarray data. Here we describe new ontological analysis methods which, unlike existing approaches, are threshold-free and statistically powerful. We perform extensive evaluations and introduce a new concept, detection spectra, to characterize methods. We show that different ontological analysis methods exhibit distinct detection spectra, and that it is critical to account for this diversity. Our results argue strongly against the continued use of existing methods, and provide directions towards an enhanced approach. PMID:17488501
A New Pivoting and Iterative Text Detection Algorithm for Biomedical Images
Xu, Songhua; Krauthammer, Michael
2010-01-01
There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper’s key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. In this paper, we demonstrate that a projection histogram-based text detection approach is well suited for text detection in biomedical images, with a performance of F score of .60. The approach performs better than comparable approaches for text detection. Further, we show that the iterative application of the algorithm is boosting overall detection performance. A C++ implementation of our algorithm is freely available through email request for academic use. PMID:20887803
Subspace Compressive GLRT Detector for MIMO Radar in the Presence of Clutter.
Bolisetti, Siva Karteek; Patwary, Mohammad; Ahmed, Khawza; Soliman, Abdel-Hamid; Abdel-Maguid, Mohamed
2015-01-01
The problem of optimising the target detection performance of MIMO radar in the presence of clutter is considered. The increased false alarm rate which is a consequence of the presence of clutter returns is known to seriously degrade the target detection performance of the radar target detector, especially under low SNR conditions. In this paper, a mathematical model is proposed to optimise the target detection performance of a MIMO radar detector in the presence of clutter. The number of samples that are required to be processed by a radar target detector regulates the amount of processing burden while achieving a given detection reliability. While Subspace Compressive GLRT (SSC-GLRT) detector is known to give optimised radar target detection performance with reduced computational complexity, it however suffers a significant deterioration in target detection performance in the presence of clutter. In this paper we provide evidence that the proposed mathematical model for SSC-GLRT detector outperforms the existing detectors in the presence of clutter. The performance analysis of the existing detectors and the proposed SSC-GLRT detector for MIMO radar in the presence of clutter are provided in this paper.
Paleologos, E K; Kontominas, M G
2005-06-10
A method using normal phase high performance liquid chromatography (NP-HPLC) with UV detection was developed for the analysis of acrylamide and methacrylamide. The method relies on the chromatographic separation of these analytes on a polar HPLC column designed for the separation of organic acids. Identification of acrylamide and methacrylamide is approached dually, that is directly in their protonated forms and as their hydrolysis products acrylic and methacrylic acid respectively, for confirmation. Detection and quantification is performed at 200 nm. The method is simple allowing for clear resolution of the target peaks from any interfering substances. Detection limits of 10 microg L(-1) were obtained for both analytes with the inter- and intra-day RSD for standard analysis lying below 1.0%. Use of acetonitrile in the elution solvent lowers detection limits and retention times, without impairing resolution of peaks. The method was applied for the determination of acrylamide and methacrylamide in spiked food samples without native acrylamide yielding recoveries between 95 and 103%. Finally, commercial samples of french and roasted fries, cookies, cocoa and coffee were analyzed to assess applicability of the method towards acrylamide, giving results similar with those reported in the literature.
A model of human event detection in multiple process monitoring situations
NASA Technical Reports Server (NTRS)
Greenstein, J. S.; Rouse, W. B.
1978-01-01
It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.
Li, Zibo; Guo, Xinwu; Tang, Lili; Peng, Limin; Chen, Ming; Luo, Xipeng; Wang, Shouman; Xiao, Zhi; Deng, Zhongping; Dai, Lizhong; Xia, Kun; Wang, Jun
2016-10-01
Circulating cell-free DNA (cfDNA) has been considered as a potential biomarker for non-invasive cancer detection. To evaluate the methylation levels of six candidate genes (EGFR, GREM1, PDGFRB, PPM1E, SOX17, and WRN) in plasma cfDNA as biomarkers for breast cancer early detection, quantitative analysis of the promoter methylation of these genes from 86 breast cancer patients and 67 healthy controls was performed by using microfluidic-PCR-based target enrichment and next-generation bisulfite sequencing technology. The predictive performance of different logistic models based on methylation status of candidate genes was investigated by means of the area under the ROC curve (AUC) and odds ratio (OR) analysis. Results revealed that EGFR, PPM1E, and 8 gene-specific CpG sites showed significantly hypermethylation in cancer patients' plasma and significantly associated with breast cancer (OR ranging from 2.51 to 9.88). The AUC values for these biomarkers were ranging from 0.66 to 0.75. Combinations of multiple hypermethylated genes or CpG sites substantially improved the predictive performance for breast cancer detection. Our study demonstrated the feasibility of quantitative measurement of candidate gene methylation in cfDNA by using microfluidic-PCR-based target enrichment and bisulfite next-generation sequencing, which is worthy of further validation and potentially benefits a broad range of applications in clinical oncology practice. Quantitative analysis of methylation pattern of plasma cfDNA by next-generation sequencing might be a valuable non-invasive tool for early detection of breast cancer.
Optimizing a neural network for detection of moving vehicles in video
NASA Astrophysics Data System (ADS)
Fischer, Noëlle M.; Kruithof, Maarten C.; Bouma, Henri
2017-10-01
In the field of security and defense, it is extremely important to reliably detect moving objects, such as cars, ships, drones and missiles. Detection and analysis of moving objects in cameras near borders could be helpful to reduce illicit trading, drug trafficking, irregular border crossing, trafficking in human beings and smuggling. Many recent benchmarks have shown that convolutional neural networks are performing well in the detection of objects in images. Most deep-learning research effort focuses on classification or detection on single images. However, the detection of dynamic changes (e.g., moving objects, actions and events) in streaming video is extremely relevant for surveillance and forensic applications. In this paper, we combine an end-to-end feedforward neural network for static detection with a recurrent Long Short-Term Memory (LSTM) network for multi-frame analysis. We present a practical guide with special attention to the selection of the optimizer and batch size. The end-to-end network is able to localize and recognize the vehicles in video from traffic cameras. We show an efficient way to collect relevant in-domain data for training with minimal manual labor. Our results show that the combination with LSTM improves performance for the detection of moving vehicles.
miRNAs as biomarkers for diagnosis of heart failure: A systematic review and meta-analysis.
Yan, Hualin; Ma, Fan; Zhang, Yi; Wang, Chuan; Qiu, Dajian; Zhou, Kaiyu; Hua, Yimin; Li, Yifei
2017-06-01
With the rapid development of molecular biology, the kind of mircoRNA (miRNA) has been introduced into emerging role both in cardiac development and pathological procedure. Thus, we conduct this meta-analysis to find out the role of circulating miRNA as a biomarker in detecting heart failure. We searched PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, and World Health Organization clinical trials registry center to identify relevant studies up to August 2016. We performed meta-analysis in a fixed/random-effect model using Meta-disc 1.4. We used STATA 14.0 to estimate the publication bias and meta-regression. Besides, we took use of SPSS 17.0 to evaluate variance between several groups. Information on true positive, false positive, false negative, and true negative, as well as the quality of research was extracted. We use results from 10 articles to analyze the pooled accuracy. The overall performance of total mixed miRNAs (TmiRs) detection was: pooled sensitivity, 0.74 (95% confidence interval [CI], 0.72 to 0.75); pooled specificity, 0.69 (95%CI, 0.67 to 0.71); and area under the summary receiver operating characteristic curves value (SROC), 0.7991. The miRNA-423-5p (miR-423-5p) detection was: pooled sensitivity, 0.81 (95%CI, 0.76 to 0.85); pooled specificity, 0.67 (95%CI, 0.61 to 0.73); and SROC, 0.8600. However, taken the same patients population, we extracted the data of BNP for detecting heart failure and performed meta-analysis with acceptable SROC as 0.9291. Among the variance analysis, the diagnostic performance of miR-423-5p claimed significant advantages of other pooled results. However, the combination of miRNAs and BNP could increase the accuracy of detecting of heart failure. Unfortunately, there was no dramatic advantage of miR-423-5p compared to BNP protocol. Despite interstudy variability, the performance test of miRNA for detecting heart failure revealed that miR-423-5p demonstrated the potential to be a biomarker. However, other miRNAs were not able to provide enough evidence on promising diagnostic value for heart failure based on the current data. Moreover, the combination of miRNAs and BNP could work as a better method to detection. Unfortunately, BNP was still the most convinced biomarker for such disease.
Cai, Pei-Shan; Li, Dan; Chen, Jing; Xiong, Chao-Mei; Ruan, Jin-Lan
2015-04-15
Two thin-film microextractions (TFME), octadecylsilane (ODS)-polyacrylonitrile (PAN)-TFME and polar enhanced phase (PEP)-PAN-TFME have been proposed for the analysis of bisphenol-A, diethylstilbestrol and 17β-estradiol in aqueous tea extract and environmental water samples followed by high performance liquid chromatography-ultraviolet detection. Both thin-films were prepared by spraying. The influencing factors including pH, extraction time, desorption solvent, desorption volume, desorption time, ion strength and reusability were investigated. Under the optimal conditions, the two TFME methods are similar in terms of the analytical performance evaluated by standard addition method. The limits of detection for three estrogens in environmental water and aqueous tea extract matrix ranged from 1.3 to 1.6 and 2.8 to 7.1 ng mL(-1) by the two TFME methods, respectively. Both approaches were applied for the analysis of analytes in real aqueous tea extract and environmental water samples, presenting satisfactory recoveries ranged from 87.3% to 109.4% for the spiked samples. Copyright © 2014 Elsevier Ltd. All rights reserved.
Document reconstruction by layout analysis of snippets
NASA Astrophysics Data System (ADS)
Kleber, Florian; Diem, Markus; Sablatnig, Robert
2010-02-01
Document analysis is done to analyze entire forms (e.g. intelligent form analysis, table detection) or to describe the layout/structure of a document. Also skew detection of scanned documents is performed to support OCR algorithms that are sensitive to skew. In this paper document analysis is applied to snippets of torn documents to calculate features for the reconstruction. Documents can either be destroyed by the intention to make the printed content unavailable (e.g. tax fraud investigation, business crime) or due to time induced degeneration of ancient documents (e.g. bad storage conditions). Current reconstruction methods for manually torn documents deal with the shape, inpainting and texture synthesis techniques. In this paper the possibility of document analysis techniques of snippets to support the matching algorithm by considering additional features are shown. This implies a rotational analysis, a color analysis and a line detection. As a future work it is planned to extend the feature set with the paper type (blank, checked, lined), the type of the writing (handwritten vs. machine printed) and the text layout of a snippet (text size, line spacing). Preliminary results show that these pre-processing steps can be performed reliably on a real dataset consisting of 690 snippets.
A case study in nonconformance and performance trend analysis
NASA Technical Reports Server (NTRS)
Maloy, Joseph E.; Newton, Coy P.
1990-01-01
As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.
Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang
2017-01-01
Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.
Drunk driving detection based on classification of multivariate time series.
Li, Zhenlong; Jin, Xue; Zhao, Xiaohua
2015-09-01
This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Usefulness of MLPA in the detection of SHOX deletions.
Funari, Mariana F A; Jorge, Alexander A L; Souza, Silvia C A L; Billerbeck, Ana E C; Arnhold, Ivo J P; Mendonca, Berenice B; Nishi, Mirian Y
2010-01-01
SHOX haploinsufficiency causes a wide spectrum of short stature phenotypes, such as Leri-Weill dyschondrosteosis (LWD) and disproportionate short stature (DSS). SHOX deletions are responsible for approximately two thirds of isolated haploinsufficiency; therefore, it is important to determine the most appropriate methodology for detection of gene deletion. In this study, three methodologies for the detection of SHOX deletions were compared: the fluorescence in situ hybridization (FISH), microsatellite analysis and multiplex ligation-dependent probe amplification (MLPA). Forty-four patients (8 LWD and 36 DSS) were analyzed. The cosmid LLNOYCO3'M'34F5 was used as a probe for the FISH analysis and microsatellite analysis were performed using three intragenic microsatellite markers. MLPA was performed using commercial kits. Twelve patients (8 LWD and 4 DSS) had deletions in SHOX area detected by MLPA and 2 patients generated discordant results with the other methodologies. In the first case, the deletion was not detected by FISH. In the second case, both FISH and microsatellite analyses were unable to identify the intragenic deletion. In conclusion, MLPA was more sensitive, less expensive and less laborious; therefore, it should be used as the initial molecular method for the detection of SHOX gene deletion. Copyright © 2010 Elsevier Masson SAS. All rights reserved.
Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.
Latha, Indu; Reichenbach, Stephen E; Tao, Qingping
2011-09-23
Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Xue, Zhiyun; Antani, Sameer; Long, L. Rodney; Jeronimo, Jose; Thoma, George R.
2007-03-01
Cervicography is a technique for visual screening of uterine cervix images for cervical cancer. One of our research goals is the automated detection in these images of acetowhite (AW) lesions, which are sometimes correlated with cervical cancer. These lesions are characterized by the whitening of regions along the squamocolumnar junction on the cervix when treated with 5% acetic acid. Image preprocessing is required prior to invoking AW detection algorithms on cervicographic images for two reasons: (1) to remove Specular Reflections (SR) caused by camera flash, and (2) to isolate the cervix region-of-interest (ROI) from image regions that are irrelevant to the analysis. These image regions may contain medical instruments, film markup, or other non-cervix anatomy or regions, such as vaginal walls. We have qualitatively and quantitatively evaluated the performance of alternative preprocessing algorithms on a test set of 120 images. For cervix ROI detection, all approaches use a common feature set, but with varying combinations of feature weights, normalization, and clustering methods. For SR detection, while one approach uses a Gaussian Mixture Model on an intensity/saturation feature set, a second approach uses Otsu thresholding on a top-hat transformed input image. Empirical results are analyzed to derive conclusions on the performance of each approach.
Monir, Md. Mamun; Zhu, Jun
2017-01-01
Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101
NASA Astrophysics Data System (ADS)
Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang
2018-04-01
Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.
Doesch, Christina; Papavassiliu, Theano; Michaely, Henrik J; Attenberger, Ulrike I; Glielmi, Christopher; Süselbeck, Tim; Fink, Christian; Borggrefe, Martin; Schoenberg, Stefan O
2013-09-01
The purpose of this study was to compare automated, motion-corrected, color-encoded (AMC) perfusion maps with qualitative visual analysis of adenosine stress cardiovascular magnetic resonance imaging for detection of flow-limiting stenoses. Myocardial perfusion measurements applying the standard adenosine stress imaging protocol and a saturation-recovery temporal generalized autocalibrating partially parallel acquisition (t-GRAPPA) turbo fast low angle shot (Turbo FLASH) magnetic resonance imaging sequence were performed in 25 patients using a 3.0-T MAGNETOM Skyra (Siemens Healthcare Sector, Erlangen, Germany). Perfusion studies were analyzed using AMC perfusion maps and qualitative visual analysis. Angiographically detected coronary artery (CA) stenoses greater than 75% or 50% or more with a myocardial perfusion reserve index less than 1.5 were considered as hemodynamically relevant. Diagnostic performance and time requirement for both methods were compared. Interobserver and intraobserver reliability were also assessed. A total of 29 CA stenoses were included in the analysis. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy for detection of ischemia on a per-patient basis were comparable using the AMC perfusion maps compared to visual analysis. On a per-CA territory basis, the attribution of an ischemia to the respective vessel was facilitated using the AMC perfusion maps. Interobserver and intraobserver reliability were better for the AMC perfusion maps (concordance correlation coefficient, 0.94 and 0.93, respectively) compared to visual analysis (concordance correlation coefficient, 0.73 and 0.79, respectively). In addition, in comparison to visual analysis, the AMC perfusion maps were able to significantly reduce analysis time from 7.7 (3.1) to 3.2 (1.9) minutes (P < 0.0001). The AMC perfusion maps yielded a diagnostic performance on a per-patient and on a per-CA territory basis comparable with the visual analysis. Furthermore, this approach demonstrated higher interobserver and intraobserver reliability as well as a better time efficiency when compared to visual analysis.
Lee, Jack; Zee, Benny Chung Ying; Li, Qing
2013-01-01
Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.
Yang, Li; Wang, Guobao; Qi, Jinyi
2016-04-01
Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.
Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin
2017-04-01
In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Grewal, Dilraj S; Tanna, Angelo P
2013-03-01
With the rapid adoption of spectral domain optical coherence tomography (SDOCT) in clinical practice and the recent advances in software technology, there is a need for a review of the literature on glaucoma detection and progression analysis algorithms designed for the commercially available instruments. Peripapillary retinal nerve fiber layer (RNFL) thickness and macular thickness, including segmental macular thickness calculation algorithms, have been demonstrated to be repeatable and reproducible, and have a high degree of diagnostic sensitivity and specificity in discriminating between healthy and glaucomatous eyes across the glaucoma continuum. Newer software capabilities such as glaucoma progression detection algorithms provide an objective analysis of longitudinally obtained structural data that enhances our ability to detect glaucomatous progression. RNFL measurements obtained with SDOCT appear more sensitive than time domain OCT (TDOCT) for glaucoma progression detection; however, agreement with the assessments of visual field progression is poor. Over the last few years, several studies have been performed to assess the diagnostic performance of SDOCT structural imaging and its validity in assessing glaucoma progression. Most evidence suggests that SDOCT performs similarly to TDOCT for glaucoma diagnosis; however, SDOCT may be superior for the detection of early stage disease. With respect to progression detection, SDOCT represents an important technological advance because of its improved resolution and repeatability. Advancements in RNFL thickness quantification, segmental macular thickness calculation and progression detection algorithms, when used correctly, may help to improve our ability to diagnose and manage glaucoma.
Integrating Oil Debris and Vibration Gear Damage Detection Technologies Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.; Afjeh, Abdollah A.
2002-01-01
A diagnostic tool for detecting damage to spur gears was developed. Two different measurement technologies, wear debris analysis and vibration, were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Test Rig. Experimental data were collected during experiments performed in this test rig with and without pitting. Results show combining the two measurement technologies improves the detection of pitting damage on spur gears.
On the performance of energy detection-based CR with SC diversity over IG channel
NASA Astrophysics Data System (ADS)
Verma, Pappu Kumar; Soni, Sanjay Kumar; Jain, Priyanka
2017-12-01
Cognitive radio (CR) is a viable 5G technology to address the scarcity of the spectrum. Energy detection-based sensing is known to be the simplest method as far as hardware complexity is concerned. In this paper, the performance of spectrum sensing-based energy detection technique in CR networks over inverse Gaussian channel for selection combining diversity technique is analysed. More specifically, accurate analytical expressions for the average detection probability under different detection scenarios such as single channel (no diversity) and with diversity reception are derived and evaluated. Further, the detection threshold parameter is optimised by minimising the probability of error over several diversity branches. The results clearly show the significant improvement in the probability of detection when optimised threshold parameter is applied. The impact of shadowing parameters on the performance of energy detector is studied in terms of complimentary receiver operating characteristic curve. To verify the correctness of our analysis, the derived analytical expressions are corroborated via exact result and Monte Carlo simulations.
Optimizing Probability of Detection Point Estimate Demonstration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
The visual analysis of emotional actions.
Chouchourelou, Arieta; Matsuka, Toshihiko; Harber, Kent; Shiffrar, Maggie
2006-01-01
Is the visual analysis of human actions modulated by the emotional content of those actions? This question is motivated by a consideration of the neuroanatomical connections between visual and emotional areas. Specifically, the superior temporal sulcus (STS), known to play a critical role in the visual detection of action, is extensively interconnected with the amygdala, a center for emotion processing. To the extent that amygdala activity influences STS activity, one would expect to find systematic differences in the visual detection of emotional actions. A series of psychophysical studies tested this prediction. Experiment 1 identified point-light walker movies that convincingly depicted five different emotional states: happiness, sadness, neutral, anger, and fear. In Experiment 2, participants performed a walker detection task with these movies. Detection performance was systematically modulated by the emotional content of the gaits. Participants demonstrated the greatest visual sensitivity to angry walkers. The results of Experiment 3 suggest that local velocity cues to anger may account for high false alarm rates to the presence of angry gaits. These results support the hypothesis that the visual analysis of human action depends upon emotion processes.
Tsao, Chia-Wen; Yang, Zhi-Jie
2015-10-14
Desorption/ionization on silicon (DIOS) is a high-performance matrix-free mass spectrometry (MS) analysis method that involves using silicon nanostructures as a matrix for MS desorption/ionization. In this study, gold nanoparticles grafted onto a nanostructured silicon (AuNPs-nSi) surface were demonstrated as a DIOS-MS analysis approach with high sensitivity and high detection specificity for glucose detection. A glucose sample deposited on the AuNPs-nSi surface was directly catalyzed to negatively charged gluconic acid molecules on a single AuNPs-nSi chip for MS analysis. The AuNPs-nSi surface was fabricated using two electroless deposition steps and one electroless etching step. The effects of the electroless fabrication parameters on the glucose detection efficiency were evaluated. Practical application of AuNPs-nSi MS glucose analysis in urine samples was also demonstrated in this study.
NASA Astrophysics Data System (ADS)
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Cautela, Domenico; Laratta, Bruna; Santelli, Francesca; Trifirò, Antonio; Servillo, Luigi; Castaldo, Domenico
2008-07-09
The chemical composition of 30 samples of juices obtained from bergamot (Citrus bergamia Risso and Poit.) fruits is reported and compared to the genuineness parameters adopted by Association of the Industry of Juice and Nectars (AIJN) for lemon juice. It was found that the compositional differences between the two juices are distinguishable, although with difficulty. However, these differences are not strong enough to detect the fraudulent addition of bergamot juice to lemon juice. Instead, we found the high-performance liquid chromatography (HPLC) analysis of the flavanones naringin, neohesperidin, and neoeriocitrin, which are present in bergamot juice and practically absent in the lemon juice, is a convenient way to detect and quantify the fraudulent addition of bergamot juice. The method has been validated by calculating the detection and quantification limits according to Eurachem procedures. Employing neoeriocitrin (detection limit = 0.7 mg/L) and naringin (detection limit = 1 mg/L) as markers, it is possible to detect the addition of bergamot juice to lemon juice at the 1% level. When using neohesperidin as a marker (detection limit = 1 mg/L), the minimal percentage of detectable addition of bergamot juice was about 2%. Finally, it is reported that the pattern of flavonoid content of the bergamot juice is similar to those of chinotto (Citrus myrtifolia Raf) and bitter orange (Citrus aurantium L.) juices and that it is possible to distinguish the three kinds of juices by HPLC analysis.
Photoelectrochemical detection of benzaldehyde in foodstuffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaCourse, W.R.; Krull, I.S.
Photoelectrochemical detection (PED) coupled with high performance liquid chromatography was used to quantitatively determine benzaldehyde in extracts, beverages, and foodstuffs. Photoelectrochemical detection is responsive to alkyl and aryl ketones and aldehydes and offers the advantages of 2-3 orders of magnitude linearity, 5-1-ng limits of detection, and a high degree of selectivity without chemical derivatization. This is the first application of the PED to sample analysis.
NASA Astrophysics Data System (ADS)
Allec, N.; Abbaszadeh, S.; Scott, C. C.; Lewin, J. M.; Karim, K. S.
2012-12-01
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
NASA Astrophysics Data System (ADS)
Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Voit, Michael; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen
2017-05-01
Real-time motion video analysis is a challenging and exhausting task for the human observer, particularly in safety and security critical domains. Hence, customized video analysis systems providing functions for the analysis of subtasks like motion detection or target tracking are welcome. While such automated algorithms relieve the human operators from performing basic subtasks, they impose additional interaction duties on them. Prior work shows that, e.g., for interaction with target tracking algorithms, a gaze-enhanced user interface is beneficial. In this contribution, we present an investigation on interaction with an independent motion detection (IDM) algorithm. Besides identifying an appropriate interaction technique for the user interface - again, we compare gaze-based and traditional mouse-based interaction - we focus on the benefit an IDM algorithm might provide for an UAS video analyst. In a pilot study, we exposed ten subjects to the task of moving target detection in UAS video data twice, once performing with automatic support, once performing without it. We compare the two conditions considering performance in terms of effectiveness (correct target selections). Additionally, we report perceived workload (measured using the NASA-TLX questionnaire) and user satisfaction (measured using the ISO 9241-411 questionnaire). The results show that a combination of gaze input and automated IDM algorithm provides valuable support for the human observer, increasing the number of correct target selections up to 62% and reducing workload at the same time.
Allec, N; Abbaszadeh, S; Scott, C C; Lewin, J M; Karim, K S
2012-12-21
In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.
Browne, Richard W; Whitcomb, Brian W
2010-07-01
Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.
Cognitive markers of psychotic unipolar depression: a meta-analytic study.
Zaninotto, Leonardo; Guglielmo, Riccardo; Calati, Raffaella; Ioime, Lucia; Camardese, Giovanni; Janiri, Luigi; Bria, Pietro; Serretti, Alessandro
2015-03-15
The goal of the current meta-analysis was to review and examine in detail the features of cognitive performance in psychotic (MDDP) versus non-psychotic (MDD) major depressive disorder. An electronic literature search was performed to find studies comparing cognitive performance in MDDP versus MDD. A meta-analysis of broad cognitive domains (processing speed, reasoning/problem solving, verbal learning, visual learning, attention/working memory) and individual cognitive tasks was conducted on all included studies (n=12). Demographic and clinical features were investigated via meta-regression analysis as moderators of cognitive performance. No difference in socio-demographic and clinical variables was detected between groups. In general, a poorer cognitive performance was detected in MDDP versus MDD subjects (ES=0.38), with a greater effect size in drug-free patients (ES=0.69). MDDP patients were more impaired in verbal learning (ES=0.67), visual learning (ES=0.62) and processing speed (ES=0.71) tasks. A significantly poorer performance was also detected in MDDP patients for individual tasks as Trail Making Test A, WAIS-R digit span backward and WAIS-R digit symbol. Age resulted to have a negative effect on tasks involved in working memory performance. In line with previous meta-analyses, our findings seem to support an association between psychosis and cognitive deficits in the context of affective disorders. Psychosis during the course of MDD is associated with poorer cognitive performance in some specific cognitive domains, such as visual and verbal learning and executive functions. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Huber, Samuel; Dunau, Patrick; Wellig, Peter; Stein, Karin
2017-10-01
Background: In target detection, the success rates depend strongly on human observer performances. Two prior studies tested the contributions of target detection algorithms and prior training sessions. The aim of this Swiss-German cooperation study was to evaluate the dependency of human observer performance on the quality of supporting image analysis algorithms. Methods: The participants were presented 15 different video sequences. Their task was to detect all targets in the shortest possible time. Each video sequence showed a heavily cluttered simulated public area from a different viewing angle. In each video sequence, the number of avatars in the area was altered to 100, 150 and 200 subjects. The number of targets appearing was kept at 10%. The number of marked targets varied from 0, 5, 10, 20 up to 40 marked subjects while keeping the positive predictive value of the detection algorithm at 20%. During the task, workload level was assessed by applying an acoustic secondary task. Detection rates and detection times for the targets were analyzed using inferential statistics. Results: The study found Target Detection Time to increase and Target Detection Rates to decrease with increasing numbers of avatars. The same is true for the Secondary Task Reaction Time while there was no effect on Secondary Task Hit Rate. Furthermore, we found a trend for a u-shaped correlation between the numbers of markings and RTST indicating increased workload. Conclusion: The trial results may indicate useful criteria for the design of training and support of observers in observational tasks.
Improvement of automatic hemorrhage detection methods using brightness correction on fundus images
NASA Astrophysics Data System (ADS)
Hatanaka, Yuji; Nakagawa, Toshiaki; Hayashi, Yoshinori; Kakogawa, Masakatsu; Sawada, Akira; Kawase, Kazuhide; Hara, Takeshi; Fujita, Hiroshi
2008-03-01
We have been developing several automated methods for detecting abnormalities in fundus images. The purpose of this study is to improve our automated hemorrhage detection method to help diagnose diabetic retinopathy. We propose a new method for preprocessing and false positive elimination in the present study. The brightness of the fundus image was changed by the nonlinear curve with brightness values of the hue saturation value (HSV) space. In order to emphasize brown regions, gamma correction was performed on each red, green, and blue-bit image. Subsequently, the histograms of each red, blue, and blue-bit image were extended. After that, the hemorrhage candidates were detected. The brown regions indicated hemorrhages and blood vessels and their candidates were detected using density analysis. We removed the large candidates such as blood vessels. Finally, false positives were removed by using a 45-feature analysis. To evaluate the new method for the detection of hemorrhages, we examined 125 fundus images, including 35 images with hemorrhages and 90 normal images. The sensitivity and specificity for the detection of abnormal cases was were 80% and 88%, respectively. These results indicate that the new method may effectively improve the performance of our computer-aided diagnosis system for hemorrhages.
Fast and objective detection and analysis of structures in downhole images
NASA Astrophysics Data System (ADS)
Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick
2017-09-01
Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.
Development and preliminary testing of an instrumented object for force analysis during grasping.
Romeo, R A; Cordella, F; Zollo, L; Formica, D; Saccomandi, P; Schena, E; Carpino, G; Davalli, A; Sacchetti, R; Guglielmelli, E
2015-01-01
This paper presents the design and realization of an instrumented object for force analysis during grasping. The object, with spherical shape, has been constructed with three contact areas in order to allow performing a tripod grasp. Force Sensing Resistor (FSR) sensors have been employed for normal force measurements, while an accelerometer has been used for slip detection. An electronic board for data acquisition has been embedded into the object, so that only the cables for power supply exit from it. Validation tests have been carried out for: (i) comparing the force measurements with a ground truth; (ii) assessing the capability of the accelerometer to detect slippage for different roughness values; (iii) evaluating object performance in grasp trials performed by a human subject.
Rovira, Ericka; Parasuraman, Raja
2010-06-01
This study examined whether benefits of conflict probe automation would occur in a future air traffic scenario in which air traffic service providers (ATSPs) are not directly responsible for freely maneuvering aircraft but are controlling other nonequipped aircraft (mixed-equipage environment). The objective was to examine how the type of automation imperfection (miss vs. false alarm) affects ATSP performance and attention allocation. Research has shown that the type of automation imperfection leads to differential human performance costs. Participating in four 30-min scenarios were 12 full-performance-level ATSPs. Dependent variables included conflict detection and resolution performance, eye movements, and subjective ratings of trust and self confidence. ATSPs detected conflicts faster and more accurately with reliable automation, as compared with manual performance. When the conflict probe automation was unreliable, conflict detection performance declined with both miss (25% conflicts detected) and false alarm automation (50% conflicts detected). When the primary task of conflict detection was automated, even highly reliable yet imperfect automation (miss or false alarm) resulted in serious negative effects on operator performance. The further in advance that conflict probe automation predicts a conflict, the greater the uncertainty of prediction; thus, designers should provide users with feedback on the state of the automation or other tools that allow for inspection and analysis of the data underlying the conflict probe algorithm.
Integrated software for the detection of epileptogenic zones in refractory epilepsy.
Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia
2010-01-01
In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.
ATAC Autocuer Modeling Analysis.
1981-01-01
the analysis of the simple rectangular scrnentation (1) is based on detection and estimation theory (2). This approach uses the concept of maximum ...continuous wave forms. In order to develop the principles of maximum likelihood, it is con- venient to develop the principles for the "classical...the concept of maximum likelihood is significant in that it provides the optimum performance of the detection/estimation problem. With a knowledge of
Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests
Carr, R.S.; Biedenbach, J.M.
1999-01-01
When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.
A Longitudinal Study of Adenoma Detection Rate in Gastroenterology Fellowship Training.
Gianotti, Robert J; Oza, Sveta Shah; Tapper, Elliot B; Kothari, Darshan; Sheth, Sunil G
2016-10-01
Current guidelines suggest that a gastroenterology fellow in training needs to perform 140 colonoscopies to achieve competency. Data are limited regarding adenoma detection rate (ADR) in fellowship. To assess how fellow ADR correlates with number of colonoscopies performed. We performed a retrospective study examining consecutive colonoscopies performed by gastroenterology fellows. Fellow ADR before and after the 140 procedure benchmark was compared to colonoscopies performed by attending only with whom these fellows trained. A threshold for ideal procedure count was performed using ROC analysis. We analyzed 2021 average-risk colonoscopies performed by 10 gastroenterology fellows under the supervision of an attending physician. When fellows had performed <140 colonoscopies, the ADR was 27 % compared with an ADR of 36 % when fellows had performed >140 colonoscopies under attending supervision (p = 0.02). The ADR of fellows who had performed >140 colonoscopies under attending supervision was greater than that of attending-only colonoscopies (36 vs. 25 %, p < 0.0001). A threshold of >325 (male patients) and 539 (female patients) colonoscopies was determined to be ideal for achieving adequate ADR based on ROC analysis. Our data suggest that ADR increases after fellows perform >140 colonoscopies under attending supervision, and thereafter surpasses the ADR of attending-only colonoscopies. Some of the differences may be driven by detection of small adenomas. The findings of this study suggest that a higher threshold for number of colonoscopies performed under attending supervision may be needed to achieve adequate ADR during fellowship prior to independent practice.
Yerlikaya, Seda; Campillo, Ana; Gonzalez, Iveth J
2018-03-15
Despite the increased use and worldwide distribution of malaria rapid diagnostic tests (RDTs) which distinguish between Plasmodium falciparum and non-falciparum species, little is known about their performance for detecting Plasmodium knowlesi (Pk), Plasmodium malariae (Pm), and Plasmodium ovale (Po). The objective of this review is to analyze results of published studies evaluating the diagnostic accuracy of malaria RDTs in detecting Pk, Pm and Po mono-infections.MEDLINE, EMBASE, Web of Science and CENTRAL databases were systematically searched to identify studies which reported on the performance of RDTs in detecting Pk, Pm,Po mono-infections.Among 40 studies included in the review, three reported on Pk, eight on Pm, five on Po, one on Pk and Pm, and 23 on Pm and Po infections. In the meta-analysis, estimates of sensitivities of RDTs in detecting Pk infections ranged from 2% to 48%. Test performances for Pm and Po infections were less accurate and highly heterogeneous, mainly due to the small number of samples tested.Limited data available suggest that malaria RDTs show suboptimal performance for detecting Pk, Pm,Po infections. New improved RDTs as well as appropriately designed, cross-sectional studies to demonstrate their usefulness in the detection of neglected Plasmodium species, are urgently needed.
Schmidt, Jürgen; Laarousi, Rihab; Stolzmann, Wolfgang; Karrer-Gauß, Katja
2018-06-01
In this article, we examine the performance of different eye blink detection algorithms under various constraints. The goal of the present study was to evaluate the performance of an electrooculogram- and camera-based blink detection process in both manually and conditionally automated driving phases. A further comparison between alert and drowsy drivers was performed in order to evaluate the impact of drowsiness on the performance of blink detection algorithms in both driving modes. Data snippets from 14 monotonous manually driven sessions (mean 2 h 46 min) and 16 monotonous conditionally automated driven sessions (mean 2 h 45 min) were used. In addition to comparing two data-sampling frequencies for the electrooculogram measures (50 vs. 25 Hz) and four different signal-processing algorithms for the camera videos, we compared the blink detection performance of 24 reference groups. The analysis of the videos was based on very detailed definitions of eyelid closure events. The correct detection rates for the alert and manual driving phases (maximum 94%) decreased significantly in the drowsy (minus 2% or more) and conditionally automated (minus 9% or more) phases. Blinking behavior is therefore significantly impacted by drowsiness as well as by automated driving, resulting in less accurate blink detection.
Microfluidic platform for multiplexed detection in single cells and methods thereof
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Meiye; Singh, Anup K.
The present invention relates to a microfluidic device and platform configured to conduct multiplexed analysis within the device. In particular, the device allows multiple targets to be detected on a single-cell level. Also provided are methods of performing multiplexed analyses to detect one or more target nucleic acids, proteins, and post-translational modifications.
Detecting Growth Shape Misspecifications in Latent Growth Models: An Evaluation of Fit Indexes
ERIC Educational Resources Information Center
Leite, Walter L.; Stapleton, Laura M.
2011-01-01
In this study, the authors compared the likelihood ratio test and fit indexes for detection of misspecifications of growth shape in latent growth models through a simulation study and a graphical analysis. They found that the likelihood ratio test, MFI, and root mean square error of approximation performed best for detecting model misspecification…
New technologies are creating the potential for using nucleic acid sequence detection to perform routine microbiological analyses of environmental samples. Our laboratory has recently reported on the development of a method for the quantitative detection of Stachybotrys chartarum...
A method for the detection of trace levels of N,N-diethyl-m-toluamide (DEET) in water is discussed. The method utilizes an on-line preconcentration column in series with high performance liquid chromatography (HPLC) and UV photodiode array detection. DEET, a common insect repel...
Reliability Assessment for Low-cost Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Freeman, Paul Michael
Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.
Applications of Graph-Theoretic Tests to Online Change Detection
2014-05-09
NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT ...assessment, crime investigation, and environmental field analysis. Our work offers a new tool for change detection that can be employed in real- time in very...this paper such MSTs and bipartite matchings. Ruth (2009) reports run times for MNBM ensembles created using Derigs’ (1998) algorithm on the order of
Analysis of the impact of error detection on computer performance
NASA Technical Reports Server (NTRS)
Shin, K. C.; Lee, Y. H.
1983-01-01
Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.
Analysis of UAS DAA Surveillance in Fast-Time Simulations without DAA Mitigation
NASA Technical Reports Server (NTRS)
Thipphavong, David P.; Santiago, Confesor; Isaacson, David R.; Lee, Seung Man; Refai, Mohamad Said; Snow, James William
2015-01-01
Realization of the expected proliferation of Unmanned Aircraft System (UAS) operations in the National Airspace System (NAS) depends on the development and validation of performance standards for UAS Detect and Avoid (DAA) Systems. The RTCA Special Committee 228 is charged with leading the development of draft Minimum Operational Performance Standards (MOPS) for UAS DAA Systems. NASA, as a participating member of RTCA SC-228 is committed to supporting the development and validation of draft requirements for DAA surveillance system performance. A recent study conducted using NASA's ACES (Airspace Concept Evaluation System) simulation capability begins to address questions surrounding the development of draft MOPS for DAA surveillance systems. ACES simulations were conducted to study the performance of sensor systems proposed by the SC-228 DAA Surveillance sub-group. Analysis included but was not limited to: 1) number of intruders (both IFR and VFR) detected by all sensors as a function of UAS flight time, 2) number of intruders (both IFR and VFR) detected by radar alone as a function of UAS flight time, and 3) number of VFR intruders detected by all sensors as a function of UAS flight time. The results will be used by SC-228 to inform decisions about the surveillance standards of UAS DAA systems and future requirements development and validation efforts.
A Bernoulli Gaussian Watermark for Detecting Integrity Attacks in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weerakkody, Sean; Ozel, Omur; Sinopoli, Bruno
We examine the merit of Bernoulli packet drops in actively detecting integrity attacks on control systems. The aim is to detect an adversary who delivers fake sensor measurements to a system operator in order to conceal their effect on the plant. Physical watermarks, or noisy additive Gaussian inputs, have been previously used to detect several classes of integrity attacks in control systems. In this paper, we consider the analysis and design of Gaussian physical watermarks in the presence of packet drops at the control input. On one hand, this enables analysis in a more general network setting. On the othermore » hand, we observe that in certain cases, Bernoulli packet drops can improve detection performance relative to a purely Gaussian watermark. This motivates the joint design of a Bernoulli-Gaussian watermark which incorporates both an additive Gaussian input and a Bernoulli drop process. We characterize the effect of such a watermark on system performance as well as attack detectability in two separate design scenarios. Here, we consider a correlation detector for attack recognition. We then propose efficiently solvable optimization problems to intelligently select parameters of the Gaussian input and the Bernoulli drop process while addressing security and performance trade-offs. Finally, we provide numerical results which illustrate that a watermark with packet drops can indeed outperform a Gaussian watermark.« less
A study on real-time low-quality content detection on Twitter from the users' perspective.
Chen, Weiling; Yeo, Chai Kiat; Lau, Chiew Tong; Lee, Bu Sung
2017-01-01
Detection techniques of malicious content such as spam and phishing on Online Social Networks (OSN) are common with little attention paid to other types of low-quality content which actually impacts users' content browsing experience most. The aim of our work is to detect low-quality content from the users' perspective in real time. To define low-quality content comprehensibly, Expectation Maximization (EM) algorithm is first used to coarsely classify low-quality tweets into four categories. Based on this preliminary study, a survey is carefully designed to gather users' opinions on different categories of low-quality content. Both direct and indirect features including newly proposed features are identified to characterize all types of low-quality content. We then further combine word level analysis with the identified features and build a keyword blacklist dictionary to improve the detection performance. We manually label an extensive Twitter dataset of 100,000 tweets and perform low-quality content detection in real time based on the characterized significant features and word level analysis. The results of our research show that our method has a high accuracy of 0.9711 and a good F1 of 0.8379 based on a random forest classifier with real time performance in the detection of low-quality content in tweets. Our work therefore achieves a positive impact in improving user experience in browsing social media content.
Locally optimum nonlinearities for DCT watermark detection.
Briassouli, Alexia; Strintzis, Michael G
2004-12-01
The issue of copyright protection of digital multimedia data has attracted a lot of attention during the last decade. An efficient copyright protection method that has been gaining popularity is watermarking, i.e., the embedding of a signature in a digital document that can be detected only by its rightful owner. Watermarks are usually blindly detected using correlating structures, which would be optimal in the case of Gaussian data. However, in the case of DCT-domain image watermarking, the data is more heavy-tailed and the correlator is clearly suboptimal. Nonlinear receivers have been shown to be particularly well suited for the detection of weak signals in heavy-tailed noise, as they are locally optimal. This motivates the use of the Gaussian-tailed zero-memory nonlinearity, as well as the locally optimal Cauchy nonlinearity for the detection of watermarks in DCT transformed images. We analyze the performance of these schemes theoretically and compare it to that of the traditionally used Gaussian correlator, but also to the recently proposed generalized Gaussian detector, which outperforms the correlator. The theoretical analysis and the actual performance of these systems is assessed through experiments, which verify the theoretical analysis and also justify the use of nonlinear structures for watermark detection. The performance of the correlator and the nonlinear detectors in the presence of quantization is also analyzed, using results from dither theory, and also verified experimentally.
A study on real-time low-quality content detection on Twitter from the users’ perspective
Yeo, Chai Kiat; Lau, Chiew Tong; Lee, Bu Sung
2017-01-01
Detection techniques of malicious content such as spam and phishing on Online Social Networks (OSN) are common with little attention paid to other types of low-quality content which actually impacts users’ content browsing experience most. The aim of our work is to detect low-quality content from the users’ perspective in real time. To define low-quality content comprehensibly, Expectation Maximization (EM) algorithm is first used to coarsely classify low-quality tweets into four categories. Based on this preliminary study, a survey is carefully designed to gather users’ opinions on different categories of low-quality content. Both direct and indirect features including newly proposed features are identified to characterize all types of low-quality content. We then further combine word level analysis with the identified features and build a keyword blacklist dictionary to improve the detection performance. We manually label an extensive Twitter dataset of 100,000 tweets and perform low-quality content detection in real time based on the characterized significant features and word level analysis. The results of our research show that our method has a high accuracy of 0.9711 and a good F1 of 0.8379 based on a random forest classifier with real time performance in the detection of low-quality content in tweets. Our work therefore achieves a positive impact in improving user experience in browsing social media content. PMID:28793347
Cell nuclei and cytoplasm joint segmentation using the sliding band filter.
Quelhas, Pedro; Marcuzzo, Monica; Mendonça, Ana Maria; Campilho, Aurélio
2010-08-01
Microscopy cell image analysis is a fundamental tool for biological research. In particular, multivariate fluorescence microscopy is used to observe different aspects of cells in cultures. It is still common practice to perform analysis tasks by visual inspection of individual cells which is time consuming, exhausting and prone to induce subjective bias. This makes automatic cell image analysis essential for large scale, objective studies of cell cultures. Traditionally the task of automatic cell analysis is approached through the use of image segmentation methods for extraction of cells' locations and shapes. Image segmentation, although fundamental, is neither an easy task in computer vision nor is it robust to image quality changes. This makes image segmentation for cell detection semi-automated requiring frequent tuning of parameters. We introduce a new approach for cell detection and shape estimation in multivariate images based on the sliding band filter (SBF). This filter's design makes it adequate to detect overall convex shapes and as such it performs well for cell detection. Furthermore, the parameters involved are intuitive as they are directly related to the expected cell size. Using the SBF filter we detect cells' nucleus and cytoplasm location and shapes. Based on the assumption that each cell has the same approximate shape center in both nuclei and cytoplasm fluorescence channels, we guide cytoplasm shape estimation by the nuclear detections improving performance and reducing errors. Then we validate cell detection by gathering evidence from nuclei and cytoplasm channels. Additionally, we include overlap correction and shape regularization steps which further improve the estimated cell shapes. The approach is evaluated using two datasets with different types of data: a 20 images benchmark set of simulated cell culture images, containing 1000 simulated cells; a 16 images Drosophila melanogaster Kc167 dataset containing 1255 cells, stained for DNA and actin. Both image datasets present a difficult problem due to the high variability of cell shapes and frequent cluster overlap between cells. On the Drosophila dataset our approach achieved a precision/recall of 95%/69% and 82%/90% for nuclei and cytoplasm detection respectively and an overall accuracy of 76%.
Qumseya, Bashar J; Wang, Haibo; Badie, Nicole; Uzomba, Rosemary N; Parasa, Sravanthi; White, Donna L; Wolfsen, Herbert; Sharma, Prateek; Wallace, Michael B
2013-12-01
US guidelines recommend surveillance of patients with Barrett's esophagus (BE) to detect dysplasia. BE conventionally is monitored via white-light endoscopy (WLE) and a collection of random biopsy specimens. However, this approach does not definitively or consistently detect areas of dysplasia. Advanced imaging technologies can increase the detection of dysplasia and cancer. We investigated whether these imaging technologies can increase the diagnostic yield for the detection of neoplasia in patients with BE, compared with WLE and analysis of random biopsy specimens. We performed a systematic review, using Medline and Embase, to identify relevant peer-review studies. Fourteen studies were included in the final analysis, with a total of 843 patients. Our metameter (estimate) of interest was the paired-risk difference (RD), defined as the difference in yield of the detection of dysplasia or cancer using advanced imaging vs WLE. The estimated paired-RD and 95% confidence interval (CI) were obtained using random-effects models. Heterogeneity was assessed by means of the Q statistic and the I(2) statistic. An exploratory meta-regression was performed to look for associations between the metameter and potential confounders or modifiers. Overall, advanced imaging techniques increased the diagnostic yield for detection of dysplasia or cancer by 34% (95% CI, 20%-56%; P < .0001). A subgroup analysis showed that virtual chromoendoscopy significantly increased the diagnostic yield (RD, 0.34; 95% CI, 0.14-0.56; P < .0001). The RD for chromoendoscopy was 0.35 (95% CI, 0.13-0.56; P = .0001). There was no significant difference between virtual chromoendoscopy and chromoendoscopy, based on Student t test analysis (P = .45). Based on a meta-analysis, advanced imaging techniques such as chromoendoscopy or virtual chromoendoscopy significantly increase the diagnostic yield for identification of dysplasia or cancer in patients with BE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Photocleavable DNA Barcoding Antibodies for Multiplexed Protein Analysis in Single Cells.
Ullal, Adeeti V; Weissleder, Ralph
2015-01-01
We describe a DNA-barcoded antibody sensing technique for single cell protein analysis in which the barcodes are photocleaved and digitally detected without amplification steps (Ullal et al., Sci Transl Med 6:219, 2014). After photocleaving the unique ~70 mer DNA barcodes we use a fluorescent hybridization technology for detection, similar to what is commonly done for nucleic acid readouts. This protocol offers a simple method for multiplexed protein detection using 100+ antibodies and can be performed on clinical samples as well as single cells.
Analysis of selected herbicide metabolites in surface and ground water of the United States
Scribner, E.A.; Thurman, E.M.; Zimmerman, L.R.
2000-01-01
One of the primary goals of the US Geological Survey (USGS) Laboratory in Lawrence, Kansas, is to develop analytical methods for the analysis of herbicide metabolites in surface and ground water that are vital to the study of herbicide fate and degradation pathways in the environment. Methods to measure metabolite concentrations from three major classes of herbicides - triazine, chloroacetanilide and phenyl-urea - have been developed. Methods for triazine metabolite detection cover nine compounds: six compounds are detected by gas chromatography/mass spectrometry; one is detected by high-performance liquid chromatography with diode-array detection; and eight are detected by liquid chromatography/mass spectrometry. Two metabolites of the chloroacetanilide herbicides - ethane sulfonic acid and oxanilic acid - are detected by high-performance liquid chromatography with diode-array detection and liquid chromatography/mass spectrometry. Alachlor ethane sulfonic acid also has been detected by solid-phase extraction and enzyme-linked immunosorbent assay. Six phenylurea metabolites are all detected by liquid chromatography/mass spectrometry; four of the six metabolites also are detected by gas chromatography/mass spectrometry. Additionally, surveys of herbicides and their metabolites in surface water, ground water, lakes, reservoirs, and rainfall have been conducted through the USGS laboratory in Lawrence. These surveys have been useful in determining herbicide and metabolite occurrence and temporal distribution and have shown that metabolites may be useful in evaluation of non-point-source contamination. Copyright (C) 2000 Elsevier Science B.V.
NASA Technical Reports Server (NTRS)
Roman, Monserrate C.; Jones, Kathy U.; Oubre, Cherie M.; Castro, Victoria; Ott, Mark C.; Birmele, Michele; Venkateswaran, Kasthuri J.; Vaishampayan, Parag A.
2013-01-01
Current methods for microbial detection: a) Labor & time intensive cultivation-based approaches that can fail to detect or characterize all cells present. b) Requires collection of samples on orbit and transportation back to ground for analysis. Disadvantages to current detection methods: a) Unable to perform quick and reliable detection on orbit. b) Lengthy sampling intervals. c) No microbe identification.
Niu, Guanghui; Shi, Qi; Xu, Mingjun; Lai, Hongjun; Lin, Qingyu; Liu, Kunping; Duan, Yixiang
2015-10-01
In this article, a novel and alternative method of laser-induced breakdown spectroscopy (LIBS) analysis for liquid sample is proposed, which involves the removal of metal ions from a liquid to a solid substrate using a cost-efficient adsorbent, dehydrated carbon, obtained using a dehydration reaction. Using this new technique, researchers can detect trace metal ions in solutions qualitatively and quantitatively, and the drawbacks of performing liquid analysis using LIBS can be avoided because the analysis is performed on a solid surface. To achieve better performance using this technique, we considered parameters potentially influencing both adsorption performance and LIBS analysis. The calibration curves were evaluated, and the limits of detection obtained for Cu(2+), Pb(2+), and Cr(3+) were 0.77, 0.065, and 0.46 mg/L, respectively, which are better than those in the previous studies. In addition, compared to other absorbents, the adsorbent used in this technique is much cheaper in cost, easier to obtain, and has fewer or no other elements other than C, H, and O that could result in spectral interference during analysis. We also used the recommended method to analyze spiked samples, obtaining satisfactory results. Thus, this new technique is helpful and promising for use in wastewater analysis and management.
Mata, Gardênia Márcia Silva Campos; Martins, Evandro; Machado, Solimar Gonçalves; Pinto, Maximiliano Soares; de Carvalho, Antônio Fernandes; Vanetti, Maria Cristina Dantas
2016-01-01
The ability of pathogens to survive cheese ripening is a food-security concern. Therefore, this study aimed to evaluate the performance of two alternative methods of analysis of Listeria during the ripening of artisanal Minas cheese. These methods were tested and compared with the conventional method: Lateral Flow System™, in cheeses produced on laboratory scale using raw milk collected from different farms and inoculated with Listeria innocua; and VIDAS(®)-LMO, in cheese samples collected from different manufacturers in Serro, Minas Gerais, Brazil. These samples were also characterized in terms of lactic acid bacteria, coliforms and physical-chemical analysis. In the inoculated samples, L. innocua was detected by Lateral Flow System™ method with 33% false-negative and 68% accuracy results. L. innocua was only detected in the inoculated samples by the conventional method at 60-days of cheese ripening. L. monocytogenes was not detected by the conventional and the VIDAS(®)-LMO methods in cheese samples collected from different manufacturers, which impairs evaluating the performance of this alternative method. We concluded that the conventional method provided a better recovery of L. innocua throughout cheese ripening, being able to detect L. innocua at 60-day, aging period which is required by the current legislation. Copyright © 2016 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.
Granja, Rodrigo H M M; Niño, Alfredo M Montes; Zucchetti, Roberto A M; Niño, Rosario E Montes; Salerno, Alessandro G
2008-01-01
Ethopabate is frequently used in the prophylaxis and treatment of coccidiosis in poultry. Residues of this drug in food present a potential risk to consumers. A simple, rapid, and sensitive column high-performance liquid chromatographic (HPLC) method with UV detection for determination of ethopabate in poultry liver is presented. The drug is extracted with acetonitrile. After evaporation, the residue is dissolved with an acetone-hexane mixture and cleaned up by solid-phase extraction using Florisil columns. The analyte is then eluted with methanol. LC analysis is carried out on a C18 5 microm Gemini column, 15 cm x 4.6 mm. Ethopabate is quantified by means of UV detection at 270 nm. Parameters such as decision limit, detection capability, precision, recovery, ruggedness, and measurement uncertainty were calculated according to method validation guidelines provided in 2002/657/EC and ISO/IEC 17025:2005. Decision limit and detection capability were determined to be 2 and 3 microg/kg, respectively. Average recoveries from poultry samples fortified with 10, 15, and 20 microg/kg levels of ethopabate were 100-105%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is to be implemented into Brazil's residue monitoring and control program for ethopabate.
The Reusable Handheld Electrolyte and Lab Technology for Humans (rHEALTH) Sensor
NASA Technical Reports Server (NTRS)
Chan, Eugene
2015-01-01
The DNA Medicine Institute has produced a reusable microfluidic device that performs rapid, low-cost cell counts and measurements of electrolytes, proteins, and other biomarkers. The rHEALTH sensor is compact and portable, and it employs cutting-edge fluorescence detection optics, innovative microfluidics, and nanostrip reagents to perform a suite of hematology, chemistry, and biomarker assays from a single drop of blood. A handful of current portable POC devices provide generalized blood analysis, but they perform only a few tests at a time. These devices also rely on disposable components and depend on diverse detection technologies to complete routine tests-all ill-suited for space travelers on extended missions. In contrast, the rHEALTH sensor integrates sample introduction, processing, and detection with a compact, resource-conscious, and efficient design. Developed to monitor astronaut health on the International Space Station and during long-term space flight, this microscale lab analysis tool also has terrestrial applications that include POC diagnostics conducted at a patient's bedside, in a doctor's office, and in a hospital.
Mass spectrometric detection of siRNA in plasma samples for doping control purposes.
Kohler, Maxie; Thomas, Andreas; Walpurgis, Katja; Schänzer, Wilhelm; Thevis, Mario
2010-10-01
Small interfering ribonucleic acid (siRNA) molecules can effect the expression of any gene by inducing the degradation of mRNA. Therefore, these molecules can be of interest for illicit performance enhancement in sports by affecting different metabolic pathways. An example of an efficient performance-enhancing gene knockdown is the myostatin gene that regulates muscle growth. This study was carried out to provide a tool for the mass spectrometric detection of modified and unmodified siRNA from plasma samples. The oligonucleotides are purified by centrifugal filtration and the use of an miRNA purification kit, followed by flow-injection analysis using an Exactive mass spectrometer to yield the accurate masses of the sense and antisense strands. Although chromatography and sensitive mass spectrometric analysis of oligonucleotides are still challenging, a method was developed and validated that has adequate sensitivity (limit of detection 0.25-1 nmol mL(-1)) and performance (precision 11-21%, recovery 23-67%) for typical antisense oligonucleotides currently used in clinical studies.
Li, Xueqi; Woodman, Michael; Wang, Selina C
2015-08-01
Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Evaluation of field performance of poplar clones using selected competition indices.
Chandler Brodie; D.S. DeBell
2004-01-01
Use of competition indices in the analysis of forestry experiments may improve detection and understanding of treatment effects, and thereby improve the application of results. In this paper, we compared the performance of eight indices in an analysis of a spacing trial of four Populus clones planted in pure and mixed clonal plots. Indices were...
Heat-Energy Analysis for Solar Receivers
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1982-01-01
Heat-energy analysis program (HEAP) solves general heat-transfer problems, with some specific features that are "custom made" for analyzing solar receivers. Can be utilized not only to predict receiver performance under varying solar flux, ambient temperature and local heat-transfer rates but also to detect locations of hotspots and metallurgical difficulties and to predict performance sensitivity of neighboring component parameters.
NASA Astrophysics Data System (ADS)
Ni, Zhengyuan; Yan, Huimin; Ni, Xuxiang; Zhang, Xiuda
2017-10-01
The research of the multifunctional analyzer which integrates absorbance detection, fluorescence detection, time-resolved fluorescence detection, biochemical luminescence detection methods, can make efficient detection and analysis for a variety of human body nutrients. This article focuses on the absorbance detection and fluorescence detection system. The two systems are modular in design and controlled by embedded system, to achieve automatic measurement according to user settings. In the optical path design, the application of confocal design can improve the optical signal acquisition capability, and reduce the interference. A photon counter is used for detection, and a high performance counter module is designed to measure the output of photon counter. In the experiment, we use neutral density filters and potassium dichromate solution to test the absorbance detection system, and use fluorescein isothiocyanate FITC for fluorescence detection system performance test. The experimental results show that the absorbance detection system has a detection range of 0 4OD, and has good linearity in the detection range, while the fluorescence detection system has a high sensitivity of 1pmol/L concentration.
NASA Astrophysics Data System (ADS)
Kusrini, Elisa; Subagyo; Aini Masruroh, Nur
2016-01-01
This research is a sequel of the author's earlier conducted researches in the fields of designing of integrated performance measurement between supply chain's actors and regulator. In the previous paper, the design of performance measurement is done by combining Balanced Scorecard - Supply Chain Operation Reference - Regulator Contribution model and Data Envelopment Analysis. This model referred as B-S-Rc-DEA model. The combination has the disadvantage that all the performance variables have the same weight. This paper investigates whether by giving weight to performance variables will produce more sensitive performance measurement in detecting performance improvement. Therefore, this paper discusses the development of the model B-S-Rc-DEA by giving weight to its performance'variables. This model referred as Scale B-S-Rc-DEA model. To illustrate the model of development, some samples from small medium enterprises of leather craft industry supply chain in province of Yogyakarta, Indonesia are used in this research. It is found that Scale B-S-Rc-DEA model is more sensitive to detecting performance improvement than B-S- Rc-DEA model.
A dedicated on-line detecting system for auto air dryers
NASA Astrophysics Data System (ADS)
Shi, Chao-yu; Luo, Zai
2013-10-01
According to the correlative automobile industry standard and the requirements of manufacturer, this dedicated on-line detecting system is designed against the shortage of low degree automatic efficiency and detection precision of auto air dryer in the domestic. Fast automatic detection is achieved by combining the technology of computer control, mechatronics and pneumatics. This system can detect the speciality performance of pressure regulating valve and sealability of auto air dryer, in which online analytical processing of test data is available, at the same time, saving and inquiring data is achieved. Through some experimental analysis, it is indicated that efficient and accurate detection of the performance of auto air dryer is realized, and the test errors are less than 3%. Moreover, we carry out the type A evaluation of uncertainty in test data based on Bayesian theory, and the results show that the test uncertainties of all performance parameters are less than 0.5kPa, which can meet the requirements of operating industrial site absolutely.
Roever, Stefan
2012-01-01
A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.
Influence analysis in quantitative trait loci detection.
Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko
2014-07-01
This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
No psychological effect of color context in a low level vision task
Pedley, Adam; Wade, Alex R
2013-01-01
Background: A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Methods: Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task. Results: A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η 2 = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. Discussion: We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas. PMID:25075280
No psychological effect of color context in a low level vision task.
Pedley, Adam; Wade, Alex R
2013-01-01
A remarkable series of recent papers have shown that colour can influence performance in cognitive tasks. In particular, they suggest that viewing a participant number printed in red ink or other red ancillary stimulus elements improves performance in tasks requiring local processing and impedes performance in tasks requiring global processing whilst the reverse is true for the colour blue. The tasks in these experiments require high level cognitive processing such as analogy solving or remote association tests and the chromatic effect on local vs. global processing is presumed to involve widespread activation of the autonomic nervous system. If this is the case, we might expect to see similar effects on all local vs. global task comparisons. To test this hypothesis, we asked whether chromatic cues also influence performance in tasks involving low level visual feature integration. Subjects performed either local (contrast detection) or global (form detection) tasks on achromatic dynamic Glass pattern stimuli. Coloured instructions, target frames and fixation points were used to attempt to bias performance to different task types. Based on previous literature, we hypothesised that red cues would improve performance in the (local) contrast detection task but would impede performance in the (global) form detection task. A two-way, repeated measures, analysis of covariance (2×2 ANCOVA) with gender as a covariate, revealed no influence of colour on either task, F(1,29) = 0.289, p = 0.595, partial η (2) = 0.002. Additional analysis revealed no significant differences in only the first attempts of the tasks or in the improvement in performance between trials. We conclude that motivational processes elicited by colour perception do not influence neuronal signal processing in the early visual system, in stark contrast to their putative effects on processing in higher areas.
Chen, J H K; She, K K K; Kwong, T-C; Wong, O-Y; Siu, G K H; Leung, C-C; Chang, K-C; Tam, C-M; Ho, P-L; Cheng, V C C; Yuen, K-Y; Yam, W-C
2015-09-01
The automated high-throughput Abbott RealTime MTB real-time PCR assay has been recently launched for Mycobacterium tuberculosis complex (MTBC) clinical diagnosis. This study would like to evaluate its performance. We first compared its diagnostic performance with the Roche Cobas TaqMan MTB assay on 214 clinical respiratory specimens. Prospective analysis of a total 520 specimens was then performed to further evaluate the Abbott assay. The Abbott assay showed a lower limit of detection at 22.5 AFB/ml, which was more sensitive than the Cobas assay (167.5 AFB/ml). The two assays demonstrated a significant difference in diagnostic performance (McNemar's test; P = 0.0034), in which the Abbott assay presented significantly higher area under curve (AUC) than the Cobas assay (1.000 vs 0.880; P = 0.0002). The Abbott assay demonstrated extremely low PCR inhibition on clinical respiratory specimens. The automated Abbott assay required only very short manual handling time (0.5 h), which could help to improve the laboratory management. In the prospective analysis, the overall estimates for sensitivity and specificity of the Abbott assay were both 100 % among smear-positive specimens, whereas the smear-negative specimens were 96.7 and 96.1 %, respectively. No cross-reactivity with non-tuberculosis mycobacterial species was observed. The superiority in sensitivity of the Abbott assay for detecting MTBC in smear-negative specimens could further minimize the risk in MTBC false-negative detection. The new Abbott RealTime MTB assay has good diagnostic performance which can be a useful diagnostic tool for rapid MTBC detection in clinical laboratories.
Futia, Gregory L; Schlaepfer, Isabel R; Qamar, Lubna; Behbakht, Kian; Gibson, Emily A
2017-07-01
Detection of circulating tumor cells (CTCs) in a blood sample is limited by the sensitivity and specificity of the biomarker panel used to identify CTCs over other blood cells. In this work, we present Bayesian theory that shows how test sensitivity and specificity set the rarity of cell that a test can detect. We perform our calculation of sensitivity and specificity on our image cytometry biomarker panel by testing on pure disease positive (D + ) populations (MCF7 cells) and pure disease negative populations (D - ) (leukocytes). In this system, we performed multi-channel confocal fluorescence microscopy to image biomarkers of DNA, lipids, CD45, and Cytokeratin. Using custom software, we segmented our confocal images into regions of interest consisting of individual cells and computed the image metrics of total signal, second spatial moment, spatial frequency second moment, and the product of the spatial-spatial frequency moments. We present our analysis of these 16 features. The best performing of the 16 features produced an average separation of three standard deviations between D + and D - and an average detectable rarity of ∼1 in 200. We performed multivariable regression and feature selection to combine multiple features for increased performance and showed an average separation of seven standard deviations between the D + and D - populations making our average detectable rarity of ∼1 in 480. Histograms and receiver operating characteristics (ROC) curves for these features and regressions are presented. We conclude that simple regression analysis holds promise to further improve the separation of rare cells in cytometry applications. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Schubert, Birthe; Oberacher, Herbert
2011-06-03
In this study the impact of solvent conditions on the performance of μLC/MS for the analysis of basic drugs was investigated. Our aim was to find experimental conditions that enable high-performance chromatographic separation particularly at overloading conditions paired with a minimal loss of mass spectrometric detection sensitivity. A focus was put on the evaluation of the usability of different kinds of acidic modifiers (acetic acid (HOAc), formic acid (FA), methansulfonic acid (CH₃SO₃H), trifluoroacetic acid (TFA), pentafluoropropionic acid (PFPA), and heptafluorobutyric acid (HFBA)). The test mixture consisted of eleven compounds (bunitrolol, caffeine, cocaine, codeine, diazepam, doxepin, haloperidol, 3,4-methylendioxyamphetamine, morphine, nicotine, and zolpidem). Best chromatographic performance was obtained with the perfluorinated acids. Particularly, 0.010-0.050% HFBA (v/v) was found to represent a good compromise in terms of chromatographic performance and mass spectrometric detection sensitivity. Compared to HOAc, on average a 50% reduction of the peak widths was observed. The use of HFBA was particularly advantageous for polar compounds such as nicotine; only with such a hydrophobic ion-pairing reagent chromatographic retention of nicotine was observed. Best mass spectrometric performance was obtained with HOAc and FA. Loss of detection sensitivity induced by HFBA, however, was moderate and ranged from 0 to 40%, which clearly demonstrates that improved chromatographic performance is able to compensate to a large extent the negative effect of reduced ionization efficiency on detection sensitivity. Applications of μLC/MS for the qualitative and quantitative analysis of clinical and forensic toxicological samples are presented. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.
2016-03-01
Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.
NASA Astrophysics Data System (ADS)
Tian, Xiange; Xi Gu, James; Rehab, Ibrahim; Abdalla, Gaballa M.; Gu, Fengshou; Ball, A. D.
2018-02-01
Envelope analysis is a widely used method for rolling element bearing fault detection. To obtain high detection accuracy, it is critical to determine an optimal frequency narrowband for the envelope demodulation. However, many of the schemes which are used for the narrowband selection, such as the Kurtogram, can produce poor detection results because they are sensitive to random noise and aperiodic impulses which normally occur in practical applications. To achieve the purposes of denoising and frequency band optimisation, this paper proposes a novel modulation signal bispectrum (MSB) based robust detector for bearing fault detection. Because of its inherent noise suppression capability, the MSB allows effective suppression of both stationary random noise and discrete aperiodic noise. The high magnitude features that result from the use of the MSB also enhance the modulation effects of a bearing fault and can be used to provide optimal frequency bands for fault detection. The Kurtogram is generally accepted as a powerful means of selecting the most appropriate frequency band for envelope analysis, and as such it has been used as the benchmark comparator for performance evaluation in this paper. Both simulated and experimental data analysis results show that the proposed method produces more accurate and robust detection results than Kurtogram based approaches for common bearing faults under a range of representative scenarios.
Bank-firm credit network in Japan: an analysis of a bipartite network.
Marotta, Luca; Miccichè, Salvatore; Fujiwara, Yoshi; Iyetomi, Hiroshi; Aoyama, Hideaki; Gallegati, Mauro; Mantegna, Rosario N
2015-01-01
We investigate the networked nature of the Japanese credit market. Our investigation is performed with tools of network science. In our investigation we perform community detection with an algorithm which is identifying communities composed of both banks and firms. We show that the communities obtained by directly working on the bipartite network carry information about the networked nature of the Japanese credit market. Our analysis is performed for each calendar year during the time period from 1980 to 2011. To investigate the time evolution of the networked structure of the credit market we introduce a new statistical method to track the time evolution of detected communities. We then characterize the time evolution of communities by detecting for each time evolving set of communities the over-expression of attributes of firms and banks. Specifically, we consider as attributes the economic sector and the geographical location of firms and the type of banks. In our 32-year-long analysis we detect a persistence of the over-expression of attributes of communities of banks and firms together with a slow dynamic of changes from some specific attributes to new ones. Our empirical observations show that the credit market in Japan is a networked market where the type of banks, geographical location of firms and banks, and economic sector of the firm play a role in shaping the credit relationships between banks and firms.
Bank-Firm Credit Network in Japan: An Analysis of a Bipartite Network
Marotta, Luca; Miccichè, Salvatore; Fujiwara, Yoshi; Iyetomi, Hiroshi; Aoyama, Hideaki; Gallegati, Mauro; Mantegna, Rosario N.
2015-01-01
We investigate the networked nature of the Japanese credit market. Our investigation is performed with tools of network science. In our investigation we perform community detection with an algorithm which is identifying communities composed of both banks and firms. We show that the communities obtained by directly working on the bipartite network carry information about the networked nature of the Japanese credit market. Our analysis is performed for each calendar year during the time period from 1980 to 2011. To investigate the time evolution of the networked structure of the credit market we introduce a new statistical method to track the time evolution of detected communities. We then characterize the time evolution of communities by detecting for each time evolving set of communities the over-expression of attributes of firms and banks. Specifically, we consider as attributes the economic sector and the geographical location of firms and the type of banks. In our 32-year-long analysis we detect a persistence of the over-expression of attributes of communities of banks and firms together with a slow dynamic of changes from some specific attributes to new ones. Our empirical observations show that the credit market in Japan is a networked market where the type of banks, geographical location of firms and banks, and economic sector of the firm play a role in shaping the credit relationships between banks and firms. PMID:25933413
Detecting Solar-like Oscillations in Red Giants with Deep Learning
NASA Astrophysics Data System (ADS)
Hon, Marc; Stello, Dennis; Zinn, Joel C.
2018-05-01
Time-resolved photometry of tens of thousands of red giant stars from space missions like Kepler and K2 has created the need for automated asteroseismic analysis methods. The first and most fundamental step in such analysis is to identify which stars show oscillations. It is critical that this step be performed with no, or little, detection bias, particularly when performing subsequent ensemble analyses that aim to compare the properties of observed stellar populations with those from galactic models. However, an efficient, automated solution to this initial detection step still has not been found, meaning that expert visual inspection of data from each star is required to obtain the highest level of detections. Hence, to mimic how an expert eye analyzes the data, we use supervised deep learning to not only detect oscillations in red giants, but also to predict the location of the frequency at maximum power, ν max, by observing features in 2D images of power spectra. By training on Kepler data, we benchmark our deep-learning classifier against K2 data that are given detections by the expert eye, achieving a detection accuracy of 98% on K2 Campaign 6 stars and a detection accuracy of 99% on K2 Campaign 3 stars. We further find that the estimated uncertainty of our deep-learning-based ν max predictions is about 5%. This is comparable to human-level performance using visual inspection. When examining outliers, we find that the deep-learning results are more likely to provide robust ν max estimates than the classical model-fitting method.
NASA Technical Reports Server (NTRS)
Price, P. B.
1976-01-01
The design, experimental testing, and calibration (error analysis) of a high resolution Cerenkov-scintillation detector is presented. The detector is capable of detecting iron isotopes and heavy ions of cosmic rays, and of performing direct measurements of individual neighboring isotopes at charge resolution 26. It utilizes Lexan (trademark) sheets, and has been used in flight packages of balloons and on the Skylab. The detector will be able to provide more information on violet astrophysical processes, such as thermonuclear reactions on neutron stars. Ground support and display equipment which are to be used in conjunction with the detector are also discussed.
High-Performance Liquid Chromatography (HPLC)-Based Detection and Quantitation of Cellular c-di-GMP.
Petrova, Olga E; Sauer, Karin
2017-01-01
The modulation of c-di-GMP levels plays a vital role in the regulation of various processes in a wide array of bacterial species. Thus, investigation of c-di-GMP regulation requires reliable methods for the assessment of c-di-GMP levels and turnover. Reversed-phase high-performance liquid chromatography (RP-HPLC) analysis has become a commonly used approach to accomplish these goals. The following describes the extraction and HPLC-based detection and quantification of c-di-GMP from Pseudomonas aeruginosa samples, a procedure that is amenable to modifications for the analysis of c-di-GMP in other bacterial species.
Validation of voxel-based morphometry (VBM) based on MRI
NASA Astrophysics Data System (ADS)
Yang, Xueyu; Chen, Kewei; Guo, Xiaojuan; Yao, Li
2007-03-01
Voxel-based morphometry (VBM) is an automated and objective image analysis technique for detecting differences in regional concentration or volume of brain tissue composition based on structural magnetic resonance (MR) images. VBM has been used widely to evaluate brain morphometric differences between different populations, but there isn't an evaluation system for its validation until now. In this study, a quantitative and objective evaluation system was established in order to assess VBM performance. We recruited twenty normal volunteers (10 males and 10 females, age range 20-26 years, mean age 22.6 years). Firstly, several focal lesions (hippocampus, frontal lobe, anterior cingulate, back of hippocampus, back of anterior cingulate) were simulated in selected brain regions using real MRI data. Secondly, optimized VBM was performed to detect structural differences between groups. Thirdly, one-way ANOVA and post-hoc test were used to assess the accuracy and sensitivity of VBM analysis. The results revealed that VBM was a good detective tool in majority of brain regions, even in controversial brain region such as hippocampus in VBM study. Generally speaking, much more severity of focal lesion was, better VBM performance was. However size of focal lesion had little effects on VBM analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
VanderNoot, Victoria A.; Haroldsen, Brent L.; Renzi, Ronald F.
2010-03-01
In a multiyear research agreement with Tenix Investments Pty. Ltd., Sandia has been developing field deployable technologies for detection of biotoxins in water supply systems. The unattended water sensor or UWS employs microfluidic chip based gel electrophoresis for monitoring biological analytes in a small integrated sensor platform. This instrument collects, prepares, and analyzes water samples in an automated manner. Sample analysis is done using the {mu}ChemLab{trademark} analysis module. This report uses analysis results of two datasets collected using the UWS to estimate performance of the device. The first dataset is made up of samples containing ricin at varying concentrations andmore » is used for assessing instrument response and detection probability. The second dataset is comprised of analyses of water samples collected at a water utility which are used to assess the false positive probability. The analyses of the two sets are used to estimate the Receiver Operating Characteristic or ROC curves for the device at one set of operational and detection algorithm parameters. For these parameters and based on a statistical estimate, the ricin probability of detection is about 0.9 at a concentration of 5 nM for a false positive probability of 1 x 10{sup -6}.« less
NASA Astrophysics Data System (ADS)
Morschheuser, Lena; Wessels, Hauke; Pille, Christina; Fischer, Judith; Hünniger, Tim; Fischer, Markus; Paschke-Kratzin, Angelika; Rohn, Sascha
2016-05-01
Protein analysis using high-performance thin-layer chromatography (HPTLC) is not commonly used but can complement traditional electrophoretic and mass spectrometric approaches in a unique way. Due to various detection protocols and possibilities for hyphenation, HPTLC protein analysis is a promising alternative for e.g., investigating posttranslational modifications. This study exemplarily focused on the investigation of lysozyme, an enzyme which is occurring in eggs and technologically added to foods and beverages such as wine. The detection of lysozyme is mandatory, as it might trigger allergenic reactions in sensitive individuals. To underline the advantages of HPTLC in protein analysis, the development of innovative, highly specific staining protocols leads to improved sensitivity for protein detection on HPTLC plates in comparison to universal protein derivatization reagents. This study aimed at developing a detection methodology for HPTLC separated proteins using aptamers. Due to their affinity and specificity towards a wide range of targets, an aptamer based staining procedure on HPTLC (HPTLC-aptastaining) will enable manifold analytical possibilities. Besides the proof of its applicability for the very first time, (i) aptamer-based staining of proteins is applicable on different stationary phase materials and (ii) furthermore, it can be used as an approach for a semi-quantitative estimation of protein concentrations.
Polyp detection rate may predict adenoma detection rate: a meta-analysis.
Niv, Yaron
2018-03-01
Adenoma detection rate (ADR) is defined as the number of colonoscopies with at least one adenoma, expressed as the ratio of the total number of colonoscopies performed. Recently, an application of a conversion factor to estimate the ADR from the polyp detection rate (PDR) was described. In this meta-analysis, we examined the correlation between ADR and PDR in the published studies and assessed the relative ratio of these ratios for a better and more accurate estimation. English Medical literature searches were performed for 'PDR' AND 'ADR'. A meta-analysis was carried out for papers that fulfilled the inclusion criteria using comprehensive meta-analysis software. Twenty-five studies and 42 sets of data, including 31 623 patients, from nine countries published till 31 August 2017, were found. Funnel plot did not indicate a significant publication bias. relative ratio for ADR calculated from PDR was 0.688, 95% confidence intervals: 0.680-0.695, P value of less than 0.0001 in the meta-analysis fixed model. Heterogeneity (the proportion of inconsistency in individual studies) between studies was significant, with Q=492.753, d.f. (Q) 41, P<0.0001, and I 91.679. We found the ratio of 0.688 can be used to calculate ADR from PDR for the individual endoscopist or for a group of endoscopists before receiving the formal results from the pathology department.
Li, Bingsheng; Gan, Aihua; Chen, Xiaolong; Wang, Xinying; He, Weifeng; Zhang, Xiaohui; Huang, Renxiang; Zhou, Shuzhu; Song, Xiaoxiao; Xu, Angao
2016-01-01
DNA hypermethylation in blood is becoming an attractive candidate marker for colorectal cancer (CRC) detection. To assess the diagnostic accuracy of blood hypermethylation markers for CRC in different clinical settings, we conducted a meta-analysis of published reports. Of 485 publications obtained in the initial literature search, 39 studies were included in the meta-analysis. Hypermethylation markers in peripheral blood showed a high degree of accuracy for the detection of CRC. The summary sensitivity was 0.62 [95% confidence interval (CI), 0.56–0.67] and specificity was 0.91 (95% CI, 0.89–0.93). Subgroup analysis showed significantly greater sensitivity for the methylated Septin 9 gene (SEPT9) subgroup (0.75; 95% CI, 0.67–0.81) than for the non-methylated SEPT9 subgroup (0.58; 95% CI, 0.52–0.64). Sensitivity and specificity were not affected significantly by target gene number, CRC staging, study region, or methylation analysis method. These findings show that hypermethylation markers in blood are highly sensitive and specific for CRC detection, with methylated SEPT9 being particularly robust. The diagnostic performance of hypermethylation markers, which have varied across different studies, can be improved by marker optimization. Future research should examine variation in diagnostic accuracy according to non-neoplastic factors. PMID:27158984
Detection of burst suppression patterns in EEG using recurrence rate.
Liang, Zhenhu; Wang, Yinghua; Ren, Yongshao; Li, Duan; Voss, Logan; Sleigh, Jamie; Li, Xiaoli
2014-01-01
Burst suppression is a unique electroencephalogram (EEG) pattern commonly seen in cases of severely reduced brain activity such as overdose of general anesthesia. It is important to detect burst suppression reliably during the administration of anesthetic or sedative agents, especially for cerebral-protective treatments in various neurosurgical diseases. This study investigates recurrent plot (RP) analysis for the detection of the burst suppression pattern (BSP) in EEG. The RP analysis is applied to EEG data containing BSPs collected from 14 patients. Firstly we obtain the best selection of parameters for RP analysis. Then, the recurrence rate (RR), determinism (DET), and entropy (ENTR) are calculated. Then RR was selected as the best BSP index one-way analysis of variance (ANOVA) and multiple comparison tests. Finally, the performance of RR analysis is compared with spectral analysis, bispectral analysis, approximate entropy, and the nonlinear energy operator (NLEO). ANOVA and multiple comparison tests showed that the RR could detect BSP and that it was superior to other measures with the highest sensitivity of suppression detection (96.49%, P = 0.03). Tracking BSP patterns is essential for clinical monitoring in critically ill and anesthetized patients. The purposed RR may provide an effective burst suppression detector for developing new patient monitoring systems.
An Android malware detection system based on machine learning
NASA Astrophysics Data System (ADS)
Wen, Long; Yu, Haiyang
2017-08-01
The Android smartphone, with its open source character and excellent performance, has attracted many users. However, the convenience of the Android platform also has motivated the development of malware. The traditional method which detects the malware based on the signature is unable to detect unknown applications. The article proposes a machine learning-based lightweight system that is capable of identifying malware on Android devices. In this system we extract features based on the static analysis and the dynamitic analysis, then a new feature selection approach based on principle component analysis (PCA) and relief are presented in the article to decrease the dimensions of the features. After that, a model will be constructed with support vector machine (SVM) for classification. Experimental results show that our system provides an effective method in Android malware detection.
Dou, Maowei; Lopez, Juan; Rios, Misael; Garcia, Oscar; Xiao, Chuan; Eastman, Michael
2016-01-01
A cost-effective battery-powered spectrophotometric system (BASS) was developed for quantitative point-of-care (POC) analysis on a microfluidic chip. By using methylene blue as a model analyte, we first compared the performance of the BASS with a commercial spectrophotometric system, and further applied the BASS for loop-mediated isothermal amplification (LAMP) detection and subsequent quantitative nucleic acid analysis which exhibited a comparable limit of detection to that of Nanodrop. Compared to the commercial spectrophotometric system, our spectrophotometric system is lower-cost, consumes less reagents, and has a higher detection sensitivity. Most importantly, it does not rely on external power supplies. All these features make our spectrophotometric system highly suitable for a variety of POC analyses, such as field detection. PMID:27143408
Covariance of lucky images: performance analysis
NASA Astrophysics Data System (ADS)
Cagigal, Manuel P.; Valle, Pedro J.; Cagigas, Miguel A.; Villó-Pérez, Isidro; Colodro-Conde, Carlos; Ginski, C.; Mugrauer, M.; Seeliger, M.
2017-01-01
The covariance of ground-based lucky images is a robust and easy-to-use algorithm that allows us to detect faint companions surrounding a host star. In this paper, we analyse the relevance of the number of processed frames, the frames' quality, the atmosphere conditions and the detection noise on the companion detectability. This analysis has been carried out using both experimental and computer-simulated imaging data. Although the technique allows us the detection of faint companions, the camera detection noise and the use of a limited number of frames reduce the minimum detectable companion intensity to around 1000 times fainter than that of the host star when placed at an angular distance corresponding to the few first Airy rings. The reachable contrast could be even larger when detecting companions with the assistance of an adaptive optics system.
Sniper detection using infrared camera: technical possibilities and limitations
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.
2010-04-01
The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.
Dong, Ming; Zheng, Chuantao; Miao, Shuzhuo; Zhang, Yu; Du, Qiaoling; Wang, Yiding; Tittel, Frank K
2017-09-27
A multi-gas sensor system was developed that uses a single broadband light source and multiple carbon monoxide (CO), carbon dioxide (CO₂) and methane (CH₄) pyroelectric detectors by use of the time division multiplexing (TDM) technique. A stepper motor-based rotating system and a single-reflection spherical optical mirror were designed and adopted to realize and enhance multi-gas detection. Detailed measurements under static detection mode (without rotation) and dynamic mode (with rotation) were performed to study the performance of the sensor system for the three gas species. Effects of the motor rotating period on sensor performances were also investigated and a rotation speed of 0.4π rad/s was required to obtain a stable sensing performance, corresponding to a detection period of ~10 s to realize one round of detection. Based on an Allan deviation analysis, the 1 σ detection limits under static operation are 2.96, 4.54 and 2.84 parts per million in volume (ppmv) for CO, CO₂ and CH₄, respectively and the 1 σ detection limits under dynamic operations are 8.83, 8.69 and 10.29 ppmv for the three gas species, respectively. The reported sensor has potential applications in various fields requiring CO, CO₂ and CH₄ detection such as in coal mines.
miRNAs as biomarkers for diagnosis of heart failure
Yan, Hualin; Ma, Fan; Zhang, Yi; Wang, Chuan; Qiu, Dajian; Zhou, Kaiyu; Hua, Yimin; Li, Yifei
2017-01-01
Abstract Background: With the rapid development of molecular biology, the kind of mircoRNA (miRNA) has been introduced into emerging role both in cardiac development and pathological procedure. Thus, we conduct this meta-analysis to find out the role of circulating miRNA as a biomarker in detecting heart failure. Methods: We searched PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, and World Health Organization clinical trials registry center to identify relevant studies up to August 2016. We performed meta-analysis in a fixed/random-effect model using Meta-disc 1.4. We used STATA 14.0 to estimate the publication bias and meta-regression. Besides, we took use of SPSS 17.0 to evaluate variance between several groups. Information on true positive, false positive, false negative, and true negative, as well as the quality of research was extracted. Results: We use results from 10 articles to analyze the pooled accuracy. The overall performance of total mixed miRNAs (TmiRs) detection was: pooled sensitivity, 0.74 (95% confidence interval [CI], 0.72 to 0.75); pooled specificity, 0.69 (95%CI, 0.67 to 0.71); and area under the summary receiver operating characteristic curves value (SROC), 0.7991. The miRNA-423-5p (miR-423-5p) detection was: pooled sensitivity, 0.81 (95%CI, 0.76 to 0.85); pooled specificity, 0.67 (95%CI, 0.61 to 0.73); and SROC, 0.8600. However, taken the same patients population, we extracted the data of BNP for detecting heart failure and performed meta-analysis with acceptable SROC as 0.9291. Among the variance analysis, the diagnostic performance of miR-423-5p claimed significant advantages of other pooled results. However, the combination of miRNAs and BNP could increase the accuracy of detecting of heart failure. Unfortunately, there was no dramatic advantage of miR-423-5p compared to BNP protocol. Conclusion: Despite interstudy variability, the performance test of miRNA for detecting heart failure revealed that miR-423-5p demonstrated the potential to be a biomarker. However, other miRNAs were not able to provide enough evidence on promising diagnostic value for heart failure based on the current data. Moreover, the combination of miRNAs and BNP could work as a better method to detection. Unfortunately, BNP was still the most convinced biomarker for such disease. PMID:28562533
Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard
2013-09-06
Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.
Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness
NASA Astrophysics Data System (ADS)
Hardy, Tyler J.; Cain, Stephen C.
2016-05-01
The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shull, D.
This report documents the initial feasibility tests performed using a commercial acoustic emission instrument for the purpose of detecting beetles in Department of Energy 9975 shipping packages. The device selected for this testing was a commercial handheld instrument and probe developed for the detection of termites, weevils, beetles and other insect infestations in wooden structures, trees, plants and soil. The results of two rounds of testing are presented. The first tests were performed by the vendor using only the hand-held instrument’s indications and real-time operator analysis of the audio signal content. The second tests included hands-free positioning of the instrumentmore » probe and post-collection analysis of the recorded audio signal content including audio background comparisons. The test results indicate that the system is promising for detecting the presence of drugstore beetles, however, additional work would be needed to improve the ease of detection and to automate the signal processing to eliminate the need for human interpretation. Mechanisms for hands-free positioning of the probe and audio background discrimination are also necessary for reliable detection and to reduce potential operator dose in radiation environments.« less
SWAN - Detection of explosives by means of fast neutron activation analysis
NASA Astrophysics Data System (ADS)
Gierlik, M.; Borsuk, S.; Guzik, Z.; Iwanowska, J.; Kaźmierczak, Ł.; Korolczuk, S.; Kozłowski, T.; Krakowski, T.; Marcinkowski, R.; Swiderski, L.; Szeptycka, M.; Szewiński, J.; Urban, A.
2016-10-01
In this work we report on SWAN, the experimental, portable device for explosives detection. The device was created as part of the EU Structural Funds Project "Accelerators & Detectors" (POIG.01.01.02-14-012/08-00), with the goal to increase beneficiary's expertise and competencies in the field of neutron activation analysis. Previous experiences and budged limitations lead toward a less advanced design based on fast neutron interactions and unsophisticated data analysis with the emphasis on the latest gamma detection and spectrometry solutions. The final device has been designed as a portable, fast neutron activation analyzer, with the software optimized for detection of carbon, nitrogen and oxygen. SWAN's performance in the role of explosives detector is elaborated in this paper. We demonstrate that the unique features offered by neutron activation analysis might not be impressive enough when confronted with practical demands and expectations of a generic homeland security customer.
A new approach for SSVEP detection using PARAFAC and canonical correlation analysis.
Tello, Richard; Pouryazdian, Saeed; Ferreira, Andre; Beheshti, Soosan; Krishnan, Sridhar; Bastos, Teodiano
2015-01-01
This paper presents a new way for automatic detection of SSVEPs through correlation analysis between tensor models. 3-way EEG tensor of channel × frequency × time is decomposed into constituting factor matrices using PARAFAC model. PARAFAC analysis of EEG tensor enables us to decompose multichannel EEG into constituting temporal, spectral and spatial signatures. SSVEPs characterized with localized spectral and spatial signatures are then detected exploiting a correlation analysis between extracted signatures of the EEG tensor and the corresponding simulated signatures of all target SSVEP signals. The SSVEP that has the highest correlation is selected as the intended target. Two flickers blinking at 8 and 13 Hz were used as visual stimuli and the detection was performed based on data packets of 1 second without overlapping. Five subjects participated in the experiments and the highest classification rate of 83.34% was achieved, leading to the Information Transfer Rate (ITR) of 21.01 bits/min.
NASA Technical Reports Server (NTRS)
Lee, Seung Man; Park, Chunki; Cone, Andrew Clayton; Thipphavong, David P.; Santiago, Confesor
2016-01-01
This presentation contains the analysis results of NAS-wide fast-time simulations with UAS and VFR traffic for a single day for evaluating the performance of Detect-and-Avoid (DAA) alerting and guidance systems. This purpose of this study was to help refine and validate MOPS alerting and guidance requirements. In this study, we generated plots of all performance metrics that are specified by RTCA SC-228 Minimum Operational Performance Standards (MOPS): 1) to evaluate the sensitivity of alerting parameters on the performance metrics of each DAA alert type: Preventive, Corrective, and Warning alerts and 2) to evaluate the effect of sensor uncertainty on DAA alerting and guidance performance.
Matsudate, Yoshihiro; Naruto, Takuya; Hayashi, Yumiko; Minami, Mitsuyoshi; Tohyama, Mikiko; Yokota, Kenji; Yamada, Daisuke; Imoto, Issei; Kubo, Yoshiaki
2017-06-01
Nevoid basal cell carcinoma syndrome (NBCCS) is an autosomal dominant disorder mainly caused by heterozygous mutations of PTCH1. In addition to characteristic clinical features, detection of a mutation in causative genes is reliable for the diagnosis of NBCCS; however, no mutations have been identified in some patients using conventional methods. To improve the method for the molecular diagnosis of NBCCS. We performed targeted exome sequencing (TES) analysis using a multi-gene panel, including PTCH1, PTCH2, SUFU, and other sonic hedgehog signaling pathway-related genes, based on next-generation sequencing (NGS) technology in 8 cases in whom possible causative mutations were not detected by previously performed conventional analysis and 2 recent cases of NBCCS. Subsequent analysis of gross deletion within or around PTCH1 detected by TES was performed using chromosomal microarray (CMA). Through TES analysis, specific single nucleotide variants or small indels of PTCH1 causing inferred amino acid changes were identified in 2 novel cases and 2 undiagnosed cases, whereas gross deletions within or around PTCH1, which are validated by CMA, were found in 3 undiagnosed cases. However, no mutations were detected even by TES in 3 cases. Among 3 cases with gross deletions of PTCH1, deletions containing the entire PTCH1 and additional neighboring genes were detected in 2 cases, one of which exhibited atypical clinical features, such as severe mental retardation, likely associated with genes located within the 4.3Mb deleted region, especially. TES-based simultaneous evaluation of sequences and copy number status in all targeted coding exons by NGS is likely to be more useful for the molecular diagnosis of NBCCS than conventional methods. CMA is recommended as a subsequent analysis for validation and detailed mapping of deleted regions, which may explain the atypical clinical features of NBCCS cases. Copyright © 2017 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.
"Dip-and-read" paper-based analytical devices using distance-based detection with color screening.
Yamada, Kentaro; Citterio, Daniel; Henry, Charles S
2018-05-15
An improved paper-based analytical device (PAD) using color screening to enhance device performance is described. Current detection methods for PADs relying on the distance-based signalling motif can be slow due to the assay time being limited by capillary flow rates that wick fluid through the detection zone. For traditional distance-based detection motifs, analysis can take up to 45 min for a channel length of 5 cm. By using a color screening method, quantification with a distance-based PAD can be achieved in minutes through a "dip-and-read" approach. A colorimetric indicator line deposited onto a paper substrate using inkjet-printing undergoes a concentration-dependent colorimetric response for a given analyte. This color intensity-based response has been converted to a distance-based signal by overlaying a color filter with a continuous color intensity gradient matching the color of the developed indicator line. As a proof-of-concept, Ni quantification in welding fume was performed as a model assay. The results of multiple independent user testing gave mean absolute percentage error and average relative standard deviations of 10.5% and 11.2% respectively, which were an improvement over analysis based on simple visual color comparison with a read guide (12.2%, 14.9%). In addition to the analytical performance comparison, an interference study and a shelf life investigation were performed to further demonstrate practical utility. The developed system demonstrates an alternative detection approach for distance-based PADs enabling fast (∼10 min), quantitative, and straightforward assays.
Hair analysis for long-term monitoring of buprenorphine intake in opiate withdrawal.
Pirro, Valentina; Fusari, Ivana; Di Corcia, Daniele; Gerace, Enrico; De Vivo, Enrico; Salomone, Alberto; Vincenti, Marco
2014-12-01
Buprenorphine (BUP) is a psychoactive pharmaceutical drug largely used to treat opiate addiction. Short-term therapeutic monitoring is supported by toxicological analysis of blood and urine samples, whereas long-term monitoring by means of hair analysis is rarely used. Aim of this work was to develop and validate a highly sensitive ultrahigh-performance liquid chromatography tandem mass spectrometry method to detect BUP and norbuprenorphine (NBUP) in head hair. Interindividual correlation between oral dosage of BUP and head hair concentration was investigated. Furthermore, an intra-individual study by means of segmental analysis was performed on subjects with variable maintenance dosage. Hair samples from a population of 79 patients in treatment for opiate addiction were analyzed. The validated ultrahigh-performance liquid chromatography tandem mass spectrometry protocol allowed to obtain limits of detection and quantification at 0.6 and 2.2 pg/mg for BUP and 5.0 and 17 pg/mg for NBUP, respectively. Validation criteria were satisfied, assuring selective analyte identification, high detection capability, and precise and accurate quantification. Significant positive correlation was found between constant oral BUP dosage (1-32 mg/d) and the summed up head hair concentrations of BUP and NBUP. Nevertheless, substantial interindividual variability limits the chance to predict the oral dosage taken by each subject from the measured concentrations in head hair. In contrast, strong correlation was observed in the results of intra-individual segmental analysis, which proved reliable to detect oral dosage variations during therapy. Remarkably, all hair samples yielded BUP concentrations higher than 10 pg/mg, even when the lowest dosage was administered. Thus, these results support the selection of 10 pg/mg as a cutoff value.
Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L
2016-02-01
Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.
Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N
2017-10-01
Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.
Train integrity detection risk analysis based on PRISM
NASA Astrophysics Data System (ADS)
Wen, Yuan
2018-04-01
GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.
NASA Astrophysics Data System (ADS)
Louchard, Eric; Farm, Brian; Acker, Andrew
2008-04-01
BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.
Detection of unmanned aerial vehicles using a visible camera system.
Hu, Shuowen; Goldman, Geoffrey H; Borel-Donohue, Christoph C
2017-01-20
Unmanned aerial vehicles (UAVs) flown by adversaries are an emerging asymmetric threat to homeland security and the military. To help address this threat, we developed and tested a computationally efficient UAV detection algorithm consisting of horizon finding, motion feature extraction, blob analysis, and coherence analysis. We compare the performance of this algorithm against two variants, one using the difference image intensity as the motion features and another using higher-order moments. The proposed algorithm and its variants are tested using field test data of a group 3 UAV acquired with a panoramic video camera in the visible spectrum. The performance of the algorithms was evaluated using receiver operating characteristic curves. The results show that the proposed approach had the best performance compared to the two algorithmic variants.
Vulnerability detection using data-flow graphs and SMT solvers
2016-10-31
concerns. The framework is modular and pipelined to allow scalable analysis on distributed systems. Our vulnerability detection framework employs machine...Design We designed the framework to be modular to enable flexible reuse and extendibility. In its current form, our framework performs the following
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
NASA Technical Reports Server (NTRS)
Wincheski, Buzz; Williams, Phillip; Simpson, John
2007-01-01
The use of eddy current techniques for the detection of outer diameter damage in tubing and many complex aerospace structures often requires the use of an inner diameter probe due to a lack of access to the outside of the part. In small bore structures the probe size and orientation are constrained by the inner diameter of the part, complicating the optimization of the inspection technique. Detection of flaws through a significant remaining wall thickness becomes limited not only by the standard depth of penetration, but also geometrical aspects of the probe. Recently, an orthogonal eddy current probe was developed for detection of such flaws in Space Shuttle Primary Reaction Control System (PRCS) Thrusters. In this case, the detection of deeply buried stress corrosion cracking by an inner diameter eddy current probe was sought. Probe optimization was performed based upon the limiting spatial dimensions, flaw orientation, and required detection sensitivity. Analysis of the probe/flaw interaction was performed through the use of finite and boundary element modeling techniques. Experimental data for the flaw detection capabilities, including a probability of detection study, will be presented along with the simulation data. The results of this work have led to the successful deployment of an inspection system for the detection of stress corrosion cracking in Space Shuttle Primary Reaction Control System (PRCS) Thrusters.
Ferry, Barbara; Gifu, Elena-Patricia; Sandu, Ioana; Denoroy, Luc; Parrot, Sandrine
2014-03-01
Electrochemical methods are very often used to detect catecholamine and indolamine neurotransmitters separated by conventional reverse-phase high performance liquid chromatography (HPLC). The present paper presents the development of a chromatographic method to detect monoamines present in low-volume brain dialysis samples using a capillary column filled with sub-2μm particles. Several parameters (repeatability, linearity, accuracy, limit of detection) for this new ultrahigh performance liquid chromatography (UHPLC) method with electrochemical detection were examined after optimization of the analytical conditions. Noradrenaline, adrenaline, serotonin, dopamine and its metabolite 3-methoxytyramine were separated in 1μL of injected sample volume; they were detected above concentrations of 0.5-1nmol/L, with 2.1-9.5% accuracy and intra-assay repeatability equal to or less than 6%. The final method was applied to very low volume dialysates from rat brain containing monoamine traces. The study demonstrates that capillary UHPLC with electrochemical detection is suitable for monitoring dialysate monoamines collected at high sampling rate. Copyright © 2014 Elsevier B.V. All rights reserved.
An Automated Directed Spectral Search Methodology for Small Target Detection
NASA Astrophysics Data System (ADS)
Grossman, Stanley I.
Much of the current efforts in remote sensing tackle macro-level problems such as determining the extent of wheat in a field, the general health of vegetation or the extent of mineral deposits in an area. However, for many of the remaining remote sensing challenges being studied currently, such as border protection, drug smuggling, treaty verification, and the war on terror, most targets are very small in nature - a vehicle or even a person. While in typical macro-level problems the objective vegetation is in the scene, for small target detection problems it is not usually known if the desired small target even exists in the scene, never mind finding it in abundance. The ability to find specific small targets, such as vehicles, typifies this problem. Complicating the analyst's life, the growing number of available sensors is generating mountains of imagery outstripping the analysts' ability to visually peruse them. This work presents the important factors influencing spectral exploitation using multispectral data and suggests a different approach to small target detection. The methodology of directed search is presented, including the use of scene-modeled spectral libraries, various search algorithms, and traditional statistical and ROC curve analysis. The work suggests a new metric to calibrate analysis labeled the analytic sweet spot as well as an estimation method for identifying the sweet spot threshold for an image. It also suggests a new visualization aid for highlighting the target in its entirety called nearest neighbor inflation (NNI). It brings these all together to propose that these additions to the target detection arena allow for the construction of a fully automated target detection scheme. This dissertation next details experiments to support the hypothesis that the optimum detection threshold is the analytic sweet spot and that the estimation method adequately predicts it. Experimental results and analysis are presented for the proposed directed search techniques of spectral image based small target detection. It offers evidence of the functionality of the NNI visualization and also provides evidence that the increased spectral dimensionality of the 8-band Worldview-2 datasets provides noteworthy improvement in results over traditional 4-band multispectral datasets. The final experiment presents the results from a prototype fully automated target detection scheme in support of the overarching premise. This work establishes the analytic sweet spot as the optimum threshold defined as the point where error detection rate curves -- false detections vs. missing detections -- cross. At this point the errors are minimized while the detection rate is maximized. It then demonstrates that taking the first moment statistic of the histogram of calculated target detection values from a detection search with test threshold set arbitrarily high will estimate the analytic sweet spot for that image. It also demonstrates that directed search techniques -- when utilized with appropriate scene-specific modeled signatures and atmospheric compensations -- perform at least as well as in-scene search techniques 88% of the time and grossly under-performing only 11% of the time; the in-scene only performs as well or better 50% of the time. It further demonstrates the clear advantage increased multispectral dimensionality brings to detection searches improving performance in 50% of the cases while performing at least as well 72% of the time. Lastly, it presents evidence that a fully automated prototype performs as anticipated laying the groundwork for further research into fully automated processes for small target detection.
2004-02-01
UNCLASSIFIED − Conducted experiments to determine the usability of general-purpose anomaly detection algorithms to monitor a large, complex military...reaction and detection modules to perform tailored analysis sequences to monitor environmental conditions, health hazards and physiological states...scalability of lab proven anomaly detection techniques for intrusion detection in real world high volume environments. Narrative Title FY 2003
[Analysis of Arsenic Compounds in Blood and Urine by HPLC-ICP-MS].
Lin, L; Zhang, S J; Xu, W C; Luo, R X; Ma, D; Shen, M
2018-02-01
To establish an analysis method for the detection of 6 arsenic compounds [AsC, AsB, As(Ⅲ), DMA, MMA and As(V)] in blood and urine by high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS), and apply it to real cases. Triton was used to damage cells, and then EDTA·2Na·2H2O was used to complex arsenic compounds in cells, and sonication and protein deposition by acetonitrile were performed for sample pretreatment. With the mobile phase consisted of ammonium carbonate and ultrapure water, gradient elution was performed for obtaining the arsenic compounds in samples, which were analysed by ICP-MS with Hamilton PRP-X100 column. The limits of detection in blood were 1.66-10 ng/mL, while the lower limits of quantitation in blood ranged from 5 to 30 ng/mL. The limits of detection in urine were 0.5-10 ng/mL, while the lower limits of quantitation in urine were 5-30 ng/mL. The relative standard deviation of inter-day and intra-day precisions was less than 10%. This method had been successfully applied to 3 cases. This study has established an analysis method for detecting 6 common arsenic compounds in blood and urine, which can be used to detect the arsenic compounds in the blood and urine from arsenic poisoning cases as well as the patients under arsenic treatment. Copyright© by the Editorial Department of Journal of Forensic Medicine.
Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy
NASA Astrophysics Data System (ADS)
Mehrubeoglu, Mehrube; McLauchlan, Lifford
2010-08-01
In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.
Mutation detection in the human HSP70B′ gene by denaturing high-performance liquid chromatography
Hecker, Karl H.; Asea, Alexzander; Kobayashi, Kaoru; Green, Stacy; Tang, Dan; Calderwood, Stuart K.
2000-01-01
Variances, particularly single nucleotide polymorphisms (SNP), in the genomic sequence of individuals are the primary key to understanding gene function as it relates to differences in the susceptibility to disease, environmental influences, and therapy. In this report, the HSP70B′ gene is the target sequence for mutation detection in biopsy samples from human prostate cancer patients undergoing combined hyperthermia and radiation therapy at the Dana-Farber Cancer Institute, using temperature-modulated heteroduplex analysis (TMHA). The underlying principles of TMHA for mutation detection using DHPLC technology are discussed. The procedures involved in amplicon design for mutation analysis by DHPLC are detailed. The melting behavior of the complete coding sequence of the target gene is characterized using WAVEMAKERTM software. Four overlapping amplicons, which span the complete coding region of the HSP70B′ gene, amenable to mutation detection by DHPLC were identified based on the software-predicted melting profile of the target sequence. TMHA was performed on PCR products of individual amplicons of the HSP70B′ gene on the WAVE® Nucleic Acid Fragment Analysis System. The criteria for mutation calling by comparing wild-type and mutant chromatographic patterns are discussed. PMID:11189446
Mutation detection in the human HSP7OB' gene by denaturing high-performance liquid chromatography.
Hecker, K H; Asea, A; Kobayashi, K; Green, S; Tang, D; Calderwood, S K
2000-11-01
Variances, particularly single nucleotide polymorphisms (SNP), in the genomic sequence of individuals are the primary key to understanding gene function as it relates to differences in the susceptibility to disease, environmental influences, and therapy. In this report, the HSP70B' gene is the target sequence for mutation detection in biopsy samples from human prostate cancer patients undergoing combined hyperthermia and radiation therapy at the Dana-Farber Cancer Institute, using temperature-modulated heteroduplex analysis (TMHA). The underlying principles of TMHA for mutation detection using DHPLC technology are discussed. The procedures involved in amplicon design for mutation analysis by DHPLC are detailed. The melting behavior of the complete coding sequence of the target gene is characterized using WAVEMAKER software. Four overlapping amplicons, which span the complete coding region of the HSP70B' gene, amenable to mutation detection by DHPLC were identified based on the software-predicted melting profile of the target sequence. TMHA was performed on PCR products of individual amplicons of the HSP70B' gene on the WAVE Nucleic Acid Fragment Analysis System. The criteria for mutation calling by comparing wild-type and mutant chromatographic patterns are discussed.
Krecsák, László; Micsik, Tamás; Kiszler, Gábor; Krenács, Tibor; Szabó, Dániel; Jónás, Viktor; Császár, Gergely; Czuni, László; Gurzó, Péter; Ficsor, Levente; Molnár, Béla
2011-01-18
The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two scoring schemes. NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14 software application proved to be a reliable image analysis tool for pathologists testing ER and PR status in breast cancer.
Droplet-Based Segregation and Extraction of Concentrated Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, C R; Buckley, P; Hamilton, J
2007-02-23
Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less
Holmström, Oscar; Linder, Nina; Ngasala, Billy; Mårtensson, Andreas; Linder, Ewert; Lundin, Mikael; Moilanen, Hannu; Suutala, Antti; Diwan, Vinod; Lundin, Johan
2017-06-01
Microscopy remains the gold standard in the diagnosis of neglected tropical diseases. As resource limited, rural areas often lack laboratory equipment and trained personnel, new diagnostic techniques are needed. Low-cost, point-of-care imaging devices show potential in the diagnosis of these diseases. Novel, digital image analysis algorithms can be utilized to automate sample analysis. Evaluation of the imaging performance of a miniature digital microscopy scanner for the diagnosis of soil-transmitted helminths and Schistosoma haematobium, and training of a deep learning-based image analysis algorithm for automated detection of soil-transmitted helminths in the captured images. A total of 13 iodine-stained stool samples containing Ascaris lumbricoides, Trichuris trichiura and hookworm eggs and 4 urine samples containing Schistosoma haematobium were digitized using a reference whole slide-scanner and the mobile microscopy scanner. Parasites in the images were identified by visual examination and by analysis with a deep learning-based image analysis algorithm in the stool samples. Results were compared between the digital and visual analysis of the images showing helminth eggs. Parasite identification by visual analysis of digital slides captured with the mobile microscope was feasible for all analyzed parasites. Although the spatial resolution of the reference slide-scanner is higher, the resolution of the mobile microscope is sufficient for reliable identification and classification of all parasites studied. Digital image analysis of stool sample images captured with the mobile microscope showed high sensitivity for detection of all helminths studied (range of sensitivity = 83.3-100%) in the test set (n = 217) of manually labeled helminth eggs. In this proof-of-concept study, the imaging performance of a mobile, digital microscope was sufficient for visual detection of soil-transmitted helminths and Schistosoma haematobium. Furthermore, we show that deep learning-based image analysis can be utilized for the automated detection and classification of helminths in the captured images.
Holmström, Oscar; Linder, Nina; Ngasala, Billy; Mårtensson, Andreas; Linder, Ewert; Lundin, Mikael; Moilanen, Hannu; Suutala, Antti; Diwan, Vinod; Lundin, Johan
2017-01-01
ABSTRACT Background: Microscopy remains the gold standard in the diagnosis of neglected tropical diseases. As resource limited, rural areas often lack laboratory equipment and trained personnel, new diagnostic techniques are needed. Low-cost, point-of-care imaging devices show potential in the diagnosis of these diseases. Novel, digital image analysis algorithms can be utilized to automate sample analysis. Objective: Evaluation of the imaging performance of a miniature digital microscopy scanner for the diagnosis of soil-transmitted helminths and Schistosoma haematobium, and training of a deep learning-based image analysis algorithm for automated detection of soil-transmitted helminths in the captured images. Methods: A total of 13 iodine-stained stool samples containing Ascaris lumbricoides, Trichuris trichiura and hookworm eggs and 4 urine samples containing Schistosoma haematobium were digitized using a reference whole slide-scanner and the mobile microscopy scanner. Parasites in the images were identified by visual examination and by analysis with a deep learning-based image analysis algorithm in the stool samples. Results were compared between the digital and visual analysis of the images showing helminth eggs. Results: Parasite identification by visual analysis of digital slides captured with the mobile microscope was feasible for all analyzed parasites. Although the spatial resolution of the reference slide-scanner is higher, the resolution of the mobile microscope is sufficient for reliable identification and classification of all parasites studied. Digital image analysis of stool sample images captured with the mobile microscope showed high sensitivity for detection of all helminths studied (range of sensitivity = 83.3–100%) in the test set (n = 217) of manually labeled helminth eggs. Conclusions: In this proof-of-concept study, the imaging performance of a mobile, digital microscope was sufficient for visual detection of soil-transmitted helminths and Schistosoma haematobium. Furthermore, we show that deep learning-based image analysis can be utilized for the automated detection and classification of helminths in the captured images. PMID:28838305
An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers
Sun, Kewen; Jin, Tian; Yang, Dongkai
2015-01-01
In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner–Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments. PMID:25905704
Prediction of topographic and bathymetric measurement performance of airborne low-SNR lidar systems
NASA Astrophysics Data System (ADS)
Cossio, Tristan
Low signal-to-noise ratio (LSNR) lidar (light detection and ranging) is an alternative paradigm to traditional lidar based on the detection of return signals at the single photoelectron level. The objective of this work was to predict low altitude (600 m) LSNR lidar system performance with regards to elevation measurement and target detection capability in topographic (dry land) and bathymetric (shallow water) scenarios. A modular numerical sensor model has been developed to provide data for further analysis due to the dearth of operational low altitude LSNR lidar systems. This simulator tool is described in detail, with consideration given to atmospheric effects, surface conditions, and the effects of laser phenomenology. Measurement performance analysis of the simulated topographic data showed results comparable to commercially available lidar systems, with a standard deviation of less than 12 cm for calculated elevation values. Bathymetric results, although dependent largely on water turbidity, were indicative of meter-scale horizontal data spacing for sea depths less than 5 m. The high prevalence of noise in LSNR lidar data introduces significant difficulties in data analysis. Novel algorithms to reduce noise are described, with particular focus on their integration into an end-to-end target detection classifier for both dry and submerged targets (cube blocks, 0.5 m to 1.0 m on a side). The key characteristic exploited to discriminate signal and noise is the temporal coherence of signal events versus the random distribution of noise events. Target detection performance over dry earth was observed to be robust, reliably detecting over 90% of targets with a minimal false alarm rate. Comparable results were observed in waters of high clarity, where the investigated system was generally able to detect more than 70% of targets to a depth of 5 m. The results of the study show that CATS, the University of Florida's LSNR lidar prototype, is capable of high fidelity (decimeter-scale) coverage of the topographic zone with limited applicability to shallow waters less than 5 m deep. To increase the spatial-temporal contrast between signal and noise events, laser pulse rate is the optimal system characteristic to improve in future LSNR lidar units.
Validation of an automated seizure detection algorithm for term neonates
Mathieson, Sean R.; Stevenson, Nathan J.; Low, Evonne; Marnane, William P.; Rennie, Janet M.; Temko, Andrey; Lightbody, Gordon; Boylan, Geraldine B.
2016-01-01
Objective The objective of this study was to validate the performance of a seizure detection algorithm (SDA) developed by our group, on previously unseen, prolonged, unedited EEG recordings from 70 babies from 2 centres. Methods EEGs of 70 babies (35 seizure, 35 non-seizure) were annotated for seizures by experts as the gold standard. The SDA was tested on the EEGs at a range of sensitivity settings. Annotations from the expert and SDA were compared using event and epoch based metrics. The effect of seizure duration on SDA performance was also analysed. Results Between sensitivity settings of 0.5 and 0.3, the algorithm achieved seizure detection rates of 52.6–75.0%, with false detection (FD) rates of 0.04–0.36 FD/h for event based analysis, which was deemed to be acceptable in a clinical environment. Time based comparison of expert and SDA annotations using Cohen’s Kappa Index revealed a best performing SDA threshold of 0.4 (Kappa 0.630). The SDA showed improved detection performance with longer seizures. Conclusion The SDA achieved promising performance and warrants further testing in a live clinical evaluation. Significance The SDA has the potential to improve seizure detection and provide a robust tool for comparing treatment regimens. PMID:26055336
Hautvast, Gilion L T F; Salton, Carol J; Chuang, Michael L; Breeuwer, Marcel; O'Donnell, Christopher J; Manning, Warren J
2012-05-01
Quantitative analysis of short-axis functional cardiac magnetic resonance images can be performed using automatic contour detection methods. The resulting myocardial contours must be reviewed and possibly corrected, which can be time-consuming, particularly when performed across all cardiac phases. We quantified the impact of manual contour corrections on both analysis time and quantitative measurements obtained from left ventricular short-axis cine images acquired from 1555 participants of the Framingham Heart Study Offspring cohort using computer-aided contour detection methods. The total analysis time for a single case was 7.6 ± 1.7 min for an average of 221 ± 36 myocardial contours per participant. This included 4.8 ± 1.6 min for manual contour correction of 2% of all automatically detected endocardial contours and 8% of all automatically detected epicardial contours. However, the impact of these corrections on global left ventricular parameters was limited, introducing differences of 0.4 ± 4.1 mL for end-diastolic volume, -0.3 ± 2.9 mL for end-systolic volume, 0.7 ± 3.1 mL for stroke volume, and 0.3 ± 1.8% for ejection fraction. We conclude that left ventricular functional parameters can be obtained under 5 min from short-axis functional cardiac magnetic resonance images using automatic contour detection methods. Manual correction more than doubles analysis time, with minimal impact on left ventricular volumes and ejection fraction. Copyright © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Ajadi, O. A.; Meyer, F. J.; Tello, M.
2015-12-01
This research presents a promising new method for the detection and tracking of oil spills from Synthetic Aperture Radar (SAR) data. The method presented here combines a number of advanced image processing techniques in order to overcome some common performance limitations of SAR-based oil spill detection. Principal among these limitations are: (1) the radar cross section of the ocean surface strongly depends on wind and wave activities and is therefore highly variable; (2) the radar cross section of oil covered waters is often indistinguishable from other dark ocean features such as low wind areas or oil lookalikes, leading to ambiguities in oil spill detection. In this paper, we introduce two novel image analysis techniques to largely mitigate the aforementioned performance limitations, namely Lipschitz regularity (LR) and Wavelet transforms. We used LR, an image texture parameter akin to the slope of the local power spectrum, in our approach to mitigate these limitations. We show that the LR parameter is much less sensitive to variations of wind and waves than the original image amplitude, lending itself well for normalizing image content. Beyond its benefit for image normalization, we also show that the LR transform enhances the contrast between oil-covered and oil-free ocean surfaces and therefore improves overall spill detection performance. To calculate LR, the SAR images are decomposed using two-dimensional continuous wavelet transform (2D-CWT), which are furthermore transformed into Holder space to measure LR. Finally, we demonstrate that the implementation of wavelet transforms provide additional benefits related to the adaptive reduction of speckle noise. We show how LR and CWT are integrated into our image analysis workflow for application to oil spill detection. To describe the performance of this approach under controlled conditions, we applied our method to simulated SAR data of wind driven oceans containing oil spills of various properties. We also show applications to several real life oil spill scenarios using a series of L-band ALOS PALSAR images and X-band TerraSAR-X images acquired during the Deep Water Horizon spill in the Gulf of Mexico in 2010. From our analysis, we concluded that the LR and CWT have distinct advantages in oil spill detection and lead to high performance spill mapping results.
Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.
2018-02-13
Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trojanowicz, Marek; Kolacinska, Kamila; Grate, Jay W.
Here, the safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. Themore » benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β–radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection.« less
Trojanowicz, Marek; Kołacińska, Kamila; Grate, Jay W
2018-06-01
The safety and security of nuclear power plant operations depend on the application of the most appropriate techniques and methods of chemical analysis, where modern flow analysis methods prevail. Nevertheless, the current status of the development of these methods is more limited than it might be expected based on their genuine advantages. The main aim of this paper is to review the automated flow analysis procedures developed with various detection methods for the nuclear energy industry. The flow analysis methods for the determination of radionuclides, that have been reported to date, are primarily focused on their environmental applications. The benefits of the application of flow methods in both monitoring of the nuclear wastes and process analysis of the primary circuit coolants of light water nuclear reactors will also be discussed. The application of either continuous flow methods (CFA) or injection methods (FIA, SIA) of the flow analysis with the β-radiometric detection shortens the analysis time and improves the precision of determination due to mechanization of certain time-consuming operations of the sample processing. Compared to the radiometric detection, the mass spectrometry (MS) detection enables one to perform multicomponent analyses as well as the determination of transuranic isotopes with much better limits of detection. Copyright © 2018 Elsevier B.V. All rights reserved.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Automated Thermal Sample Acquisition with Applications
NASA Astrophysics Data System (ADS)
Kooshesh, K. A.; Lineberger, D. H.
2012-03-01
We created an Arduino®-based robot to detect samples subject to an experiment, perform measurements once each sample is located, and store the results for further analysis. We then relate the robot’s performance to an experiment on thermal inertia.
Field study evaluation of cepstrum coefficient speech analysis for fatigue in aviation cabin crew.
DOT National Transportation Integrated Search
2013-10-01
Impaired neurobehavioral performance induced by fatigue may compromise safety in 24-hr operational : environmentssuch as aviation. As such, non-invasive, reliable, and valid methods of objectively detecting : compromised performance capacity in opera...
Nazarzadeh, Kimia; Arjunan, Sridhar P; Kumar, Dinesh K; Das, Debi Prasad
2016-08-01
In this study, we have analyzed the accelerometer data recorded during gait analysis of Parkinson disease patients for detecting freezing of gait (FOG) episodes. The proposed method filters the recordings for noise reduction of the leg movement changes and computes the wavelet coefficients to detect FOG events. Publicly available FOG database was used and the technique was evaluated using receiver operating characteristic (ROC) analysis. Results show a higher performance of the wavelet feature in discrimination of the FOG events from the background activity when compared with the existing technique.
Almenoff, June S; LaCroix, Karol K; Yuen, Nancy A; Fram, David; DuMouchel, William
2006-01-01
There is increasing interest in using disproportionality-based signal detection methods to support postmarketing safety surveillance activities. Two commonly used methods, empirical Bayes multi-item gamma Poisson shrinker (MGPS) and proportional reporting ratio (PRR), perform differently with respect to the number and types of signals detected. The goal of this study was to compare and analyse the performance characteristics of these two methods, to understand why they differ and to consider the practical implications of these differences for a large, industry-based pharmacovigilance department. We compared the numbers and types of signals of disproportionate reporting (SDRs) obtained with MGPS and PRR using two postmarketing safety databases and a simulated database. We recorded signal counts and performed a qualitative comparison of the drug-event combinations signalled by the two methods as well as a sensitivity analysis to better understand how the thresholds commonly used for these methods impact their performance. PRR detected more SDRs than MGPS. We observed that MGPS is less subject to confounding by demographic factors because it employs stratification and is more stable than PRR when report counts are low. Simulation experiments performed using published empirical thresholds demonstrated that PRR detected false-positive signals at a rate of 1.1%, while MGPS did not detect any statistical false positives. In an attempt to separate the effect of choice of signal threshold from more fundamental methodological differences, we performed a series of experiments in which we modified the conventional threshold values for each method so that each method detected the same number of SDRs for the example drugs studied. This analysis, which provided quantitative examples of the relationship between the published thresholds for the two methods, demonstrates that the signalling criterion published for PRR has a higher signalling frequency than that published for MGPS. The performance differences between the PRR and MGPS methods are related to (i) greater confounding by demographic factors with PRR; (ii) a higher tendency of PRR to detect false-positive signals when the number of reports is small; and (iii) the conventional thresholds that have been adapted for each method. PRR tends to be more 'sensitive' and less 'specific' than MGPS. A high-specificity disproportionality method, when used in conjunction with medical triage and investigation of critical medical events, may provide an efficient and robust approach to applying quantitative methods in routine postmarketing pharmacovigilance.
Detection of Bladder CA by Microsatellite Analysis (MSA) — EDRN Public Portal
Goal 1: To determine sensitivity and specificity of microsatellite analysis (MSA) of urine sediment, using a panel of 15 microsatellite markers, in detecting bladder cancer in participants requiring cystoscopy. This technique will be compared to the diagnostic standard of cystoscopy, as well as to urine cytology. Goal 2: To determine the temporal performance characteristics of microsatellite analysis of urine sediment. Goal 3: To determine which of the 15 individual markers or combination of markers that make up the MSA test are most predictive of the presence of bladder cancer.
2014-12-23
Campbell 225 Using Johnson Distribution for Automatic Threshold Setting in Wind Turbine Condition Monitoring System Kun S. Marhadi and Georgios Alexandros...Victoria M. Catterson, Craig Love, and Andrew Robb 725 Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis Georgios...instance, in wind turbine gearbox analysis (Zappalà et al., 2012). Various other techniques for frequency domain analysis have been explored for
Castro-Suarez, John R; Pacheco-Londoño, Leonardo C; Vélez-Reyes, Miguel; Diem, Max; Tague, Thomas J; Hernandez-Rivera, Samuel P
2013-02-01
A standoff detection system was assembled by coupling a reflecting telescope to a Fourier transform infrared spectrometer equipped with a cryo-cooled mercury cadmium telluride detector and used for detection of solid-phase samples deposited on substrates. Samples of highly energetic materials were deposited on aluminum substrates and detected at several collector-target distances by performing passive-mode, remote, infrared detection measurements on the heated analytes. Aluminum plates were used as support material, and 2,4,6-Trinitrotoluene (TNT) was used as the target. For standoff detection experiments, the samples were placed at different distances (4 to 55 m). Several target surface temperatures were investigated. Partial least squares regression analysis was applied to the analysis of the intensities of the spectra obtained. Overall, standoff detection in passive mode was useful for quantifying TNT deposited on the aluminum plates with high confidence up to target-collector distances of 55 m.
NASA Astrophysics Data System (ADS)
Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf
2016-11-01
This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.
Automatic sentence extraction for the detection of scientific paper relations
NASA Astrophysics Data System (ADS)
Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.
2018-03-01
The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.
Neal, Andrew; Kwantes, Peter J
2009-04-01
The aim of this article is to develop a formal model of conflict detection performance. Our model assumes that participants iteratively sample evidence regarding the state of the world and accumulate it over time. A decision is made when the evidence reaches a threshold that changes over time in response to the increasing urgency of the task. Two experiments were conducted to examine the effects of conflict geometry and timing on response proportions and response time. The model is able to predict the observed pattern of response times, including a nonmonotonic relationship between distance at point of closest approach and response time, as well as effects of angle of approach and relative velocity. The results demonstrate that evidence accumulation models provide a good account of performance on a conflict detection task. Evidence accumulation models are a form of dynamic signal detection theory, allowing for the analysis of response times as well as response proportions, and can be used for simulating human performance on dynamic decision tasks.
Fan, Shu-Han; Chou, Chia-Ching; Chen, Wei-Chen; Fang, Wai-Chi
2015-01-01
In this study, an effective real-time obstructive sleep apnea (OSA) detection method from frequency analysis of ECG-derived respiratory (EDR) and heart rate variability (HRV) is proposed. Compared to traditional Polysomnography (PSG) which needs several physiological signals measured from patients, the proposed OSA detection method just only use ECG signals to determine the time interval of OSA. In order to be feasible to be implemented in hardware to achieve the real-time detection and portable application, the simplified Lomb Periodogram is utilized to perform the frequency analysis of EDR and HRV in this study. The experimental results of this work indicate that the overall accuracy can be effectively increased with values of Specificity (Sp) of 91%, Sensitivity (Se) of 95.7%, and Accuracy of 93.2% by integrating the EDR and HRV indexes.
Bacterial Pathogens Associated with Community-acquired Pneumonia in Children Aged Below Five Years.
Das, Anusmita; Patgiri, Saurav J; Saikia, Lahari; Dowerah, Pritikar; Nath, Reema
2016-03-01
To determine the spectrum of bacterial pathogens causing community-acquired pneumonia in children below 5 years of age. Children aged below 5 years satisfying the WHO criteria for pneumonia, severe pneumonia or very severe pneumonia, and with the presence of lung infiltrates on chest X-ray were enrolled. Two respiratory samples, one for culture and the other for PCR analysis, and a blood sample for culture were collected from every child. Of the 180 samples processed, bacterial pathogens were detected in 64.4%. Streptococcus pneumoniae and Hemophilus influenzae were most frequently detected. The performance of PCR analysis and culture were identical for the typical bacterial pathogens; atypical pathogens were detected by PCR analysis only. S. pneumoniae and H. influenza were the most commonly detected organisms from respiratory secretions of children with community acquired pneumonia.
NASA Astrophysics Data System (ADS)
Wang, Z.; Quek, S. T.
2015-07-01
Performance of any structural health monitoring algorithm relies heavily on good measurement data. Hence, it is necessary to employ robust faulty sensor detection approaches to isolate sensors with abnormal behaviour and exclude the highly inaccurate data in the subsequent analysis. The independent component analysis (ICA) is implemented to detect the presence of sensors showing abnormal behaviour. A normalized form of the relative partial decomposition contribution (rPDC) is proposed to identify the faulty sensor. Both additive and multiplicative types of faults are addressed and the detectability illustrated using a numerical and an experimental example. An empirical method to establish control limits for detecting and identifying the type of fault is also proposed. The results show the effectiveness of the ICA and rPDC method in identifying faulty sensor assuming that baseline cases are available.
2009-09-01
prior to Traditional VT pro- iv cessing. This proves to be effective and provides more robust burst detection for −3 ≤ SNR ≤ 10 dB. Performance of a...TD and WD Dimensionality . . . . . 74 4.4 Performance Sensitivity Analysis . . . . . . . . . . . . . 77 4.4.1 Effect of Burst Location Error...78 4.4.2 Effect of Dissimilar Signal SNRs . . . . . . . . . 84 4.4.3 Effect of Dissimilar Signal Types . . . . . . . . 86 V. Conclusion
Characterization of slow and fast phase nystagmus
NASA Technical Reports Server (NTRS)
Lessard, Charles S.; Rodriguez-Garcia, Carlos A.; Wong, Wing Chan; Im, Jae J.; Schmidt, Glenn F.
1991-01-01
A current literature review of the analog and digital process of vestibular and optical kinetic nystagmus reveals little agreement in the methods used by various labs. The strategies for detection of saccade (fast phase velocity component of nystagmus) vary between labs, and most of the process have not been evaluated and validated with a standard database. A survey was made of major vestibular labs in the U.S. that perform computer analyses of vestibular and optokinetic reflexes to stimuli, and a baseline was established from which to standardize data acquisition and analysis programs. The concept of an Error Index was employed as the criterium for evaluating the performance of the vestibular analysis software programs. The performance criterium is based on the detection of saccades and is the average of the percentages of missed detections and false detections. Evaluation of the programs produced results for lateral gaze with saccadic amplitude of one, two, three, five, and ten degrees with various signal-to-noise ratios. In addition, results were obtained for sinusoidal pursuit of 0.05, 0.10, and 0.50 Hz with saccades from one to ten degrees at various signal-to-noise ratios. Selection of the best program was made from the performance in the lateral gaze with three degrees of saccadic amplitude and in the 0.10 Hz sinusoid with three degrees of saccadic amplitude.
Detection of triglycerides using immobilized enzymes in food and biological samples
NASA Astrophysics Data System (ADS)
Raichur, Ashish; Lesi, Abiodun; Pedersen, Henrik
1996-04-01
A scheme for the determination of total triglyceride (fat) content in biomedical and food samples is being developed. The primary emphasis is to minimize the reagents used, simplify sample preparation and develop a robust system that would facilitate on-line monitoring. The new detection scheme developed thus far involves extracting triglycerides into an organic solvent (cyclohexane) and performing partial least squares (PLS) analysis on the NIR (1100 - 2500 nm) absorbance spectra of the solution. A training set using 132 spectra of known triglyceride mixtures was complied. Eight PLS calibrations were generated and were used to predict the total fat extracted from commercial samples such as mayonnaise, butter, corn oil and coconut oil. The results typically gave a correlation coefficient (r) of 0.99 or better. Predictions were typically within 90% and better at higher concentrations. Experiments were also performed using an immobilized lipase reactor to hydrolyze the fat extracted into the organic solvent. Performing PLS analysis on the difference spectra of the substrate and product could enhance specificity. This is being verified experimentally. Further work with biomedical samples is to be performed. This scheme may be developed into a feasible detection method for triglycerides in the biomedical and food industries.
Kobayashi, Hajime; Ohkubo, Masaki; Narita, Akihiro; Marasinghe, Janaka C; Murao, Kohei; Matsumoto, Toru; Sone, Shusuke
2017-01-01
Objective: We propose the application of virtual nodules to evaluate the performance of computer-aided detection (CAD) of lung nodules in cancer screening using low-dose CT. Methods: The virtual nodules were generated based on the spatial resolution measured for a CT system used in an institution providing cancer screening and were fused into clinical lung images obtained at that institution, allowing site specificity. First, we validated virtual nodules as an alternative to artificial nodules inserted into a phantom. In addition, we compared the results of CAD analysis between the real nodules (n = 6) and the corresponding virtual nodules. Subsequently, virtual nodules of various sizes and contrasts between nodule density and background density (ΔCT) were inserted into clinical images (n = 10) and submitted for CAD analysis. Results: In the validation study, 46 of 48 virtual nodules had the same CAD results as artificial nodules (kappa coefficient = 0.913). Real nodules and the corresponding virtual nodules showed the same CAD results. The detection limits of the tested CAD system were determined in terms of size and density of peripheral lung nodules; we demonstrated that a nodule with a 5-mm diameter was detected when the nodule had a ΔCT > 220 HU. Conclusion: Virtual nodules are effective in evaluating CAD performance using site-specific scan/reconstruction conditions. Advances in knowledge: Virtual nodules can be an effective means of evaluating site-specific CAD performance. The methodology for guiding the detection limit for nodule size/density might be a useful evaluation strategy. PMID:27897029
Computerized analysis of sonograms for the detection of breast lesions
NASA Astrophysics Data System (ADS)
Drukker, Karen; Giger, Maryellen L.; Horsch, Karla; Vyborny, Carl J.
2002-05-01
With a renewed interest in using non-ionizing radiation for the screening of high risk women, there is a clear role for a computerized detection aid in ultrasound. Thus, we are developing a computerized detection method for the localization of lesions on breast ultrasound images. The computerized detection scheme utilizes two methods. Firstly, a radial gradient index analysis is used to distinguish potential lesions from normal parenchyma. Secondly, an image skewness analysis is performed to identify posterior acoustic shadowing. We analyzed 400 cases (757 images) consisting of complex cysts, solid benign lesions, and malignant lesions. The detection method yielded an overall sensitivity of 95% by image, and 99% by case at a false-positive rate of 0.94 per image. In 51% of all images, only the lesion itself was detected, while in 5% of the images only the shadowing was identified. For malignant lesions these numbers were 37% and 9%, respectively. In summary, we have developed a computer detection method for lesions on ultrasound images of the breast, which may ultimately aid in breast cancer screening.
Feasibility of fast neutron analysis for the detection of explosives buried in soil
NASA Astrophysics Data System (ADS)
Faust, A. A.; McFee, J. E.; Bowman, C. L.; Mosquera, C.; Andrews, H. R.; Kovaltchouk, V. D.; Ing, H.
2011-12-01
A commercialized thermal neutron analysis (TNA) sensor has been developed to confirm the presence of buried bulk explosives as part of a multi-sensor anti-tank landmine detection system. Continuing improvements to the TNA system have included the use of an electronic pulsed neutron generator that offers the possibility of applying fast neutron analysis (FNA) methods to improve the system's detection capability. This paper describes an investigation into the use of FNA as a complementary component in such a TNA system. The results of a modeling study using simple geometries and a full model of the TNA sensor head are presented, as well as preliminary results from an experimental associated particle imaging (API) system that supports the modeling study results. The investigation has concluded that the pulsed beam FNA approach would not improve the detection performance of a TNA system for landmine or buried IED detection in a confirmation role, and could not be made into a practical stand-alone detection system for buried anti-tank landmines. Detection of buried landmines and IEDs by FNA remains a possibility, however, through the use of the API technique.
Karageorgou, Eftychia; Christoforidou, Sofia; Ioannidou, Maria; Psomas, Evdoxios; Samouris, Georgios
2018-06-01
The present study was carried out to assess the detection sensitivity of four microbial inhibition assays (MIAs) in comparison with the results obtained by the High Performance Liquid Chromatography with Diode-Array Detection (HPLC-DAD) method for antibiotics of the β-lactam group and chloramphenicol in fortified raw milk samples. MIAs presented fairly good results when detecting β-lactams, whereas none were able to detect chloramphenicol at or above the permissible limits. HPLC analysis revealed high recoveries of examined compounds, whereas all detection limits observed were lower than their respective maximum residue limits (MRL) values. The extraction and clean-up procedure of antibiotics was performed by a modified matrix solid phase dispersion procedure using a mixture of Plexa by Agilent and QuEChERS as a sorbent. The HPLC method developed was validated, determining the accuracy, precision, linearity, decision limit, and detection capability. Both methods were used to monitor raw milk samples of several cows and sheep, obtained from producers in different regions of Greece, for the presence of examined antibiotic residues. Results obtained showed that MIAs could be used effectively and routinely to detect antibiotic residues in several milk types. However, in some cases, spoilage of milk samples revealed that the kits' sensitivity could be strongly affected, whereas this fact does not affect the effectiveness of HPLC-DAD analysis.
Detection, isolation and diagnosability analysis of intermittent faults in stochastic systems
NASA Astrophysics Data System (ADS)
Yan, Rongyi; He, Xiao; Wang, Zidong; Zhou, D. H.
2018-02-01
Intermittent faults (IFs) have the properties of unpredictability, non-determinacy, inconsistency and repeatability, switching systems between faulty and healthy status. In this paper, the fault detection and isolation (FDI) problem of IFs in a class of linear stochastic systems is investigated. For the detection and isolation of IFs, it includes: (1) to detect all the appearing time and the disappearing time of an IF; (2) to detect each appearing (disappearing) time of the IF before the subsequent disappearing (appearing) time; (3) to determine where the IFs happen. Based on the outputs of the observers we designed, a novel set of residuals is constructed by using the sliding-time window technique, and two hypothesis tests are proposed to detect all the appearing time and disappearing time of IFs. The isolation problem of IFs is also considered. Furthermore, within a statistical framework, the definition of the diagnosability of IFs is proposed, and a sufficient condition is brought forward for the diagnosability of IFs. Quantitative performance analysis results for the false alarm rate and missing detection rate are discussed, and the influences of some key parameters of the proposed scheme on performance indices such as the false alarm rate and missing detection rate are analysed rigorously. The effectiveness of the proposed scheme is illustrated via a simulation example of an unmanned helicopter longitudinal control system.
Change detection in satellite images
NASA Astrophysics Data System (ADS)
Thonnessen, U.; Hofele, G.; Middelmann, W.
2005-05-01
Change detection plays an important role in different military areas as strategic reconnaissance, verification of armament and disarmament control and damage assessment. It is the process of identifying differences in the state of an object or phenomenon by observing it at different times. The availability of spaceborne reconnaissance systems with high spatial resolution, multi spectral capabilities, and short revisit times offer new perspectives for change detection. Before performing any kind of change detection it is necessary to separate changes of interest from changes caused by differences in data acquisition parameters. In these cases it is necessary to perform a pre-processing to correct the data or to normalize it. Image registration and, corresponding to this task, the ortho-rectification of the image data is a further prerequisite for change detection. If feasible, a 1-to-1 geometric correspondence should be aspired for. Change detection on an iconic level with a succeeding interpretation of the changes by the observer is often proposed; nevertheless an automatic knowledge-based analysis delivering the interpretation of the changes on a semantic level should be the aim of the future. We present first results of change detection on a structural level concerning urban areas. After pre-processing, the images are segmented in areas of interest and structural analysis is applied to these regions to extract descriptions of urban infrastructure like buildings, roads and tanks of refineries. These descriptions are matched to detect changes and similarities.
A search for optical evidence for lightning on Venus with VIRTIS on Venus Express
NASA Astrophysics Data System (ADS)
Abildgaard, Sofie; Cardesin, Alejandro; Garcia Múnoz, Antonio; Piccioni, Giuseppe
2015-04-01
Lightning is known to occur on the atmospheres of Earth, Jupiter, Saturn, Uranus and Neptune, but although the occurrence of lightning in the Venusian atmosphere has been published several times in the past years, always on the basis of detected electromagnetic pulses, the subject is still controversial. It is generally agreed that an optical observation of the phenomenon would settle the issue. In this work we analyse the data collection of hyper-spectral images produced by the Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS) on Venus Express, that has been observing the Venusian atmosphere continuously since 2006. A dedicated search algorithm for transient events was developed and a detailed analysis of the archive was performed in all wavelengths. The first preliminary analysis have been performed and we have proven that transient events can easily be identified in the data. Work is ongoing for optimizing search parameters and performing a statistical analysis. In this contribution, we will present a summary of the data analysis process and some of the preliminary conclusion in the lightning detection/nondetection.
SisFall: A Fall and Movement Dataset
Sucerquia, Angela; López, José David; Vargas-Bonilla, Jesús Francisco
2017-01-01
Research on fall and movement detection with wearable devices has witnessed promising growth. However, there are few publicly available datasets, all recorded with smartphones, which are insufficient for testing new proposals due to their absence of objective population, lack of performed activities, and limited information. Here, we present a dataset of falls and activities of daily living (ADLs) acquired with a self-developed device composed of two types of accelerometer and one gyroscope. It consists of 19 ADLs and 15 fall types performed by 23 young adults, 15 ADL types performed by 14 healthy and independent participants over 62 years old, and data from one participant of 60 years old that performed all ADLs and falls. These activities were selected based on a survey and a literature analysis. We test the dataset with widely used feature extraction and a simple to implement threshold based classification, achieving up to 96% of accuracy in fall detection. An individual activity analysis demonstrates that most errors coincide in a few number of activities where new approaches could be focused. Finally, validation tests with elderly people significantly reduced the fall detection performance of the tested features. This validates findings of other authors and encourages developing new strategies with this new dataset as the benchmark. PMID:28117691
SisFall: A Fall and Movement Dataset.
Sucerquia, Angela; López, José David; Vargas-Bonilla, Jesús Francisco
2017-01-20
Research on fall and movement detection with wearable devices has witnessed promising growth. However, there are few publicly available datasets, all recorded with smartphones, which are insufficient for testing new proposals due to their absence of objective population, lack of performed activities, and limited information. Here, we present a dataset of falls and activities of daily living (ADLs) acquired with a self-developed device composed of two types of accelerometer and one gyroscope. It consists of 19 ADLs and 15 fall types performed by 23 young adults, 15 ADL types performed by 14 healthy and independent participants over 62 years old, and data from one participant of 60 years old that performed all ADLs and falls. These activities were selected based on a survey and a literature analysis. We test the dataset with widely used feature extraction and a simple to implement threshold based classification, achieving up to 96% of accuracy in fall detection. An individual activity analysis demonstrates that most errors coincide in a few number of activities where new approaches could be focused. Finally, validation tests with elderly people significantly reduced the fall detection performance of the tested features. This validates findings of other authors and encourages developing new strategies with this new dataset as the benchmark.
NASA Astrophysics Data System (ADS)
Song, YoungJae; Sepulveda, Francisco
2017-02-01
Objective. Self-paced EEG-based BCIs (SP-BCIs) have traditionally been avoided due to two sources of uncertainty: (1) precisely when an intentional command is sent by the brain, i.e., the command onset detection problem, and (2) how different the intentional command is when compared to non-specific (or idle) states. Performance evaluation is also a problem and there are no suitable standard metrics available. In this paper we attempted to tackle these issues. Approach. Self-paced covert sound-production cognitive tasks (i.e., high pitch and siren-like sounds) were used to distinguish between intentional commands (IC) and idle states. The IC states were chosen for their ease of execution and negligible overlap with common cognitive states. Band power and a digital wavelet transform were used for feature extraction, and the Davies-Bouldin index was used for feature selection. Classification was performed using linear discriminant analysis. Main results. Performance was evaluated under offline and simulated-online conditions. For the latter, a performance score called true-false-positive (TFP) rate, ranging from 0 (poor) to 100 (perfect), was created to take into account both classification performance and onset timing errors. Averaging the results from the best performing IC task for all seven participants, an 77.7% true-positive (TP) rate was achieved in offline testing. For simulated-online analysis the best IC average TFP score was 76.67% (87.61% TP rate, 4.05% false-positive rate). Significance. Results were promising when compared to previous IC onset detection studies using motor imagery, in which best TP rates were reported as 72.0% and 79.7%, and which, crucially, did not take timing errors into account. Moreover, based on our literature review, there is no previous covert sound-production onset detection system for spBCIs. Results showed that the proposed onset detection technique and TFP performance metric have good potential for use in SP-BCIs.
Detection of indoor biological hazards using the man-portable laser induced breakdown spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munson, Chase A.; Gottfried, Jennifer L.; Snyder, Emily Gibb
2008-11-01
The performance of a man-portable laser induced breakdown spectrometer was evaluated for the detection of biological powders on indoor office surfaces and wipe materials. Identification of pure unknown powders was performed by comparing against a library of spectra containing biological agent surrogates and confusant materials, such as dusts, diesel soot, natural and artificial sweeteners, and drink powders, using linear correlation analysis. Simple models constructed using a second technique, partial least squares discriminant analysis, successfully identified Bacillus subtilis (BG) spores on wipe materials and office surfaces. Furthermore, these models were able to identify BG on materials not used in the trainingmore » of the model.« less
A New Pivoting and Iterative Text Detection Algorithm for Biomedical Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Songhua; Krauthammer, Prof. Michael
2010-01-01
There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manuallymore » labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use.« less
Novo, P; Chu, V; Conde, J P
2014-07-15
The miniaturization of biosensors using microfluidics has potential in enabling the development of point-of-care devices, with the added advantages of reduced time and cost of analysis with limits-of-detection comparable to those obtained through traditional laboratory techniques. Interfacing microfluidic devices with the external world can be difficult especially in aspects involving fluid handling and the need for simple sample insertion that avoids special equipment or trained personnel. In this work we present a point-of-care prototype system by integrating capillary microfluidics with a microfabricated photodiode array and electronic instrumentation into a hand-held unit. The capillary microfluidic device is capable of autonomous and sequential fluid flow, including control of the average fluid velocity at any given point of the analysis. To demonstrate the functionality of the prototype, a model chemiluminescence ELISA was performed. The performance of the integrated optical detection in the point-of-care prototype is equal to that obtained with traditional bench-top instrumentation. The photodiode signals were acquired, displayed and processed by a simple graphical user interface using a computer connected to the microcontroller through USB. The prototype performed integrated chemiluminescence ELISA detection in about 15 min with a limit-of-detection of ≈2 nM with an antibody-antigen affinity constant of ≈2×10(7) M(-1). Copyright © 2014 Elsevier B.V. All rights reserved.
Robust Detection of Examinees with Aberrant Answer Changes
ERIC Educational Resources Information Center
Belov, Dmitry I.
2015-01-01
The statistical analysis of answer changes (ACs) has uncovered multiple testing irregularities on large-scale assessments and is now routinely performed at testing organizations. However, AC data has an uncertainty caused by technological or human factors. Therefore, existing statistics (e.g., number of wrong-to-right ACs) used to detect examinees…
NASA Technical Reports Server (NTRS)
Simpson, C.; Eisenhardt, P.
1998-01-01
We investigate the ability of the Space Infrared Telescope Facility's Infrared Array Camera to detect distant (z3) galaxies and measure their photometric redshifts. Our analysis shows that changing the original long wavelength filter specifications provides significant improvements in performance in this and other areas.
An enhanced performance through agent-based secure approach for mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Bisen, Dhananjay; Sharma, Sanjeev
2018-01-01
This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.
SSME Post Test Diagnostic System: Systems Section
NASA Technical Reports Server (NTRS)
Bickmore, Timothy
1995-01-01
An assessment of engine and component health is routinely made after each test firing or flight firing of a Space Shuttle Main Engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project - the SSME Post Test Diagnostic System (PTDS) - is to develop a computer program which automates the analysis of test data from the SSME in order to detect and diagnose anomalies. This report primarily covers work on the Systems Section of the PTDS, which automates the analyses performed by the systems/performance group at the Propulsion Branch of NASA Marshall Space Flight Center (MSFC). This group is responsible for assessing the overall health and performance of the engine, and detecting and diagnosing anomalies which involve multiple components (other groups are responsible for analyzing the behavior of specific components). The PTDS utilizes several advanced software technologies to perform its analyses. Raw test data is analyzed using signal processing routines which detect features in the data, such as spikes, shifts, peaks, and drifts. Component analyses are performed by expert systems, which use 'rules-of-thumb' obtained from interviews with the MSFC data analysts to detect and diagnose anomalies. The systems analysis is performed using case-based reasoning. Results of all analyses are stored in a relational database and displayed via an X-window-based graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
Torque Teno Virus in HIV-infected transgender in Surakarta, Indonesia
NASA Astrophysics Data System (ADS)
Hartono; Agung Prasetyo, Afiono; Fanani, Mohammad
2018-05-01
Torque Teno Virus (TTV) is a circular single-stranded DNA virus that may co-infected with human immunodeficiency virus (HIV), especially in the high-risk community e.g. the transgender performing high-riskbehavior. TTV shows an increased viremia in HIV patients and maybe influence the HIV clinical progression. Blood samples collected from transgender performing high-riskbehavior in Surakarta were tested by serological and molecular assays to detect the presence of HIV infection. The blood samples with HIV positive status were then tested by a nested polymerase chain reaction (PCR) to detect the presentation of TTV DNA. The amplified PCR products were molecularly cloned and subjected to sequence analysis. TTV DNA was detected in 40.0% HIV-positive samples. The molecular characterization revealed that the most prevalent was genogroup 3, followed by genogroup 2 and 1, respectively. TTV was detected in HIV-infected transgender performing high-riskbehavior in Surakarta with high infection rate.
Islam, Jyoti; Zhang, Yanqing
2018-05-31
Alzheimer's disease is an incurable, progressive neurological brain disorder. Earlier detection of Alzheimer's disease can help with proper treatment and prevent brain tissue damage. Several statistical and machine learning models have been exploited by researchers for Alzheimer's disease diagnosis. Analyzing magnetic resonance imaging (MRI) is a common practice for Alzheimer's disease diagnosis in clinical research. Detection of Alzheimer's disease is exacting due to the similarity in Alzheimer's disease MRI data and standard healthy MRI data of older people. Recently, advanced deep learning techniques have successfully demonstrated human-level performance in numerous fields including medical image analysis. We propose a deep convolutional neural network for Alzheimer's disease diagnosis using brain MRI data analysis. While most of the existing approaches perform binary classification, our model can identify different stages of Alzheimer's disease and obtains superior performance for early-stage diagnosis. We conducted ample experiments to demonstrate that our proposed model outperformed comparative baselines on the Open Access Series of Imaging Studies dataset.
Live face detection based on the analysis of Fourier spectra
NASA Astrophysics Data System (ADS)
Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.
2004-08-01
Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.
Analysis of Technology for Compact Coherent Lidar
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin
1997-01-01
In view of the recent advances in the area of solid state and semiconductor lasers has created new possibilities for the development of compact and reliable coherent lidars for a wide range of applications. These applications include: Automated Rendezvous and Capture, wind shear and clear air turbulence detection, aircraft wake vortex detection, and automobile collision avoidance. The work performed by the UAH personnel under this Delivery Order, concentrated on design and analyses of a compact coherent lidar system capable of measuring range and velocity of hard targets, and providing air mass velocity data. The following is the scope of this work. a. Investigate various laser sources and optical signal detection configurations in support of a compact and lightweight coherent laser radar to be developed for precision range and velocity measurements of hard and fuzzy targets. Through interaction with MSFC engineers, the most suitable laser source and signal detection technique that can provide a reliable compact and lightweight laser radar design will be selected. b. Analyze and specify the coherent laser radar system configuration and assist with its optical and electronic design efforts. Develop a system design including its optical layout design. Specify all optical components and provide the general requirements of the electronic subsystems including laser beam modulator and demodulator drivers, detector electronic interface, and the signal processor. c. Perform a thorough performance analysis to predict the system measurement range and accuracy. This analysis will utilize various coherent laser radar sensitivity formulations and different target models.
NASA Astrophysics Data System (ADS)
Bandeira, Lourenço; Ding, Wei; Stepinski, Tomasz F.
2012-01-01
Counting craters is a paramount tool of planetary analysis because it provides relative dating of planetary surfaces. Dating surfaces with high spatial resolution requires counting a very large number of small, sub-kilometer size craters. Exhaustive manual surveys of such craters over extensive regions are impractical, sparking interest in designing crater detection algorithms (CDAs). As a part of our effort to design a CDA, which is robust and practical for planetary research analysis, we propose a crater detection approach that utilizes both shape and texture features to identify efficiently sub-kilometer craters in high resolution panchromatic images. First, a mathematical morphology-based shape analysis is used to identify regions in an image that may contain craters; only those regions - crater candidates - are the subject of further processing. Second, image texture features in combination with the boosting ensemble supervised learning algorithm are used to accurately classify previously identified candidates into craters and non-craters. The design of the proposed CDA is described and its performance is evaluated using a high resolution image of Mars for which sub-kilometer craters have been manually identified. The overall detection rate of the proposed CDA is 81%, the branching factor is 0.14, and the overall quality factor is 72%. This performance is a significant improvement over the previous CDA based exclusively on the shape features. The combination of performance level and computational efficiency offered by this CDA makes it attractive for practical application.
Development and Measurements of a Mid-Infrared Multi-Gas Sensor System for CO, CO2 and CH4 Detection
Dong, Ming; Zheng, Chuantao; Miao, Shuzhuo; Zhang, Yu; Du, Qiaoling; Wang, Yiding
2017-01-01
A multi-gas sensor system was developed that uses a single broadband light source and multiple carbon monoxide (CO), carbon dioxide (CO2) and methane (CH4) pyroelectric detectors by use of the time division multiplexing (TDM) technique. A stepper motor-based rotating system and a single-reflection spherical optical mirror were designed and adopted to realize and enhance multi-gas detection. Detailed measurements under static detection mode (without rotation) and dynamic mode (with rotation) were performed to study the performance of the sensor system for the three gas species. Effects of the motor rotating period on sensor performances were also investigated and a rotation speed of 0.4π rad/s was required to obtain a stable sensing performance, corresponding to a detection period of ~10 s to realize one round of detection. Based on an Allan deviation analysis, the 1σ detection limits under static operation are 2.96, 4.54 and 2.84 parts per million in volume (ppmv) for CO, CO2 and CH4, respectively and the 1σ detection limits under dynamic operations are 8.83, 8.69 and 10.29 ppmv for the three gas species, respectively. The reported sensor has potential applications in various fields requiring CO, CO2 and CH4 detection such as in coal mines. PMID:28953260
CO and CO2 dual-gas detection based on mid-infrared wideband absorption spectroscopy
NASA Astrophysics Data System (ADS)
Dong, Ming; Zhong, Guo-qiang; Miao, Shu-zhuo; Zheng, Chuan-tao; Wang, Yi-ding
2018-03-01
A dual-gas sensor system is developed for CO and CO2 detection using a single broadband light source, pyroelectric detectors and time-division multiplexing (TDM) technique. A stepper motor based rotating system and a single-reflection spherical optical mirror are designed and adopted for realizing and enhancing dual-gas detection. Detailed measurements under static detection mode (without rotation) and dynamic mode (with rotation) are performed to study the performance of the sensor system for the two gas samples. The detection period is 7.9 s in one round of detection by scanning the two detectors. Based on an Allan deviation analysis, the 1σ detection limits under static operation are 3.0 parts per million (ppm) in volume and 2.6 ppm for CO and CO2, respectively, and those under dynamic operation are 9.4 ppm and 10.8 ppm for CO and CO2, respectively. The reported sensor has potential applications in various fields requiring CO and CO2 detection such as in the coal mine.
Sun, Xiange; Li, Bowei; Qi, Anjin; Tian, Chongguo; Han, Jinglong; Shi, Yajun; Lin, Bingcheng; Chen, Lingxin
2018-02-01
In this work, a novel rotational microfluidic paper-based device was developed to improve the accuracy and performance of the multiplexed colorimetric detection by effectively avoiding the diffusion of colorimetric reagent on the detection zone. The integrated paper-based rotational valves were used to control the connection or disconnection between detection zones and fluid channels. Based on the manipulation of the rotational valves, this rotational paper-based device could prevent the random diffusion of colorimetric reagent and reduce the error of quantitative analysis considerably. The multiplexed colorimetric detection of heavy metals Ni(II), Cu(II) and Cr(VI) were implemented on the rotational device and the detection limits could be found to be 4.8, 1.6, and 0.18mg/L, respectively. The developed rotational device showed the great advantage in improving the detection accuracy and was expected to be a low-cost, portable analytical platform for the on-site detection. Copyright © 2017 Elsevier B.V. All rights reserved.
Herbicide Orange Site Characterization Study Naval Construction Battalion Center
1987-01-01
U.S. Testing Laboratories for analysis. Over 200 additional analyses were performed for a variety of quality assurance criteria. The resultant data...TABLE 9. NCBC PERFORMANCE AUDIT SAMPLE ANALYSIS SUNMARYa (SERIES 1) TCDD Sppb ) Reported Detection Relative b Sample Number Concentration Limit...limit rather than estimating the variance of the results. The sample results were transformed using the natural logarithm. The Shapiro-Wilk W test
A SVM-based quantitative fMRI method for resting-state functional network detection.
Song, Xiaomu; Chen, Nan-kuei
2014-09-01
Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Wang, Junping; Xie, Xinfang; Feng, Jinsong; Chen, Jessica C; Du, Xin-jun; Luo, Jiangzhao; Lu, Xiaonan; Wang, Shuo
2015-07-02
Listeria monocytogenes is a facultatively anaerobic, Gram-positive, rod-shape foodborne bacterium causing invasive infection, listeriosis, in susceptible populations. Rapid and high-throughput detection of this pathogen in dairy products is critical as milk and other dairy products have been implicated as food vehicles in several outbreaks. Here we evaluated confocal micro-Raman spectroscopy (785 nm laser) coupled with chemometric analysis to distinguish six closely related Listeria species, including L. monocytogenes, in both liquid media and milk. Raman spectra of different Listeria species and other bacteria (i.e., Staphylococcus aureus, Salmonella enterica and Escherichia coli) were collected to create two independent databases for detection in media and milk, respectively. Unsupervised chemometric models including principal component analysis and hierarchical cluster analysis were applied to differentiate L. monocytogenes from Listeria and other bacteria. To further evaluate the performance and reliability of unsupervised chemometric analyses, supervised chemometrics were performed, including two discriminant analyses (DA) and soft independent modeling of class analogies (SIMCA). By analyzing Raman spectra via two DA-based chemometric models, average identification accuracies of 97.78% and 98.33% for L. monocytogenes in media, and 95.28% and 96.11% in milk were obtained, respectively. SIMCA analysis also resulted in satisfied average classification accuracies (over 93% in both media and milk). This Raman spectroscopic-based detection of L. monocytogenes in media and milk can be finished within a few hours and requires no extensive sample preparation. Copyright © 2015 Elsevier B.V. All rights reserved.
Comparison of human and algorithmic target detection in passive infrared imagery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Hutchinson, Meredith
2003-09-01
We have designed an experiment that compares the performance of human observers and a scale-insensitive target detection algorithm that uses pixel level information for the detection of ground targets in passive infrared imagery. The test database contains targets near clutter whose detectability ranged from easy to very difficult. Results indicate that human observers detect more "easy-to-detect" targets, and with far fewer false alarms, than the algorithm. For "difficult-to-detect" targets, human and algorithm detection rates are considerably degraded, and algorithm false alarms excessive. Analysis of detections as a function of observer confidence shows that algorithm confidence attribution does not correspond to human attribution, and does not adequately correlate with correct detections. The best target detection score for any human observer was 84%, as compared to 55% for the algorithm for the same false alarm rate. At 81%, the maximum detection score for the algorithm, the same human observer had 6 false alarms per frame as compared to 29 for the algorithm. Detector ROC curves and observer-confidence analysis benchmarks the algorithm and provides insights into algorithm deficiencies and possible paths to improvement.
1988-10-01
A statistical analysis on the output signals of an acousto - optic spectrum analyzer (AOSA) is performed for the case when the input signal is a...processing, Electronic warfare, Radar countermeasures, Acousto - optic , Spectrum analyzer, Statistical analysis, Detection, Estimation, Canada, Modelling.
Calcagni, Maria Lucia; Taralli, Silvia; Cardillo, Giuseppe; Graziano, Paolo; Ialongo, Pasquale; Mattoli, Maria Vittoria; Di Franco, Davide; Caldarella, Carmelo; Carleo, Francesco; Indovina, Luca; Giordano, Alessandro
2016-04-01
Solitary pulmonary nodule (SPN) still represents a diagnostic challenge. The aim of our study was to evaluate the diagnostic performance of (18)F-fluorodeoxyglucose positron emission tomography-computed tomography in one of the largest samples of small SPNs, incidentally detected in subjects without a history of malignancy (nonscreening population) and undetermined at computed tomography. One-hundred and sixty-two small (>0.8 to 1.5 cm) and, for comparison, 206 large nodules (>1.5 to 3 cm) were retrospectively evaluated. Diagnostic performance of (18)F-fluorodeoxyglucose visual analysis, receiver-operating characteristic (ROC) analysis for maximum standardized uptake value (SUVmax), and Bayesian analysis were assessed using histology or radiological follow-up as a golden standard. In 162 small nodules, (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.3) provided 72.6% and 77.4% sensitivity and 88.0% and 82.0% specificity, respectively. The prevalence of malignancy was 38%; Bayesian analysis provided 78.8% positive and 16.0% negative posttest probabilities of malignancy. In 206 large nodules (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.9) provided 89.5% and 85.1% sensitivity and 70.8% and 79.2% specificity, respectively. The prevalence of malignancy was 65%; Bayesian analysis provided 85.0% positive and 21.6% negative posttest probabilities of malignancy. In both groups, malignant nodules had a significant higher SUVmax (p < 0.0001) than benign nodules. Only in the small group, malignant nodules were significantly larger (p = 0.0054) than benign ones. (18)F-fluorodeoxyglucose can be clinically relevant to rule in and rule out malignancy in undetermined small SPNs, incidentally detected in nonscreening population with intermediate pretest probability of malignancy, as well as in larger ones. Visual analysis can be considered an optimal diagnostic criterion, adequately detecting a wide range of malignant nodules with different metabolic activity. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan
2007-11-01
Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.
NASA Astrophysics Data System (ADS)
Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan
2007-11-01
Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.
Korompoki, Eleni; Del Giudice, Angela; Hillmann, Steffi; Malzahn, Uwe; Gladstone, David J; Heuschmann, Peter; Veltkamp, Roland
2017-01-01
Background and purpose The detection rate of atrial fibrillation has not been studied specifically in transient ischemic attack (TIA) patients although extrapolation from ischemic stroke may be inadequate. We conducted a systematic review and meta-analysis to determine the rate of newly diagnosed atrial fibrillation using different methods of ECG monitoring in TIA. Methods A comprehensive literature search was performed following a pre-specified protocol the PRISMA statement. Prospective observational studies and randomized controlled trials were considered that included TIA patients who underwent cardiac monitoring for >12 h. Primary outcome was frequency of detection of atrial fibrillation ≥30 s. Analyses of subgroups and of duration and type of monitoring were performed. Results Seventeen studies enrolling 1163 patients were included. The pooled atrial fibrillation detection rate for all methods was 4% (95% CI: 2-7%). Yield of monitoring was higher in selected (higher age, more extensive testing for arrhythmias before enrolment, or presumed cardioembolic/cryptogenic cause) than in unselected cohorts (7% vs 3%). Pooled mean atrial fibrillation detection rates rose with duration of monitoring: 4% (24 h), 5% (24 h to 7 days) and 6% (>7 days), respectively. Yield of non-invasive was significantly lower than that of invasive monitoring (4% vs. 11%). Significant heterogeneity was observed among studies (I 2 =60.61%). Conclusion This first meta-analysis of atrial fibrillation detection in TIA patients finds a lower atrial fibrillation detection rate in TIA than reported for IS and TIA cohorts in previous meta-analyses. Prospective studies are needed to determine actual prevalence of atrial fibrillation and optimal diagnostic procedure for atrial fibrillation detection in TIA.
Inagaki, Shinsuke; Hirashima, Haruo; Taniguchi, Sayuri; Higashi, Tatsuya; Min, Jun Zhe; Kikura-Hanajiri, Ruri; Goda, Yukihiro; Toyo'oka, Toshimasa
2012-12-01
A rapid enantiomeric separation and simultaneous determination method based on ultra high performance liquid chromatography (UHPLC) was developed for phenethylamine-type abused drugs using (R)-(-)-4-(N,N-dimethylaminosulfonyl)-7-(3-isothiocyanatopyrrolidin-1-yl)-2,1,3-benzoxadiazole ((R)-(-)-DBD-Py-NCS) as the chiral fluorescent derivatization reagent. The derivatives were rapidly enantiomerically separated by reversed-phase UHPLC using a column of 2.3-µm octadecylsilica (ODS) particles by isocratic elution with water-methanol or water-acetonitrile systems as the mobile phase. The proposed method was applied to the analysis of products containing illicit drugs distributed in the Japanese market. Among the products, 1-(3,4-methylenedioxyphenyl)butan-2-amine (BDB) and 1-(2-methoxy4,5-methylenedioxyphenyl)propan-2-amine (MMDA-2) were detected in racemic form. Furthermore, the method was successfully applied to the analysis of hair specimens from rats that were continuously dosed with diphenyl(pyrrolidin-2-yl)methanol (D2PM). Using UHPLC-fluorescence (FL) detection, (R)- and (S)-D2PM from hair specimens were enantiomerically separated and detected with high sensitivity. The detection limits of (R)- and (S)-D2PM were 0.12 and 0.21 ng/mg hair, respectively (signal-to-noise ratio (S/N) = 3). Copyright © 2012 John Wiley & Sons, Ltd.
Breath analysis based on micropreconcentrator for early cancer diagnosis
NASA Astrophysics Data System (ADS)
Lee, Sang-Seok
2018-02-01
We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.
Tavčar, Eva; Turk, Erika; Kreft, Samo
2012-01-01
The most commonly used technique for water content determination is Karl-Fischer titration with electrometric detection, requiring specialized equipment. When appropriate equipment is not available, the method can be performed through visual detection of a titration endpoint, which does not enable an analysis of colored samples. Here, we developed a method with spectrophotometric detection of a titration endpoint, appropriate for moisture determination of colored samples. The reaction takes place in a sealed 4 ml cuvette. Detection is performed at 520 nm. Titration endpoint is determined from the graph of absorbance plotted against titration volume. The method has appropriate reproducibility (RSD = 4.3%), accuracy, and linearity (R 2 = 0.997). PMID:22567558
NASA Astrophysics Data System (ADS)
Agafonova, N.; Aleksandrov, A.; Anokhina, A.; Aoki, S.; Ariga, A.; Ariga, T.; Bender, D.; Bertolin, A.; Bozza, C.; Brugnera, R.; Buonaura, A.; Buontempo, S.; Büttner, B.; Chernyavsky, M.; Chukanov, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; De Serio, M.; Del Amo Sanchez, P.; Di Crescenzo, A.; Di Ferdinando, D.; Di Marco, N.; Dmitrievski, S.; Dracos, M.; Duchesneau, D.; Dusini, S.; Dzhatdoev, T.; Ebert, J.; Ereditato, A.; Fini, R. A.; Fukuda, T.; Galati, G.; Garfagnini, A.; Giacomelli, G.; Göllnitz, C.; Goldberg, J.; Gornushkin, Y.; Grella, G.; Guler, M.; Gustavino, C.; Hagner, C.; Hara, T.; Hollnagel, A.; Hosseini, B.; Ishida, H.; Ishiguro, K.; Jakovcic, K.; Jollet, C.; Kamiscioglu, C.; Kamiscioglu, M.; Kawada, J.; Kim, J. H.; Kim, S. H.; Kitagawa, N.; Klicek, B.; Kodama, K.; Komatsu, M.; Kose, U.; Kreslo, I.; Lauria, A.; Lenkeit, J.; Ljubicic, A.; Longhin, A.; Loverre, P.; Malgin, A.; Malenica, M.; Mandrioli, G.; Matsuo, T.; Matveev, V.; Mauri, N.; Medinaceli, E.; Meregaglia, A.; Mikado, S.; Monacelli, P.; Montesi, M. C.; Morishima, K.; Muciaccia, M. T.; Naganawa, N.; Naka, T.; Nakamura, M.; Nakano, T.; Nakatsuka, Y.; Niwa, K.; Ogawa, S.; Okateva, N.; Olshevsky, A.; Omura, T.; Ozaki, K.; Paoloni, A.; Park, B. D.; Park, I. G.; Pasqualini, L.; Pastore, A.; Patrizii, L.; Pessard, H.; Pistillo, C.; Podgrudkov, D.; Polukhina, N.; Pozzato, M.; Pupilli, F.; Roda, M.; Rokujo, H.; Roganova, T.; Rosa, G.; Ryazhskaya, O.; Sato, O.; Schembri, A.; Shakiryanova, I.; Shchedrina, T.; Sheshukov, A.; Shibuya, H.; Shiraishi, T.; Shoziyoev, G.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Spinetti, M.; Stanco, L.; Starkov, N.; Stellacci, S. M.; Stipcevic, M.; Strauss, T.; Strolin, P.; Takahashi, S.; Tenti, M.; Terranova, F.; Tioukov, V.; Tufanli, S.; Vilain, P.; Vladimirov, M.; Votano, L.; Vuilleumier, J. L.; Wilquet, G.; Wonsak, B.; Yoon, C. S.; Zemskova, S.; Zghiche, A.
2014-08-01
The OPERA experiment, designed to perform the first observation of oscillations in appearance mode through the detection of the leptons produced in charged current interactions, has collected data from 2008 to 2012. In the present paper, the procedure developed to detect particle decays, occurring over distances of the order of from the neutrino interaction point, is described in detail and applied to the search for charmed hadrons, showing similar decay topologies as the lepton. In the analysed sample, 50 charm decay candidate events are observed while are expected, proving that the detector performance and the analysis chain applied to neutrino events are well reproduced by the OPERA simulation and thus validating the methods for appearance detection.
Operational Performance Analysis of Passive Acoustic Monitoring for Killer Whales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzner, Shari; Fu, Tao; Ren, Huiying
2011-09-30
For the planned tidal turbine site in Puget Sound, WA, the main concern is to protect Southern Resident Killer Whales (SRKW) due to their Endangered Species Act status. A passive acoustic monitoring system is proposed because the whales emit vocalizations that can be detected by a passive system. The algorithm for detection is implemented in two stages. The first stage is an energy detector designed to detect candidate signals. The second stage is a spectral classifier that is designed to reduce false alarms. The evaluation presented here of the detection algorithm incorporates behavioral models of the species of interest, environmentalmore » models of noise levels and potential false alarm sources to provide a realistic characterization of expected operational performance.« less
NASA Technical Reports Server (NTRS)
Rounds, M. A.; Nielsen, S. S.; Mitchell, C. A. (Principal Investigator)
1993-01-01
The use of gradient anion-exchange HPLC, with a simple post-column detection system, is described for the separation of myo-inositol phosphates, including "phytic acid" (myo-inositol hexaphosphate). Hexa-, penta-, tetra-, tri- and diphosphate members of this homologous series are clearly resolved within 30 min. This method should facilitate analysis and quantitation of "phytic acid" and other inositol phosphates in plant, food, and soil samples.
Li, Shi-Rong; Wang, Zhen-Ming; Wang, Yu-Hui; Wang, Xi-Bo; Zhao, Jian-Qiang; Xue, Hai-Bin; Jiang, Fu-Guo
2015-01-01
Detection of cervical high grade lesions in patients with atypical squamous cells of undetermined significance (ASCUS) is still a challenge. Our study tested the efficacy of the paired boxed gene 1 (PAX1) methylation analysis by methylation-sensitive high-resolution melting (MS-HRM) in the detection of high grade lesions in ASCUS and compared performance with the hybrid capture 2 (HC2) human papillomavirus (HPV) test. A total of 463 consecutive ASCUS women from primary screening were selected. Their cervical scrapings were collected and assessed by PAX1 methylation analysis (MS-HRM) and high-risk HPV-DNA test (HC2). All patients with ASCUS were admitted to colposcopy and cervical biopsies. The Chi- square test was used to test the differences of PAX1 methylation or HPV infection between groups. The specificity, sensitivity, and accuracy for detecting CIN2 + lesions were: 95.6%, 82.4%, and 94.6%, respectively, for the PAX1 MS-HRM test; and 59.7%, 64.7%, and 60.0% for the HC2 HPV test. The PAX1 methylation analysis by MS-HRM demonstrated a better performance than the high-risk HPV-DNA test for the detection of high grade lesions (CIN2 +) in ASCUS cases. This approach could screen out the majority of low grade cases of ASCUS, and thus reduce the referral rate to colposcopy.
NASA Astrophysics Data System (ADS)
Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo
2017-07-01
Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.
Application of the MIDAS approach for analysis of lysine acetylation sites.
Evans, Caroline A; Griffiths, John R; Unwin, Richard D; Whetton, Anthony D; Corfe, Bernard M
2013-01-01
Multiple Reaction Monitoring Initiated Detection and Sequencing (MIDAS™) is a mass spectrometry-based technique for the detection and characterization of specific post-translational modifications (Unwin et al. 4:1134-1144, 2005), for example acetylated lysine residues (Griffiths et al. 18:1423-1428, 2007). The MIDAS™ technique has application for discovery and analysis of acetylation sites. It is a hypothesis-driven approach that requires a priori knowledge of the primary sequence of the target protein and a proteolytic digest of this protein. MIDAS essentially performs a targeted search for the presence of modified, for example acetylated, peptides. The detection is based on the combination of the predicted molecular weight (measured as mass-charge ratio) of the acetylated proteolytic peptide and a diagnostic fragment (product ion of m/z 126.1), which is generated by specific fragmentation of acetylated peptides during collision induced dissociation performed in tandem mass spectrometry (MS) analysis. Sequence information is subsequently obtained which enables acetylation site assignment. The technique of MIDAS was later trademarked by ABSciex for targeted protein analysis where an MRM scan is combined with full MS/MS product ion scan to enable sequence confirmation.
Quality Analysis of Chlorogenic Acid and Hyperoside in Crataegi fructus
Weon, Jin Bae; Jung, Youn Sik; Ma, Choong Je
2016-01-01
Background: Crataegi fructus is a herbal medicine for strong stomach, sterilization, and alcohol detoxification. Chlorogenic acid and hyperoside are the major compounds in Crataegi fructus. Objective: In this study, we established novel high-performance liquid chromatography (HPLC)-diode array detection analysis method of chlorogenic acid and hyperoside for quality control of Crataegi fructus. Materials and Methods: HPLC analysis was achieved on a reverse-phase C18 column (5 μm, 4.6 mm × 250 mm) using water and acetonitrile as mobile phase with gradient system. The method was validated for linearity, precision, and accuracy. About 31 batches of Crataegi fructus samples collected from Korea and China were analyzed by using HPLC fingerprint of developed HPLC method. Then, the contents of chlorogenic acid and hyperoside were compared for quality evaluation of Crataegi fructus. Results: The results have shown that the average contents (w/w %) of chlorogenic acid and hyperoside in Crataegi fructus collected from Korea were 0.0438% and 0.0416%, respectively, and the average contents (w/w %) of 0.0399% and 0.0325%, respectively. Conclusion: In conclusion, established HPLC analysis method was stable and could provide efficient quality evaluation for monitoring of commercial Crataegi fructus. SUMMARY Quantitative analysis method of chlorogenic acid and hyperoside in Crataegi fructus is developed by high.performance liquid chromatography.(HPLC).diode array detectionEstablished HPLC analysis method is validated with linearity, precision, and accuracyThe developed method was successfully applied for quantitative analysis of Crataegi fructus sample collected from Korea and China. Abbreviations used: HPLC: High-performance liquid chromatography, GC: Gas chromatography, MS: Mass spectrometer, LOD: Limits of detection, LOQ: Limits of quantification, RSD: Relative standard deviation, RRT: Relative retention time, RPA: Relation peak area. PMID:27076744
Eberlin, Livia S; Abdelnur, Patricia V; Passero, Alan; de Sa, Gilberto F; Daroda, Romeu J; de Souza, Vanderlea; Eberlin, Marcos N
2009-08-01
High performance thin layer chromatography (HPTLC) combined with on-spot detection and characterization via easy ambient sonic-spray ionization mass spectrometry (EASI-MS) is applied to the analysis of biodiesel (B100) and biodiesel-petrodiesel blends (BX). HPTLC provides chromatographic resolution of major components whereas EASI-MS allows on-spot characterization performed directly on the HPTLC surface at ambient conditions. Constituents (M) are detected by EASI-MS in a one component-one ion fashion as either [M + Na](+) or [M + H](+). For both B100 and BX samples, typical profiles of fatty acid methyl esters (FAME) detected as [FAME + Na](+) ions allow biodiesel typification. The spectrum of the petrodiesel spot displays a homologous series of protonated alkyl pyridines which are characteristic for petrofuels (natural markers). The spectrum for residual or admixture oil spots is characterized by sodiated triglycerides [TAG + Na](+). The application of HPTLC to analyze B100 and BX samples and its combination with EASI-MS for on-spot characterization and quality control is demonstrated.
Henriquez-Camacho, C; Villafuerte-Gutierrez, P; Pérez-Molina, J A; Losa, J; Gotuzzo, E; Cheyne, N
2017-07-01
International health agencies have promoted nontargeted universal (opt-out) HIV screening tests in different settings, including emergency departments (EDs). We performed a systematic review and meta-analysis to assess the testing uptake of strategies (opt-in targeted, opt-in nontargeted and opt-out) to detect new cases of HIV infection in EDs. We searched the Pubmed and Embase databases, from 1984 to April 2015, for opt-in and opt-out HIV diagnostic strategies used in EDs. Randomized controlled or quasi experimental studies were included. We assessed the percentage of positive individuals tested for HIV infection in each programme (opt-in and opt-out strategies). The mean percentage was estimated by combining studies in a random-effect meta-analysis. The percentages of individuals tested in the programmes were compared in a random-effect meta-regression model. Data were analysed using stata version 12. Quality assessments were performed using the Newcastle-Ottawa Scale. Of the 90 papers identified, 28 were eligible for inclusion. Eight trials used opt-out, 18 trials used opt-in, and two trials used both to detect new cases of HIV infection. The test was accepted and taken by 75 155 of 172 237 patients (44%) in the opt-out strategy, and 73 581 of 382 992 patients (19%) in the opt-in strategy. The prevalence of HIV infection detected by the opt-out strategy was 0.40% (373 cases), that detected by the opt-in nontargeted strategy was 0.52% (419 cases), and that detected by the opt-in targeted strategy was 1.06% (52 cases). In this meta-analysis, the testing uptake of the opt-out strategy was not different from that of the opt-in strategy to detect new cases of HIV infection in EDs. © 2016 British HIV Association.
Dong, Zongmei; Lou, Pei'an; Zhang, Pan; Chen, Peipei; Qiao, Cheng; Li, Ting
2015-12-01
To observe the relationship between alcohol dependence and new detected hypertension in adult residents of Xuzhou city. Participants were sampled by stratified multi-stage randomly cluster sampling method from February 2013 to June 2013 among permanent residents aged 18 and more in Xuzhou city. The alcohol dependence was defined with Michigan Alcoholism Screening Test (MAST). Other information was obtained by questionnaire. Spearman correlation analysis and multivariate logistic regression analysis were performed to identify the relationship between alcohol dependence and new detected hypertension. The alcohol dependence rate was 11.56% on the whole cohort (n=36 157), and 22.02%(3 854/17 501) for male and 1.74%(324/18 656) for female(P<0.01). The new detected hypertension rate was 9.46%(3 422/36 157) in the whole cohort. The new detected hypertension rate increased in proportion with the severity of alcohol dependence (P<0.01). Spearman correlation analysis showed that alcohol dependence was positively correlated with systemic blood pressure(r=0.071, P<0.01) and diastolic blood pressure (r=0.077, P<0.01). After adjusting for gender, age, marital status, body mass index, smoking status, physical activity level, educational level, income level and region, multivariate logistic regression analysis showed that alcohol dependence was an independent risk factor for hypertension (low alcohol dependence: OR=1.44, 95%CI 1.14-1.81, P<0.01; light alcohol dependence: OR=1.35, 95%CI 1.11-1.64, P<0.01; medium alcohol dependence: OR=1.83, 95%CI 1.40-2.41, P<0.01). The alcohol dependence is an independent risk factor for new detected hypertension in adult residents of Xuzhou city. Intensive hypertension prevention and treatment strategies should be performed on this population based on our results.
Bogani, Patrizia; Spiriti, Maria Michela; Lazzarano, Stefano; Arcangeli, Annarosa; Buiatti, Marcello; Minunni, Maria
2011-11-01
The World Anti-Doping Agency fears the use of gene doping to enhance athletic performances. Thus, a bioanalytical approach based on end point PCR for detecting markers' of transgenesis traceability was developed. A few sequences from two different vectors using an animal model were selected and traced in different tissues and at different times. In particular, enhanced green fluorescent protein gene and a construct-specific new marker were targeted in the analysis. To make the developed detection approach open to future routine doping analysis, matrices such as urine and tears as well blood were also tested. This study will have impact in evaluating the vector transgenes traceability for the detection of a gene doping event by non-invasive sampling.
High performance thin layer chromatography fingerprint analysis of guava (Psidium guajava) leaves
NASA Astrophysics Data System (ADS)
Astuti, M.; Darusman, L. K.; Rafi, M.
2017-05-01
High-performance thin layer chromatography (HPTLC) fingerprint analysis is commonly used for quality control of medicinal plants in term of identification and authentication. In this study, we have been developed HPTLC fingerprint analysis for identification of guava (Psidium guajava) leaves raw material. A mixture of chloroform, acetone, and formic acid in the ratio 10:2:1 was used as the optimum mobile phase in HPTLC silica plate and with 13 bands were detected. As reference marker we chose gallic acid (Rf = 0.21) and catechin (Rf = 0.11). The two compound were detected as pale black bands at 366 nm after derivatization with sulfuric acid 10% v/v (in methanol) reagent. Validation of the method was met within validation criteria, so the developed method could be used for quality control of guava leaves.
NASA Astrophysics Data System (ADS)
Fisher, Mark; Sikes, John; Prather, Mark
2004-09-01
The dog's nose is an effective, highly-mobile sampling system, while the canine olfactory organs are an extremely sensitive detector. Having been trained to detect a wide variety of substances with exceptional results, canines are widely regarded as the 'gold standard' in chemical vapor detection. Historically, attempts to mimic the ability of dogs to detect vapors of explosives using electronic 'dogs noses' has proven difficult. However, recent advances in technology have resulted in development of detection (i.e., sampling and sensor) systems with performance that is rapidly approaching that of trained canines. The Nomadics Fido was the first sensor to demonstrate under field conditions the detection of landmines with performance approaching that of canines. More recently, comparative testing of Fido against canines has revealed that electronic vapor detection, when coupled with effective sampling methods, can produce results comparable to that of highly-trained canines. The results of these comparative tests will be presented, as will recent test results in which explosives hidden in cargo were detected using Fido with a high-volume sampling technique. Finally, the use of canines along with electronic sensors will be discussed as a means of improving the performance and expanding the capabilities of both methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-07-31
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper developed a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detect ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis.« less
Jin, Bo; Krishnan, Balu; Adler, Sophie; Wagstyl, Konrad; Hu, Wenhan; Jones, Stephen; Najm, Imad; Alexopoulos, Andreas; Zhang, Kai; Zhang, Jianguo; Ding, Meiping; Wang, Shuang; Wang, Zhong Irene
2018-05-01
Focal cortical dysplasia (FCD) is a major pathology in patients undergoing surgical resection to treat pharmacoresistant epilepsy. Magnetic resonance imaging (MRI) postprocessing methods may provide essential help for detection of FCD. In this study, we utilized surface-based MRI morphometry and machine learning for automated lesion detection in a mixed cohort of patients with FCD type II from 3 different epilepsy centers. Sixty-one patients with pharmacoresistant epilepsy and histologically proven FCD type II were included in the study. The patients had been evaluated at 3 different epilepsy centers using 3 different MRI scanners. T1-volumetric sequence was used for postprocessing. A normal database was constructed with 120 healthy controls. We also included 35 healthy test controls and 15 disease test controls with histologically confirmed hippocampal sclerosis to assess specificity. Features were calculated and incorporated into a nonlinear neural network classifier, which was trained to identify lesional cluster. We optimized the threshold of the output probability map from the classifier by performing receiver operating characteristic (ROC) analyses. Success of detection was defined by overlap between the final cluster and the manual labeling. Performance was evaluated using k-fold cross-validation. The threshold of 0.9 showed optimal sensitivity of 73.7% and specificity of 90.0%. The area under the curve for the ROC analysis was 0.75, which suggests a discriminative classifier. Sensitivity and specificity were not significantly different for patients from different centers, suggesting robustness of performance. Correct detection rate was significantly lower in patients with initially normal MRI than patients with unequivocally positive MRI. Subgroup analysis showed the size of the training group and normal control database impacted classifier performance. Automated surface-based MRI morphometry equipped with machine learning showed robust performance across cohorts from different centers and scanners. The proposed method may be a valuable tool to improve FCD detection in presurgical evaluation for patients with pharmacoresistant epilepsy. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Method and system for detecting an explosive
Reber, Edward L.; Rohde, Kenneth W.; Blackwood, Larry G.
2010-12-07
A method and system for detecting at least one explosive in a vehicle using a neutron generator and a plurality of NaI detectors. Spectra read from the detectors is calibrated by performing Gaussian peak fitting to define peak regions, locating a Na peak and an annihilation peak doublet, assigning a predetermined energy level to one peak in the doublet, and predicting a hydrogen peak location based on a location of at least one peak of the doublet. The spectra are gain shifted to a common calibration, summed for respective groups of NaI detectors, and nitrogen detection analysis performed on the summed spectra for each group.
Treglia, Giorgio; Bertagna, Francesco; Sadeghi, Ramin; Muoio, Barbara; Giovanella, Luca
2015-12-01
This study aimed at performing a meta-analysis on the prevalence and risk of malignancy of focal parotid incidental uptake (FPIU) detected by hybrid fluorine-18-fluorodeoxyglucose ((18)F-FDG) positron emission tomography/computed tomography (PET/CT) or (18)F-FDG PET alone. A comprehensive literature search of studies published up to July 2014 was performed. Records reporting at least 5 FPIUs were selected. Pooled prevalence and malignancy risk of FPIU were calculated including 95 % confidence intervals (95 % CI). Twelve records were selected for our meta-analysis. Pooled prevalence of FPIU detected by (18)F-FDG PET or PET/CT was 0.6 % (95 % CI 0.4-0.7 %), collecting data of 220 patients with FPIU. Overall, 181 FPIUs underwent further evaluation and 165 FPIUs were pathologically proven. Pooled risk of malignancy was 9.6 % (95 % CI 5.4-14.8 %), 10.9 % (95 % CI 5.8-17.3 %) and 20.4 % (95 % CI 12.3-30 %), considering all FPIUs detected, only those which underwent further evaluation and only those pathologically proven, respectively. Selection bias in the included studies, the heterogeneity among studies and the publication bias are limitations of our meta-analysis. Overall FPIUs are observed in about 1 % of (18)F-FDG PET or PET/CT scans and they are benign in most of the cases. Nevertheless, further evaluation is needed whenever FPIUs are detected by (18)F-FDG-PET or PET/CT to exclude malignant lesions or with possible malignant degeneration. Prospective studies are needed to confirm the findings reported by our meta-analysis.
Jin, H; Yuan, L; Li, C; Kan, Y; Hao, R; Yang, J
2014-03-01
The purpose of this study was to systematically review and perform a meta-analysis of published data regarding the diagnostic performance of positron emission tomography (PET) or PET/computed tomography (PET/CT) in prosthetic infection after arthroplasty. A comprehensive computer literature search of studies published through May 31, 2012 regarding PET or PET/CT in patients suspicious of prosthetic infection was performed in PubMed/MEDLINE, Embase and Scopus databases. Pooled sensitivity and specificity of PET or PET/CT in patients suspicious of prosthetic infection on a per prosthesis-based analysis were calculated. The area under the receiver-operating characteristic (ROC) curve was calculated to measure the accuracy of PET or PET/CT in patients with suspicious of prosthetic infection. Fourteen studies comprising 838 prosthesis with suspicious of prosthetic infection after arthroplasty were included in this meta-analysis. The pooled sensitivity of PET or PET/CT in detecting prosthetic infection was 86% (95% confidence interval [CI] 82-90%) on a per prosthesis-based analysis. The pooled specificity of PET or PET/CT in detecting prosthetic infection was 86% (95% CI 83-89%) on a per prosthesis-based analysis. The area under the ROC curve was 0.93 on a per prosthesis-based analysis. In patients suspicious of prosthetic infection, FDG PET or PET/CT demonstrated high sensitivity and specificity. FDG PET or PET/CT are accurate methods in this setting. Nevertheless, possible sources of false positive results and influcing factors should kept in mind.
A fast automatic target detection method for detecting ships in infrared scenes
NASA Astrophysics Data System (ADS)
Özertem, Kemal Arda
2016-05-01
Automatic target detection in infrared scenes is a vital task for many application areas like defense, security and border surveillance. For anti-ship missiles, having a fast and robust ship detection algorithm is crucial for overall system performance. In this paper, a straight-forward yet effective ship detection method for infrared scenes is introduced. First, morphological grayscale reconstruction is applied to the input image, followed by an automatic thresholding onto the suppressed image. For the segmentation step, connected component analysis is employed to obtain target candidate regions. At this point, it can be realized that the detection is defenseless to outliers like small objects with relatively high intensity values or the clouds. To deal with this drawback, a post-processing stage is introduced. For the post-processing stage, two different methods are used. First, noisy detection results are rejected with respect to target size. Second, the waterline is detected by using Hough transform and the detection results that are located above the waterline with a small margin are rejected. After post-processing stage, there are still undesired holes remaining, which cause to detect one object as multi objects or not to detect an object as a whole. To improve the detection performance, another automatic thresholding is implemented only to target candidate regions. Finally, two detection results are fused and post-processing stage is repeated to obtain final detection result. The performance of overall methodology is tested with real world infrared test data.
Liquid Chromatography-Mass Spectrometry Interface for Detection of Extraterrestrial Organics
NASA Technical Reports Server (NTRS)
Southard, Adrian E.; Getty, Stephanie A.; Balvin, Manuel; Cook, Jamie E.; Espiritu, Ana Mellina; Kotecki, Carl; Towner, Deborah W.; Dworkin, J. P.; Glavin, Daniel P.; Mahaffy, Paul R.;
2014-01-01
The OASIS (Organics Analyzer for Sampling Icy surfaces) microchip enables electrospray or thermospray of analyte for subsequent analysis by the OASIS time-of-flight mass spectrometer. Electrospray of buffer solution containing the nucleobase adenine was performed using the microchip and detected by a commercial time-of-flight mass spectrometer. Future testing of thermospray and electrospray capability will be performed using a test fixture and vacuum chamber developed especially for optimization of ion spray at atmosphere and in low pressure environments.
2009-12-01
with 32 chip baseband waveforms such as Walsh functions. Performance with both coherent and noncoherent detection is analyzed. For noncoherent ...detection, only one five bit symbol is transmitted on the I and Q components of the carrier per symbol duration, so the data throughput for noncoherent ...for coherent and noncoherent demodulation, respectively, when 510bP . Likewise, in an AWGN only environment with a diversity of two, the proposed
Evaluation of asbestos-containing products and released fibers in home appliances.
Hwang, Sung Ho; Park, Wha Me
2016-09-01
The purpose of this study was to detect asbestos-containing products and released asbestos fibers from home appliances. The authors investigated a total of 414 appliances manufactured between 1986 and 2007. Appliances were divided into three categories: large-sized electric appliances, small-sized electric appliances, and household items. Analysis for asbestos-containing material (ACM) was performed using polarized light microscopy (PLM) and stereoscopic microscopy. Air sampling was performed to measure airborne concentration of asbestos using a phase-contrast microscope (PCM). The results of the analysis for ACM in appliances show that large-sized electric appliances (refrigerators, washing machines, kimchi-refrigerators) and household items (bicycles, motorcycles, gas boilers) contain asbestos material and small-sized electric appliances do not contain asbestos material. All appliances with detected asbestos material showed typical characteristics of chrysotile (7-50%) and tremolite (7-10%). No released fibers of ACM were detected from the tested appliances when the appliances were operating. This study gives the basic information on asbestos risk to people who use home appliances. All appliances with detected asbestos material showed typical characteristics of chrysotile (7-50%) and tremolite (7-10%). No released fibers of ACM were detected from the tested appliances when the appliances were operating.
Sweet-spot training for early esophageal cancer detection
NASA Astrophysics Data System (ADS)
van der Sommen, Fons; Zinger, Svitlana; Schoon, Erik J.; de With, Peter H. N.
2016-03-01
Over the past decade, the imaging tools for endoscopists have improved drastically. This has enabled physicians to visually inspect the intestinal tissue for early signs of malignant lesions. Besides this, recent studies show the feasibility of supportive image analysis for endoscopists, but the analysis problem is typically approached as a segmentation task where binary ground truth is employed. In this study, we show that the detection of early cancerous tissue in the gastrointestinal tract cannot be approached as a binary segmentation problem and it is crucial and clinically relevant to involve multiple experts for annotating early lesions. By employing the so-called sweet spot for training purposes as a metric, a much better detection performance can be achieved. Furthermore, a multi-expert-based ground truth, i.e. a golden standard, enables an improved validation of the resulting delineations. For this purpose, besides the sweet spot we also propose another novel metric, the Jaccard Golden Standard (JIGS) that can handle multiple ground-truth annotations. Our experiments involving these new metrics and based on the golden standard show that the performance of a detection algorithm of early neoplastic lesions in Barrett's esophagus can be increased significantly, demonstrating a 10 percent point increase in the resulting F1 detection score.
NDE detectability of fatigue type cracks in high strength alloys
NASA Technical Reports Server (NTRS)
Christner, B. K.; Rummel, W. D.
1983-01-01
Specimens suitable for investigating the reliability of production nondestructive evaluation (NDE) to detect tightly closed fatigue cracks in high strength alloys representative of those materials used in spacecraft engine/booster construction were produced. Inconel 718 was selected as representative of nickel base alloys and Haynes 188 was selected as representative of cobalt base alloys used in this application. Cleaning procedures were developed to insure the reusability of the test specimens and a flaw detection reliability assessment of the fluorescent penetrant inspection method was performed using the test specimens produced to characterize their use for future reliability assessments and to provide additional NDE flaw detection reliability data for high strength alloys. The statistical analysis of the fluorescent penetrant inspection data was performed to determine the detection reliabilities for each inspection at a 90% probability/95% confidence level.
Method for predicting peptide detection in mass spectrometry
Kangas, Lars [West Richland, WA; Smith, Richard D [Richland, WA; Petritis, Konstantinos [Richland, WA
2010-07-13
A method of predicting whether a peptide present in a biological sample will be detected by analysis with a mass spectrometer. The method uses at least one mass spectrometer to perform repeated analysis of a sample containing peptides from proteins with known amino acids. The method then generates a data set of peptides identified as contained within the sample by the repeated analysis. The method then calculates the probability that a specific peptide in the data set was detected in the repeated analysis. The method then creates a plurality of vectors, where each vector has a plurality of dimensions, and each dimension represents a property of one or more of the amino acids present in each peptide and adjacent peptides in the data set. Using these vectors, the method then generates an algorithm from the plurality of vectors and the calculated probabilities that specific peptides in the data set were detected in the repeated analysis. The algorithm is thus capable of calculating the probability that a hypothetical peptide represented as a vector will be detected by a mass spectrometry based proteomic platform, given that the peptide is present in a sample introduced into a mass spectrometer.
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Henderson, Sammy W.; Frehlich, R. G.
1991-01-01
The performance and calibration of a focused, continuous wave, coherent detection CO2 lidar operated for the measurement of atmospheric backscatter coefficient, B(m), was examined. This instrument functions by transmitting infrared (10 micron) light into the atmosphere and collecting the light which is scattered in the rearward direction. Two distinct modes of operation were considered. In volume mode, the scattered light energy from many aerosols is detected simultaneously, whereas in the single particle mode (SPM), the scattered light energy from a single aerosol is detected. The analysis considered possible sources of error for each of these two cases, and also considered the conditions where each technique would have superior performance. The analysis showed that, within reasonable assumptions, the value of B(m) could be accurately measured by either the VM or the SPM method. The understanding of the theory developed during the analysis was also applied to a pulsed CO2 lidar. Preliminary results of field testing of a solid state 2 micron lidar using a CW oscillator is included.
Dabbah, M A; Graham, J; Petropoulos, I N; Tavakoli, M; Malik, R A
2011-10-01
Diabetic peripheral neuropathy (DPN) is one of the most common long term complications of diabetes. Corneal confocal microscopy (CCM) image analysis is a novel non-invasive technique which quantifies corneal nerve fibre damage and enables diagnosis of DPN. This paper presents an automatic analysis and classification system for detecting nerve fibres in CCM images based on a multi-scale adaptive dual-model detection algorithm. The algorithm exploits the curvilinear structure of the nerve fibres and adapts itself to the local image information. Detected nerve fibres are then quantified and used as feature vectors for classification using random forest (RF) and neural networks (NNT) classifiers. We show, in a comparative study with other well known curvilinear detectors, that the best performance is achieved by the multi-scale dual model in conjunction with the NNT classifier. An evaluation of clinical effectiveness shows that the performance of the automated system matches that of ground-truth defined by expert manual annotation. Copyright © 2011 Elsevier B.V. All rights reserved.
Enhancing detection of steady-state visual evoked potentials using individual training data.
Wang, Yijun; Nakanishi, Masaki; Wang, Yu-Te; Jung, Tzyy-Ping
2014-01-01
Although the performance of steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) has improved gradually in the past decades, it still does not meet the requirement of a high communication speed in many applications. A major challenge is the interference of spontaneous background EEG activities in discriminating SSVEPs. An SSVEP BCI using frequency coding typically does not have a calibration procedure since the frequency of SSVEPs can be recognized by power spectrum density analysis (PSDA). However, the detection rate can be deteriorated by the spontaneous EEG activities within the same frequency range because phase information of SSVEPs is ignored in frequency detection. To address this problem, this study proposed to incorporate individual SSVEP training data into canonical correlation analysis (CCA) to improve the frequency detection of SSVEPs. An eight-class SSVEP dataset recorded from 10 subjects in a simulated online BCI experiment was used for performance evaluation. Compared to the standard CCA method, the proposed method obtained significantly improved detection accuracy (95.2% vs. 88.4%, p<0.05) and information transfer rates (ITR) (104.6 bits/min vs. 89.1 bits/min, p<0.05). The results suggest that the employment of individual SSVEP training data can significantly improve the detection rate and thereby facilitate the implementation of a high-speed BCI.
Zhao, Ying-yong; Cheng, Xian-long; Zhang, Yongmin; Zhao, Ye; Lin, Rui-chao; Sun, Wen-ji
2010-02-01
Polyporus umbellatus is a widely used diuretic herbal medicine. In this study, a high-performance liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometric detection (HPLC-APCI-MS) method was developed for qualitative and quantitative analysis of steroids, as well as for the quality control of Polyporus umbellatus. The selectivity, reproducibility and sensitivity were compared with HPLC with photodiode array detection and evaporative light scattering detection (ELSD). Selective ion monitoring in positive mode was used for qualitative and quantitative analysis of eight major components and beta-ecdysterone was used as the internal standard. Limits of detection and quantification fell in the ranges 7-21 and 18-63 ng/mL for the eight analytes with an injection of 10 microL samples, and all calibration curves showed good linear regression (r(2) > 0.9919) within the test range. The quantitative results demonstrated that samples from different localities showed different qualities. Advantages, in comparison with conventional HPLC-diode array detection and HPLC-ELSD, are that reliable identification of target compounds could be achieved by accurate mass measurements along with characteristic retention time, and the great enhancement in selectivity and sensitivity allows identification and quantification of low levels of constituents in complex Polyporus umbellatus matrixes. (c) 2009 John Wiley & Sons, Ltd.
Al-Nawashi, Malek; Al-Hazaimeh, Obaida M; Saraee, Mohamad
2017-01-01
Abnormal activity detection plays a crucial role in surveillance applications, and a surveillance system that can perform robustly in an academic environment has become an urgent need. In this paper, we propose a novel framework for an automatic real-time video-based surveillance system which can simultaneously perform the tracking, semantic scene learning, and abnormality detection in an academic environment. To develop our system, we have divided the work into three phases: preprocessing phase, abnormal human activity detection phase, and content-based image retrieval phase. For motion object detection, we used the temporal-differencing algorithm and then located the motions region using the Gaussian function. Furthermore, the shape model based on OMEGA equation was used as a filter for the detected objects (i.e., human and non-human). For object activities analysis, we evaluated and analyzed the human activities of the detected objects. We classified the human activities into two groups: normal activities and abnormal activities based on the support vector machine. The machine then provides an automatic warning in case of abnormal human activities. It also embeds a method to retrieve the detected object from the database for object recognition and identification using content-based image retrieval. Finally, a software-based simulation using MATLAB was performed and the results of the conducted experiments showed an excellent surveillance system that can simultaneously perform the tracking, semantic scene learning, and abnormality detection in an academic environment with no human intervention.
Yousefifard, Mahmoud; Baikpour, Masoud; Ghelichkhani, Parisa; Asady, Hadi; Shahsavari Nia, Kavous; Moghadas Jafari, Ali; Hosseini, Mostafa; Safari, Saeed
2016-01-01
The role of ultrasonography in detection of pleural effusion has long been a subject of interest but controversial results have been reported. Accordingly, this study aims to conduct a systematic review of the available literature on diagnostic value of ultrasonography and radiography in detection of pleural effusion through a meta-analytic approach. An extended search was done in databases of Medline, EMBASE, ISI Web of Knowledge, Scopus, Cochrane Library, and ProQuest. Two reviewers independently extracted the data and assessed the quality of the articles. Meta-analysis was performed using a mixed-effects binary regression model. Finally, subgroup analysis was carried out in order to find the sources of heterogeneity between the included studies. 12 studies were included in this meta-analysis (1554 subjects, 58.6% male). Pooled sensitivity of ultrasonography in detection of pleural effusion was 0.94 (95% CI: 0.88-0.97; I2= 84.23, p<0.001) and its pooled specificity was calculated to be 0.98 (95% CI: 0.92-1.0; I2= 88.65, p<0.001), while sensitivity and specificity of chest radiography were 0.51 (95% CI: 0.33-0.68; I2= 91.76, p<0.001) and 0.91 (95% CI: 0.68-0.98; I2= 92.86, p<0.001), respectively. Sensitivity of ultrasonography was found to be higher when the procedure was carried out by an intensivist or a radiologist using 5-10 MHz transducers. Chest ultrasonography, as a screening tool, has a higher diagnostic accuracy in identification of plural effusion compared to radiography. The sensitivity of this imaging modality was found to be higher when performed by a radiologist or an intensivist and using 5-10MHz probes.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Sequence information gain based motif analysis.
Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre
2015-11-09
The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.
Yoshizawa, Hidenori; Motooka, Daisuke; Matsumoto, Yuki; Katada, Ryuichi; Nakamura, Shota; Morii, Eiichi; Iida, Tetsuya; Matsumoto, Hiroshi
2018-05-01
Post-mortem detection of pathogenetic microorganisms in severe infectious death is significantly important for diagnosing the cause of death as well as for public health. However, it is difficult to recognize whether a microorganism detected from post-mortem materials is truly pathogenic or not. We report a case of severe soft tissue infection due to Streptococcus oralis subsp. tigurinus (S. tigurinus), a recently reported species, in which whole-genome analysis was performed to clarify its pathogenicity. A 46-year-old woman had died with symptoms of a severe infectious disease. A post-mortem examination was performed by a medical examiner. The external findings suggested a soft tissue infection; subsequently, pathological specimens sampled by necropsy revealed findings compatible with necrotizing fasciitis. In the post-mortem bacterial test, S. tigurinus was detected from the localized autopsy sample. Whole-genome sequencing was performed to analyze its pathogenicity and detected a strain of S. tigurinus with genetic determinants that were specific and unique to its highly virulent strains as a result of gene annotation. Utilizing various technologies, such as whole-genome sequencing, may be a powerful tool for diagnosing the cause of infectious death accurately and safely. © 2018 Japanese Society of Pathology and John Wiley & Sons Australia, Ltd.
NASA Astrophysics Data System (ADS)
Chen, Hai-Wen; McGurr, Mike; Brickhouse, Mark
2015-11-01
We present a newly developed feature transformation (FT) detection method for hyper-spectral imagery (HSI) sensors. In essence, the FT method, by transforming the original features (spectral bands) to a different feature domain, may considerably increase the statistical separation between the target and background probability density functions, and thus may significantly improve the target detection and identification performance, as evidenced by the test results in this paper. We show that by differentiating the original spectral, one can completely separate targets from the background using a single spectral band, leading to perfect detection results. In addition, we have proposed an automated best spectral band selection process with a double-threshold scheme that can rank the available spectral bands from the best to the worst for target detection. Finally, we have also proposed an automated cross-spectrum fusion process to further improve the detection performance in lower spectral range (<1000 nm) by selecting the best spectral band pair with multivariate analysis. Promising detection performance has been achieved using a small background material signature library for concept-proving, and has then been further evaluated and verified using a real background HSI scene collected by a HYDICE sensor.
Lee, Sangyeop; Choi, Junghyun; Chen, Lingxin; Park, Byungchoon; Kyong, Jin Burm; Seong, Gi Hun; Choo, Jaebum; Lee, Yeonjung; Shin, Kyung-Hoon; Lee, Eun Kyu; Joo, Sang-Woo; Lee, Kyeong-Hee
2007-05-08
A rapid and highly sensitive trace analysis technique for determining malachite green (MG) in a polydimethylsiloxane (PDMS) microfluidic sensor was investigated using surface-enhanced Raman spectroscopy (SERS). A zigzag-shaped PDMS microfluidic channel was fabricated for efficient mixing between MG analytes and aggregated silver colloids. Under the optimal condition of flow velocity, MG molecules were effectively adsorbed onto silver nanoparticles while flowing along the upper and lower zigzag-shaped PDMS channel. A quantitative analysis of MG was performed based on the measured peak height at 1615 cm(-1) in its SERS spectrum. The limit of detection, using the SERS microfluidic sensor, was found to be below the 1-2 ppb level and this low detection limit is comparable to the result of the LC-Mass detection method. In the present study, we introduce a new conceptual detection technology, using a SERS microfluidic sensor, for the highly sensitive trace analysis of MG in water.
Detection of traffic incidents using nonlinear time series analysis
NASA Astrophysics Data System (ADS)
Fragkou, A. D.; Karakasidis, T. E.; Nathanail, E.
2018-06-01
In this study, we present results of the application of nonlinear time series analysis on traffic data for incident detection. More specifically, we analyze daily volume records of Attica Tollway (Greece) collected from sensors located at various locations. The analysis was performed using the Recurrence Plot (RP) and Recurrence Quantification Analysis (RQA) method of the volume data of the lane closest to the median. The results show that it is possible to identify, through the abrupt change of the dynamics of the system revealed by RPs and RQA, the occurrence of incidents on the freeway and differentiate from recurrent traffic congestion. The proposed methodology could be of interest for big data traffic analysis.
Ratiometric analysis of in vivo retinal layer thicknesses in multiple sclerosis
NASA Astrophysics Data System (ADS)
Bhaduri, Basanta; Nolan, Ryan M.; Shelton, Ryan L.; Pilutti, Lara A.; Motl, Robert W.; Boppart, Stephen A.
2016-09-01
We performed ratiometric analysis of retinal optical coherence tomography images for the first time in multiple sclerosis (MS) patients. The ratiometric analysis identified differences in several retinal layer thickness ratios in the cohort of MS subjects without a history of optic neuritis (ON) compared to healthy control (HC) subjects, and there was no difference in standard retinal nerve fiber layer thickness (RNFLT). The difference in such ratios between HC subjects and those with mild MS-disability, without a difference in RNFLT, further suggests the possibility of using layer ratiometric analysis for detecting early retinal changes in MS. Ratiometric analysis may be useful and potentially more sensitive for detecting disease changes in MS.
Booth, Marsilea Adela; Vogel, Robert; Curran, James M; Harbison, SallyAnn; Travas-Sejdic, Jadranka
2013-07-15
Despite the plethora of DNA sensor platforms available, a portable, sensitive, selective and economic sensor able to rival current fluorescence-based techniques would find use in many applications. In this research, probe oligonucleotide-grafted particles are used to detect target DNA in solution through a resistive pulse nanopore detection technique. Using carbodiimide chemistry, functionalized probe DNA strands are attached to carboxylated dextran-based magnetic particles. Subsequent incubation with complementary target DNA yields a change in surface properties as the two DNA strands hybridize. Particle-by-particle analysis with resistive pulse sensing is performed to detect these changes. A variable pressure method allows identification of changes in the surface charge of particles. As proof-of-principle, we demonstrate that target hybridization is selectively detected at micromolar concentrations (nanomoles of target) using resistive pulse sensing, confirmed by fluorescence and phase analysis light scattering as complementary techniques. The advantages, feasibility and limitations of using resistive pulse sensing for sample analysis are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingram, J.C.; Groenewold, G.S.; Appelhans, A.D.
1997-02-01
Direct surface analyses by static secondary ion mass spectrometry (SIMS) were performed for the following pesticides adsorbed on dandelion leaves, grass, soil, and stainless steel samples: alachlor, atrazine, captan, carbofuran, chlorpyrifos, chlorosulfuron, chlorthal-dimethyl, cypermethrin, 2,4-D, diuron, glyphosate, malathion, methomyl, methyl arsonic acid, mocap, norflurazon, oxyfluorfen, paraquat, temik, and trifluralin. The purpose of this study was to evaluate static SIMS as a tool for pesticide analysis, principally for use in screening samples for pesticides. The advantage of direct surface analysis compared with conventional pesticide analysis methods is the elimination of sample pretreatment including extraction, which streamlines the analysis substantially; total analysismore » time for SIMS analysis was ca. 10 min/sample. Detection of 16 of the 20 pesticides on all four substrates was achieved. Of the remaining four pesticides, only one (trifluralin) was not detected on any of the samples. The minimum detectable quantity was determined for paraquat on soil in order to evaluate the efficacy of using SIMS as a screening tool. Paraquat was detected at 3 pg/mm{sup 2} (c.a. 0.005 monolayers). The results of these studies suggest that SIMS is capable of direct surface detection of a range of pesticides, with low volatility, polar pesticides being the most easily detected. 25 refs., 2 figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Liu, Robin H.; Longiaru, Mathew
2009-05-01
DNA microarrays are becoming a widespread tool used in life science and drug screening due to its many benefits of miniaturization and integration. Microarrays permit a highly multiplexed DNA analysis. Recently, the development of new detection methods and simplified methodologies has rapidly expanded the use of microarray technologies from predominantly gene expression analysis into the arena of diagnostics. Osmetech's eSensor® is an electrochemical detection platform based on a low-to- medium density DNA hybridization array on a cost-effective printed circuit board substrate. eSensor® has been cleared by FDA for Warfarin sensitivity test and Cystic Fibrosis Carrier Detection. Other genetic-based diagnostic and infectious disease detection tests are under development. The eSensor® platform eliminates the need for an expensive laser-based optical system and fluorescent reagents. It allows one to perform hybridization and detection in a single and small instrument without any fluidic processing and handling. Furthermore, the eSensor® platform is readily adaptable to on-chip sample-to-answer genetic analyses using microfluidics technology. The eSensor® platform provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus have a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.
Analysis of digital communication signals and extraction of parameters
NASA Astrophysics Data System (ADS)
Al-Jowder, Anwar
1994-12-01
The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).
Biffi, E.; Ghezzi, D.; Pedrocchi, A.; Ferrigno, G.
2010-01-01
Neurons cultured in vitro on MicroElectrode Array (MEA) devices connect to each other, forming a network. To study electrophysiological activity and long term plasticity effects, long period recording and spike sorter methods are needed. Therefore, on-line and real time analysis, optimization of memory use and data transmission rate improvement become necessary. We developed an algorithm for amplitude-threshold spikes detection, whose performances were verified with (a) statistical analysis on both simulated and real signal and (b) Big O Notation. Moreover, we developed a PCA-hierarchical classifier, evaluated on simulated and real signal. Finally we proposed a spike detection hardware design on FPGA, whose feasibility was verified in terms of CLBs number, memory occupation and temporal requirements; once realized, it will be able to execute on-line detection and real time waveform analysis, reducing data storage problems. PMID:20300592
Sipkova, Zuzana; Lam, Fook Chang; Francis, Ian; Herold, Jim; Liu, Christopher
2013-04-01
To assess the use of serial computed tomography (CT) in the detection of osteo-odonto-lamina resorption in osteo-odonto-keratoprosthesis (OOKP) and to investigate the use of new volumetric software, Advanced Lung Analysis software (3D-ALA; GE Healthcare), for detecting changes in OOKP laminar volume. A retrospective assessment of the radiological databases and hospital records was performed for 22 OOKP patients treated at the National OOKP referral center in Brighton, United Kingdom. Three-dimensional surface reconstructions of the OOKP laminae were performed using stored CT data. For the 2-dimensional linear analysis, the linear dimensions of the reconstructed laminae were measured, compared with original measurements taken at the time of surgery, and then assigned a CT grade based on a predetermined resorption grading scale. The volumetric analysis involved calculating the laminar volumes using 3D-ALA. The effectiveness of 2-dimensional linear analysis, volumetric analysis, and clinical examination in detecting laminar resorption was compared. The mean change in laminar volume between the first and second scans was -6.67% (range, +10.13% to -24.86%). CT grades assigned to patients based on laminar dimension measurements remained the same, despite significant changes in laminar volumes. Clinical examination failed to identify 60% of patients who were found to have resorption on volumetric analysis. Currently, the detection of laminar resorption relies on clinical examination and the measurement of laminar dimensions on the 2- and 3-dimensional radiological images. Laminar volume measurement is a useful new addition to the armamentarium. It provides an objective tool that allows for a precise and reproducible assessment of laminar resorption.
RNA interference for performance enhancement and detection in doping control.
Kohler, Maxie; Schänzer, Wilhelm; Thevis, Mario
2011-10-01
RNA interference represents a comparably new route of regulating and manipulating specific gene expression. Promising results were obtained in experimental therapies aim at the treatment of different kinds of diseases including cancer, diabetes mellitus or Dychenne muscular dystrophy. While studies on down-regulation efficiency are often performed by analyzing the regulated protein, the direct detection of small, interfering RNA molecules and antisense oligonucleotides is of great interest for the investigation of the metabolism and degradation and also for the detection of a putative misuse of these molecules in sports. Myostatin down-regulation was shown to result in increased performance and muscle growth and the regulation of several other proteins could be relevant for performance enhancement. This mini-review summarizes current approaches for the mass spectrometric analysis of siRNA and antisense oligonucleotides from biological matrices and the available data on biodistribution, metabolism, and half-life of relevant substances are discussed. Copyright © 2011 John Wiley & Sons, Ltd.
An application of LOH analysis for detecting the genetic influences of space environmental radiation
NASA Astrophysics Data System (ADS)
Yatagai, F.; Umebayashi, Y.; Honma, M.; Abe, T.; Suzuki, H.; Shimazu, T.; Ishioka, N.; Iwaki, M.
To detect the genetic influence of space environmental radiation at the chromosome level we proposed an application of loss of heterozygosity LOH analysis system for the mutations induced in human lymphoblastoid TK6 cells Surprisingly we succeeded the mutation detection in the frozen dells which were exposed to a low-dose 10 cGy of carbon-ion beam irradiation Mutation assays were performed within a few days or after about one month preservation at --80 r C following irradiation The results showed an increase in mutation frequency at the thymidine kinase TK gene locus 1 6-fold 2 5 X 10 -6 to 3 9 X 10 -6 and 2 1-fold 2 5 X 10 -6 to 5 3 X 10 -6 respectively Although the relative distributions of mutation classes were not changed by the radiation exposure in either assay an interesting characteristic was detected using this LOH analysis system two TK locus markers and eleven microsatellite loci spanning chromosome 17 The radiation-specific patterns of interstitial deletions were observed in the hemizygous LOH mutants which were considered as a result of end-joining repair of carbon ion-induced DNA double-strand breaks These results clearly demonstrate that this analysis can be used for the detection of low-dose ionizing radiation effects in the frozen cells In addition we performed so called adaptive response experiments in which TK6 cells were pre-irradiated with low-dose 2 5 sim 10 cGy of X-ray and then exposed to challenging dose 2Gy of X-rays Interestingly the
Lajnef, Tarek; Chaibi, Sahbi; Eichenlaub, Jean-Baptiste; Ruby, Perrine M.; Aguera, Pierre-Emmanuel; Samet, Mounir; Kachouri, Abdennaceur; Jerbi, Karim
2015-01-01
A novel framework for joint detection of sleep spindles and K-complex events, two hallmarks of sleep stage S2, is proposed. Sleep electroencephalography (EEG) signals are split into oscillatory (spindles) and transient (K-complex) components. This decomposition is conveniently achieved by applying morphological component analysis (MCA) to a sparse representation of EEG segments obtained by the recently introduced discrete tunable Q-factor wavelet transform (TQWT). Tuning the Q-factor provides a convenient and elegant tool to naturally decompose the signal into an oscillatory and a transient component. The actual detection step relies on thresholding (i) the transient component to reveal K-complexes and (ii) the time-frequency representation of the oscillatory component to identify sleep spindles. Optimal thresholds are derived from ROC-like curves (sensitivity vs. FDR) on training sets and the performance of the method is assessed on test data sets. We assessed the performance of our method using full-night sleep EEG data we collected from 14 participants. In comparison to visual scoring (Expert 1), the proposed method detected spindles with a sensitivity of 83.18% and false discovery rate (FDR) of 39%, while K-complexes were detected with a sensitivity of 81.57% and an FDR of 29.54%. Similar performances were obtained when using a second expert as benchmark. In addition, when the TQWT and MCA steps were excluded from the pipeline the detection sensitivities dropped down to 70% for spindles and to 76.97% for K-complexes, while the FDR rose up to 43.62 and 49.09%, respectively. Finally, we also evaluated the performance of the proposed method on a set of publicly available sleep EEG recordings. Overall, the results we obtained suggest that the TQWT-MCA method may be a valuable alternative to existing spindle and K-complex detection methods. Paths for improvements and further validations with large-scale standard open-access benchmarking data sets are discussed. PMID:26283943
Lee, Gyeong-Hweon; Bang, Dae-Young; Lim, Jung-Hoon; Yoon, Seok-Min; Yea, Myeong-Jai; Chi, Young-Min
2017-10-15
In this study, a rapid method for simultaneous detection of ethyl carbamate (EC) and urea in Korean rice wine was developed. To achieve quantitative analysis of EC and urea, the conditions for Ultra-performance liquid chromatography (UPLC) separation and atmospheric-pressure chemical ionization tandem mass spectrometry (APCI-MS/MS) detection were first optimized. Under the established conditions, the detection limit, relative standard deviation and linear range were 2.83μg/L, 3.75-5.96%, and 0.01-10.0mg/L, respectively, for urea; the corresponding values were 0.17μg/L, 1.06-4.01%, and 1.0-50.0μg/L, respectively, for EC. The correlation between the contents of EC and its precursor urea was determined under specific pH (3.5 and 4.5) and temperature (4, 25, and 50°C) conditions using the developed method. As a result, EC content was increased with greater temperature and lower pH. In Korean rice wine, urea was detected 0.19-1.37mg/L and EC was detected 2.0-7.7μg/L. The method developed in this study, which has the advantages of simplified sample preparation, low detection limits, and good selectivity, was successfully applied for the rapid analysis of EC and urea. Copyright © 2017 Elsevier B.V. All rights reserved.
Khandelwal, Siddhartha; Wickstrom, Nicholas
2016-12-01
Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.
Detection of Differential Item Functioning Using the Lasso Approach
ERIC Educational Resources Information Center
Magis, David; Tuerlinckx, Francis; De Boeck, Paul
2015-01-01
This article proposes a novel approach to detect differential item functioning (DIF) among dichotomously scored items. Unlike standard DIF methods that perform an item-by-item analysis, we propose the "LR lasso DIF method": logistic regression (LR) model is formulated for all item responses. The model contains item-specific intercepts,…
Porto, Suely K S S; Nogueira, Thiago; Blanes, Lucas; Doble, Philip; Sabino, Bruno D; do Lago, Claudimir L; Angnes, Lúcio
2014-11-01
A method for the identification of 3,4-methylenedioxymethamphetamine (MDMA) and meta-chlorophenylpiperazine (mCPP) was developed employing capillary electrophoresis (CE) with capacitively coupled contactless conductivity detection (C(4) D). Sample extraction, separation, and detection of "Ecstasy" tablets were performed in <10 min without sample derivatization. The separation electrolyte was 20 mm TAPS/Lithium, pH 8.7. Average minimal detectable amounts for MDMA and mCPP were 0.04 mg/tablet, several orders of magnitude lower than the minimum amount encountered in a tablet. Seven different Ecstasy tablets seized in Rio de Janeiro, Brazil, were analyzed by CE-C(4) D and compared against routine gas chromatography-mass spectrometry (GC-MS). The CE method demonstrated sufficient selectivity to discriminate the two target drugs, MDMA and mCPP, from the other drugs present in seizures, namely amphepramone, fenproporex, caffeine, lidocaine, and cocaine. Separation was performed in <90 sec. The advantages of using C(4) D instead of traditional CE-UV methods for in-field analysis are also discussed. © 2014 American Academy of Forensic Sciences.
Performance analysis of robust road sign identification
NASA Astrophysics Data System (ADS)
Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.
2013-12-01
This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.
Becker, A S; Blüthgen, C; Phi van, V D; Sekaggya-Wiltshire, C; Castelnuovo, B; Kambugu, A; Fehr, J; Frauenfelder, T
2018-03-01
To evaluate the feasibility of Deep Learning-based detection and classification of pathological patterns in a set of digital photographs of chest X-ray (CXR) images of tuberculosis (TB) patients. In this prospective, observational study, patients with previously diagnosed TB were enrolled. Photographs of their CXRs were taken using a consumer-grade digital still camera. The images were stratified by pathological patterns into classes: cavity, consolidation, effusion, interstitial changes, miliary pattern or normal examination. Image analysis was performed with commercially available Deep Learning software in two steps. Pathological areas were first localised; detected areas were then classified. Detection was assessed using receiver operating characteristics (ROC) analysis, and classification using a confusion matrix. The study cohort was 138 patients with human immunodeficiency virus (HIV) and TB co-infection (median age 34 years, IQR 28-40); 54 patients were female. Localisation of pathological areas was excellent (area under the ROC curve 0.82). The software could perfectly distinguish pleural effusions from intraparenchymal changes. The most frequent misclassifications were consolidations as cavitations, and miliary patterns as interstitial patterns (and vice versa). Deep Learning analysis of CXR photographs is a promising tool. Further efforts are needed to build larger, high-quality data sets to achieve better diagnostic performance.
Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios
2014-01-01
To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Shrestha, R; Shakya, R M; Khan A, A
2016-01-01
Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.
Acoustic Analysis and Electroglottography in Elite Vocal Performers.
Villafuerte-Gonzalez, Rocio; Valadez-Jimenez, Victor M; Sierra-Ramirez, Jose A; Ysunza, Pablo Antonio; Chavarria-Villafuerte, Karen; Hernandez-Lopez, Xochiquetzal
2017-05-01
Acoustic analysis of voice (AAV) and electroglottography (EGG) have been used for assessing vocal quality in patients with voice disorders. The effectiveness of these procedures for detecting mild disturbances in vocal quality in elite vocal performers has been controversial. To compare acoustic parameters obtained by AAV and EGG before and after vocal training to determine the effectiveness of these procedures for detecting vocal improvements in elite vocal performers. Thirty-three elite vocal performers were studied. The study group included 14 males and 19 females, ages 18-40 years, without a history of voice disorders. Acoustic parameters were obtained through AAV and EGG before and after vocal training using the Linklater method. Nonsignificant differences (P > 0.05) were found between values of fundamental frequency (F 0 ), shimmer, and jitter obtained by both procedures before vocal training. Mean F 0 was similar after vocal training. Jitter percentage as measured by AAV showed nonsignificant differences (P > 0.05) before and after vocal training. Shimmer percentage as measured by AAV demonstrated a significant reduction (P < 0.05) after vocal training. As measured by EGG after vocal training, shimmer and jitter were significantly reduced (P < 0.05); open quotient was significantly increased (P < 0.05); and irregularity was significantly reduced (P < 0.05). AAV and EGG were effective for detecting improvements in vocal function after vocal training in male and female elite vocal performers undergoing vocal training. EGG demonstrated better efficacy for detecting improvements and provided additional parameters as compared to AAV. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
The Tissue Analysis Core (TAC) within the AIDS and Cancer Virus Program will process, embed, and perform microtomy on fixed tissue samples presented in ethanol. CD4 (DAB) and CD68/CD163 (FastRed) double immunohistochemistry will be performed, in whic
Analysis of Factors Affecting System Performance in the ASpIRE Challenge
2015-12-13
performance in the ASpIRE (Automatic Speech recognition In Reverberant Environments) challenge. In particular, overall word error rate (WER) of the solver...systems is analyzed as a function of room, distance between talker and microphone, and microphone type. We also analyze speech activity detection...analysis will inform the design of future challenges and provide insight into the efficacy of current solutions addressing noisy reverberant speech
A group filter algorithm for sea mine detection
NASA Astrophysics Data System (ADS)
Cobb, J. Tory; An, Myoung; Tolimieri, Richard
2005-06-01
Automatic detection of sea mines in coastal regions is a difficult task due to the highly variable sea bottom conditions present in the underwater environment. Detection systems must be able to discriminate objects which vary in size, shape, and orientation from naturally occurring and man-made clutter. Additionally, these automated systems must be computationally efficient to be incorporated into unmanned underwater vehicle (UUV) sensor systems characterized by high sensor data rates and limited processing abilities. Using noncommutative group harmonic analysis, a fast, robust sea mine detection system is created. A family of unitary image transforms associated to noncommutative groups is generated and applied to side scan sonar image files supplied by Naval Surface Warfare Center Panama City (NSWC PC). These transforms project key image features, geometrically defined structures with orientations, and localized spectral information into distinct orthogonal components or feature subspaces of the image. The performance of the detection system is compared against the performance of an independent detection system in terms of probability of detection (Pd) and probability of false alarm (Pfa).
Variable threshold method for ECG R-peak detection.
Kew, Hsein-Ping; Jeong, Do-Un
2011-10-01
In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.
EEG analysis of seizure patterns using visibility graphs for detection of generalized seizures.
Wang, Lei; Long, Xi; Arends, Johan B A M; Aarts, Ronald M
2017-10-01
The traditional EEG features in the time and frequency domain show limited seizure detection performance in the epileptic population with intellectual disability (ID). In addition, the influence of EEG seizure patterns on detection performance was less studied. A single-channel EEG signal can be mapped into visibility graphs (VGS), including basic visibility graph (VG), horizontal VG (HVG), and difference VG (DVG). These graphs were used to characterize different EEG seizure patterns. To demonstrate its effectiveness in identifying EEG seizure patterns and detecting generalized seizures, EEG recordings of 615h on one EEG channel from 29 epileptic patients with ID were analyzed. A novel feature set with discriminative power for seizure detection was obtained by using the VGS method. The degree distributions (DDs) of DVG can clearly distinguish EEG of each seizure pattern. The degree entropy and power-law degree power in DVG were proposed here for the first time, and they show significant difference between seizure and non-seizure EEG. The connecting structure measured by HVG can better distinguish seizure EEG from background than those by VG and DVG. A traditional EEG feature set based on frequency analysis was used here as a benchmark feature set. With a support vector machine (SVM) classifier, the seizure detection performance of the benchmark feature set (sensitivity of 24%, FD t /h of 1.8s) can be improved by combining our proposed VGS features extracted from one EEG channel (sensitivity of 38%, FD t /h of 1.4s). The proposed VGS-based features can help improve seizure detection for ID patients. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated detection of fundus photographic red lesions in diabetic retinopathy.
Larsen, Michael; Godt, Jannik; Larsen, Nicolai; Lund-Andersen, Henrik; Sjølie, Anne Katrin; Agardh, Elisabet; Kalm, Helle; Grunkin, Michael; Owens, David R
2003-02-01
To compare a fundus image-analysis algorithm for automated detection of hemorrhages and microaneurysms with visual detection of retinopathy in patients with diabetes. Four hundred fundus photographs (35-mm color transparencies) were obtained in 200 eyes of 100 patients with diabetes who were randomly selected from the Welsh Community Diabetic Retinopathy Study. A gold standard reference was defined by classifying each patient as having or not having diabetic retinopathy based on overall visual grading of the digitized transparencies. A single-lesion visual grading was made independently, comprising meticulous outlining of all single lesions in all photographs and used to develop the automated red lesion detection system. A comparison of visual and automated single-lesion detection in replicating the overall visual grading was then performed. Automated red lesion detection demonstrated a specificity of 71.4% and a resulting sensitivity of 96.7% in detecting diabetic retinopathy when applied at a tentative threshold setting for use in diabetic retinopathy screening. The accuracy of 79% could be raised to 85% by adjustment of a single user-supplied parameter determining the balance between the screening priorities, for which a considerable range of options was demonstrated by the receiver-operating characteristic (area under the curve 90.3%). The agreement of automated lesion detection with overall visual grading (0.659) was comparable to the mean agreement of six ophthalmologists (0.648). Detection of diabetic retinopathy by automated detection of single fundus lesions can be achieved with a performance comparable to that of experienced ophthalmologists. The results warrant further investigation of automated fundus image analysis as a tool for diabetic retinopathy screening.
High-efficiency high performance liquid chromatographic analysis of red wine anthocyanins.
de Villiers, André; Cabooter, Deirdre; Lynen, Frédéric; Desmet, Gert; Sandra, Pat
2011-07-22
The analysis of anthocyanins in natural products is of significant relevance in recent times due to the recognised health benefits associated with their consumption. In red grapes and wines in particular, anthocyanins are known to contribute important properties to the sensory (colour and taste), anti-oxidant- and ageing characteristics. However, the detailed investigation of the alteration of these compounds during wine ageing is hampered by the challenges associated with the separation of grape-derived anthocyanins and their derived products. High performance liquid chromatography (HPLC) is primarily used for this purpose, often in combination with mass spectrometric (MS) detection, although conventional HPLC methods provide incomplete resolution. We have previously demonstrated how on-column inter-conversion reactions are responsible for poor chromatographic efficiency in the HPLC analysis of anthocyanins, and how an increase in temperature and decrease in particle size may improve the chromatographic performance. In the current contribution an experimental configuration for the high efficiency analysis of anthocyanins is derived using the kinetic plot method (KPM). Further, it is shown how analysis under optimal conditions, in combination with MS detection, delivers much improved separation and identification of red wine anthocyanins and their derived products. This improved analytical performance holds promise for the in-depth investigation of these influential compounds in wine during ageing. Copyright © 2011 Elsevier B.V. All rights reserved.
Nanostructure-enhanced surface plasmon resonance imaging (Conference Presentation)
NASA Astrophysics Data System (ADS)
Špašková, Barbora; Lynn, Nicholas S.; Slabý, Jiří Bocková, Markéta; Homola, Jiří
2017-06-01
There remains a need for the multiplexed detection of biomolecules at extremely low concentrations in fields of medical diagnostics, food safety, and security. Surface plasmon resonance imaging is an established biosensing approach in which the measurement of the intensity of light across a sensor chip is correlated with the amount of target biomolecules captured by the respective areas on the chip. In this work, we present a new approach for this method allowing for enhanced bioanalytical performance via the introduction of nanostructured sensing chip and polarization contrast measurement, which enable the exploitation of both amplitude and phase properties of plasmonic resonances on the nanostructures. Here we will discuss a complex theoretical analysis of the sensor performance, whereby we investigate aspects related to both the optical performance as well as the transport of the analyte molecules to the functionalized surfaces. This analysis accounts for the geometrical parameters of the nanostructured sensing surface, the properties of functional coatings, and parameters related to the detection assay. Based on the results of the theoretical analysis, we fabricated sensing chips comprised of arrays of gold nanoparticles (by electron-beam lithography), which were modified by a biofunctional coating to allow for the selective capturing of the target biomolecules in the regions with high sensitivity. In addition, we developed a compact optical reader with an integrated microfluidic cell, allowing for the measurement from 50 independent sensing channels. The performance of this biosensor is demonstrated through the sensitive detection of short oligonucleotides down to the low picomolar level.
A Novel Technique to Detect Code for SAC-OCDMA System
NASA Astrophysics Data System (ADS)
Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.
2018-04-01
The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
A comparative analysis of frequency modulation threshold extension techniques
NASA Technical Reports Server (NTRS)
Arndt, G. D.; Loch, F. J.
1970-01-01
FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output
Temesi, David G; Martin, Scott; Smith, Robin; Jones, Christopher; Middleton, Brian
2010-06-30
Screening assays capable of performing quantitative analysis on hundreds of compounds per week are used to measure metabolic stability during early drug discovery. Modern orthogonal acceleration time-of-flight (OATOF) mass spectrometers equipped with analogue-to-digital signal capture (ADC) now offer performance levels suitable for many applications normally supported by triple quadruple instruments operated in multiple reaction monitoring (MRM) mode. Herein the merits of MRM and OATOF with ADC detection are compared for more than 1000 compounds screened in rat and/or cryopreserved human hepatocytes over a period of 3 months. Statistical comparison of a structurally diverse subset indicated good agreement for the two detection methods. The overall success rate was higher using OATOF detection and data acquisition time was reduced by around 20%. Targeted metabolites of diazepam were detected in samples from a CLint determination performed at 1 microM. Data acquisition by positive and negative ion mode switching can be achieved on high-performance liquid chromatography (HPLC) peak widths as narrow as 0.2 min (at base), thus enabling a more comprehensive first pass analysis with fast HPLC gradients. Unfortunately, most existing OATOF instruments lack the software tools necessary to rapidly convert the huge amounts of raw data into quantified results. Software with functionality similar to open access triple quadrupole systems is needed for OATOF to truly compete in a high-throughput screening environment. Copyright 2010 John Wiley & Sons, Ltd.
Design and analysis of APD photoelectric detecting circuit
NASA Astrophysics Data System (ADS)
Fang, R.; Wang, C.
2015-11-01
In LADAR system, photoelectric detecting circuit is the key part in photoelectric conversion, which determines speed of respond, sensitivity and fidelity of the system. This paper presents the design of a matched APD Photoelectric detecting circuit. The circuit accomplishes low-noise readout and high-gain amplification of the weak photoelectric signal. The main performances, especially noise and transient response of the circuit are analyzed. In order to obtain large bandwidth, decompensated operational amplifiers are applied. Circuit simulations allow the architecture validation and the global performances to be predicted. The simulation results show that the gain of the detecting circuit is 630kΩ while the bandwidth is 100MHz, and 28dB dynamic range is achieved. Furthermore, the variation of the output pulse width is less than 0.9ns.
Rapid detection of EBOLA VP40 in microchip immunofiltration assay
NASA Astrophysics Data System (ADS)
Miethe, Peter; Gary, Dominik; Hlawatsch, Nadine; Gad, Anne-Marie
2015-05-01
In the spring of 2014, the Ebola virus (EBOV) strain Zaire caused a dramatic outbreak in several regions of West Africa. The RT-PCR and antigen capture diagnostic proved to be effective for detecting EBOV in blood and serum. In this paper, we present data of a rapid antigen capture test for the detection of VP40. The test was performed in a microfluidic chip for immunofiltration analysis. The chip integrates all necessary assay components. The analytical sensitivity of the rapid test was 8 ng/ml for recombinant VP40. In serum and whole blood samples spiked with virus culture material, the detection limit was 2.2 x 102 PFU/ml. The performance data of the rapid test (15 min) are comparable to that of the VP40 laboratory ELISA.
NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios
NASA Astrophysics Data System (ADS)
Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.
2012-04-01
For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.
NASA Astrophysics Data System (ADS)
Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.
2013-05-01
In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.
NASA Astrophysics Data System (ADS)
Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.
2017-02-01
A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.
Detecting trap states in planar PbS colloidal quantum dot solar cells
Jin, Zhiwen; Wang, Aiji; Zhou, Qing; Wang, Yinshu; Wang, Jizheng
2016-01-01
The recently developed planar architecture (ITO/ZnO/PbS-TBAI/PbS-EDT/Au) has greatly improved the power conversion efficiency of colloidal quantum dot photovoltaics (QDPVs). However, the performance is still far below the theoretical expectations and trap states in the PbS-TBAI film are believed to be the major origin, characterization and understanding of the traps are highly demanded to develop strategies for continued performance improvement. Here employing impedance spectroscopy we detect trap states in the planar PbS QDPVs. We determined a trap state of about 0.34 eV below the conduction band with a density of around 3.2 × 1016 cm−3 eV−1. Temperature dependent open-circuit voltage analysis, temperature dependent diode property analysis and temperature dependent build-in potential analysis consistently denotes an below-bandgap activation energy of about 1.17–1.20 eV. PMID:27845392
Derivative Analysis of AVIRIS Data for Crop Stress Detection
NASA Technical Reports Server (NTRS)
Estep, Lee; Carter, Gregory A.; Berglund, Judith
2003-01-01
Low-altitude Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) hyperspectral imagery of a cornfield in Nebraska was used to determine whether derivative analysis methods provided enhanced plant stress detection compared with narrow-band ratios. The field was divided into 20 plots representing 4 replicates each of 5 nitrogen (N) fertilization treatments that ranged from 0 to 200 kg N/ha in 50 kg/ha increments. The imagery yielded a 3 m ground pixel size for 224 spectral bands. Derivative analysis provided no advantage in stress detection compared with the performance of narrow-band indices derived from the literature. This result was attributed to a high leaf area index at the time of overflight (LAI approx. equal to 5 to 6t) and the high signal-to-noise character of the narrow AVIRIS bands.
Zhao, Xiaoyong; Shen, Shanshan; Wu, Datong; Cai, Pengfei; Pan, Yuanjiang
2017-09-08
Analysis of carbohydrates based on matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is still challenging and researchers have been devoting themselves to efficient matrices discovery. In the present study, the design, synthesis, qualitative and quantitative performance of non-derivative ionic liquid matrices (ILMs) were reported. DHB/N-methylaniline (N-MA) and DHB/N-ethylaniline (N-EA), performing best for carbohydrate detection, have been screened out. The limit of detection for oligosaccharide provided by DHB/N-MA and DHB/N-EA were as low as 10 fmol. DHB/N-MA and DHB/N-EA showed significantly higher ion generation efficiency than DHB. The comparison of capacity to probe polysaccharide between these two ILMs and DHB also revealed their powerful potential. Their outstanding performance were probably due to lower proton affinities and stronger UV absorption at λ = 355 nm. What is more, taking DHB/N-MA as an example, quantitative analysis of fructo-oligosaccharide mixtures extracted and identified from rice noodles has been accomplished sensitively using an internal standard method. Overall, DHB/N-MA and DHB/N-EA exhibited excellent performance and might be significant sources as the carbohydrate matrices. Copyright © 2017 Elsevier B.V. All rights reserved.
Microcontroller-based real-time QRS detection.
Sun, Y; Suppappola, S; Wrublewski, T A
1992-01-01
The authors describe the design of a system for real-time detection of QRS complexes in the electrocardiogram based on a single-chip microcontroller (Motorola 68HC811). A systematic analysis of the instrumentation requirements for QRS detection and of the various design techniques is also given. Detection algorithms using different nonlinear transforms for the enhancement of QRS complexes are evaluated by using the ECG database of the American Heart Association. The results show that the nonlinear transform involving multiplication of three adjacent, sign-consistent differences in the time domain gives a good performance and a quick response. When implemented with an appropriate sampling rate, this algorithm is also capable of rejecting pacemaker spikes. The eight-bit single-chip microcontroller provides sufficient throughput and shows a satisfactory performance. Implementation of multiple detection algorithms in the same system improves flexibility and reliability. The low chip count in the design also favors maintainability and cost-effectiveness.
On-line early fault detection and diagnosis of municipal solid waste incinerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao Jinsong; Huang Jianchao; Sun Wei
A fault detection and diagnosis framework is proposed in this paper for early fault detection and diagnosis (FDD) of municipal solid waste incinerators (MSWIs) in order to improve the safety and continuity of production. In this framework, principal component analysis (PCA), one of the multivariate statistical technologies, is used for detecting abnormal events, while rule-based reasoning performs the fault diagnosis and consequence prediction, and also generates recommendations for fault mitigation once an abnormal event is detected. A software package, SWIFT, is developed based on the proposed framework, and has been applied in an actual industrial MSWI. The application shows thatmore » automated real-time abnormal situation management (ASM) of the MSWI can be achieved by using SWIFT, resulting in an industrially acceptable low rate of wrong diagnosis, which has resulted in improved process continuity and environmental performance of the MSWI.« less
NASA Astrophysics Data System (ADS)
Pujiyanto; Yasin, M.; Rusydi, F.
2018-03-01
Development of lead ion detection systems is expected to have an advantage in terms of simplicity of the device and easy for concentration analysis of a lead ion with very high performance. One important part of lead ion detection systems are electrical signal acquisition parts. The electrical signal acquisition part uses the main electronic components: non inverting op-amplifier, instrumentation amplifier, multiplier circuit and logarithmic amplifier. Here will be shown the performance of lead ion detection systems when the existing electrical signal processors use commercial electronic components. The results that can be drawn from this experimental were the lead ion sensor that has been developed can be used to detect lead ions with a sensitivity of 10.48 mV/ppm with the linearity 97.11% and had a measurement range of 0.1 ppm to 80 ppm.
Musharraf, Syed Ghulam; Ameer, Mariam; Ali, Arslan
2017-01-05
Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) being soft ionization technique, has become a method of choice for high-throughput analysis of proteins and peptides. In this study, we have explored the potential of atypical anti-psychotic drug olanzapine (OLZ) as a matrix for MALDI-MS analysis of peptides aided with the theoretical studies. Seven small peptides were employed as target analytes to check performance of olanzapine and compared with conventional MALDI matrix α-cyano-4-hydroxycinnamic acid (HCCA). All peptides were successfully detected when olanzapine was used as a matrix. Moreover, peptides angiotensin Ι and angiotensin ΙΙ were detected with better S/N ratio and resolution with this method as compared to their analysis by HCCA. Computational studies were performed to determine the thermochemical properties of olanzapine in order to further evaluate its similarity to MALDI matrices which were found in good agreement with the data of existing MALDI matrices. Copyright © 2016. Published by Elsevier B.V.
Takahashi, Makoto; Sakamaki, Shizuka; Fujita, Akira
2013-01-01
We developed and validated a new high-performance liquid chromatographic analysis for electrochemically detecting guaiacol and vanillin as important components in vanilla extract. Separation was achieved with Capcell Pak C-18 MG, the potential of the working electrode being set at +1000 mV. The respective calibration curves for guaiacol and vanillin were linear in the range of 1.60-460 µg/L and 5.90-1180 µg/L. The respective limits for the quantities of guaiacol and vanillin were 1.60 µg/L and 2.36 µg/L. The related standard deviations of the intra- and inter-day precision of the retention time and peak area were all less than 4%. The recovery of guaiacol and vanillin was both more than 97%, all of the validation data being within an acceptable range. This analysis method is well suited for the simultaneous and convenient analysis of guaiacol and vanillin in a vanilla extract to evaluate the quality of the vanilla extract.
MEG-guided analysis of 7T-MRI in patients with epilepsy.
Colon, A J; Osch, M J P van; Buijs, M; Grond, J V D; Hillebrand, A; Schijns, O; Wagner, G J; Ossenblok, P; Hofman, P; Buchem, M A V; Boon, P
2018-05-26
To study possible detection of structural abnormalities on 7T MRI that were not detected on 3T MRI and estimate the added value of MEG-guidance. For abnormalities found, analysis of convergence between clinical, MEG and 7T MRI localization of suspected epileptogenic foci. In adult patients with well-documented localization-related epilepsy in whom a previous 3T MRI did not demonstrate an epileptogenic lesion but MEG indicated a plausible epileptogenic focus, 7T MRI was performed. Based on semiologic data, visual analysis of the 7T images was performed as well as based on prior MEG results. Correlation with other data from the patient charts, for as far as these were available, was analysed. To establish the level of concordance between the three observers the generalized or Fleiss kappa was calculated. In 3/19 patients abnormalities that, based on semiology, could plausibly represent an epileptogenic lesion were detected using 7T MRI. In an additional 3/19 an abnormality was detected after MEG-guidance. However, in these later cases there was no concordance among the three observers with regard to the presence of a structural abnormality. In one of these three cases intracranial recording was performed, proving the possible abnormality on 7T MRI to be the epileptogenic focus. In 32% of patients 7T MRI showed abnormalities that could indicate an epileptogenic lesion whereas previous 3T MRI did not, especially when visual inspection was guided by the presence of focal interictal MEG abnormalities. Copyright © 2018 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Shi, Yan; Xiong, Jing; Sun, Dongmei; Liu, Wei; Wei, Feng; Ma, Shuangcheng; Lin, Ruichao
2015-08-01
An accurate and sensitive high-performance liquid chromatography method coupled with ultralviolet detection and precolumn derivatization was developed for the simultaneous quantification of the major bile acids in Artificial Calculus bovis, including cholic acid, hyodeoxycholic acid, chenodeoxycholic acid, and deoxycholic acid. The extraction, derivatization, chromatographic separation, and detection parameters were fully optimized. The samples were extracted with methanol by ultrasonic extraction. Then, 2-bromine-4'-nitroacetophenone and 18-crown ether-6 were used for derivatization. The chromatographic separation was performed on an Agilent SB-C18 column (250 × 4.6 mm id, 5 μm) at a column temperature of 30°C and liquid flow rate of 1.0 mL/min using water and methanol as the mobile phase with a gradient elution. The detection wavelength was 263 nm. The method was extensively validated by evaluating the linearity (r(2) ≥ 0.9980), recovery (94.24-98.91%), limits of detection (0.25-0.31 ng) and limits of quantification (0.83-1.02 ng). Seventeen samples were analyzed using the developed and validated method. Then, the amounts of bile acids were analyzed by hierarchical agglomerative clustering analysis and principal component analysis. The results of the chemometric analysis showed that the contents of these compounds reflect the intrinsic quality of artificial Calculus bovis, and two compounds (hyodeoxycholic acid and chenodeoxycholic acid) were the most important markers for quality evaluating. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analysis of the restricting factors of laser countermeasure active detection technology
NASA Astrophysics Data System (ADS)
Zhang, Yufa; Sun, Xiaoquan
2016-07-01
The detection effect of laser active detection system is affected by various kinds of factors. In view of the application requirement of laser active detection, the influence factors for laser active detection are analyzed. The mathematical model of cat eye target detection distance has been built, influence of the parameters of laser detection system and the environment on detection range and the detection efficiency are analyzed. Various parameters constraint detection performance is simulated. The results show that the discovery distance of laser active detection is affected by the laser divergence angle, the incident angle and the visibility of the atmosphere. For a given detection range, the laser divergence angle and the detection efficiency are mutually restricted. Therefore, in view of specific application environment, it is necessary to select appropriate laser detection parameters to achieve optimal detection effect.
Development of techniques for the analysis of isoflavones in soy foods and nutraceuticals.
Dentith, Susan; Lockwood, Brian
2008-05-01
For over 20 years, soy isoflavones have been investigated for their ability to prevent a wide range of cancers and cardiovascular problems, and numerous other disease states. This research is underpinned by the ability of researchers to analyse isoflavones in various forms in a range of raw materials and biological fluids. This review summarizes the techniques recently used in their analysis. The speed of high performance liquid chromatography analysis has been improved, allowing analysis of more samples, and increasing the sensitivity of detection techniques allows quantification of isoflavones down to nanomoles per litre levels in biological fluids. The combination of high-performance liquid chromatography with immunoassay has allowed identification and estimation of low-level soy isoflavones. The use of soy isoflavone supplements has shown an increase in their circulating levels in plasma and urine, aiding investigation of their biological effects. The significance of the metabolite equol has spurned research into new areas, and recently the specific enantiomers have been studied. High-performance liquid chromatography, capillary electrophoresis and gas chromatography are widely used with a range of detection systems. Increasingly, immunoassay is being used because of its high sensitivity and low cost.
A new pivoting and iterative text detection algorithm for biomedical images.
Xu, Songhua; Krauthammer, Michael
2010-12-01
There is interest to expand the reach of literature mining to include the analysis of biomedical images, which often contain a paper's key findings. Examples include recent studies that use Optical Character Recognition (OCR) to extract image text, which is used to boost biomedical image retrieval and classification. Such studies rely on the robust identification of text elements in biomedical images, which is a non-trivial task. In this work, we introduce a new text detection algorithm for biomedical images based on iterative projection histograms. We study the effectiveness of our algorithm by evaluating the performance on a set of manually labeled random biomedical images, and compare the performance against other state-of-the-art text detection algorithms. We demonstrate that our projection histogram-based text detection approach is well suited for text detection in biomedical images, and that the iterative application of the algorithm boosts performance to an F score of .60. We provide a C++ implementation of our algorithm freely available for academic use. Copyright © 2010 Elsevier Inc. All rights reserved.
Driver fatigue alarm based on eye detection and gaze estimation
NASA Astrophysics Data System (ADS)
Sun, Xinghua; Xu, Lu; Yang, Jingyu
2007-11-01
The driver assistant system has attracted much attention as an essential component of intelligent transportation systems. One task of driver assistant system is to prevent the drivers from fatigue. For the fatigue detection it is natural that the information about eyes should be utilized. The driver fatigue can be divided into two types, one is the sleep with eyes close and another is the sleep with eyes open. Considering that the fatigue detection is related with the prior knowledge and probabilistic statistics, the dynamic Bayesian network is used as the analysis tool to perform the reasoning of fatigue. Two kinds of experiments are performed to verify the system effectiveness, one is based on the video got from the laboratory and another is based on the video got from the real driving situation. Ten persons participate in the test and the experimental result is that, in the laboratory all the fatigue events can be detected, and in the practical vehicle the detection ratio is about 85%. Experiments show that in most of situations the proposed system works and the corresponding performance is satisfying.
Object tracking via background subtraction for monitoring illegal activity in crossroad
NASA Astrophysics Data System (ADS)
Ghimire, Deepak; Jeong, Sunghwan; Park, Sang Hyun; Lee, Joonwhoan
2016-07-01
In the field of intelligent transportation system a great number of vision-based techniques have been proposed to prevent pedestrians from being hit by vehicles. This paper presents a system that can perform pedestrian and vehicle detection and monitoring of illegal activity in zebra crossings. In zebra crossing, according to the traffic light status, to fully avoid a collision, a driver or pedestrian should be warned earlier if they possess any illegal moves. In this research, at first, we detect the traffic light status of pedestrian and monitor the crossroad for vehicle pedestrian moves. The background subtraction based object detection and tracking is performed to detect pedestrian and vehicles in crossroads. Shadow removal, blob segmentation, trajectory analysis etc. are used to improve the object detection and classification performance. We demonstrate the experiment in several video sequences which are recorded in different time and environment such as day time and night time, sunny and raining environment. Our experimental results show that such simple and efficient technique can be used successfully as a traffic surveillance system to prevent accidents in zebra crossings.
Incipient Crack Detection in Composite Wind Turbine Blades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Stuart G.; Choi, Mijin; Jeong, Hyomi
2012-08-28
This paper presents some analysis results for incipient crack detection in a 9-meter CX-100 wind turbine blade that underwent fatigue loading to failure. The blade was manufactured to standard specifications, and it underwent harmonic excitation at its first resonance using a hydraulically-actuated excitation system until reaching catastrophic failure. This work investigates the ability of an ultrasonic guided wave approach to detect incipient damage prior to the surfacing of a visible, catastrophic crack. The blade was instrumented with piezoelectric transducers, which were used in an active, pitchcatch mode with guided waves over a range of excitation frequencies. The performance results inmore » detecting incipient crack formation in the fiberglass skin of the blade is assessed over the range of frequencies in order to determine the point at which the incipient crack became detectable. Higher excitation frequencies provide consistent results for paths along the rotor blade's carbon fiber spar cap, but performance falls off with increasing excitation frequencies for paths off of the spar cap. Lower excitation frequencies provide more consistent performance across all sensor paths.« less
NASA Astrophysics Data System (ADS)
Dabiri, Mohammad Taghi; Sadough, Seyed Mohammad Sajad
2018-04-01
In the free-space optical (FSO) links, atmospheric turbulence lead to scintillation in the received signal. Due to its ease of implementation, intensity modulation with direct detection (IM/DD) based on ON-OFF keying (OOK) is a popular signaling scheme in these systems. Over turbulence channel, to detect OOK symbols in a blind way, i.e., without sending pilot symbols, an expectation-maximization (EM)-based detection method was recently proposed in the literature related to free-space optical (FSO) communication. However, the performance of EM-based detection methods severely depends on the length of the observation interval (Ls). To choose the optimum values of Ls at target bit error rates (BER)s of FSO communications which are commonly lower than 10-9, Monte-Carlo simulations would be very cumbersome and require a very long processing time. To facilitate performance evaluation, in this letter we derive the analytic expressions for BER and outage probability. Numerical results validate the accuracy of our derived analytic expressions. Our results may serve to evaluate the optimum value for Ls without resorting to time-consuming Monte-Carlo simulations.
Breath analysis using external cavity diode lasers: a review
NASA Astrophysics Data System (ADS)
Bayrakli, Ismail
2017-04-01
Most techniques that are used for diagnosis and therapy of diseases are invasive. Reliable noninvasive methods are always needed for the comfort of patients. Owing to its noninvasiveness, ease of use, and easy repeatability, exhaled breath analysis is a very good candidate for this purpose. Breath analysis can be performed using different techniques, such as gas chromatography mass spectrometry (MS), proton transfer reaction-MS, and selected ion flow tube-MS. However, these devices are bulky and require complicated procedures for sample collection and preconcentration. Therefore, these are not practical for routine applications in hospitals. Laser-based techniques with small size, robustness, low cost, low response time, accuracy, precision, high sensitivity, selectivity, low detection limit, real-time, and point-of-care detection have a great potential for routine use in hospitals. In this review paper, the recent advances in the fields of external cavity lasers and breath analysis for detection of diseases are presented.
Non-contact FBG sensing based steam turbine rotor dynamic balance vibration detection system
NASA Astrophysics Data System (ADS)
Li, Tianliang; Tan, Yuegang; Cai, Lin
2015-10-01
This paper has proposed a non-contact vibration sensor based on fiber Bragg grating sensing, and applied to detect vibration of steam turbine rotor dynamic balance experimental platform. The principle of the sensor has been introduced, as well as the experimental analysis; performance of non-contact FBG vibration sensor has been analyzed in the experiment; in addition, turbine rotor dynamic vibration detection system based on eddy current displacement sensor and non-contact FBG vibration sensor have built; finally, compared with results of signals under analysis of the time domain and frequency domain. The analysis of experimental data contrast shows that: the vibration signal analysis of non-contact FBG vibration sensor is basically the same as the result of eddy current displacement sensor; it verified that the sensor can be used for non-contact measurement of steam turbine rotor dynamic balance vibration.
Digital PCR Improves Mutation Analysis in Pancreas Fine Needle Aspiration Biopsy Specimens.
Sho, Shonan; Court, Colin M; Kim, Stephen; Braxton, David R; Hou, Shuang; Muthusamy, V Raman; Watson, Rabindra R; Sedarat, Alireza; Tseng, Hsian-Rong; Tomlinson, James S
2017-01-01
Applications of precision oncology strategies rely on accurate tumor genotyping from clinically available specimens. Fine needle aspirations (FNA) are frequently obtained in cancer management and often represent the only source of tumor tissues for patients with metastatic or locally advanced diseases. However, FNAs obtained from pancreas ductal adenocarcinoma (PDAC) are often limited in cellularity and/or tumor cell purity, precluding accurate tumor genotyping in many cases. Digital PCR (dPCR) is a technology with exceptional sensitivity and low DNA template requirement, characteristics that are necessary for analyzing PDAC FNA samples. In the current study, we sought to evaluate dPCR as a mutation analysis tool for pancreas FNA specimens. To this end, we analyzed alterations in the KRAS gene in pancreas FNAs using dPCR. The sensitivity of dPCR mutation analysis was first determined using serial dilution cell spiking studies. Single-cell laser-microdissection (LMD) was then utilized to identify the minimal number of tumor cells needed for mutation detection. Lastly, dPCR mutation analysis was performed on 44 pancreas FNAs (34 formalin-fixed paraffin-embedded (FFPE) and 10 fresh (non-fixed)), including samples highly limited in cellularity (100 cells) and tumor cell purity (1%). We found dPCR to detect mutations with allele frequencies as low as 0.17%. Additionally, a single tumor cell could be detected within an abundance of normal cells. Using clinical FNA samples, dPCR mutation analysis was successful in all preoperative FNA biopsies tested, and its accuracy was confirmed via comparison with resected tumor specimens. Moreover, dPCR revealed additional KRAS mutations representing minor subclones within a tumor that were not detected by the current clinical gold standard method of Sanger sequencing. In conclusion, dPCR performs sensitive and accurate mutation analysis in pancreas FNAs, detecting not only the dominant mutation subtype, but also the additional rare mutation subtypes representing tumor heterogeneity.
Digital PCR Improves Mutation Analysis in Pancreas Fine Needle Aspiration Biopsy Specimens
Court, Colin M.; Kim, Stephen; Braxton, David R.; Hou, Shuang; Muthusamy, V. Raman; Watson, Rabindra R.; Sedarat, Alireza; Tseng, Hsian-Rong; Tomlinson, James S.
2017-01-01
Applications of precision oncology strategies rely on accurate tumor genotyping from clinically available specimens. Fine needle aspirations (FNA) are frequently obtained in cancer management and often represent the only source of tumor tissues for patients with metastatic or locally advanced diseases. However, FNAs obtained from pancreas ductal adenocarcinoma (PDAC) are often limited in cellularity and/or tumor cell purity, precluding accurate tumor genotyping in many cases. Digital PCR (dPCR) is a technology with exceptional sensitivity and low DNA template requirement, characteristics that are necessary for analyzing PDAC FNA samples. In the current study, we sought to evaluate dPCR as a mutation analysis tool for pancreas FNA specimens. To this end, we analyzed alterations in the KRAS gene in pancreas FNAs using dPCR. The sensitivity of dPCR mutation analysis was first determined using serial dilution cell spiking studies. Single-cell laser-microdissection (LMD) was then utilized to identify the minimal number of tumor cells needed for mutation detection. Lastly, dPCR mutation analysis was performed on 44 pancreas FNAs (34 formalin-fixed paraffin-embedded (FFPE) and 10 fresh (non-fixed)), including samples highly limited in cellularity (100 cells) and tumor cell purity (1%). We found dPCR to detect mutations with allele frequencies as low as 0.17%. Additionally, a single tumor cell could be detected within an abundance of normal cells. Using clinical FNA samples, dPCR mutation analysis was successful in all preoperative FNA biopsies tested, and its accuracy was confirmed via comparison with resected tumor specimens. Moreover, dPCR revealed additional KRAS mutations representing minor subclones within a tumor that were not detected by the current clinical gold standard method of Sanger sequencing. In conclusion, dPCR performs sensitive and accurate mutation analysis in pancreas FNAs, detecting not only the dominant mutation subtype, but also the additional rare mutation subtypes representing tumor heterogeneity. PMID:28125707
Stojanovic, Anja; Lämmerhofer, Michael; Kogelnig, Daniel; Schiesel, Simone; Sturm, Martin; Galanski, Markus; Krachler, Regina; Keppler, Bernhard K; Lindner, Wolfgang
2008-10-31
Several hydrophobic ionic liquids (ILs) based on long-chain aliphatic ammonium- and phosphonium cations and selected aromatic anions were analyzed by reversed-phase high-performance liquid chromatography (RP-HPLC) employing trifluoroacetic acid as ion-pairing additive to the acetonitrile-containing mobile phase and adopting a step-gradient elution mode. The coupling of charged aerosol detection (CAD) for the non-chromophoric aliphatic cations with diode array detection (DAD) for the aromatic anions allowed their simultaneous analysis in a set of new ILs derived from either tricaprylmethylammonium chloride (Aliquat 336) and trihexyltetradecylphosphonium chloride as precursors. Aliquat 336 is a mix of ammonium cations with distinct aliphatic chain lengths. In the course of the studies it turned out that CAD generates an identical detection response for all the distinct aliphatic cations. Due to lack of single component standards of the individual Aliquat 336 cation species, a unified calibration function was established for the quantitative analysis of the quaternary ammonium cations of the ILs. The developed method was validated according to ICH guidelines, which confirmed the validity of the unified calibration. The application of the method revealed molar ratios of cation to anion close to 1 indicating a quantitative exchange of the chloride ions of the precursors by the various aromatic anions in the course of the synthesis of new ILs. Anomalies of CAD observed for the detection of some aromatic anions (thiosalicylate and benzoate) are discussed.
Display conditions and lesion detectability: effect of background light
NASA Astrophysics Data System (ADS)
Razavi, Mahmood; Hall, Theodore R.; Aberle, Denise R.; Hayrapetian, Alek S.; Loloyan, Mansur; Eldredge, Sandra L.
1990-08-01
We assessed the effect of high background light on observer performance for the detection of a variety of chest radiographic abnormalities. Five observers reviewed 66 digital hard copy chest images formatted to 1 1 x 14 inch size under two display conditions: 1) on a specially prepared 1 1 x 14 inch illuminated panel with no peripheral light and 2) on a standard viewing panel designed for 14 x 17 inch radiographs. The images contained one - or more of the following conditions: pneumothorax, interstitial disease, nodules, alveolar process, or no abnormality. The results of receiver operator characteristic analysis show that extraneous light does reduce observer performance and the detectability of nodules, interstitial disease.
Mitchell, J M; Yee, A J; McNab, W B; Griffiths, M W; McEwen, S A
1999-01-01
LacTek tests are competitive enzyme-linked immunosorbent assays intended for rapid detection of antimicrobial residues in bovine milk. In this study, the LacTek test protocol was modified for use with extracts of bovine tissue to detect beta-lactam, tetracycline, and sulfamethazine residues. Test performance characteristics--precision, accuracy, ruggedness, practicability, and analytical specificity and sensitivity--were investigated. Results suggest that LacTek tests can be easily adapted to detect antimicrobial residues in extracts of lean ground beef. However, positive samples may not contain residues at violative concentrations (i.e., Canadian maximum residue limits), and therefore, additional analysis would be required for final confirmation and quantitation (e.g., chromatography).
Amplitude image processing by diffractive optics.
Cagigal, Manuel P; Valle, Pedro J; Canales, V F
2016-02-22
In contrast to the standard digital image processing, which operates over the detected image intensity, we propose to perform amplitude image processing. Amplitude processing, like low pass or high pass filtering, is carried out using diffractive optics elements (DOE) since it allows to operate over the field complex amplitude before it has been detected. We show the procedure for designing the DOE that corresponds to each operation. Furthermore, we accomplish an analysis of amplitude image processing performances. In particular, a DOE Laplacian filter is applied to simulated astronomical images for detecting two stars one Airy ring apart. We also check by numerical simulations that the use of a Laplacian amplitude filter produces less noisy images than the standard digital image processing.
Mental workload while driving: effects on visual search, discrimination, and decision making.
Recarte, Miguel A; Nunes, Luis M
2003-06-01
The effects of mental workload on visual search and decision making were studied in real traffic conditions with 12 participants who drove an instrumented car. Mental workload was manipulated by having participants perform several mental tasks while driving. A simultaneous visual-detection and discrimination test was used as performance criteria. Mental tasks produced spatial gaze concentration and visual-detection impairment, although no tunnel vision occurred. According to ocular behavior analysis, this impairment was due to late detection and poor identification more than to response selection. Verbal acquisition tasks were innocuous compared with production tasks, and complex conversations, whether by phone or with a passenger, are dangerous for road safety.
Alì, Greta; Proietti, Agnese; Pelliccioni, Serena; Niccoli, Cristina; Lupi, Cristiana; Sensi, Elisa; Giannini, Riccardo; Borrelli, Nicla; Menghi, Maura; Chella, Antonio; Ribechini, Alessandro; Cappuzzo, Federico; Melfi, Franca; Lucchi, Marco; Mussi, Alfredo; Fontanini, Gabriella
2014-11-01
Echinoderm microtubule associated proteinlike 4-anaplastic lymphoma receptor tyrosine kinase (EML4-ALK) translocation has been described in a subset of patients with non-small cell lung cancer (NSCLC) and has been shown to have oncogenic activity. Fluorescence in situ hybridization (FISH) is used to detect ALK-positive NSCLC, but it is expensive, time-consuming, and difficult for routine application. To evaluate the potential role of immunohistochemistry (IHC) as a screening tool to identify candidate cases for FISH analysis and for ALK inhibitor therapy in NSCLC. We performed FISH and IHC for ALK and mutational analysis for epidermal growth factor receptor (EGFR) and KRAS in 523 NSCLC specimens. We conducted IHC analysis with the monoclonal antibody D5F3 (Ventana Medical Systems, Tucson, Arizona) and a highly sensitive detection system. We also performed a MassARRAY-based analysis (Sequenom, San Diego, California) in a small subset of 11 samples to detect EML4-ALK rearrangement. Of the 523 NSCLC specimens, 20 (3.8%) were positive for ALK rearrangement by FISH analysis. EGFR and KRAS mutations were identified in 70 (13.4%) and 124 (23.7%) of the 523 tumor samples, respectively. ALK rearrangement and EGFR and KRAS mutations were mutually exclusive. Of 523 tumor samples analyzed, 18 (3.4%) were ALK(+) by IHC, 18 samples (3.4%) had concordant IHC and FISH results, and 2 ALK(+) cases (0.3%) by FISH failed to show ALK protein expression. In the 2 discrepant cases, we did not detect any mass peaks for the EML4-ALK variants by MassARRAY. Our results show that IHC may be a useful technique for selecting NSCLC cases to undergo ALK FISH analysis.
Wang, Huayin
2014-09-01
A new quantitative technique for the simultaneous quantification of the individual anthocyanins based on the pH differential method and high-performance liquid chromatography with diode array detection is proposed in this paper. The six individual anthocyanins (cyanidin 3-glucoside, cyanidin 3-rutinoside, petunidin 3-glucoside, petunidin 3-rutinoside, and malvidin 3-rutinoside) from mulberry (Morus rubra) and Liriope platyphylla were used for demonstration and validation. The elution of anthocyanins was performed using a C18 column with stepwise gradient elution and individual anthocyanins were identified by high-performance liquid chromatography with tandem mass spectrometry. Based on the pH differential method, the high-performance liquid chromatography peak areas of maximum and reference absorption wavelengths of anthocyanin extracts were conducted to quantify individual anthocyanins. The calibration curves for these anthocyanins were linear within the range of 10-5500 mg/L. The correlation coefficients (r(2)) all exceeded 0.9972, and the limits of detection were in the range of 1-4 mg/L at a signal-to-noise ratio ≥5 for these anthocyanins. The proposed quantitative analysis was reproducible with good accuracy of all individual anthocyanins ranging from 96.3 to 104.2% and relative recoveries were in the range 98.4-103.2%. The proposed technique is performed without anthocyanin standards and is a simple, rapid, accurate, and economical method to determine individual anthocyanin contents. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Van Hertem, T; Bahr, C; Schlageter Tello, A; Viazzi, S; Steensels, M; Romanini, C E B; Lokhorst, C; Maltz, E; Halachmi, I; Berckmans, D
2016-09-01
The objective of this study was to evaluate if a multi-sensor system (milk, activity, body posture) was a better classifier for lameness than the single-sensor-based detection models. Between September 2013 and August 2014, 3629 cow observations were collected on a commercial dairy farm in Belgium. Human locomotion scoring was used as reference for the model development and evaluation. Cow behaviour and performance was measured with existing sensors that were already present at the farm. A prototype of three-dimensional-based video recording system was used to quantify automatically the back posture of a cow. For the single predictor comparisons, a receiver operating characteristics curve was made. For the multivariate detection models, logistic regression and generalized linear mixed models (GLMM) were developed. The best lameness classification model was obtained by the multi-sensor analysis (area under the receiver operating characteristics curve (AUC)=0.757±0.029), containing a combination of milk and milking variables, activity and gait and posture variables from videos. Second, the multivariate video-based system (AUC=0.732±0.011) performed better than the multivariate milk sensors (AUC=0.604±0.026) and the multivariate behaviour sensors (AUC=0.633±0.018). The video-based system performed better than the combined behaviour and performance-based detection model (AUC=0.669±0.028), indicating that it is worthwhile to consider a video-based lameness detection system, regardless the presence of other existing sensors in the farm. The results suggest that Θ2, the feature variable for the back curvature around the hip joints, with an AUC of 0.719 is the best single predictor variable for lameness detection based on locomotion scoring. In general, this study showed that the video-based back posture monitoring system is outperforming the behaviour and performance sensing techniques for locomotion scoring-based lameness detection. A GLMM with seven specific variables (walking speed, back posture measurement, daytime activity, milk yield, lactation stage, milk peak flow rate and milk peak conductivity) is the best combination of variables for lameness classification. The accuracy on four-level lameness classification was 60.3%. The accuracy improved to 79.8% for binary lameness classification. The binary GLMM obtained a sensitivity of 68.5% and a specificity of 87.6%, which both exceed the sensitivity (52.1%±4.7%) and specificity (83.2%±2.3%) of the multi-sensor logistic regression model. This shows that the repeated measures analysis in the GLMM, taking into account the individual history of the animal, outperforms the classification when thresholds based on herd level (a statistical population) are used.
Ryan, Denise S; Sia, Rose K; Stutzman, Richard D; Pasternak, Joseph F; Howard, Robin S; Howell, Christopher L; Maurer, Tana; Torres, Mark F; Bower, Kraig S
2017-01-01
To compare visual performance, marksmanship performance, and threshold target identification following wavefront-guided (WFG) versus wavefront-optimized (WFO) photorefractive keratectomy (PRK). In this prospective, randomized clinical trial, active duty U.S. military Soldiers, age 21 or over, electing to undergo PRK were randomized to undergo WFG (n = 27) or WFO (n = 27) PRK for myopia or myopic astigmatism. Binocular visual performance was assessed preoperatively and 1, 3, and 6 months postoperatively: Super Vision Test high contrast, Super Vision Test contrast sensitivity (CS), and 25% contrast acuity with night vision goggle filter. CS function was generated testing at five spatial frequencies. Marksmanship performance in low light conditions was evaluated in a firing tunnel. Target detection and identification performance was tested for probability of identification of varying target sets and probability of detection of humans in cluttered environments. Visual performance, CS function, marksmanship, and threshold target identification demonstrated no statistically significant differences over time between the two treatments. Exploratory regression analysis of firing range tasks at 6 months showed no significant differences or correlations between procedures. Regression analysis of vehicle and handheld probability of identification showed a significant association with pretreatment performance. Both WFG and WFO PRK results translate to excellent and comparable visual and military performance. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.
Trained neurons-based motion detection in optical camera communications
NASA Astrophysics Data System (ADS)
Teli, Shivani; Cahyadi, Willy Anugrah; Chung, Yeon Ho
2018-04-01
A concept of trained neurons-based motion detection (TNMD) in optical camera communications (OCC) is proposed. The proposed TNMD is based on neurons present in a neural network that perform repetitive analysis in order to provide efficient and reliable motion detection in OCC. This efficient motion detection can be considered another functionality of OCC in addition to two traditional functionalities of illumination and communication. To verify the proposed TNMD, the experiments were conducted in an indoor static downlink OCC, where a mobile phone front camera is employed as the receiver and an 8 × 8 red, green, and blue (RGB) light-emitting diode array as the transmitter. The motion is detected by observing the user's finger movement in the form of centroid through the OCC link via a camera. Unlike conventional trained neurons approaches, the proposed TNMD is trained not with motion itself but with centroid data samples, thus providing more accurate detection and far less complex detection algorithm. The experiment results demonstrate that the TNMD can detect all considered motions accurately with acceptable bit error rate (BER) performances at a transmission distance of up to 175 cm. In addition, while the TNMD is performed, a maximum data rate of 3.759 kbps over the OCC link is obtained. The OCC with the proposed TNMD combined can be considered an efficient indoor OCC system that provides illumination, communication, and motion detection in a convenient smart home environment.
USDA-ARS?s Scientific Manuscript database
Ultra-High performance liquid chromatography (UHPLC) with single wavelength (215 nm) detection was used to obtain chromatographic profiles of authentic skim milk powder (SMP) and synthetic mixtures of SMP with variable amounts of soy (SPI), pea (PPI), brown rice (BRP), and hydrolyzed wheat protein (...
ECG R-R peak detection on mobile phones.
Sufi, F; Fang, Q; Cosic, I
2007-01-01
Mobile phones have become an integral part of modern life. Due to the ever increasing processing power, mobile phones are rapidly expanding its arena from a sole device of telecommunication to organizer, calculator, gaming device, web browser, music player, audio/video recording device, navigator etc. The processing power of modern mobile phones has been utilized by many innovative purposes. In this paper, we are proposing the utilization of mobile phones for monitoring and analysis of biosignal. The computation performed inside the mobile phone's processor will now be exploited for healthcare delivery. We performed literature review on RR interval detection from ECG and selected few PC based algorithms. Then, three of those existing RR interval detection algorithms were programmed on Java platform. Performance monitoring and comparison studies were carried out on three different mobile devices to determine their application on a realtime telemonitoring scenario.
GICHD mine dog testing project : soil sample results #5.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, James L.; Phelan, James M.; Archuleta, Luisa M.
2004-01-01
A mine dog evaluation project initiated by the Geneva International Center for Humanitarian Demining is evaluating the capability and reliability of mine detection dogs. The performance of field-operational mine detection dogs will be measured in test minefields in Afghanistan containing actual, but unfused landmines. Repeated performance testing over two years through various seasonal weather conditions will provide data simulating near real world conditions. Soil samples will be obtained adjacent to the buried targets repeatedly over the course of the test. Chemical analysis results from these soil samples will be used to evaluate correlations between mine dog detection performance and seasonalmore » weather conditions. This report documents the analytical chemical methods and results from the fifth batch of soils received. This batch contained samples from Kharga, Afghanistan collected in June 2003.« less
Polarization differences in airborne ground penetrating radar performance for landmine detection
NASA Astrophysics Data System (ADS)
Dogaru, Traian; Le, Calvin
2016-05-01
The U.S. Army Research Laboratory (ARL) has investigated the ultra-wideband (UWB) radar technology for detection of landmines, improvised explosive devices and unexploded ordnance, for over two decades. This paper presents a phenomenological study of the radar signature of buried landmines in realistic environments and the performance of airborne synthetic aperture radar (SAR) in detecting these targets as a function of multiple parameters: polarization, depression angle, soil type and burial depth. The investigation is based on advanced computer models developed at ARL. The analysis includes both the signature of the targets of interest and the clutter produced by rough surface ground. Based on our numerical simulations, we conclude that low depression angles and H-H polarization offer the highest target-to-clutter ratio in the SAR images and therefore the best radar performance of all the scenarios investigated.
Gray, Bobby P; Viljanto, Marjaana; Bright, Jane; Pearce, Clive; Maynard, Steve
2013-07-17
The detection of the abuse of anabolic steroids in equine sport is complicated by the endogenous nature of some of the abused steroids, such as testosterone and nandrolone. These steroids are commonly administered as intramuscular injections of esterified forms of the steroid, which prolongs their effects and improves bioavailability over oral dosing. The successful detection of an intact anabolic steroid ester therefore provides unequivocal proof of an illegal administration, as esterified forms are not found endogenously. Detection of intact anabolic steroid esters is possible in plasma samples but not, to date, in the traditional doping control matrix of urine. The analysis of equine mane hair for the detection of anabolic steroid esters has the potential to greatly extend the time period over which detection of abuse can be monitored. Equine mane hair samples were incubated in 0.1M phosphate buffer (pH 9.5) before anabolic steroids (testosterone, nandrolone, boldenone, trenbolone and stanozolol), anabolic steroid esters (esters of testosterone, nandrolone, boldenone and trenbolone) and associated compounds (fluticasone propionate and esters of hydroxyprogesterone) were extracted by liquid-liquid extraction with a mix of hexane and ethyl acetate (7:3, v:v). Further sample clean up by solid phase extraction was followed by derivatisation with methoxylamine HCL and analysis by UHPLC-MS/MS. Initial method development was performed on a representative suite of four testosterone esters (propionate, phenylpropionate, isocaproate and decanoate) and the method was later extended to include a further 18 compounds. The applicability of the method was demonstrated by the analysis of mane hair samples collected following the intramuscular administration of 500 mg of Durateston(®) (mixed testosterone esters) to a Thoroughbred mare (560 kg). The method was subsequently used to successfully detect boldenone undecylenate and stanozolol in hair samples collected following suspicious screening findings from post-race urine samples. The use of segmental analysis to potentially provide additional information on the timing of administration was also investigated. Copyright © 2013 Elsevier B.V. All rights reserved.
Changes in plasma protein levels as an early indication of a bloodstream infection
Joenväärä, Sakari; Kaartinen, Johanna; Järvinen, Asko; Renkonen, Risto
2017-01-01
Blood culture is the primary diagnostic test performed in a suspicion of bloodstream infection to detect the presence of microorganisms and direct the treatment. However, blood culture is slow and time consuming method to detect blood stream infections or separate septic and/or bacteremic patients from others with less serious febrile disease. Plasma proteomics, despite its challenges, remains an important source for early biomarkers for systemic diseases and might show changes before direct evidence from bacteria can be obtained. We have performed a plasma proteomic analysis, simultaneously at the time of blood culture sampling from ten blood culture positive and ten blood culture negative patients, and quantified 172 proteins with two or more unique peptides. Principal components analysis, Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) and ROC curve analysis were performed to select protein(s) features which can classify the two groups of samples. We propose a number of candidates which qualify as potential biomarkers to select the blood culture positive cases from negative ones. Pathway analysis by two methods revealed complement activation, phagocytosis pathway and alterations in lipid metabolism as enriched pathways which are relevant for the condition. Data are available via ProteomeXchange with identifier PXD005022. PMID:28235076
A deep learning approach for fetal QRS complex detection.
Zhong, Wei; Liao, Lijuan; Guo, Xuemei; Wang, Guoli
2018-04-20
Non-invasive foetal electrocardiography (NI-FECG) has the potential to provide more additional clinical information for detecting and diagnosing fetal diseases. We propose and demonstrate a deep learning approach for fetal QRS complex detection from raw NI-FECG signals by using a convolutional neural network (CNN) model. The main objective is to investigate whether reliable fetal QRS complex detection performance can still be obtained from features of single-channel NI-FECG signals, without canceling maternal ECG (MECG) signals. A deep learning method is proposed for recognizing fetal QRS complexes. Firstly, we collect data from set-a of the PhysioNet/computing in Cardiology Challenge database. The sample entropy method is used for signal quality assessment. Part of the bad quality signals is excluded in the further analysis. Secondly, in the proposed method, the features of raw NI-FECG signals are normalized before they are fed to a CNN classifier to perform fetal QRS complex detection. We use precision, recall, F-measure and accuracy as the evaluation metrics to assess the performance of fetal QRS complex detection. The proposed deep learning method can achieve relatively high precision (75.33%), recall (80.54%), and F-measure scores (77.85%) compared with three other well-known pattern classification methods, namely KNN, naive Bayes and SVM. the proposed deep learning method can attain reliable fetal QRS complex detection performance from the raw NI-FECG signals without canceling MECG signals. In addition, the influence of different activation functions and signal quality assessment on classification performance are evaluated, and results show that Relu outperforms the Sigmoid and Tanh on this particular task, and better classification performance is obtained with the signal quality assessment step in this study.
Analysis of variation in oil pressure in lubricating system
NASA Astrophysics Data System (ADS)
Sharma, Sumit; Upreti, Mritunjay; Sharma, Bharat; Poddar, Keshav
2018-05-01
Automotive Maintenance for an engine contributes to its reliability, energy efficiency and repair cost reduction. Modeling of engine performance and fault detection require large amount of data, which are usually obtained on test benches. This report presents a methodical study on analysis of variation in lubrication system of various medium speed engines. Further this study is limited to the influence of Engine Oil Pressure on frictional losses, Torque analysis for various Oil Pressures and an analytical analysis of engine Lubrication System. The data collected from various Engines under diagnostics is represented graphically. Finally the illustrated results were used as a viable source for detection and troubleshooting of faults in Lubrication System of regular passenger vehicle.
Wang, Z; Hennion, B; Urruty, L; Montury, M
2000-11-01
Solid-phase microextraction coupled with high performance liquid chromatography has been studied for the analysis of methiocarb, napropamide, fenoxycarb and bupirimate in strawberries. The strawberries were blended and centrifuged. Then, an aliquot of the resulting extracting solution was subjected to solid-phase microextraction (SPME) on a 60 microns polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre for 45 min at room temperature. The extracted pesticides on the SPME fibre were desorbed into SPME/high performance liquid chromatography (HPLC) interface for HPLC analysis with diode-array detection (DAD). The method is organic solvent-free for the whole extraction process and is simple and easy to manipulate. The detection limits were shown to be at low microgram kg-1 level and the linear response covered the range from 0.05 to 2 mg kg-1 of pesticides in strawberries with a regression coefficient larger than 0.99. A good repeatability with RSDs between 2.92 and 9.25% was obtained, depending on compounds.
Guan, Y-G; Yu, P; Yu, S-J; Xu, X-B; Wu, X-L
2012-11-01
A simultaneous analysis of reducing sugars and 5-hydroxymethyl-2-furaldehyde of the Maillard reaction products was detailed. It was based on a high performance anion exchange chromatography with electrochemical detector system and an HPLC with refractive index detector. Results showed that high performance anion exchange chromatography with electrochemical detector using a CarboPac PA-1 column (Dionex Corp., Sunnyvale, CA) was more suitable for reducing sugars and 5-hydroxymethyl-2-furaldehyde determination, especially for trace analysis. The lowest detectable limit of reducing sugars and 5-hydroxymethyl-2-furaldehyde was 0.00005 mol/L in this experiment. However, HPLC with a refractive index detector always produces a tailing peak for 5-hydroxymethyl-2-furaldehyde, and mannose and fructose cannot be absolutely separated. The results of the present study could provide a more sensitive means for 5-hydroxymethyl-2-furaldehyde and reducing sugar detection. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Ziegler, Ronny; Brendel, Bernhard; Rinneberg, Herbert; Nielsen, Tim
2009-01-21
Using a statistical (chi-square) test on simulated data and a realistic noise model derived from the system's hardware we study the performance of diffuse optical tomography systems for fluorescence imaging. We compare the predicted smallest size of detectable lesions at various positions in slab and cup geometry and model how detection sensitivity depends on breast compression and lesion fluorescence contrast. Our investigation shows that lesion detection is limited by relative noise in slab geometry and by absolute noise in cup geometry.
Portable point-of-care blood analysis system for global health (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dou, James J.; Aitchison, James Stewart; Chen, Lu; Nayyar, Rakesh
2016-03-01
In this paper we present a portable blood analysis system based on a disposable cartridge and hand-held reader. The platform can perform all the sample preparation, detection and waste collection required to complete a clinical test. In order to demonstrate the utility of this approach a CD4 T cell enumeration was carried out. A handheld, point-of-care CD4 T cell system was developed based on this system. In particular we will describe a pneumatic, active pumping method to control the on-chip fluidic actuation. Reagents for the CD4 T cell counting assay were dried on a reagent plug to eliminate the need for cold chain storage when used in the field. A micromixer based on the active fluidic actuation was designed to complete sample staining with fluorescent dyes that was dried on the reagent plugs. A novel image detection and analysis algorithm was developed to detect and track the flight of target particles and cells during each analysis. The handheld, point-of-care CD4 testing system was benchmarked against clinical cytometer. The experimental results demonstrated experimental results were closely matched with the flow cytometry. The same platform can be further expanded into a bead-array detection system where other types of biomolecules such as proteins can be detected using the same detection system.
Functional Analysis of Metabolomics Data.
Chagoyen, Mónica; López-Ibáñez, Javier; Pazos, Florencio
2016-01-01
Metabolomics aims at characterizing the repertory of small chemical compounds in a biological sample. As it becomes more massive and larger sets of compounds are detected, a functional analysis is required to convert these raw lists of compounds into biological knowledge. The most common way of performing such analysis is "annotation enrichment analysis," also used in transcriptomics and proteomics. This approach extracts the annotations overrepresented in the set of chemical compounds arisen in a given experiment. Here, we describe the protocols for performing such analysis as well as for visualizing a set of compounds in different representations of the metabolic networks, in both cases using free accessible web tools.
Dynamic biochemical tissue analysis detects functional L-selectin ligands on colon cancer tissues
Carlson, Grady E.; Martin, Eric W.; Shirure, Venktesh S.; Malgor, Ramiro; Resto, Vicente A.; Goetz, Douglas J.; Burdick, Monica M.
2017-01-01
A growing body of evidence suggests that L-selectin ligands presented on circulating tumor cells facilitate metastasis by binding L-selectin presented on leukocytes. Commonly used methods for detecting L-selectin ligands on tissues, e.g., immunostaining, are performed under static, no-flow conditions. However, such analysis does not assay for functional L-selectin ligands, specifically those ligands that promote adhesion under shear flow conditions. Recently our lab developed a method, termed dynamic biochemical tissue analysis (DBTA), to detect functional selectin ligands in situ by probing tissues with L-selectin-coated microspheres under hemodynamic flow conditions. In this investigation, DBTA was used to probe human colon tissues for L-selectin ligand activity. The detection of L-selectin ligands using DBTA was highly specific. Furthermore, DBTA reproducibly detected functional L-selectin ligands on diseased, e.g., cancerous or inflamed, tissues but not on noncancerous tissues. In addition, DBTA revealed a heterogeneous distribution of functional L-selectin ligands on colon cancer tissues. Most notably, detection of L-selectin ligands by immunostaining using HECA-452 antibody only partially correlated with functional L-selectin ligands detected by DBTA. In summation, the results of this study demonstrate that DBTA detects functional selectin ligands to provide a unique characterization of pathological tissue. PMID:28282455
Dynamic biochemical tissue analysis detects functional L-selectin ligands on colon cancer tissues.
Carlson, Grady E; Martin, Eric W; Shirure, Venktesh S; Malgor, Ramiro; Resto, Vicente A; Goetz, Douglas J; Burdick, Monica M
2017-01-01
A growing body of evidence suggests that L-selectin ligands presented on circulating tumor cells facilitate metastasis by binding L-selectin presented on leukocytes. Commonly used methods for detecting L-selectin ligands on tissues, e.g., immunostaining, are performed under static, no-flow conditions. However, such analysis does not assay for functional L-selectin ligands, specifically those ligands that promote adhesion under shear flow conditions. Recently our lab developed a method, termed dynamic biochemical tissue analysis (DBTA), to detect functional selectin ligands in situ by probing tissues with L-selectin-coated microspheres under hemodynamic flow conditions. In this investigation, DBTA was used to probe human colon tissues for L-selectin ligand activity. The detection of L-selectin ligands using DBTA was highly specific. Furthermore, DBTA reproducibly detected functional L-selectin ligands on diseased, e.g., cancerous or inflamed, tissues but not on noncancerous tissues. In addition, DBTA revealed a heterogeneous distribution of functional L-selectin ligands on colon cancer tissues. Most notably, detection of L-selectin ligands by immunostaining using HECA-452 antibody only partially correlated with functional L-selectin ligands detected by DBTA. In summation, the results of this study demonstrate that DBTA detects functional selectin ligands to provide a unique characterization of pathological tissue.
Linke, Richard; Ulrich, Frank; Bechstein, Wolf O; Schnitzbauer, Andreas A
2015-01-01
Bile leakage testing may help to detect and reduce the incidence of biliary leakage after hepatic resection. This review was performed to investigate the value of the White-test in identifying intraoperative biliary leakage and avoiding postoperative leakage. A systematic review and meta-analysis was performed. Two researchers performed literature research. Primary outcome measure was the incidence of post-hepatectomy biliary leakage; secondary outcome measure was the ability of detecting intraoperative biliary leakage with the help of the White-test. A total of 4 publications (including original data from our center) were included in the analysis. Evidence levels of the included studies had medium quality of 2b (individual cohort studies including low quality randomized controlled trials). Use of the White-test led to a significant reduction of post-operative biliary leakage [OR: 0.3 (95% CI: 0.14, 0.63), p = 0.002] and led to a significant higher intraoperative detection of biliary leakages [OR: 0.03 (95%CI: 0.02, 0.07), p < 0.00001]. Existing evidence implicates the use of the White-test after hepatic resection to identify bile leaks intraoperatively and thus reduce incidence of post-operative biliary leakage. Nonetheless, there is a requirement for a high-quality randomized controlled trial with adequately powered sample-size to confirm findings from the above described studies and further increase evidence in this field.
NASA Astrophysics Data System (ADS)
Aab, A.; Abreu, P.; Aglietta, M.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Anastasi, G. A.; Anchordoqui, L.; Andrada, B.; Andringa, S.; Aramo, C.; Arqueros, F.; Arsene, N.; Asorey, H.; Assis, P.; Aublin, J.; Avila, G.; Badescu, A. M.; Balaceanu, A.; Barreira Luz, R. J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Biteau, J.; Blaess, S. G.; Blanco, A.; Blazek, J.; Bleve, C.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Borodai, N.; Botti, A. M.; Brack, J.; Brancus, I.; Bretz, T.; Bridgeman, A.; Briechle, F. L.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, L.; Cancio, A.; Canfora, F.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Chavez, A. G.; Chinellato, J. A.; Chudoba, J.; Clay, R. W.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Coutu, S.; Covault, C. E.; Cronin, J.; D'Amico, S.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; de Jong, S. J.; De Mauro, G.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; Debatin, J.; Deligny, O.; Di Giulio, C.; Di Matteo, A.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; D'Olivo, J. C.; dos Anjos, R. C.; Dova, M. T.; Dundovic, A.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Falcke, H.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Fick, B.; Figueira, J. M.; Filipčič, A.; Fratu, O.; Freire, M. M.; Fujii, T.; Fuster, A.; Gaior, R.; García, B.; Garcia-Pinto, D.; Gaté, F.; Gemmeke, H.; Gherghel-Lascu, A.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Głas, D.; Glaser, C.; Golup, G.; Gómez Berisso, M.; Gómez Vitale, P. F.; González, N.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Hasankiadeh, Q.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huege, T.; Hulsman, J.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Johnsen, J. A.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Katkov, I.; Keilhauer, B.; Kemp, E.; Kemp, J.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Kuempel, D.; Kukec Mezek, G.; Kunka, N.; Kuotb Awad, A.; LaHurd, D.; Lauscher, M.; Legumina, R.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopes, L.; López, R.; López Casado, A.; Luce, Q.; Lucero, A.; Malacari, M.; Mallamaci, M.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Mariş, I. C.; Marsella, G.; Martello, D.; Martinez, H.; Martínez Bravo, O.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Melo, D.; Menshikov, A.; Messina, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Mockler, D.; Mollerach, S.; Montanet, F.; Morello, C.; Mostafá, M.; Müller, A. L.; Müller, G.; Muller, M. A.; Müller, S.; Mussa, R.; Naranjo, I.; Nellen, L.; Nguyen, P. H.; Niculescu-Oglinzanu, M.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, H.; Núñez, L. A.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pedreira, F.; Pȩkala, J.; Pelayo, R.; Peña-Rodriguez, J.; Pereira, L. A. S.; Perlín, M.; Perrone, L.; Peters, C.; Petrera, S.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Ramos-Pollan, R.; Rautenberg, J.; Ravignani, D.; Revenu, B.; Ridky, J.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rogozin, D.; Roncoroni, M. J.; Roth, M.; Roulet, E.; Rovero, A. C.; Ruehl, P.; Saffi, S. J.; Saftoiu, A.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santos, E. M.; Santos, E.; Sarazin, F.; Sarmento, R.; Sarmiento, C. A.; Sato, R.; Schauer, M.; Scherini, V.; Schieler, H.; Schimp, M.; Schmidt, D.; Scholten, O.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sigl, G.; Silli, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sonntag, S.; Sorokin, J.; Squartini, R.; Stanca, D.; Stanič, S.; Stasielak, J.; Stassi, P.; Strafella, F.; Suarez, F.; Suarez Durán, M.; Sudholz, T.; Suomijärvi, T.; Supanitsky, A. D.; Swain, J.; Szadkowski, Z.; Taboada, A.; Taborda, O. A.; Tapia, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Tomankova, L.; Tomé, B.; Torralba Elipe, G.; Torri, M.; Travnicek, P.; Trini, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Vergara Quispe, I. D.; Verzi, V.; Vicha, J.; Villaseñor, L.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weindl, A.; Wiencke, L.; Wilczyński, H.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Yang, L.; Yelos, D.; Yushkov, A.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zimmermann, B.; Ziolkowski, M.; Zong, Z.; Zuccarello, F.
2017-06-01
We report a multi-resolution search for anisotropies in the arrival directions of cosmic rays detected at the Pierre Auger Observatory with local zenith angles up to 80o and energies in excess of 4 EeV (4 × 1018 eV). This search is conducted by measuring the angular power spectrum and performing a needlet wavelet analysis in two independent energy ranges. Both analyses are complementary since the angular power spectrum achieves a better performance in identifying large-scale patterns while the needlet wavelet analysis, considering the parameters used in this work, presents a higher efficiency in detecting smaller-scale anisotropies, potentially providing directional information on any observed anisotropies. No deviation from isotropy is observed on any angular scale in the energy range between 4 and 8 EeV. Above 8 EeV, an indication for a dipole moment is captured; while no other deviation from isotropy is observed for moments beyond the dipole one. The corresponding p-values obtained after accounting for searches blindly performed at several angular scales, are 1.3 × 10-5 in the case of the angular power spectrum, and 2.5 × 10-3 in the case of the needlet analysis. While these results are consistent with previous reports making use of the same data set, they provide extensions of the previous works through the thorough scans of the angular scales.
Analysis of aromatic aldehydes in brandy and wine by high-performance capillary electrophoresis.
Panossian, A; Mamikonyan, G; Torosyan, M; Gabrielyan, E; Mkhitaryan, S
2001-09-01
A new method of analysis of vanillin, syringaldehyde, coniferaldehyde, and sinapaldehyde in brandy and wine by high-performance capillary electrophoresis is described. Electrophoretic mobility of these compounds is achieved by a borate buffer at pH 9.3. At this pH, the sensitivity of UV detection of these phenolic aldehydes also increases. UV absorptions at 348, 362, 404, and 422 nm were selected for monitoring vanillin, syringaldehyde, coniferaldehyde, and sinapaldehyde, respectively. This procedure was performed simultaneously during one run using a diode array detector. Samples of brandy or wine were analyzed directly without concentration, extraction, or any other preliminary treatment of the test sample. The limits of detection were found to be 0.275, 0.1425, 0.1475, and 0.1975 ppm for syringaldehyde, coniferaldehyde, sinapaldehyde, and vanillin, respectively, which is acceptable for analysis of both brandy and wine aged in oak barrels. The method has been shown to be linear in a range from 0.3 to 57 mg/L. Recoveries ranged between 99.9% and 107.7% for all of the compounds tested. Repeatability and reproducibility of the method were high. The relative standard deviation was consequently approximately 3% and also between 4.47% and 6.89% for all tested compounds. The method is useful for the identification of counterfeit brandy, which is easy to recognize by the absence of sinapaldehyde, syringaldehyde, and coniferaldehyde, which are not detectable in false brandy.
Clinical Evaluation of a Loop-Mediated Amplification Kit for Diagnosis of Imported Malaria
Polley, Spencer D.; González, Iveth J.; Mohamed, Deqa; Daly, Rosemarie; Bowers, Kathy; Watson, Julie; Mewse, Emma; Armstrong, Margaret; Gray, Christen; Perkins, Mark D.; Bell, David; Kanda, Hidetoshi; Tomita, Norihiro; Kubota, Yutaka; Mori, Yasuyoshi; Chiodini, Peter L.; Sutherland, Colin J.
2013-01-01
Background. Diagnosis of malaria relies on parasite detection by microscopy or antigen detection; both fail to detect low-density infections. New tests providing rapid, sensitive diagnosis with minimal need for training would enhance both malaria diagnosis and malaria control activities. We determined the diagnostic accuracy of a new loop-mediated amplification (LAMP) kit in febrile returned travelers. Methods. The kit was evaluated in sequential blood samples from returned travelers sent for pathogen testing to a specialist parasitology laboratory. Microscopy was performed, and then malaria LAMP was performed using Plasmodium genus and Plasmodium falciparum–specific tests in parallel. Nested polymerase chain reaction (PCR) was performed on all samples as the reference standard. Primary outcome measures for diagnostic accuracy were sensitivity and specificity of LAMP results, compared with those of nested PCR. Results. A total of 705 samples were tested in the primary analysis. Sensitivity and specificity were 98.4% and 98.1%, respectively, for the LAMP P. falciparum primers and 97.0% and 99.2%, respectively, for the Plasmodium genus primers. Post hoc repeat PCR analysis of all 15 tests with discrepant results resolved 4 results in favor of LAMP, suggesting that the primary analysis had underestimated diagnostic accuracy. Conclusions. Malaria LAMP had a diagnostic accuracy similar to that of nested PCR, with a greatly reduced time to result, and was superior to expert microscopy. PMID:23633403
Waites, Anthony B; Mannfolk, Peter; Shaw, Marnie E; Olsrud, Johan; Jackson, Graeme D
2007-02-01
Clinical functional magnetic resonance imaging (fMRI) occasionally fails to detect significant activation, often due to variability in task performance. The present study seeks to test whether a more flexible statistical analysis can better detect activation, by accounting for variance associated with variable compliance to the task over time. Experimental results and simulated data both confirm that even at 80% compliance to the task, such a flexible model outperforms standard statistical analysis when assessed using the extent of activation (experimental data), goodness of fit (experimental data), and area under the operator characteristic curve (simulated data). Furthermore, retrospective examination of 14 clinical fMRI examinations reveals that in patients where the standard statistical approach yields activation, there is a measurable gain in model performance in adopting the flexible statistical model, with little or no penalty in lost sensitivity. This indicates that a flexible model should be considered, particularly for clinical patients who may have difficulty complying fully with the study task.
Breast Mass Detection in Digital Mammogram Based on Gestalt Psychology
Bu, Qirong; Liu, Feihong; Zhang, Min; Ren, Yu; Lv, Yi
2018-01-01
Inspired by gestalt psychology, we combine human cognitive characteristics with knowledge of radiologists in medical image analysis. In this paper, a novel framework is proposed to detect breast masses in digitized mammograms. It can be divided into three modules: sensation integration, semantic integration, and verification. After analyzing the progress of radiologist's mammography screening, a series of visual rules based on the morphological characteristics of breast masses are presented and quantified by mathematical methods. The framework can be seen as an effective trade-off between bottom-up sensation and top-down recognition methods. This is a new exploratory method for the automatic detection of lesions. The experiments are performed on Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM) data sets. The sensitivity reached to 92% at 1.94 false positive per image (FPI) on MIAS and 93.84% at 2.21 FPI on DDSM. Our framework has achieved a better performance compared with other algorithms. PMID:29854359
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow
NASA Astrophysics Data System (ADS)
Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar
2018-03-01
Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.
NASA Technical Reports Server (NTRS)
Keesler, E. L.
1974-01-01
The functional paths of the Orbital Maneuver Subsystem (OMS) is defined. The operational flight instrumentation required for performance monitoring, fault detection, and annunciation is described. The OMS is a pressure fed rocket engine propulsion subsystem. One complete OMS shares each of the two auxiliary propulsion subsystem pods with a reaction control subsystem. Each OMS is composed of a pressurization system, a propellant tanking system, and a gimbaled rocket engine. The design, development, and operation of the system are explained. Diagrams of the system are provided.
1990-09-01
military pilot acceptance of a safety network system would be based , as always, on the following: a. Do I really need such a system and will it be a...inferring pilot state based on computer analysis of pilot control inputs (or lack of)l. Having decided that the pilot is incapacitated, PMAS would alert...the advances being made in neural network computing machinery have necessitated a complete re-thinking of the conventional serial von Neuman machine
Process fault detection and nonlinear time series analysis for anomaly detection in safeguards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, T.L.; Mullen, M.F.; Wangen, L.E.
In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less
A normal incidence X-ray telescope
NASA Technical Reports Server (NTRS)
Golub, Leon
1987-01-01
The postflight performance evaluation of the X-ray telescope was summarized. All payload systems and subsystems performed well within acceptable limits, with the sole exception of the light-blocking prefilters. Launch, flight and recovery were performed in a fully satisfactory manner. The payload was recovered in a timely manner and in excellent condition. The prefilter performance analysis showed that no X-ray images were detected on the processed flight film. Recommendations for improved performance are listed.
NASA Astrophysics Data System (ADS)
Dalipi, Rogerta; Marguí, Eva; Borgese, Laura; Bilo, Fabjola; Depero, Laura E.
2016-06-01
Recent technological improvements have led to a widespread adoption of benchtop total reflection X-ray fluorescence systems (TXRF) for analysis of liquid samples. However, benchtop TXRF systems usually present limited sensitivity compared with high-scale instrumentation which can restrict its application in some fields. The aim of the present work was to evaluate and compare the analytical capabilities of two TXRF systems, equipped with low power Mo and W target X-ray tubes, for multielemental analysis of wine samples. Using the Mo-TXRF system, the detection limits for most elements were one order of magnitude lower than those attained using the W-TXRF system. For the detection of high Z elements like Cd and Ag, however, W-TXRF remains a very good option due to the possibility of K-Lines detection. Accuracy and precision of the obtained results have been evaluated analyzing spiked real wine samples and comparing the TXRF results with those obtained by inductively coupled plasma emission spectroscopy (ICP-OES). In general, good agreement was obtained between ICP-OES and TXRF results for the analysis of both red and white wine samples except for light elements (i.e., K) which TXRF concentrations were underestimated. However, a further achievement of analytical quality of TXRF results can be achieved if wine analysis is performed after dilution of the sample with de-ionized water.
NASA Astrophysics Data System (ADS)
Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.
2018-02-01
A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.
A CO trace gas detection system based on continuous wave DFB-QCL
NASA Astrophysics Data System (ADS)
Dang, Jingmin; Yu, Haiye; Sun, Yujing; Wang, Yiding
2017-05-01
A compact and mobile system was demonstrated for the detection of carbon monoxide (CO) at trace level. This system adopted a high-power, continuous wave (CW), distributed feedback quantum cascade laser (DFB-QCL) operating at ∼22 °C as excitation source. Wavelength modulation spectroscopy (WMS) as well as second harmonic detection was used to isolate complex, overlapping spectral absorption features typical of ambient pressures and to achieve excellent specificity and high detection sensitivity. For the selected P(11) absorption line of CO molecule, located at 2099.083 cm-1, a limit of detection (LoD) of 26 ppb by volume (ppbv) at atmospheric pressure was achieved with a 1 s acquisition time. Allan deviation analysis was performed to investigate the long term performance of the CO detection system, and a measurement precision of 3.4 ppbv was observed with an optimal integration time of approximate 114 s, which verified the reliable and robust operation of the developed system.
Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane
2017-06-01
Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Jiang, Sha-Yi; Yang, Jing-Wei; Shao, Jing-Bo; Liao, Xue-Lian; Lu, Zheng-Hua; Jiang, Hui
2016-05-01
In this meta-analysis, we evaluated the diagnostic role of Epstein-Barr virus deoxyribonucleic acid detection and quantitation in the serum of pediatric and young adult patients with infectious mononucleosis. The primary outcome of this meta-analysis was the sensitivity and specificity of Epstein-Barr virus (EBV) deoxyribonucleic acid (DNA) detection and quantitation using polymerase chain reaction (PCR). A systematic review and meta-analysis was performed by searching for articles that were published through September 24, 2014 in the following databases: Medline, Cochrane, EMBASE, and Google Scholar. The following keywords were used for the search: "Epstein-Barr virus," "infectious mononucleosis," "children/young adults/infant/pediatric," and "polymerase chain reaction or PCR." Three were included in this analysis. We found that for detection by PCR, the pooled sensitivity for detecting EBV DNA was 77% (95%CI, 66-86%) and the pooled specificity for was 98% (95%CI, 93-100%). Our findings indicate that this PCR-based assay has high specificity and good sensitivity for detecting of EBV DNA, indicating it may useful for identifying patients with infectious mononucleosis. This assay may also be helpful to identify young athletic patients or highly physically active pediatric patients who are at risk for a splenic rupture due to acute infectious mononucleosis. © 2015 Wiley Periodicals, Inc.
Xiao, Qiyang; Li, Jian; Bai, Zhiliang; Sun, Jiedi; Zhou, Nan; Zeng, Zhoumo
2016-12-13
In this study, a small leak detection method based on variational mode decomposition (VMD) and ambiguity correlation classification (ACC) is proposed. The signals acquired from sensors were decomposed using the VMD, and numerous components were obtained. According to the probability density function (PDF), an adaptive de-noising algorithm based on VMD is proposed for noise component processing and de-noised components reconstruction. Furthermore, the ambiguity function image was employed for analysis of the reconstructed signals. Based on the correlation coefficient, ACC is proposed to detect the small leak of pipeline. The analysis of pipeline leakage signals, using 1 mm and 2 mm leaks, has shown that proposed detection method can detect a small leak accurately and effectively. Moreover, the experimental results have shown that the proposed method achieved better performances than support vector machine (SVM) and back propagation neural network (BP) methods.
Xiao, Qiyang; Li, Jian; Bai, Zhiliang; Sun, Jiedi; Zhou, Nan; Zeng, Zhoumo
2016-01-01
In this study, a small leak detection method based on variational mode decomposition (VMD) and ambiguity correlation classification (ACC) is proposed. The signals acquired from sensors were decomposed using the VMD, and numerous components were obtained. According to the probability density function (PDF), an adaptive de-noising algorithm based on VMD is proposed for noise component processing and de-noised components reconstruction. Furthermore, the ambiguity function image was employed for analysis of the reconstructed signals. Based on the correlation coefficient, ACC is proposed to detect the small leak of pipeline. The analysis of pipeline leakage signals, using 1 mm and 2 mm leaks, has shown that proposed detection method can detect a small leak accurately and effectively. Moreover, the experimental results have shown that the proposed method achieved better performances than support vector machine (SVM) and back propagation neural network (BP) methods. PMID:27983577
Giovannelli, D; Abballe, F
2005-08-26
A method has been developed which allows simultaneous determination of three linear alkyl trimethylammonium salts. Dodecyltrimethylammonium chloride, tetradecyltrimethylammonium bromide and hexadecyltrimethylammonium chloride are widely used as main active ingredients of lysing reagents for blood cell analyzers which perform white blood cells differential determination into two or more sub-populations by impedance analysis. The ion-pair on styrene-divinyl benzene chromatographic phase looks like a suitable, reliable and long term stable tool for separation of such quaternary compounds. The detection based on suppressed conductivity was chosen because of the lack of significance chromophores. A micromembrane suppressor device compatible with high solvent concentration (up to 80%) was used in order to minimize the conductivity background before the detection. In the present work we show how the chemical post column derivatization makes the alkyl chain detectable also by UV direct detection at 210 nm.
Method 8321B describes procedures for preparation and analysis of solid, aqueous liquid, drinking water and wipe samples using high performance liquid chromatography and mass spectrometry for extractable non-volatile compounds.
García Vicente, A M; Soriano Castrejón, A; Cruz Mora, M Á; Ortega Ruiperez, C; Espinosa Aunión, R; León Martín, A; González Ageitos, A; Van Gómez López, O
2014-01-01
To assess dual time point 2-deoxy-2-[(18)F]fluoro-D-glucose (18)(F)FDG PET-CT accuracy in nodal staging and in detection of extra-axillary involvement. Dual time point [(18)F] FDG PET/CT scan was performed in 75 patients. Visual and semiquantitative assessment of lymph nodes was performed. Semiquantitative measurement of SUV and ROC-analysis were carried out to calculate SUV(max) cut-off value with the best diagnostic performance. Axillary and extra-axillary lymph node chains were evaluated. Sensitivity and specificity of visual assessment was 87.3% and 75%, respectively. SUV(max) values with the best sensitivity were 0.90 and 0.95 for early and delayed PET, respectively. SUV(max) values with the best specificity were 1.95 and 2.75, respectively. Extra-axillary lymph node involvement was detected in 26.7%. FDG PET/CT detected extra-axillary lymph node involvement in one-fourth of the patients. Semiquantitative lymph node analysis did not show any advantage over the visual evaluation. Copyright © 2013 Elsevier España, S.L. and SEMNIM. All rights reserved.
Performance of RVGui sensor and Kodak Ektaspeed Plus film for proximal caries detection.
Abreu, M; Mol, A; Ludlow, J B
2001-03-01
A high-resolution charge-coupled device was used to compare the diagnostic performances obtained with Trophy's new RVGui sensor and Kodak Ektaspeed Plus film with respect to caries detection. Three acquisition modes of the Trophy RVGui sensor were compared with Kodak Ektaspeed Plus film. Images of the proximal surfaces of 40 extracted posterior teeth were evaluated by 6 observers. The presence or absence of caries was scored by means of a 5-point confidence scale. The actual caries status of each surface was determined through ground-section histology. Responses were evaluated by means of receiver operating characteristic analysis. Areas under receiver operating characteristic curves (A(Z)) were assessed through analysis of variance. The mean A(Z) scores were 0.85 for film, 0.84 for the high-resolution caries mode, and 0.82 for both the low resolution caries mode and the high-resolution periodontal mode. These differences were not statistically significant (P =.70). The differences among observers also were not statistically significant (P =.23). The performance of the RVGui sensor in high- and low-resolution modes for proximal caries detection is comparable to that of Ektaspeed Plus film.
Improved wheal detection from skin prick test images
NASA Astrophysics Data System (ADS)
Bulan, Orhan
2014-03-01
Skin prick test is a commonly used method for diagnosis of allergic diseases (e.g., pollen allergy, food allergy, etc.) in allergy clinics. The results of this test are erythema and wheal provoked on the skin where the test is applied. The sensitivity of the patient against a specific allergen is determined by the physical size of the wheal, which can be estimated from images captured by digital cameras. Accurate wheal detection from these images is an important step for precise estimation of wheal size. In this paper, we propose a method for improved wheal detection on prick test images captured by digital cameras. Our method operates by first localizing the test region by detecting calibration marks drawn on the skin. The luminance variation across the localized region is eliminated by applying a color transformation from RGB to YCbCr and discarding the luminance channel. We enhance the contrast of the captured images for the purpose of wheal detection by performing principal component analysis on the blue-difference (Cb) and red-difference (Cr) color channels. We finally, perform morphological operations on the contrast enhanced image to detect the wheal on the image plane. Our experiments performed on images acquired from 36 different patients show the efficiency of the proposed method for wheal detection from skin prick test images captured in an uncontrolled environment.
[Exclusive use of blue dye to detect sentinel lymph nodes in breast cancer].
Bühler H, Simón; Rojas P, Hugo; Cayazzo M, Daniela; Cunill C, Eduardo; Vesperinas A, Gonzalo; Hamilton S, James
2008-08-01
The use of a dye and radiocolloid to detect sentinel lymph nodes in breast cancer increases the detection rates. However the use of either method alone does not modify the false negative rate. Therefore there is no formal contraindication for the exclusive use of dye to detect nodes. To report a prospective analysis of the exclusive blue dye technique for sentinel node biopsy in patients with early breast cancer. We analyzed the first 100 women with pathologically proven breast cancer who met the inclusion criteria. Patent blue dye was used as colorant. In the first 25 cases sentinel node was identified using radiocolloid and blue dye an then an axillary dissection performed. In the next 25 women, blue dye was used exclusively for detection and an axillary dissection was performed. In the next 50 cases, blue dye was used and only isolated sentinel node biopsy was performed. In 92 of the 100 women a sentinel node was successfully detected. In the first 50 women, the false negative rate of sentinel lymph node detection was 6.9%. No complications occurred. During follow-up, lasting three to 29 months, no axillary relapse was observed. Sentinel node biopsy in patients with early breast cancer using exclusively blue dye is feasible and safe.
Faverjon, C; Vial, F; Andersson, M G; Lecollinet, S; Leblond, A
2017-04-01
West Nile virus (WNV) is a growing public health concern in Europe and there is a need to develop more efficient early detection systems. Nervous signs in horses are considered to be an early indicator of WNV and, using them in a syndromic surveillance system, might be relevant. In our study, we assessed whether or not data collected by the passive French surveillance system for the surveillance of equine diseases can be used routinely for the detection of WNV. We tested several pre-processing methods and detection algorithms based on regression. We evaluated system performances using simulated and authentic data and compared them to those of the surveillance system currently in place. Our results show that the current detection algorithm provided similar performances to those tested using simulated and real data. However, regression models can be easily and better adapted to surveillance objectives. The detection performances obtained were compatible with the early detection of WNV outbreaks in France (i.e. sensitivity 98%, specificity >94%, timeliness 2·5 weeks and around four false alarms per year) but further work is needed to determine the most suitable alarm threshold for WNV surveillance in France using cost-efficiency analysis.
al Jarad, N; Strickland, B; Bothamley, G; Lock, S; Logan-Sinclair, R; Rudd, R M
1993-01-01
BACKGROUND--Crackles are a prominent clinical feature of asbestosis and may be an early sign of the condition. Auscultation, however, is subjective and interexaminer disagreement is a problem. Computerised lung sound analysis can visualise, store, and analyse lung sounds and disagreement on the presence of crackles is minimal. High resolution computed tomography (HRCT) is superior to chest radiography in detecting early signs of asbestosis. The aim of this study was to compare clinical auscultation, time expanded wave form analysis (TEW), chest radiography, and HRCT in detecting signs of asbestosis in asbestos workers. METHODS--Fifty three asbestos workers (51 men and two women) were investigated. Chest radiography and HRCT were assessed by two independent readers for detection of interstitial opacities. HRCT was performed in the supine position with additional sections at the bases in the prone position. Auscultation for persistent fine inspiratory crackles was performed by two independent examiners unacquainted with the diagnosis. TEW analysis was obtained from a 33 second recording of lung sounds over the lung bases. TEW and auscultation were performed in a control group of 13 subjects who had a normal chest radiograph. There were 10 current smokers and three previous smokers. In asbestos workers the extent of pulmonary opacities on the chest radiograph was scored according to the International Labour Office (ILO) scale. Patients were divided into two groups: 21 patients in whom the chest radiograph was > 1/0 (group 1) and 32 patients in whom the chest radiograph was scored < or = 1/0 (group 2) on the ILO scale. RESULTS--In patients with an ILO score of < or = 1/0 repetitive mid to late inspiratory crackles were detected by auscultation in seven (22%) patients and by TEW in 14 (44%). HRCT detected definite interstitial opacities in 11 (34%) and gravity dependent subpleural lines in two (6%) patients. All but two patients with evidence of interstitial disease or gravity dependent subpleural lines on HRCT had crackles detected by TEW. In patients with an ILO score of > 1/0 auscultation and TEW revealed mid to late inspiratory crackles in all patients, whereas HRCT revealed gravity dependent subpleural lines in one patient and signs of definite interstitial fibrosis in the rest. In normal subjects crackles different from those detected in asbestosis were detected by TEW in three subjects but only in one subject by auscultation. These were early, fine inspiratory crackles. CONCLUSION--Mid to late inspiratory crackles in asbestos workers are detected by TEW more frequently than by auscultation. Signs of early asbestosis not apparent on the plain radiograph are detected by TEW and HRCT with similar frequency. off Images PMID:8511731
NASA Astrophysics Data System (ADS)
Li, Hong; Ding, Xue
2017-03-01
This paper combines wavelet analysis and wavelet transform theory with artificial neural network, through the pretreatment on point feature attributes before in intrusion detection, to make them suitable for improvement of wavelet neural network. The whole intrusion classification model gets the better adaptability, self-learning ability, greatly enhances the wavelet neural network for solving the problem of field detection invasion, reduces storage space, contributes to improve the performance of the constructed neural network, and reduces the training time. Finally the results of the KDDCup99 data set simulation experiment shows that, this method reduces the complexity of constructing wavelet neural network, but also ensures the accuracy of the intrusion classification.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.
Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira
2015-01-01
Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291
On the Detectability of CO Molecules in the Interstellar Medium via X-Ray Spectroscopy
NASA Technical Reports Server (NTRS)
Joachimi, Katerine; Gatuzz, Efrain; Garcia, Javier; Kallman, Timothy R.
2016-01-01
We present a study of the detectability of CO molecules in the Galactic interstellar medium using high-resolution X-ray spectra obtained with the XMM-Newton Reflection Grating Spectrometer. We analysed 10 bright low mass X-ray binaries (LMXBs) to study the CO contribution in their line of sights. A total of 25 observations were fitted with the ISMabs X-ray absorption model which includes photoabsorption cross-sections for Oi, Oii, Oiii and CO. We performed a Monte Carlo (MC) simulation analysis of the goodness of fit in order to estimate the significance of the CO detection. We determine that the statistical analysis prevents a significant detection of CO molecular X-ray absorption features, except for the lines of sight towards XTE J1718-330 and 4U 1636-53. In the case of XTE J1817-330, this is the first report of the presence of CO along its line of sight. Our results reinforce the conclusion that molecules have a minor contribution to the absorption features in the O K-edge spectral region. We estimate a CO column density lower limit to perform a significant detection with XMM-Newton of N(CO) greater than 6 x 10(exp 16) per sq cm for typical exposure times.
NASA Astrophysics Data System (ADS)
Carrer, Leonardo; Gerekos, Christopher; Bruzzone, Lorenzo
2018-03-01
Lunar lava tubes have attracted special interest as they would be suitable shelters for future human outposts on the Moon. Recent experimental results from optical images and gravitational anomalies have brought strong evidence of their existence, but such investigative means have very limited potential for global mapping of lava tubes. In this paper, we investigate the design requirement and feasibility of a radar sounder system specifically conceived for detecting subsurface Moon lava tubes from orbit. This is done by conducting a complete performance assessment and by simulating the electromagnetic signatures of lava tubes using a coherent 3D simulator. The results show that radar sounding of lava tubes is feasible with good performance margins in terms of signal-to-noise and signal-to-clutter ratio, and that a dual-frequency radar sounder would be able to detect the majority of lunar lava tubes based on their potential dimension with some limitations for very small lava tubes having width smaller than 250 m. The electromagnetic simulations show that lava tubes display an unique signature characterized by a signal phase inversion on the roof echo. The analysis is provided for different acquisition geometries with respect to the position of the sounded lava tube. This analysis confirms that orbiting multi-frequency radar sounder can detect and map in a reliable and unambiguous way the majority of Moon lava tubes.
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Li, Jonathan Z; Chapman, Brad; Charlebois, Patrick; Hofmann, Oliver; Weiner, Brian; Porter, Alyssa J; Samuel, Reshmi; Vardhanabhuti, Saran; Zheng, Lu; Eron, Joseph; Taiwo, Babafemi; Zody, Michael C; Henn, Matthew R; Kuritzkes, Daniel R; Hide, Winston; Wilson, Cara C; Berzins, Baiba I; Acosta, Edward P; Bastow, Barbara; Kim, Peter S; Read, Sarah W; Janik, Jennifer; Meres, Debra S; Lederman, Michael M; Mong-Kryspin, Lori; Shaw, Karl E; Zimmerman, Louis G; Leavitt, Randi; De La Rosa, Guy; Jennings, Amy
2014-01-01
The impact of raltegravir-resistant HIV-1 minority variants (MVs) on raltegravir treatment failure is unknown. Illumina sequencing offers greater throughput than 454, but sequence analysis tools for viral sequencing are needed. We evaluated Illumina and 454 for the detection of HIV-1 raltegravir-resistant MVs. A5262 was a single-arm study of raltegravir and darunavir/ritonavir in treatment-naïve patients. Pre-treatment plasma was obtained from 5 participants with raltegravir resistance at the time of virologic failure. A control library was created by pooling integrase clones at predefined proportions. Multiplexed sequencing was performed with Illumina and 454 platforms at comparable costs. Illumina sequence analysis was performed with the novel snp-assess tool and 454 sequencing was analyzed with V-Phaser. Illumina sequencing resulted in significantly higher sequence coverage and a 0.095% limit of detection. Illumina accurately detected all MVs in the control library at ≥0.5% and 7/10 MVs expected at 0.1%. 454 sequencing failed to detect any MVs at 0.1% with 5 false positive calls. For MVs detected in the patient samples by both 454 and Illumina, the correlation in the detected variant frequencies was high (R2 = 0.92, P<0.001). Illumina sequencing detected 2.4-fold greater nucleotide MVs and 2.9-fold greater amino acid MVs compared to 454. The only raltegravir-resistant MV detected was an E138K mutation in one participant by Illumina sequencing, but not by 454. In participants of A5262 with raltegravir resistance at virologic failure, baseline raltegravir-resistant MVs were rarely detected. At comparable costs to 454 sequencing, Illumina demonstrated greater depth of coverage, increased sensitivity for detecting HIV MVs, and fewer false positive variant calls.
Tong, Yanhong; McCarthy, Kaitlin; Kong, Huimin; Lemieux, Bertrand
2013-01-01
We have developed a rapid and simple molecular test, the IsoGlow HSV Typing assay, for the detection and typing of herpes simplex virus (type 1 and 2) from genital or oral lesions. Clinical samples suspended in viral transport mediums are simply diluted and then added to a helicase-dependent amplification master mix. The amplification and detection were performed on a portable fluorescence detector called the FireFly instrument. Detection of amplification products is based on end-point analysis using cycling probe technology. An internal control nucleic acid was included in the amplification master mix to monitor the presence of amplification inhibitors in the samples. Because the device has only two fluorescence detection channels, two strategies were developed and compared to detect the internal control template: internal control detected by melting curve analysis using a dual-labeled probe, versus internal control detection using end-point fluorescence release by a CPT probe at a lower temperature. Both have a total turnaround time of about 1 hour. Clinical performance relative to herpes viral culture was evaluated using 176 clinical specimens. Both formats of the IsoGlow HSV typing assay had sensitivities comparable to that of the Food and Drug Administration–cleared IsoAmp HSV (BioHelix Corp., Beverly MA) test and specificity for the two types of HSV comparable to that of ELVIS HSV (Diagnostic Hybrids, Athens, OH). PMID:22951487
Event Recognition for Contactless Activity Monitoring Using Phase-Modulated Continuous Wave Radar.
Forouzanfar, Mohamad; Mabrouk, Mohamed; Rajan, Sreeraman; Bolic, Miodrag; Dajani, Hilmi R; Groza, Voicu Z
2017-02-01
The use of remote sensing technologies such as radar is gaining popularity as a technique for contactless detection of physiological signals and analysis of human motion. This paper presents a methodology for classifying different events in a collection of phase modulated continuous wave radar returns. The primary application of interest is to monitor inmates where the presence of human vital signs amidst different, interferences needs to be identified. A comprehensive set of features is derived through time and frequency domain analyses of the radar returns. The Bhattacharyya distance is used to preselect the features with highest class separability as the possible candidate features for use in the classification process. The uncorrelated linear discriminant analysis is performed to decorrelate, denoise, and reduce the dimension of the candidate feature set. Linear and quadratic Bayesian classifiers are designed to distinguish breathing, different human motions, and nonhuman motions. The performance of these classifiers is evaluated on a pilot dataset of radar returns that contained different events including breathing, stopped breathing, simple human motions, and movement of fan and water. Our proposed pattern classification system achieved accuracies of up to 93% in stationary subject detection, 90% in stop-breathing detection, and 86% in interference detection. Our proposed radar pattern recognition system was able to accurately distinguish the predefined events amidst interferences. Besides inmate monitoring and suicide attempt detection, this paper can be extended to other radar applications such as home-based monitoring of elderly people, apnea detection, and home occupancy detection.
Yamada, Yoshitake; Jinzaki, Masahiro; Hashimoto, Masahiro; Shiomi, Eisuke; Abe, Takayuki; Kuribayashi, Sachio; Ogawa, Kenji
2013-08-01
To compare the diagnostic performance of tomosynthesis with that of chest radiography for the detection of pulmonary emphysema, using multidetector computed tomography (MDCT) as reference. Forty-eight patients with and 63 without pulmonary emphysema underwent chest MDCT, tomosynthesis and radiography on the same day. Two blinded radiologists independently evaluated the tomosynthesis images and radiographs for the presence of pulmonary emphysema. Axial and coronal MDCT images served as the reference standard and the percentage lung volume with attenuation values of -950 HU or lower (LAA-950) was evaluated to determine the extent of emphysema. Receiver-operating characteristic (ROC) analysis and generalised estimating equations model were used. ROC analysis revealed significantly better performance (P < 0.0001) of tomosynthesis than radiography for the detection of pulmonary emphysema. The average sensitivity, specificity, positive predictive value and negative predictive value of tomosynthesis were 0.875, 0.968, 0.955 and 0.910, respectively, whereas the values for radiography were 0.479, 0.913, 0.815 and 0.697, respectively. For both tomosynthesis and radiography, the sensitivity increased with increasing LAA-950. The diagnostic performance of tomosynthesis was significantly superior to that of radiography for the detection of pulmonary emphysema. In both tomosynthesis and radiography, the sensitivity was affected by the LAA-950. • Tomosynthesis showed significantly better diagnostic performance for pulmonary emphysema than radiography. • Interobserver agreement for tomosynthesis was significantly higher than that for radiography. • Sensitivity increased with increasing LAA -950 in both tomosynthesis and radiography. • Tomosynthesis imparts a similar radiation dose to two projection chest radiography. • Radiation dose and cost of tomosynthesis are lower than those of MDCT.
Recent enhancements of the PMCC infrasound signal detector
NASA Astrophysics Data System (ADS)
Brachet, N.; Mialle, P.; Matoza, R. S.; Le Pichon, A.; Cansi, Y.; Ceranna, L.
2010-12-01
The Progressive Multi-Channel Correlation (PMCC) is an antenna technique that is commonly being used by the scientific community for detecting coherent signals recorded on infrasound arrays. The PMCC detector, originally developed by CEA/DASE (Cansi, 1995), was installed in 2004 in the operational environment of the International Data Centre (IDC) of the Comprehensive nuclear Test Ban Treaty Organization (CTBTO) in Vienna. During the last 5 years, several changes have been made by the IDC to enhance the PMCC source code and parameter configuration, and the detector has exhibited good performance in terms of detection sensitivity and robustness. Recent studies performed at the CEA/DASE have shown that the IDC version (DFX/Geotool-PMCC) and the DASE version (WinPMCC) of PMCC software benefit from the implementation of the adaptive processing window duration and a log-spaced frequency bands. This tested configuration enables better detection and characterization of all received signals in their wave-front parameter space (e.g., frequency-azimuth space, frequency-trace-velocity space). A new release of the WinPMCC software - running under Windows or Linux operating systems - including a fully configurable filtering and detection parameters is now available upon request. We present the results of a statistical analysis on 10 years of infrasound data recorded at the IMS stations IS26, Germany and IS22, New Caledonia. A comparison is made between the automatic detections produced by the IDC, and the reprocessed detections using the optimized filtering and detection configuration parameters. Work is also underway at the CEA/DASE to determine more rigorously the azimuth and speed uncertainties. The current algorithm estimates the uncertainties based on statistical analysis of the distribution of PMCC detection pixels in the azimuth-speed space. The new code that is being considered performs the calculation of infrasound measurement errors as a function of physical parameters, i.e. dependant on the array geometry and the wave properties.
Liu, Sifei; Zhang, Guangrui; Qiu, Ying; Wang, Xiaobo; Guo, Lihan; Zhao, Yanxin; Tong, Meng; Wei, Lan; Sun, Lixin
2016-12-01
In this study, we aimed to establish a comprehensive and practical quality evaluation system for Shenmaidihuang pills. A simple and reliable high-performance liquid chromatography coupled with photodiode array detection method was developed both for fingerprint analysis and quantitative determination. In fingerprint analysis, relative retention time and relative peak area were used to identify the common peaks in 18 samples for investigation. Twenty one peaks were selected as the common peaks to evaluate the similarities of 18 Shenmaidihuang pills samples with different manufacture dates. Furthermore, similarity analysis was applied to evaluate the similarity of samples. Hierarchical cluster analysis and principal component analysis were also performed to evaluate the variation of Shenmaidihuang pills. In quantitative analysis, linear regressions, injection precisions, recovery, repeatability and sample stability were all tested and good results were obtained to simultaneously determine the seven identified compounds, namely, 5-hydroxymethylfurfural, morroniside, loganin, paeonol, paeoniflorin, psoralen, isopsoralen in Shenmaidihuang pills. The contents of some analytes in different batches of samples indicated significant difference, especially for 5-hydroxymethylfurfural. So, it was concluded that the chromatographic fingerprint method obtained by high-performance liquid chromatography coupled with photodiode array detection associated with multiple compounds determination is a powerful and meaningful tool to comprehensively conduct the quality control of Shenmaidihuang pills. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Guo, Yujie; Chen, Xi; Qi, Jin; Yu, Boyang
2016-07-01
A reliable method, combining qualitative analysis by high-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and quantitative assessment by high-performance liquid chromatography with photodiode array detection, has been developed to simultaneously analyze flavonoids and alkaloids in lotus leaf extracts. In the qualitative analysis, a total of 30 compounds, including 12 flavonoids, 16 alkaloids, and two proanthocyanidins, were identified. The fragmentation behaviors of four types of flavone glycoside and three types of alkaloid are summarized. The mass spectra of four representative components, quercetin 3-O-glucuronide, norcoclaurine, nuciferine, and neferine, are shown to illustrate their fragmentation pathways. Five pairs of isomers were detected and three of them were distinguished by comparing the elution order with reference substances and the mass spectrometry data with reported data. In the quantitative analysis, 30 lotus leaf samples from different regions were analyzed to investigate the proportion of eight representative compounds. Quercetin 3-O-glucuronide was found to be the predominant constituent of lotus leaf extracts. For further discrimination among the samples, hierarchical cluster analysis, and principal component analysis, based on the areas of the eight quantitative peaks, were carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ground-based deep-space LADAR for satellite detection: A parametric study
NASA Astrophysics Data System (ADS)
Davey, Kevin F.
1989-12-01
The minimum performance requirements are determined of a ground based infrared LADAR designed to detect deep space satellites, and a candidate sensor design is presented based on current technology. The research examines LADAR techniques and detection methods to determine the optimum LADAR configuration, and then assesses the effects of atmospheric transmission, background radiance, and turbulence across the infrared region to find the optimum laser wavelengths. Diffraction theory is then used in a parametric analysis of the transmitted laser beam and received signal, using a Cassegrainian telescope design and heterodyne detection. The effects of beam truncation and obscuration, heterodyne misalignment, off-boresight detection, and image-pixel geometry are also included in the analysis. The derived equations are then used to assess the feasibility of several candidate designs under a wide range of detection conditions including daylight operation through cirrus. The results show that successful detection is theoretically possible under most conditions by transmitting a high power frequency modulated pulse train from an isotopic 13CO2 laser radiating at 11.17 micrometers, and utilizing post-detection integration and pulse compression techniques.
Combined Raman spectroscopy and autofluoresence imaging method for in vivo skin tumor diagnosis
NASA Astrophysics Data System (ADS)
Zakharov, V. P.; Bratchenko, I. A.; Myakinin, O. O.; Artemyev, D. N.; Khristoforova, Y. A.; Kozlov, S. V.; Moryatov, A. A.
2014-09-01
The fluorescence and Raman spectroscopy (RS) combined method of in vivo detection of malignant human skin cancer was demonstrated. The fluorescence analysis was used for detection of abnormalities during fast scanning of large tissue areas. In suspected cases of malignancy the Raman spectrum analysis of biological tissue was performed to determine the type of neoplasm. A special RS phase method was proposed for in vivo identification of skin tumor. Quadratic Discriminant Analysis was used for tumor type classification on phase planes. It was shown that the application of phase method provides a diagnosis of malignant melanoma with a sensitivity of 89% and a specificity of 87%.
Anastassiades, M; Schwack, W
1998-10-30
Simple methods for the analysis of carbendazim, benomyl and thiophanate methyl in fruits and vegetables and of 2,4-D in citrus fruits are presented. Sample preparation involves supercritical fluid extraction with carbon dioxide and further analysis is performed without any additional clean-up by GC-MS after derivatisation or directly by HPLC-diode array detection. The SFE methods presented are clearly faster and more cost effective than traditional solvent based approaches. The recoveries, detection limits and repeatabilities achieved, meet the needs of tolerance level monitoring of these compounds in fruits and vegetables.
NASA Astrophysics Data System (ADS)
Aktas, Metin; Maral, Hakan; Akgun, Toygar
2018-02-01
Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.
Inertial Sensor-Based Motion Analysis of Lower Limbs for Rehabilitation Treatments
Sun, Tongyang; Duan, Lihong; Wang, Yulong
2017-01-01
The hemiplegic rehabilitation state diagnosing performed by therapists can be biased due to their subjective experience, which may deteriorate the rehabilitation effect. In order to improve this situation, a quantitative evaluation is proposed. Though many motion analysis systems are available, they are too complicated for practical application by therapists. In this paper, a method for detecting the motion of human lower limbs including all degrees of freedom (DOFs) via the inertial sensors is proposed, which permits analyzing the patient's motion ability. This method is applicable to arbitrary walking directions and tracks of persons under study, and its results are unbiased, as compared to therapist qualitative estimations. Using the simplified mathematical model of a human body, the rotation angles for each lower limb joint are calculated from the input signals acquired by the inertial sensors. Finally, the rotation angle versus joint displacement curves are constructed, and the estimated values of joint motion angle and motion ability are obtained. The experimental verification of the proposed motion detection and analysis method was performed, which proved that it can efficiently detect the differences between motion behaviors of disabled and healthy persons and provide a reliable quantitative evaluation of the rehabilitation state. PMID:29065575
An online sleep apnea detection method based on recurrence quantification analysis.
Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen
2014-07-01
This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
Bendahl, Lars; Hansen, Steen Honoré; Gammelgaard, Bente; Sturup, Stefan; Nielsen, Camilla
2006-02-24
Ultra performance liquid chromatography (UPLC) was coupled to inductively coupled plasma mass spectrometry (ICP-MS) for fast analysis of three bromine-containing preservatives, monitoring the 79Br and 81Br isotopes simultaneously. Due to the efficiency of the 1.7 microm column packing material, the resolution of the test substances was only slightly affected when the linear flow velocity was increased from 0.5 to 1.9 mm s(-1). However, the sensitivity of ICP-MS detection decreased when the linear flow velocity was increased from 0.5 to 1.9 mm s(-1). Analytical figures of merit were determined at an intermediate and at a high linear velocity. The precision was better than 2.2% R.S.D. and regression analysis showed that a linear response was achieved at both flow rates (R2 > 0.9993, n = 36). The analysis time was less than 4.5 min at a flow rate of 50 microL min(-1) and limits of detection and quantification were better than 3.3 and 11 microg BrL(-1), respectively. The analysis time was reduced to 2.7 min when the flow rate was increased to 90 microL min(-1) and limits of detection and quantification were better than 20 and 65 microg BrL(-1), respectively. The method was applied for quantitative analysis of bromine-containing preservatives in commercially available cosmetic products.
Sun, Liangliang; Zhu, Guijie; Yan, Xiaojing; Champion, Mathew M.
2014-01-01
The vast majority of proteomic studies employ reversed-phase high-performance liquid chromatography coupled with tandem mass spectrometry for analysis of the tryptic digest of a cellular lysate. This technology is quite mature, and typically provides identification of hundreds to thousands of peptides, which is used to infer the identity of hundreds to thousands of proteins. These approaches usually require milligrams to micrograms of starting material. Capillary zone electrophoresis provides an interesting alternative separation method based on a different separation mechanism than HPLC. Capillary electrophoresis received some attention for protein analysis beginning 25 years ago. Those efforts stalled because of the limited performance of the electrospray interfaces and the limited speed and sensitivity of mass spectrometers of that era. This review considers a new electrospray interface design coupled with Orbitrap Velos and linear Q-trap mass spectrometers. Capillary zone electrophoresis coupled with this interface and these detectors provides single shot detection of >1,250 peptides from an E. coli digest in less than one hour, identification of nearly 5,000 peptides from analysis of seven fractions produced by solid-phase extraction of the E. coli digest in a six hour total analysis time, low attomole detection limits for peptides generated from standard proteins, and high zeptomole detection limits for selected ion monitoring of peptides. Incorporation of an integrated on-line immobilized trypsin microreactor allows digestion and analysis of picogram amounts of a complex eukaryotic proteome. PMID:24277677
Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth
2017-01-01
Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197
Nika, Varvara; Babyn, Paul; Zhu, Hongmei
2014-07-01
Automatic change detection methods for identifying the changes of serial MR images taken at different times are of great interest to radiologists. The majority of existing change detection methods in medical imaging, and those of brain images in particular, include many preprocessing steps and rely mostly on statistical analysis of magnetic resonance imaging (MRI) scans. Although most methods utilize registration software, tissue classification remains a difficult and overwhelming task. Recently, dictionary learning techniques are being used in many areas of image processing, such as image surveillance, face recognition, remote sensing, and medical imaging. We present an improved version of the EigenBlockCD algorithm, named the EigenBlockCD-2. The EigenBlockCD-2 algorithm performs an initial global registration and identifies the changes between serial MR images of the brain. Blocks of pixels from a baseline scan are used to train local dictionaries to detect changes in the follow-up scan. We use PCA to reduce the dimensionality of the local dictionaries and the redundancy of data. Choosing the appropriate distance measure significantly affects the performance of our algorithm. We examine the differences between [Formula: see text] and [Formula: see text] norms as two possible similarity measures in the improved EigenBlockCD-2 algorithm. We show the advantages of the [Formula: see text] norm over the [Formula: see text] norm both theoretically and numerically. We also demonstrate the performance of the new EigenBlockCD-2 algorithm for detecting changes of MR images and compare our results with those provided in the recent literature. Experimental results with both simulated and real MRI scans show that our improved EigenBlockCD-2 algorithm outperforms the previous methods. It detects clinical changes while ignoring the changes due to the patient's position and other acquisition artifacts.
Detection probability of EBPSK-MODEM system
NASA Astrophysics Data System (ADS)
Yao, Yu; Wu, Lenan
2016-07-01
Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.