Sample records for threshold fevt method

  1. Experimental and Automated Analysis Techniques for High-resolution Electrical Mapping of Small Intestine Slow Wave Activity

    PubMed Central

    Angeli, Timothy R; O'Grady, Gregory; Paskaranandavadivel, Niranchan; Erickson, Jonathan C; Du, Peng; Pullan, Andrew J; Bissett, Ian P

    2013-01-01

    Background/Aims Small intestine motility is governed by an electrical slow wave activity, and abnormal slow wave events have been associated with intestinal dysmotility. High-resolution (HR) techniques are necessary to analyze slow wave propagation, but progress has been limited by few available electrode options and laborious manual analysis. This study presents novel methods for in vivo HR mapping of small intestine slow wave activity. Methods Recordings were obtained from along the porcine small intestine using flexible printed circuit board arrays (256 electrodes; 4 mm spacing). Filtering options were compared, and analysis was automated through adaptations of the falling-edge variable-threshold (FEVT) algorithm and graphical visualization tools. Results A Savitzky-Golay filter was chosen with polynomial-order 9 and window size 1.7 seconds, which maintained 94% of slow wave amplitude, 57% of gradient and achieved a noise correction ratio of 0.083. Optimized FEVT parameters achieved 87% sensitivity and 90% positive-predictive value. Automated activation mapping and animation successfully revealed slow wave propagation patterns, and frequency, velocity, and amplitude were calculated and compared at 5 locations along the intestine (16.4 ± 0.3 cpm, 13.4 ± 1.7 mm/sec, and 43 ± 6 µV, respectively, in the proximal jejunum). Conclusions The methods developed and validated here will greatly assist small intestine HR mapping, and will enable experimental and translational work to evaluate small intestine motility in health and disease. PMID:23667749

  2. Clinically Useful Spirometry in Preschool-Aged Children: Evaluation of the 2007 American Thoracic Society Guidelines

    PubMed Central

    Gaffin, Jonathan M.; Shotola, Nancy Lichtenberg; Martin, Thomas R.; Phipatanakul, Wanda

    2010-01-01

    Rationale In 2007 the American Thoracic Society (ATS) recommended guidelines for acceptability and repeatability for assessing spirometry in preschool children. The authors aim to determine the feasibility of spirometry among children in this age group performing spirometry for the first time in a busy clinical practice. Methods First-time spirometry for children age 4 to 5 years old was selected from the Children’s Hospital Boston Pulmonary Function Test (PFT) database. Maneuvers were deemed acceptable if (1) the flow-volume loop showed rapid rise and smooth descent; (2) the back extrapolated volume (Vbe), the volume leaked by a subject prior to the forced maneuver, was ≤80 ml and 12.5% of forced vital capacity (FVC); and (3) cessation of expiratory flow was at a point ≤10% of peak expiratory flow rate (PEFR). Repeatability was determined by another acceptable maneuver with forced expiratory volume in t seconds (FEVt) and FVC within 10% or 0.1 L of the best acceptable maneuver. Post hoc analysis compared spirometry values for those with asthma and cystic fibrosis to normative values. Results Two hundred and forty-eight preschool children performed spirometry for the first time between August 26, 2006, and August 25, 2008. At least one technically acceptable maneuver was found in 82.3% (n = 204) of the tests performed. Overall, 54% of children were able to perform acceptable and repeatable spirometry based on the ATS criteria. Children with asthma or cystic fibrosis did not have spirometry values that differed significantly from healthy controls. However, up to 29% of the overall cohort displayed at least one abnormal spirometry value. Conclusions Many preschool-aged children are able to perform technically acceptable and repeatable spirometry under normal conditions in a busy clinical setting. Spirometry may be a useful screen for abnormal lung function in this age group. PMID:20653495

  3. Thresholding Based on Maximum Weighted Object Correlation for Rail Defect Detection

    NASA Astrophysics Data System (ADS)

    Li, Qingyong; Huang, Yaping; Liang, Zhengping; Luo, Siwei

    Automatic thresholding is an important technique for rail defect detection, but traditional methods are not competent enough to fit the characteristics of this application. This paper proposes the Maximum Weighted Object Correlation (MWOC) thresholding method, fitting the features that rail images are unimodal and defect proportion is small. MWOC selects a threshold by optimizing the product of object correlation and the weight term that expresses the proportion of thresholded defects. Our experimental results demonstrate that MWOC achieves misclassification error of 0.85%, and outperforms the other well-established thresholding methods, including Otsu, maximum correlation thresholding, maximum entropy thresholding and valley-emphasis method, for the application of rail defect detection.

  4. Chaotic Signal Denoising Based on Hierarchical Threshold Synchrosqueezed Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Bo; Jing, Yun-yu; Zhao, Yan-chao; Zhang, Lian-Hua; Wang, Xiang-Li

    2017-12-01

    In order to overcoming the shortcoming of single threshold synchrosqueezed wavelet transform(SWT) denoising method, an adaptive hierarchical threshold SWT chaotic signal denoising method is proposed. Firstly, a new SWT threshold function is constructed based on Stein unbiased risk estimation, which is two order continuous derivable. Then, by using of the new threshold function, a threshold process based on the minimum mean square error was implemented, and the optimal estimation value of each layer threshold in SWT chaotic denoising is obtained. The experimental results of the simulating chaotic signal and measured sunspot signals show that, the proposed method can filter the noise of chaotic signal well, and the intrinsic chaotic characteristic of the original signal can be recovered very well. Compared with the EEMD denoising method and the single threshold SWT denoising method, the proposed method can obtain better denoising result for the chaotic signal.

  5. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs.

    PubMed

    Tang, Jing; Zheng, Jianbin; Wang, Yang; Yu, Lie; Zhan, Enqi; Song, Qiuzhi

    2018-02-06

    This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM) sets a threshold to divide the ground contact forces (GCFs) into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA) that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs) were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold) were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA), which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM) and Lopez-Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  6. Comparison of image segmentation of lungs using methods: connected threshold, neighborhood connected, and threshold level set segmentation

    NASA Astrophysics Data System (ADS)

    Amanda, A. R.; Widita, R.

    2016-03-01

    The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.

  7. Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds

    PubMed Central

    Deeks, J.J.; Martin, E.C.; Riley, R.D.

    2017-01-01

    Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347

  8. Methods of Muscle Activation Onset Timing Recorded During Spinal Manipulation.

    PubMed

    Currie, Stuart J; Myers, Casey A; Krishnamurthy, Ashok; Enebo, Brian A; Davidson, Bradley S

    2016-05-01

    The purpose of this study was to determine electromyographic threshold parameters that most reliably characterize the muscular response to spinal manipulation and compare 2 methods that detect muscle activity onset delay: the double-threshold method and cross-correlation method. Surface and indwelling electromyography were recorded during lumbar side-lying manipulations in 17 asymptomatic participants. Muscle activity onset delays in relation to the thrusting force were compared across methods and muscles using a generalized linear model. The threshold combinations that resulted in the lowest Detection Failures were the "8 SD-0 milliseconds" threshold (Detection Failures = 8) and the "8 SD-10 milliseconds" threshold (Detection Failures = 9). The average muscle activity onset delay for the double-threshold method across all participants was 149 ± 152 milliseconds for the multifidus and 252 ± 204 milliseconds for the erector spinae. The average onset delay for the cross-correlation method was 26 ± 101 for the multifidus and 67 ± 116 for the erector spinae. There were no statistical interactions, and a main effect of method demonstrated that the delays were higher when using the double-threshold method compared with cross-correlation. The threshold parameters that best characterized activity onset delays were an 8-SD amplitude and a 10-millisecond duration threshold. The double-threshold method correlated well with visual supervision of muscle activity. The cross-correlation method provides several advantages in signal processing; however, supervision was required for some results, negating this advantage. These results help standardize methods when recording neuromuscular responses of spinal manipulation and improve comparisons within and across investigations. Copyright © 2016 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  9. Bayesian methods for estimating GEBVs of threshold traits

    PubMed Central

    Wang, C-L; Ding, X-D; Wang, J-Y; Liu, J-F; Fu, W-X; Zhang, Z; Yin, Z-J; Zhang, Q

    2013-01-01

    Estimation of genomic breeding values is the key step in genomic selection (GS). Many methods have been proposed for continuous traits, but methods for threshold traits are still scarce. Here we introduced threshold model to the framework of GS, and specifically, we extended the three Bayesian methods BayesA, BayesB and BayesCπ on the basis of threshold model for estimating genomic breeding values of threshold traits, and the extended methods are correspondingly termed BayesTA, BayesTB and BayesTCπ. Computing procedures of the three BayesT methods using Markov Chain Monte Carlo algorithm were derived. A simulation study was performed to investigate the benefit of the presented methods in accuracy with the genomic estimated breeding values (GEBVs) for threshold traits. Factors affecting the performance of the three BayesT methods were addressed. As expected, the three BayesT methods generally performed better than the corresponding normal Bayesian methods, in particular when the number of phenotypic categories was small. In the standard scenario (number of categories=2, incidence=30%, number of quantitative trait loci=50, h2=0.3), the accuracies were improved by 30.4%, 2.4%, and 5.7% points, respectively. In most scenarios, BayesTB and BayesTCπ generated similar accuracies and both performed better than BayesTA. In conclusion, our work proved that threshold model fits well for predicting GEBVs of threshold traits, and BayesTCπ is supposed to be the method of choice for GS of threshold traits. PMID:23149458

  10. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  11. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  12. Proposal on Calculation of Ventilation Threshold Using Non-contact Respiration Measurement with Pattern Light Projection

    NASA Astrophysics Data System (ADS)

    Aoki, Hirooki; Ichimura, Shiro; Fujiwara, Toyoki; Kiyooka, Satoru; Koshiji, Kohji; Tsuzuki, Keishi; Nakamura, Hidetoshi; Fujimoto, Hideo

    We proposed a calculation method of the ventilation threshold using the non-contact respiration measurement with dot-matrix pattern light projection under pedaling exercise. The validity and effectiveness of our proposed method is examined by simultaneous measurement with the expiration gas analyzer. The experimental result showed that the correlation existed between the quasi ventilation thresholds calculated by our proposed method and the ventilation thresholds calculated by the expiration gas analyzer. This result indicates the possibility of the non-contact measurement of the ventilation threshold by the proposed method.

  13. Optimal thresholds for the estimation of area rain-rate moments by the threshold method

    NASA Technical Reports Server (NTRS)

    Short, David A.; Shimizu, Kunio; Kedem, Benjamin

    1993-01-01

    Optimization of the threshold method, achieved by determination of the threshold that maximizes the correlation between an area-average rain-rate moment and the area coverage of rain rates exceeding the threshold, is demonstrated empirically and theoretically. Empirical results for a sequence of GATE radar snapshots show optimal thresholds of 5 and 27 mm/h for the first and second moments, respectively. Theoretical optimization of the threshold method by the maximum-likelihood approach of Kedem and Pavlopoulos (1991) predicts optimal thresholds near 5 and 26 mm/h for lognormally distributed rain rates with GATE-like parameters. The agreement between theory and observations suggests that the optimal threshold can be understood as arising due to sampling variations, from snapshot to snapshot, of a parent rain-rate distribution. Optimal thresholds for gamma and inverse Gaussian distributions are also derived and compared.

  14. Thermal detection thresholds in 5-year-old preterm born children; IQ does matter.

    PubMed

    de Graaf, Joke; Valkenburg, Abraham J; Tibboel, Dick; van Dijk, Monique

    2012-07-01

    Experiencing pain at newborn age may have consequences on one's somatosensory perception later in life. Children's perception for cold and warm stimuli may be determined with the Thermal Sensory Analyzer (TSA) device by two different methods. This pilot study in 5-year-old children born preterm aimed at establishing whether the TSA method of limits, which is dependent of reaction time, and the method of levels, which is independent of reaction time, would yield different cold and warm detection thresholds. The second aim was to establish possible associations between intellectual ability and the detection thresholds obtained with either method. A convenience sample was drawn from the participants in an ongoing 5-year follow-up study of a randomized controlled trial on effects of morphine during mechanical ventilation. Thresholds were assessed using both methods and statistically compared. Possible associations between the child's intelligence quotient (IQ) and threshold levels were analyzed. The method of levels yielded more sensitive thresholds than did the method of limits, i.e. mean (SD) cold detection thresholds: 30.3 (1.4) versus 28.4 (1.7) (Cohen'sd=1.2, P=0.001) and warm detection thresholds; 33.9 (1.9) versus 35.6 (2.1) (Cohen's d=0.8, P=0.04). IQ was statistically significantly associated only with the detection thresholds obtained with the method of limits (cold: r=0.64, warm: r=-0.52). The TSA method of levels, is to be preferred over the method of limits in 5-year-old preterm born children, as it establishes more sensitive detection thresholds and is independent of IQ. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.

    PubMed

    OConnor, William; Runquist, Elizabeth A

    2008-07-01

    Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.

  16. Evaluation of different methods for determining growing degree-day thresholds in apricot cultivars

    NASA Astrophysics Data System (ADS)

    Ruml, Mirjana; Vuković, Ana; Milatović, Dragan

    2010-07-01

    The aim of this study was to examine different methods for determining growing degree-day (GDD) threshold temperatures for two phenological stages (full bloom and harvest) and select the optimal thresholds for a greater number of apricot ( Prunus armeniaca L.) cultivars grown in the Belgrade region. A 10-year data series were used to conduct the study. Several commonly used methods to determine the threshold temperatures from field observation were evaluated: (1) the least standard deviation in GDD; (2) the least standard deviation in days; (3) the least coefficient of variation in GDD; (4) regression coefficient; (5) the least standard deviation in days with a mean temperature above the threshold; (6) the least coefficient of variation in days with a mean temperature above the threshold; and (7) the smallest root mean square error between the observed and predicted number of days. In addition, two methods for calculating daily GDD, and two methods for calculating daily mean air temperatures were tested to emphasize the differences that can arise by different interpretations of basic GDD equation. The best agreement with observations was attained by method (7). The lower threshold temperature obtained by this method differed among cultivars from -5.6 to -1.7°C for full bloom, and from -0.5 to 6.6°C for harvest. However, the “Null” method (lower threshold set to 0°C) and “Fixed Value” method (lower threshold set to -2°C for full bloom and to 3°C for harvest) gave very good results. The limitations of the widely used method (1) and methods (5) and (6), which generally performed worst, are discussed in the paper.

  17. Anaerobic Threshold and Salivary α-amylase during Incremental Exercise.

    PubMed

    Akizuki, Kazunori; Yazaki, Syouichirou; Echizenya, Yuki; Ohashi, Yukari

    2014-07-01

    [Purpose] The purpose of this study was to clarify the validity of salivary α-amylase as a method of quickly estimating anaerobic threshold and to establish the relationship between salivary α-amylase and double-product breakpoint in order to create a way to adjust exercise intensity to a safe and effective range. [Subjects and Methods] Eleven healthy young adults performed an incremental exercise test using a cycle ergometer. During the incremental exercise test, oxygen consumption, carbon dioxide production, and ventilatory equivalent were measured using a breath-by-breath gas analyzer. Systolic blood pressure and heart rate were measured to calculate the double product, from which double-product breakpoint was determined. Salivary α-amylase was measured to calculate the salivary threshold. [Results] One-way ANOVA revealed no significant differences among workloads at the anaerobic threshold, double-product breakpoint, and salivary threshold. Significant correlations were found between anaerobic threshold and salivary threshold and between anaerobic threshold and double-product breakpoint. [Conclusion] As a method for estimating anaerobic threshold, salivary threshold was as good as or better than determination of double-product breakpoint because the correlation between anaerobic threshold and salivary threshold was higher than the correlation between anaerobic threshold and double-product breakpoint. Therefore, salivary threshold is a useful index of anaerobic threshold during an incremental workload.

  18. An integrative perspective of the anaerobic threshold.

    PubMed

    Sales, Marcelo Magalhães; Sousa, Caio Victor; da Silva Aguiar, Samuel; Knechtle, Beat; Nikolaidis, Pantelis Theodoros; Alves, Polissandro Mortoza; Simões, Herbert Gustavo

    2017-12-14

    The concept of anaerobic threshold (AT) was introduced during the nineteen sixties. Since then, several methods to identify the anaerobic threshold (AT) have been studied and suggested as novel 'thresholds' based upon the variable used for its detection (i.e. lactate threshold, ventilatory threshold, glucose threshold). These different techniques have brought some confusion about how we should name this parameter, for instance, anaerobic threshold or the physiological measure used (i.e. lactate, ventilation). On the other hand, the modernization of scientific methods and apparatus to detect AT, as well as the body of literature formed in the past decades, could provide a more cohesive understanding over the AT and the multiple physiological systems involved. Thus, the purpose of this review was to provide an integrative perspective of the methods to determine AT. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Reliability of TMS phosphene threshold estimation: Toward a standardized protocol.

    PubMed

    Mazzi, Chiara; Savazzi, Silvia; Abrahamyan, Arman; Ruzzoli, Manuela

    Phosphenes induced by transcranial magnetic stimulation (TMS) are a subjectively described visual phenomenon employed in basic and clinical research as index of the excitability of retinotopically organized areas in the brain. Phosphene threshold estimation is a preliminary step in many TMS experiments in visual cognition for setting the appropriate level of TMS doses; however, the lack of a direct comparison of the available methods for phosphene threshold estimation leaves unsolved the reliability of those methods in setting TMS doses. The present work aims at fulfilling this gap. We compared the most common methods for phosphene threshold calculation, namely the Method of Constant Stimuli (MOCS), the Modified Binary Search (MOBS) and the Rapid Estimation of Phosphene Threshold (REPT). In two experiments we tested the reliability of PT estimation under each of the three methods, considering the day of administration, participants' expertise in phosphene perception and the sensitivity of each method to the initial values used for the threshold calculation. We found that MOCS and REPT have comparable reliability when estimating phosphene thresholds, while MOBS estimations appear less stable. Based on our results, researchers and clinicians can estimate phosphene threshold according to MOCS or REPT equally reliably, depending on their specific investigation goals. We suggest several important factors for consideration when calculating phosphene thresholds and describe strategies to adopt in experimental procedures. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. A threshold selection method based on edge preserving

    NASA Astrophysics Data System (ADS)

    Lou, Liantang; Dan, Wei; Chen, Jiaqi

    2015-12-01

    A method of automatic threshold selection for image segmentation is presented. An optimal threshold is selected in order to preserve edge of image perfectly in image segmentation. The shortcoming of Otsu's method based on gray-level histograms is analyzed. The edge energy function of bivariate continuous function is expressed as the line integral while the edge energy function of image is simulated by discretizing the integral. An optimal threshold method by maximizing the edge energy function is given. Several experimental results are also presented to compare with the Otsu's method.

  1. Detecting wood surface defects with fusion algorithm of visual saliency and local threshold segmentation

    NASA Astrophysics Data System (ADS)

    Wang, Xuejuan; Wu, Shuhang; Liu, Yunpeng

    2018-04-01

    This paper presents a new method for wood defect detection. It can solve the over-segmentation problem existing in local threshold segmentation methods. This method effectively takes advantages of visual saliency and local threshold segmentation. Firstly, defect areas are coarsely located by using spectral residual method to calculate global visual saliency of them. Then, the threshold segmentation of maximum inter-class variance method is adopted for positioning and segmenting the wood surface defects precisely around the coarse located areas. Lastly, we use mathematical morphology to process the binary images after segmentation, which reduces the noise and small false objects. Experiments on test images of insect hole, dead knot and sound knot show that the method we proposed obtains ideal segmentation results and is superior to the existing segmentation methods based on edge detection, OSTU and threshold segmentation.

  2. California sea lion (Zalophus californianus) aerial hearing sensitivity measured using auditory steady-state response and psychophysical methods.

    PubMed

    Mulsow, Jason; Finneran, James J; Houser, Dorian S

    2011-04-01

    Although electrophysiological methods of measuring the hearing sensitivity of pinnipeds are not yet as refined as those for dolphins and porpoises, they appear to be a promising supplement to traditional psychophysical procedures. In order to further standardize electrophysiological methods with pinnipeds, a within-subject comparison of psychophysical and auditory steady-state response (ASSR) measures of aerial hearing sensitivity was conducted with a 1.5-yr-old California sea lion. The psychophysical audiogram was similar to those previously reported for otariids, with a U-shape, and thresholds near 10 dB re 20 μPa at 8 and 16 kHz. ASSR thresholds measured using both single and multiple simultaneous amplitude-modulated tones closely reproduced the psychophysical audiogram, although the mean ASSR thresholds were elevated relative to psychophysical thresholds. Differences between psychophysical and ASSR thresholds were greatest at the low- and high-frequency ends of the audiogram. Thresholds measured using the multiple ASSR method were not different from those measured using the single ASSR method. The multiple ASSR method was more rapid than the single ASSR method, and allowed for threshold measurements at seven frequencies in less than 20 min. The multiple ASSR method may be especially advantageous for hearing sensitivity measurements with otariid subjects that are untrained for psychophysical procedures.

  3. How to determine an optimal threshold to classify real-time crash-prone traffic conditions?

    PubMed

    Yang, Kui; Yu, Rongjie; Wang, Xuesong; Quddus, Mohammed; Xue, Lifang

    2018-08-01

    One of the proactive approaches in reducing traffic crashes is to identify hazardous traffic conditions that may lead to a traffic crash, known as real-time crash prediction. Threshold selection is one of the essential steps of real-time crash prediction. And it provides the cut-off point for the posterior probability which is used to separate potential crash warnings against normal traffic conditions, after the outcome of the probability of a crash occurring given a specific traffic condition on the basis of crash risk evaluation models. There is however a dearth of research that focuses on how to effectively determine an optimal threshold. And only when discussing the predictive performance of the models, a few studies utilized subjective methods to choose the threshold. The subjective methods cannot automatically identify the optimal thresholds in different traffic and weather conditions in real application. Thus, a theoretical method to select the threshold value is necessary for the sake of avoiding subjective judgments. The purpose of this study is to provide a theoretical method for automatically identifying the optimal threshold. Considering the random effects of variable factors across all roadway segments, the mixed logit model was utilized to develop the crash risk evaluation model and further evaluate the crash risk. Cross-entropy, between-class variance and other theories were employed and investigated to empirically identify the optimal threshold. And K-fold cross-validation was used to validate the performance of proposed threshold selection methods with the help of several evaluation criteria. The results indicate that (i) the mixed logit model can obtain a good performance; (ii) the classification performance of the threshold selected by the minimum cross-entropy method outperforms the other methods according to the criteria. This method can be well-behaved to automatically identify thresholds in crash prediction, by minimizing the cross entropy between the original dataset with continuous probability of a crash occurring and the binarized dataset after using the thresholds to separate potential crash warnings against normal traffic conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Experimental and environmental factors affect spurious detection of ecological thresholds

    USGS Publications Warehouse

    Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.

    2012-01-01

    Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.

  5. Peaks Over Threshold (POT): A methodology for automatic threshold estimation using goodness of fit p-value

    NASA Astrophysics Data System (ADS)

    Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.

    2017-04-01

    Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.

  6. Verification of the tumor volume delineation method using a fixed threshold of peak standardized uptake value.

    PubMed

    Koyama, Kazuya; Mitsumoto, Takuya; Shiraishi, Takahiro; Tsuda, Keisuke; Nishiyama, Atsushi; Inoue, Kazumasa; Yoshikawa, Kyosan; Hatano, Kazuo; Kubota, Kazuo; Fukushi, Masahiro

    2017-09-01

    We aimed to determine the difference in tumor volume associated with the reconstruction model in positron-emission tomography (PET). To reduce the influence of the reconstruction model, we suggested a method to measure the tumor volume using the relative threshold method with a fixed threshold based on peak standardized uptake value (SUV peak ). The efficacy of our method was verified using 18 F-2-fluoro-2-deoxy-D-glucose PET/computed tomography images of 20 patients with lung cancer. The tumor volume was determined using the relative threshold method with a fixed threshold based on the SUV peak . The PET data were reconstructed using the ordered-subset expectation maximization (OSEM) model, the OSEM + time-of-flight (TOF) model, and the OSEM + TOF + point-spread function (PSF) model. The volume differences associated with the reconstruction algorithm (%VD) were compared. For comparison, the tumor volume was measured using the relative threshold method based on the maximum SUV (SUV max ). For the OSEM and TOF models, the mean %VD values were -0.06 ± 8.07 and -2.04 ± 4.23% for the fixed 40% threshold according to the SUV max and the SUV peak, respectively. The effect of our method in this case seemed to be minor. For the OSEM and PSF models, the mean %VD values were -20.41 ± 14.47 and -13.87 ± 6.59% for the fixed 40% threshold according to the SUV max and SUV peak , respectively. Our new method enabled the measurement of tumor volume with a fixed threshold and reduced the influence of the changes in tumor volume associated with the reconstruction model.

  7. Comparison of software and human observers in reading images of the CDMAM test object to assess digital mammography systems

    NASA Astrophysics Data System (ADS)

    Young, Kenneth C.; Cook, James J. H.; Oduko, Jennifer M.; Bosmans, Hilde

    2006-03-01

    European Guidelines for quality control in digital mammography specify minimum and achievable standards of image quality in terms of threshold contrast, based on readings of images of the CDMAM test object by human observers. However this is time-consuming and has large inter-observer error. To overcome these problems a software program (CDCOM) is available to automatically read CDMAM images, but the optimal method of interpreting the output is not defined. This study evaluates methods of determining threshold contrast from the program, and compares these to human readings for a variety of mammography systems. The methods considered are (A) simple thresholding (B) psychometric curve fitting (C) smoothing and interpolation and (D) smoothing and psychometric curve fitting. Each method leads to similar threshold contrasts but with different reproducibility. Method (A) had relatively poor reproducibility with a standard error in threshold contrast of 18.1 +/- 0.7%. This was reduced to 8.4% by using a contrast-detail curve fitting procedure. Method (D) had the best reproducibility with an error of 6.7%, reducing to 5.1% with curve fitting. A panel of 3 human observers had an error of 4.4% reduced to 2.9 % by curve fitting. All automatic methods led to threshold contrasts that were lower than for humans. The ratio of human to program threshold contrasts varied with detail diameter and was 1.50 +/- .04 (sem) at 0.1mm and 1.82 +/- .06 at 0.25mm for method (D). There were good correlations between the threshold contrast determined by humans and the automated methods.

  8. Threshold selection for classification of MR brain images by clustering method

    NASA Astrophysics Data System (ADS)

    Moldovanu, Simona; Obreja, Cristian; Moraru, Luminita

    2015-12-01

    Given a grey-intensity image, our method detects the optimal threshold for a suitable binarization of MR brain images. In MR brain image processing, the grey levels of pixels belonging to the object are not substantially different from the grey levels belonging to the background. Threshold optimization is an effective tool to separate objects from the background and further, in classification applications. This paper gives a detailed investigation on the selection of thresholds. Our method does not use the well-known method for binarization. Instead, we perform a simple threshold optimization which, in turn, will allow the best classification of the analyzed images into healthy and multiple sclerosis disease. The dissimilarity (or the distance between classes) has been established using the clustering method based on dendrograms. We tested our method using two classes of images: the first consists of 20 T2-weighted and 20 proton density PD-weighted scans from two healthy subjects and from two patients with multiple sclerosis. For each image and for each threshold, the number of the white pixels (or the area of white objects in binary image) has been determined. These pixel numbers represent the objects in clustering operation. The following optimum threshold values are obtained, T = 80 for PD images and T = 30 for T2w images. Each mentioned threshold separate clearly the clusters that belonging of the studied groups, healthy patient and multiple sclerosis disease.

  9. Influenza surveillance in Europe: establishing epidemic thresholds by the Moving Epidemic Method

    PubMed Central

    Vega, Tomás; Lozano, Jose Eugenio; Meerhoff, Tamara; Snacken, René; Mott, Joshua; Ortiz de Lejarazu, Raul; Nunes, Baltazar

    2012-01-01

    Please cite this paper as: Vega et al. (2012) Influenza surveillance in Europe: establishing epidemic thresholds by the moving epidemic method. Influenza and Other Respiratory Viruses 7(4), 546–558. Background  Timely influenza surveillance is important to monitor influenza epidemics. Objectives  (i) To calculate the epidemic threshold for influenza‐like illness (ILI) and acute respiratory infections (ARI) in 19 countries, as well as the thresholds for different levels of intensity. (ii) To evaluate the performance of these thresholds. Methods  The moving epidemic method (MEM) has been developed to determine the baseline influenza activity and an epidemic threshold. False alerts, detection lags and timeliness of the detection of epidemics were calculated. The performance was evaluated using a cross‐validation procedure. Results  The overall sensitivity of the MEM threshold was 71·8% and the specificity was 95·5%. The median of the timeliness was 1 week (range: 0–4·5). Conclusions  The method produced a robust and specific signal to detect influenza epidemics. The good balance between the sensitivity and specificity of the epidemic threshold to detect seasonal epidemics and avoid false alerts has advantages for public health purposes. This method may serve as standard to define the start of the annual influenza epidemic in countries in Europe. PMID:22897919

  10. Comparison of alternatives to amplitude thresholding for onset detection of acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bai, F.; Gagar, D.; Foote, P.; Zhao, Y.

    2017-02-01

    Acoustic Emission (AE) monitoring can be used to detect the presence of damage as well as determine its location in Structural Health Monitoring (SHM) applications. Information on the time difference of the signal generated by the damage event arriving at different sensors in an array is essential in performing localisation. Currently, this is determined using a fixed threshold which is particularly prone to errors when not set to optimal values. This paper presents three new methods for determining the onset of AE signals without the need for a predetermined threshold. The performance of the techniques is evaluated using AE signals generated during fatigue crack growth and compared to the established Akaike Information Criterion (AIC) and fixed threshold methods. It was found that the 1D location accuracy of the new methods was within the range of < 1 - 7.1 % of the monitored region compared to 2.7% for the AIC method and a range of 1.8-9.4% for the conventional Fixed Threshold method at different threshold levels.

  11. Evaluation of Maryland abutment scour equation through selected threshold velocity methods

    USGS Publications Warehouse

    Benedict, S.T.

    2010-01-01

    The U.S. Geological Survey, in cooperation with the Maryland State Highway Administration, used field measurements of scour to evaluate the sensitivity of the Maryland abutment scour equation to the critical (or threshold) velocity variable. Four selected methods for estimating threshold velocity were applied to the Maryland abutment scour equation, and the predicted scour to the field measurements were compared. Results indicated that performance of the Maryland abutment scour equation was sensitive to the threshold velocity with some threshold velocity methods producing better estimates of predicted scour than did others. In addition, results indicated that regional stream characteristics can affect the performance of the Maryland abutment scour equation with moderate-gradient streams performing differently from low-gradient streams. On the basis of the findings of the investigation, guidance for selecting threshold velocity methods for application to the Maryland abutment scour equation are provided, and limitations are noted.

  12. Subsurface characterization with localized ensemble Kalman filter employing adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Delijani, Ebrahim Biniaz; Pishvaie, Mahmoud Reza; Boozarjomehry, Ramin Bozorgmehry

    2014-07-01

    Ensemble Kalman filter, EnKF, as a Monte Carlo sequential data assimilation method has emerged promisingly for subsurface media characterization during past decade. Due to high computational cost of large ensemble size, EnKF is limited to small ensemble set in practice. This results in appearance of spurious correlation in covariance structure leading to incorrect or probable divergence of updated realizations. In this paper, a universal/adaptive thresholding method is presented to remove and/or mitigate spurious correlation problem in the forecast covariance matrix. This method is, then, extended to regularize Kalman gain directly. Four different thresholding functions have been considered to threshold forecast covariance and gain matrices. These include hard, soft, lasso and Smoothly Clipped Absolute Deviation (SCAD) functions. Three benchmarks are used to evaluate the performances of these methods. These benchmarks include a small 1D linear model and two 2D water flooding (in petroleum reservoirs) cases whose levels of heterogeneity/nonlinearity are different. It should be noted that beside the adaptive thresholding, the standard distance dependant localization and bootstrap Kalman gain are also implemented for comparison purposes. We assessed each setup with different ensemble sets to investigate the sensitivity of each method on ensemble size. The results indicate that thresholding of forecast covariance yields more reliable performance than Kalman gain. Among thresholding function, SCAD is more robust for both covariance and gain estimation. Our analyses emphasize that not all assimilation cycles do require thresholding and it should be performed wisely during the early assimilation cycles. The proposed scheme of adaptive thresholding outperforms other methods for subsurface characterization of underlying benchmarks.

  13. Accuracy of cancellous bone volume fraction measured by micro-CT scanning.

    PubMed

    Ding, M; Odgaard, A; Hvid, I

    1999-03-01

    Volume fraction, the single most important parameter in describing trabecular microstructure, can easily be calculated from three-dimensional reconstructions of micro-CT images. This study sought to quantify the accuracy of this measurement. One hundred and sixty human cancellous bone specimens which covered a large range of volume fraction (9.8-39.8%) were produced. The specimens were micro-CT scanned, and the volume fraction based on Archimedes' principle was determined as a reference. After scanning, all micro-CT data were segmented using individual thresholds determined by the scanner supplied algorithm (method I). A significant deviation of volume fraction from method I was found: both the y-intercept and the slope of the regression line were significantly different from those of the Archimedes-based volume fraction (p < 0.001). New individual thresholds were determined based on a calibration of volume fraction to the Archimedes-based volume fractions (method II). The mean thresholds of the two methods were applied to segment 20 randomly selected specimens. The results showed that volume fraction using the mean threshold of method I was underestimated by 4% (p = 0.001), whereas the mean threshold of method II yielded accurate values. The precision of the measurement was excellent. Our data show that care must be taken when applying thresholds in generating 3-D data, and that a fixed threshold may be used to obtain reliable volume fraction data. This fixed threshold may be determined from the Archimedes-based volume fraction of a subgroup of specimens. The threshold may vary between different materials, and so it should be determined whenever a study series is performed.

  14. Threshold-Voltage-Shift Compensation and Suppression Method Using Hydrogenated Amorphous Silicon Thin-Film Transistors for Large Active Matrix Organic Light-Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Oh, Kyonghwan; Kwon, Oh-Kyong

    2012-03-01

    A threshold-voltage-shift compensation and suppression method for active matrix organic light-emitting diode (AMOLED) displays fabricated using a hydrogenated amorphous silicon thin-film transistor (TFT) backplane is proposed. The proposed method compensates for the threshold voltage variation of TFTs due to different threshold voltage shifts during emission time and extends the lifetime of the AMOLED panel. Measurement results show that the error range of emission current is from -1.1 to +1.7% when the threshold voltage of TFTs varies from 1.2 to 3.0 V.

  15. Reliability of the method of levels for determining cutaneous temperature sensitivity

    NASA Astrophysics Data System (ADS)

    Jakovljević, Miroljub; Mekjavić, Igor B.

    2012-09-01

    Determination of the thermal thresholds is used clinically for evaluation of peripheral nervous system function. The aim of this study was to evaluate reliability of the method of levels performed with a new, low cost device for determining cutaneous temperature sensitivity. Nineteen male subjects were included in the study. Thermal thresholds were tested on the right side at the volar surface of mid-forearm, lateral surface of mid-upper arm and front area of mid-thigh. Thermal testing was carried out by the method of levels with an initial temperature step of 2°C. Variability of thermal thresholds was expressed by means of the ratio between the second and the first testing, coefficient of variation (CV), coefficient of repeatability (CR), intraclass correlation coefficient (ICC), mean difference between sessions (S1-S2diff), standard error of measurement (SEM) and minimally detectable change (MDC). There were no statistically significant changes between sessions for warm or cold thresholds, or between warm and cold thresholds. Within-subject CVs were acceptable. The CR estimates for warm thresholds ranged from 0.74°C to 1.06°C and from 0.67°C to 1.07°C for cold thresholds. The ICC values for intra-rater reliability ranged from 0.41 to 0.72 for warm thresholds and from 0.67 to 0.84 for cold thresholds. S1-S2diff ranged from -0.15°C to 0.07°C for warm thresholds, and from -0.08°C to 0.07°C for cold thresholds. SEM ranged from 0.26°C to 0.38°C for warm thresholds, and from 0.23°C to 0.38°C for cold thresholds. Estimated MDC values were between 0.60°C and 0.88°C for warm thresholds, and 0.53°C and 0.88°C for cold thresholds. The method of levels for determining cutaneous temperature sensitivity has acceptable reliability.

  16. Lower-upper-threshold correlation for underwater range-gated imaging self-adaptive enhancement.

    PubMed

    Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; He, Jun; Fan, Songtao; Zhou, Yan; Liu, Yuliang

    2016-10-10

    In underwater range-gated imaging (URGI), enhancement of low-brightness and low-contrast images is critical for human observation. Traditional histogram equalizations over-enhance images, with the result of details being lost. To compress over-enhancement, a lower-upper-threshold correlation method is proposed for underwater range-gated imaging self-adaptive enhancement based on double-plateau histogram equalization. The lower threshold determines image details and compresses over-enhancement. It is correlated with the upper threshold. First, the upper threshold is updated by searching for the local maximum in real time, and then the lower threshold is calculated by the upper threshold and the number of nonzero units selected from a filtered histogram. With this method, the backgrounds of underwater images are constrained with enhanced details. Finally, the proof experiments are performed. Peak signal-to-noise-ratio, variance, contrast, and human visual properties are used to evaluate the objective quality of the global and regions of interest images. The evaluation results demonstrate that the proposed method adaptively selects the proper upper and lower thresholds under different conditions. The proposed method contributes to URGI with effective image enhancement for human eyes.

  17. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    NASA Astrophysics Data System (ADS)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  18. A new edge detection algorithm based on Canny idea

    NASA Astrophysics Data System (ADS)

    Feng, Yingke; Zhang, Jinmin; Wang, Siming

    2017-10-01

    The traditional Canny algorithm has poor self-adaptability threshold, and it is more sensitive to noise. In order to overcome these drawbacks, this paper proposed a new edge detection method based on Canny algorithm. Firstly, the media filtering and filtering based on the method of Euclidean distance are adopted to process it; secondly using the Frei-chen algorithm to calculate gradient amplitude; finally, using the Otsu algorithm to calculate partial gradient amplitude operation to get images of thresholds value, then find the average of all thresholds that had been calculated, half of the average is high threshold value, and the half of the high threshold value is low threshold value. Experiment results show that this new method can effectively suppress noise disturbance, keep the edge information, and also improve the edge detection accuracy.

  19. Threshold selection for classification of MR brain images by clustering method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moldovanu, Simona; Dumitru Moţoc High School, 15 Milcov St., 800509, Galaţi; Obreja, Cristian

    Given a grey-intensity image, our method detects the optimal threshold for a suitable binarization of MR brain images. In MR brain image processing, the grey levels of pixels belonging to the object are not substantially different from the grey levels belonging to the background. Threshold optimization is an effective tool to separate objects from the background and further, in classification applications. This paper gives a detailed investigation on the selection of thresholds. Our method does not use the well-known method for binarization. Instead, we perform a simple threshold optimization which, in turn, will allow the best classification of the analyzedmore » images into healthy and multiple sclerosis disease. The dissimilarity (or the distance between classes) has been established using the clustering method based on dendrograms. We tested our method using two classes of images: the first consists of 20 T2-weighted and 20 proton density PD-weighted scans from two healthy subjects and from two patients with multiple sclerosis. For each image and for each threshold, the number of the white pixels (or the area of white objects in binary image) has been determined. These pixel numbers represent the objects in clustering operation. The following optimum threshold values are obtained, T = 80 for PD images and T = 30 for T2w images. Each mentioned threshold separate clearly the clusters that belonging of the studied groups, healthy patient and multiple sclerosis disease.« less

  20. Psychophysics with children: Investigating the effects of attentional lapses on threshold estimates.

    PubMed

    Manning, Catherine; Jones, Pete R; Dekker, Tessa M; Pellicano, Elizabeth

    2018-03-26

    When assessing the perceptual abilities of children, researchers tend to use psychophysical techniques designed for use with adults. However, children's poorer attentiveness might bias the threshold estimates obtained by these methods. Here, we obtained speed discrimination threshold estimates in 6- to 7-year-old children in UK Key Stage 1 (KS1), 7- to 9-year-old children in Key Stage 2 (KS2), and adults using three psychophysical procedures: QUEST, a 1-up 2-down Levitt staircase, and Method of Constant Stimuli (MCS). We estimated inattentiveness using responses to "easy" catch trials. As expected, children had higher threshold estimates and made more errors on catch trials than adults. Lower threshold estimates were obtained from psychometric functions fit to the data in the QUEST condition than the MCS and Levitt staircases, and the threshold estimates obtained when fitting a psychometric function to the QUEST data were also lower than when using the QUEST mode. This suggests that threshold estimates cannot be compared directly across methods. Differences between the procedures did not vary significantly with age group. Simulations indicated that inattentiveness biased threshold estimates particularly when threshold estimates were computed as the QUEST mode or the average of staircase reversals. In contrast, thresholds estimated by post-hoc psychometric function fitting were less biased by attentional lapses. Our results suggest that some psychophysical methods are more robust to attentiveness, which has important implications for assessing the perception of children and clinical groups.

  1. Cost-effectiveness thresholds: methods for setting and examples from around the world.

    PubMed

    Santos, André Soares; Guerra-Junior, Augusto Afonso; Godman, Brian; Morton, Alec; Ruas, Cristina Mariano

    2018-06-01

    Cost-effectiveness thresholds (CETs) are used to judge if an intervention represents sufficient value for money to merit adoption in healthcare systems. The study was motivated by the Brazilian context of HTA, where meetings are being conducted to decide on the definition of a threshold. Areas covered: An electronic search was conducted on Medline (via PubMed), Lilacs (via BVS) and ScienceDirect followed by a complementary search of references of included studies, Google Scholar and conference abstracts. Cost-effectiveness thresholds are usually calculated through three different approaches: the willingness-to-pay, representative of welfare economics; the precedent method, based on the value of an already funded technology; and the opportunity cost method, which links the threshold to the volume of health displaced. An explicit threshold has never been formally adopted in most places. Some countries have defined thresholds, with some flexibility to consider other factors. An implicit threshold could be determined by research of funded cases. Expert commentary: CETs have had an important role as a 'bridging concept' between the world of academic research and the 'real world' of healthcare prioritization. The definition of a cost-effectiveness threshold is paramount for the construction of a transparent and efficient Health Technology Assessment system.

  2. A study of the threshold method utilizing raingage data

    NASA Technical Reports Server (NTRS)

    Short, David A.; Wolff, David B.; Rosenfeld, Daniel; Atlas, David

    1993-01-01

    The threshold method for estimation of area-average rain rate relies on determination of the fractional area where rain rate exceeds a preset level of intensity. Previous studies have shown that the optimal threshold level depends on the climatological rain-rate distribution (RRD). It has also been noted, however, that the climatological RRD may be composed of an aggregate of distributions, one for each of several distinctly different synoptic conditions, each having its own optimal threshold. In this study, the impact of RRD variations on the threshold method is shown in an analysis of 1-min rainrate data from a network of tipping-bucket gauges in Darwin, Australia. Data are analyzed for two distinct regimes: the premonsoon environment, having isolated intense thunderstorms, and the active monsoon rains, having organized convective cell clusters that generate large areas of stratiform rain. It is found that a threshold of 10 mm/h results in the same threshold coefficient for both regimes, suggesting an alternative definition of optimal threshold as that which is least sensitive to distribution variations. The observed behavior of the threshold coefficient is well simulated by assumption of lognormal distributions with different scale parameters and same shape parameters.

  3. Improvement in the measurement error of the specific binding ratio in dopamine transporter SPECT imaging due to exclusion of the cerebrospinal fluid fraction using the threshold of voxel RI count.

    PubMed

    Mizumura, Sunao; Nishikawa, Kazuhiro; Murata, Akihiro; Yoshimura, Kosei; Ishii, Nobutomo; Kokubo, Tadashi; Morooka, Miyako; Kajiyama, Akiko; Terahara, Atsuro

    2018-05-01

    In Japan, the Southampton method for dopamine transporter (DAT) SPECT is widely used to quantitatively evaluate striatal radioactivity. The specific binding ratio (SBR) is the ratio of specific to non-specific binding observed after placing pentagonal striatal voxels of interest (VOIs) as references. Although the method can reduce the partial volume effect, the SBR may fluctuate due to the presence of low-count areas of cerebrospinal fluid (CSF), caused by brain atrophy, in the striatal VOIs. We examined the effect of the exclusion of low-count VOIs on SBR measurement. We retrospectively reviewed DAT imaging of 36 patients with parkinsonian syndromes performed after injection of 123 I-FP-CIT. SPECT data were reconstructed using three conditions. We defined the CSF area in each SPECT image after segmenting the brain tissues. A merged image of gray and white matter images was constructed from each patient's magnetic resonance imaging (MRI) to create an idealized brain image that excluded the CSF fraction (MRI-mask method). We calculated the SBR and asymmetric index (AI) in the MRI-mask method for each reconstruction condition. We then calculated the mean and standard deviation (SD) of voxel RI counts in the reference VOI without the striatal VOIs in each image, and determined the SBR by excluding the low-count pixels (threshold method) using five thresholds: mean-0.0SD, mean-0.5SD, mean-1.0SD, mean-1.5SD, and mean-2.0SD. We also calculated the AIs from the SBRs measured using the threshold method. We examined the correlation among the SBRs of the threshold method, between the uncorrected SBRs and the SBRs of the MRI-mask method, and between the uncorrected AIs and the AIs of the MRI-mask method. The intraclass correlation coefficient indicated an extremely high correlation among the SBRs and among the AIs of the MRI-mask and threshold methods at thresholds between mean-2.0D and mean-1.0SD, regardless of the reconstruction correction. The differences among the SBRs and the AIs of the two methods were smallest at thresholds between man-2.0SD and mean-1.0SD. The SBR calculated using the threshold method was highly correlated with the MRI-SBR. These results suggest that the CSF correction of the threshold method is effective for the calculation of idealized SBR and AI values.

  4. Stable Extraction of Threshold Voltage Using Transconductance Change Method for CMOS Modeling, Simulation and Characterization

    NASA Astrophysics Data System (ADS)

    Choi, Woo Young; Woo, Dong-Soo; Choi, Byung Yong; Lee, Jong Duk; Park, Byung-Gook

    2004-04-01

    We proposed a stable extraction algorithm for threshold voltage using transconductance change method by optimizing node interval. With the algorithm, noise-free gm2 (=dgm/dVGS) profiles can be extracted within one-percent error, which leads to more physically-meaningful threshold voltage calculation by the transconductance change method. The extracted threshold voltage predicts the gate-to-source voltage at which the surface potential is within kT/q of φs=2φf+VSB. Our algorithm makes the transconductance change method more practical by overcoming noise problem. This threshold voltage extraction algorithm yields the threshold roll-off behavior of nanoscale metal oxide semiconductor field effect transistor (MOSFETs) accurately and makes it possible to calculate the surface potential φs at any other point on the drain-to-source current (IDS) versus gate-to-source voltage (VGS) curve. It will provide us with a useful analysis tool in the field of device modeling, simulation and characterization.

  5. Threshold Assessment of Gear Diagnostic Tools on Flight and Test Rig Data

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Mosher, Marianne; Huff, Edward M.

    2003-01-01

    A method for defining thresholds for vibration-based algorithms that provides the minimum number of false alarms while maintaining sensitivity to gear damage was developed. This analysis focused on two vibration based gear damage detection algorithms, FM4 and MSA. This method was developed using vibration data collected during surface fatigue tests performed in a spur gearbox rig. The thresholds were defined based on damage progression during tests with damage. The thresholds false alarm rates were then evaluated on spur gear tests without damage. Next, the same thresholds were applied to flight data from an OH-58 helicopter transmission. Results showed that thresholds defined in test rigs can be used to define thresholds in flight to correctly classify the transmission operation as normal.

  6. Discriminating the precipitation phase based on different temperature thresholds in the Songhua River Basin, China

    NASA Astrophysics Data System (ADS)

    Zhong, Keyuan; Zheng, Fenli; Xu, Ximeng; Qin, Chao

    2018-06-01

    Different precipitation phases (rain, snow or sleet) differ greatly in their hydrological and erosional processes. Therefore, accurate discrimination of the precipitation phase is highly important when researching hydrologic processes and climate change at high latitudes and mountainous regions. The objective of this study was to identify suitable temperature thresholds for discriminating the precipitation phase in the Songhua River Basin (SRB) based on 20-year daily precipitation collected from 60 meteorological stations located in and around the basin. Two methods, the air temperature method (AT method) and the wet bulb temperature method (WBT method), were used to discriminate the precipitation phase. Thirteen temperature thresholds were used to discriminate snowfall in the SRB. These thresholds included air temperatures from 0 to 5.5 °C at intervals of 0.5 °C and the wet bulb temperature (WBT). Three evaluation indices, the error percentage of discriminated snowfall days (Ep), the relative error of discriminated snowfall (Re) and the determination coefficient (R2), were applied to assess the discrimination accuracy. The results showed that 2.5 °C was the optimum threshold temperature for discriminating snowfall at the scale of the entire basin. Due to differences in the landscape conditions at the different stations, the optimum threshold varied by station. The optimal threshold ranged 1.5-4.0 °C, and 19 stations, 17 stations and 18 stations had optimal thresholds of 2.5 °C, 3.0 °C, and 3.5 °C respectively, occupying 90% of all stations. Compared with using a single suitable temperature threshold to discriminate snowfall throughout the basin, it was more accurate to use the optimum threshold at each station to estimate snowfall in the basin. In addition, snowfall was underestimated when the temperature threshold was the WBT and when the temperature threshold was below 2.5 °C, whereas snowfall was overestimated when the temperature threshold exceeded 4.0 °C at most stations. The results of this study provide information for climate change research and hydrological process simulations in the SRB, as well as provide reference information for discriminating precipitation phase in other regions.

  7. DYNAMIC PATTERN RECOGNITION BY MEANS OF THRESHOLD NETS,

    DTIC Science & Technology

    A method is expounded for the recognition of visual patterns. A circuit diagram of a device is described which is based on a multilayer threshold ...structure synthesized in accordance with the proposed method. Coded signals received each time an image is displayed are transmitted to the threshold ...circuit which distinguishes the signs, and from there to the layers of threshold resolving elements. The image at each layer is made to correspond

  8. Rainfall Threshold for Flash Flood Early Warning Based on Rational Equation: A Case Study of Zuojiao Watershed in Yunnan Province

    NASA Astrophysics Data System (ADS)

    Li, Q.; Wang, Y. L.; Li, H. C.; Zhang, M.; Li, C. Z.; Chen, X.

    2017-12-01

    Rainfall threshold plays an important role in flash flood warning. A simple and easy method, using Rational Equation to calculate rainfall threshold, was proposed in this study. The critical rainfall equation was deduced from the Rational Equation. On the basis of the Manning equation and the results of Chinese Flash Flood Survey and Evaluation (CFFSE) Project, the critical flow was obtained, and the net rainfall was calculated. Three aspects of the rainfall losses, i.e. depression storage, vegetation interception, and soil infiltration were considered. The critical rainfall was the sum of the net rainfall and the rainfall losses. Rainfall threshold was estimated after considering the watershed soil moisture using the critical rainfall. In order to demonstrate this method, Zuojiao watershed in Yunnan Province was chosen as study area. The results showed the rainfall thresholds calculated by the Rational Equation method were approximated to the rainfall thresholds obtained from CFFSE, and were in accordance with the observed rainfall during flash flood events. Thus the calculated results are reasonable and the method is effective. This study provided a quick and convenient way to calculated rainfall threshold of flash flood warning for the grass root staffs and offered technical support for estimating rainfall threshold.

  9. A threshold method for immunological correlates of protection

    PubMed Central

    2013-01-01

    Background Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. Methods We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Results Highly significant thresholds with p-values less than 0.01 were found for 13 of the 15 datasets. Considerable variability was seen in the widths of confidence intervals. Relative risks indicated around 70% or better protection in 11 datasets and relevance of the estimated threshold to imply strong protection. Goodness-of-fit was generally acceptable. Conclusions The a:b model offers a formal statistical method of estimation of thresholds differentiating susceptible from protected individuals which has previously depended on putative statements based on visual inspection of data. PMID:23448322

  10. Developing Bayesian adaptive methods for estimating sensitivity thresholds (d′) in Yes-No and forced-choice tasks

    PubMed Central

    Lesmes, Luis A.; Lu, Zhong-Lin; Baek, Jongsoo; Tran, Nina; Dosher, Barbara A.; Albright, Thomas D.

    2015-01-01

    Motivated by Signal Detection Theory (SDT), we developed a family of novel adaptive methods that estimate the sensitivity threshold—the signal intensity corresponding to a pre-defined sensitivity level (d′ = 1)—in Yes-No (YN) and Forced-Choice (FC) detection tasks. Rather than focus stimulus sampling to estimate a single level of %Yes or %Correct, the current methods sample psychometric functions more broadly, to concurrently estimate sensitivity and decision factors, and thereby estimate thresholds that are independent of decision confounds. Developed for four tasks—(1) simple YN detection, (2) cued YN detection, which cues the observer's response state before each trial, (3) rated YN detection, which incorporates a Not Sure response, and (4) FC detection—the qYN and qFC methods yield sensitivity thresholds that are independent of the task's decision structure (YN or FC) and/or the observer's subjective response state. Results from simulation and psychophysics suggest that 25 trials (and sometimes less) are sufficient to estimate YN thresholds with reasonable precision (s.d. = 0.10–0.15 decimal log units), but more trials are needed for FC thresholds. When the same subjects were tested across tasks of simple, cued, rated, and FC detection, adaptive threshold estimates exhibited excellent agreement with the method of constant stimuli (MCS), and with each other. These YN adaptive methods deliver criterion-free thresholds that have previously been exclusive to FC methods. PMID:26300798

  11. Methods for SBS Threshold Reduction

    DTIC Science & Technology

    1994-01-30

    We have investigated methods for reducing the threshold for stimulated Brillouin scattering (SBS) using a frequency-narrowed Cr,Tm,Ho:YAG laser...operating at 2.12 micrometers. The SBS medium was carbon disulfide. Single-focus SBS and threshold reduction by using two foci, a loop, and a ring have

  12. A new iterative triclass thresholding technique in image segmentation.

    PubMed

    Cai, Hongmin; Yang, Zhong; Cao, Xinhua; Xia, Weiming; Xu, Xiaoyin

    2014-03-01

    We present a new method in image segmentation that is based on Otsu's method but iteratively searches for subregions of the image for segmentation, instead of treating the full image as a whole region for processing. The iterative method starts with Otsu's threshold and computes the mean values of the two classes as separated by the threshold. Based on the Otsu's threshold and the two mean values, the method separates the image into three classes instead of two as the standard Otsu's method does. The first two classes are determined as the foreground and background and they will not be processed further. The third class is denoted as a to-be-determined (TBD) region that is processed at next iteration. At the succeeding iteration, Otsu's method is applied on the TBD region to calculate a new threshold and two class means and the TBD region is again separated into three classes, namely, foreground, background, and a new TBD region, which by definition is smaller than the previous TBD regions. Then, the new TBD region is processed in the similar manner. The process stops when the Otsu's thresholds calculated between two iterations is less than a preset threshold. Then, all the intermediate foreground and background regions are, respectively, combined to create the final segmentation result. Tests on synthetic and real images showed that the new iterative method can achieve better performance than the standard Otsu's method in many challenging cases, such as identifying weak objects and revealing fine structures of complex objects while the added computational cost is minimal.

  13. Quantifying ecological thresholds from response surfaces

    Treesearch

    Heather E. Lintz; Bruce McCune; Andrew N. Gray; Katherine A. McCulloh

    2011-01-01

    Ecological thresholds are abrupt changes of ecological state. While an ecological threshold is a widely accepted concept, most empirical methods detect them in time or across geographic space. Although useful, these approaches do not quantify the direct drivers of threshold response. Causal understanding of thresholds detected empirically requires their investigation...

  14. Threshold-adaptive canny operator based on cross-zero points

    NASA Astrophysics Data System (ADS)

    Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu

    2018-03-01

    Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.

  15. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  16. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    PubMed

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  17. Adaptive thresholding image series from fluorescence confocal scanning laser microscope using orientation intensity profiles

    NASA Astrophysics Data System (ADS)

    Feng, Judy J.; Ip, Horace H.; Cheng, Shuk H.

    2004-05-01

    Many grey-level thresholding methods based on histogram or other statistic information about the interest image such as maximum entropy and so on have been proposed in the past. However, most methods based on statistic analysis of the images concerned little about the characteristics of morphology of interest objects, which sometimes could provide very important indication which can help to find the optimum threshold, especially for those organisms which have special texture morphologies such as vasculature, neuro-network etc. in medical imaging. In this paper, we propose a novel method for thresholding the fluorescent vasculature image series recorded from Confocal Scanning Laser Microscope. After extracting the basic orientation of the slice of vessels inside a sub-region partitioned from the images, we analysis the intensity profiles perpendicular to the vessel orientation to get the reasonable initial threshold for each region. Then the threshold values of those regions near the interest one both in x-y and optical directions have been referenced to get the final result of thresholds of the region, which makes the whole stack of images look more continuous. The resulting images are characterized by suppressing both noise and non-interest tissues conglutinated to vessels, while improving the vessel connectivities and edge definitions. The value of the method for idealized thresholding the fluorescence images of biological objects is demonstrated by a comparison of the results of 3D vascular reconstruction.

  18. Analyses of Fatigue Crack Growth and Closure Near Threshold Conditions for Large-Crack Behavior

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1999-01-01

    A plasticity-induced crack-closure model was used to study fatigue crack growth and closure in thin 2024-T3 aluminum alloy under constant-R and constant-K(sub max) threshold testing procedures. Two methods of calculating crack-opening stresses were compared. One method was based on a contact-K analyses and the other on crack-opening-displacement (COD) analyses. These methods gave nearly identical results under constant-amplitude loading, but under threshold simulations the contact-K analyses gave lower opening stresses than the contact COD method. Crack-growth predictions tend to support the use of contact-K analyses. Crack-growth simulations showed that remote closure can cause a rapid rise in opening stresses in the near threshold regime for low-constraint and high applied stress levels. Under low applied stress levels and high constraint, a rise in opening stresses was not observed near threshold conditions. But crack-tip-opening displacement (CTOD) were of the order of measured oxide thicknesses in the 2024 alloy under constant-R simulations. In contrast, under constant-K(sub max) testing the CTOD near threshold conditions were an order-of-magnitude larger than measured oxide thicknesses. Residual-plastic deformations under both constant-R and constant-K(sub max) threshold simulations were several times larger than the expected oxide thicknesses. Thus, residual-plastic deformations, in addition to oxide and roughness, play an integral part in threshold development.

  19. Fitting psychometric functions using a fixed-slope parameter: an advanced alternative for estimating odor thresholds with data generated by ASTM E679.

    PubMed

    Peng, Mei; Jaeger, Sara R; Hautus, Michael J

    2014-03-01

    Psychometric functions are predominately used for estimating detection thresholds in vision and audition. However, the requirement of large data quantities for fitting psychometric functions (>30 replications) reduces their suitability in olfactory studies because olfactory response data are often limited (<4 replications) due to the susceptibility of human olfactory receptors to fatigue and adaptation. This article introduces a new method for fitting individual-judge psychometric functions to olfactory data obtained using the current standard protocol-American Society for Testing and Materials (ASTM) E679. The slope parameter of the individual-judge psychometric function is fixed to be the same as that of the group function; the same-shaped symmetrical sigmoid function is fitted only using the intercept. This study evaluated the proposed method by comparing it with 2 available methods. Comparison to conventional psychometric functions (fitted slope and intercept) indicated that the assumption of a fixed slope did not compromise precision of the threshold estimates. No systematic difference was obtained between the proposed method and the ASTM method in terms of group threshold estimates or threshold distributions, but there were changes in the rank, by threshold, of judges in the group. Overall, the fixed-slope psychometric function is recommended for obtaining relatively reliable individual threshold estimates when the quantity of data is limited.

  20. Statistical approaches for the definition of landslide rainfall thresholds and their uncertainty using rain gauge and satellite data

    NASA Astrophysics Data System (ADS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-05-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the "ground" rainfall registered by rain gauges.

  1. Statistical Approaches for the Definition of Landslide Rainfall Thresholds and their Uncertainty Using Rain Gauge and Satellite Data

    NASA Technical Reports Server (NTRS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-01-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the 'ground' rainfall registered by rain gauges.

  2. Optimal Threshold Determination for Interpreting Semantic Similarity and Particularity: Application to the Comparison of Gene Sets and Metabolic Pathways Using GO and ChEBI

    PubMed Central

    Bettembourg, Charles; Diot, Christian; Dameron, Olivier

    2015-01-01

    Background The analysis of gene annotations referencing back to Gene Ontology plays an important role in the interpretation of high-throughput experiments results. This analysis typically involves semantic similarity and particularity measures that quantify the importance of the Gene Ontology annotations. However, there is currently no sound method supporting the interpretation of the similarity and particularity values in order to determine whether two genes are similar or whether one gene has some significant particular function. Interpretation is frequently based either on an implicit threshold, or an arbitrary one (typically 0.5). Here we investigate a method for determining thresholds supporting the interpretation of the results of a semantic comparison. Results We propose a method for determining the optimal similarity threshold by minimizing the proportions of false-positive and false-negative similarity matches. We compared the distributions of the similarity values of pairs of similar genes and pairs of non-similar genes. These comparisons were performed separately for all three branches of the Gene Ontology. In all situations, we found overlap between the similar and the non-similar distributions, indicating that some similar genes had a similarity value lower than the similarity value of some non-similar genes. We then extend this method to the semantic particularity measure and to a similarity measure applied to the ChEBI ontology. Thresholds were evaluated over the whole HomoloGene database. For each group of homologous genes, we computed all the similarity and particularity values between pairs of genes. Finally, we focused on the PPAR multigene family to show that the similarity and particularity patterns obtained with our thresholds were better at discriminating orthologs and paralogs than those obtained using default thresholds. Conclusion We developed a method for determining optimal semantic similarity and particularity thresholds. We applied this method on the GO and ChEBI ontologies. Qualitative analysis using the thresholds on the PPAR multigene family yielded biologically-relevant patterns. PMID:26230274

  3. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  4. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  5. A complementary graphical method for reducing and analyzing large data sets. Case studies demonstrating thresholds setting and selection.

    PubMed

    Jing, X; Cimino, J J

    2014-01-01

    Graphical displays can make data more understandable; however, large graphs can challenge human comprehension. We have previously described a filtering method to provide high-level summary views of large data sets. In this paper we demonstrate our method for setting and selecting thresholds to limit graph size while retaining important information by applying it to large single and paired data sets, taken from patient and bibliographic databases. Four case studies are used to illustrate our method. The data are either patient discharge diagnoses (coded using the International Classification of Diseases, Clinical Modifications [ICD9-CM]) or Medline citations (coded using the Medical Subject Headings [MeSH]). We use combinations of different thresholds to obtain filtered graphs for detailed analysis. The thresholds setting and selection, such as thresholds for node counts, class counts, ratio values, p values (for diff data sets), and percentiles of selected class count thresholds, are demonstrated with details in case studies. The main steps include: data preparation, data manipulation, computation, and threshold selection and visualization. We also describe the data models for different types of thresholds and the considerations for thresholds selection. The filtered graphs are 1%-3% of the size of the original graphs. For our case studies, the graphs provide 1) the most heavily used ICD9-CM codes, 2) the codes with most patients in a research hospital in 2011, 3) a profile of publications on "heavily represented topics" in MEDLINE in 2011, and 4) validated knowledge about adverse effects of the medication of rosiglitazone and new interesting areas in the ICD9-CM hierarchy associated with patients taking the medication of pioglitazone. Our filtering method reduces large graphs to a manageable size by removing relatively unimportant nodes. The graphical method provides summary views based on computation of usage frequency and semantic context of hierarchical terminology. The method is applicable to large data sets (such as a hundred thousand records or more) and can be used to generate new hypotheses from data sets coded with hierarchical terminologies.

  6. Flood Extent Delineation by Thresholding Sentinel-1 SAR Imagery Based on Ancillary Land Cover Information

    NASA Astrophysics Data System (ADS)

    Liang, J.; Liu, D.

    2017-12-01

    Emergency responses to floods require timely information on water extents that can be produced by satellite-based remote sensing. As SAR image can be acquired in adverse illumination and weather conditions, it is particularly suitable for delineating water extent during a flood event. Thresholding SAR imagery is one of the most widely used approaches to delineate water extent. However, most studies apply only one threshold to separate water and dry land without considering the complexity and variability of different dry land surface types in an image. This paper proposes a new thresholding method for SAR image to delineate water from other different land cover types. A probability distribution of SAR backscatter intensity is fitted for each land cover type including water before a flood event and the intersection between two distributions is regarded as a threshold to classify the two. To extract water, a set of thresholds are applied to several pairs of land cover types—water and urban or water and forest. The subsets are merged to form the water distribution for the SAR image during or after the flooding. Experiments show that this land cover based thresholding approach outperformed the traditional single thresholding by about 5% to 15%. This method has great application potential with the broadly acceptance of the thresholding based methods and availability of land cover data, especially for heterogeneous regions.

  7. Methods for threshold determination in multiplexed assays

    DOEpatents

    Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J

    2014-06-24

    Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.

  8. First-principles simulation of the optical response of bulk and thin-film α-quartz irradiated with an ultrashort intense laser pulse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Kyung-Min; Min Kim, Chul; Moon Jeong, Tae, E-mail: jeongtm@gist.ac.kr

    A computational method based on a first-principles multiscale simulation has been used for calculating the optical response and the ablation threshold of an optical material irradiated with an ultrashort intense laser pulse. The method employs Maxwell's equations to describe laser pulse propagation and time-dependent density functional theory to describe the generation of conduction band electrons in an optical medium. Optical properties, such as reflectance and absorption, were investigated for laser intensities in the range 10{sup 10} W/cm{sup 2} to 2 × 10{sup 15} W/cm{sup 2} based on the theory of generation and spatial distribution of the conduction band electrons. The method was applied tomore » investigate the changes in the optical reflectance of α-quartz bulk, half-wavelength thin-film, and quarter-wavelength thin-film and to estimate their ablation thresholds. Despite the adiabatic local density approximation used in calculating the exchange–correlation potential, the reflectance and the ablation threshold obtained from our method agree well with the previous theoretical and experimental results. The method can be applied to estimate the ablation thresholds for optical materials, in general. The ablation threshold data can be used to design ultra-broadband high-damage-threshold coating structures.« less

  9. Optimum threshold selection method of centroid computation for Gaussian spot

    NASA Astrophysics Data System (ADS)

    Li, Xuxu; Li, Xinyang; Wang, Caixia

    2015-10-01

    Centroid computation of Gaussian spot is often conducted to get the exact position of a target or to measure wave-front slopes in the fields of target tracking and wave-front sensing. Center of Gravity (CoG) is the most traditional method of centroid computation, known as its low algorithmic complexity. However both electronic noise from the detector and photonic noise from the environment reduces its accuracy. In order to improve the accuracy, thresholding is unavoidable before centroid computation, and optimum threshold need to be selected. In this paper, the model of Gaussian spot is established to analyze the performance of optimum threshold under different Signal-to-Noise Ratio (SNR) conditions. Besides, two optimum threshold selection methods are introduced: TmCoG (using m % of the maximum intensity of spot as threshold), and TkCoG ( usingμn +κσ n as the threshold), μn and σn are the mean value and deviation of back noise. Firstly, their impact on the detection error under various SNR conditions is simulated respectively to find the way to decide the value of k or m. Then, a comparison between them is made. According to the simulation result, TmCoG is superior over TkCoG for the accuracy of selected threshold, and detection error is also lower.

  10. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  11. Novel wavelet threshold denoising method in axle press-fit zone ultrasonic detection

    NASA Astrophysics Data System (ADS)

    Peng, Chaoyong; Gao, Xiaorong; Peng, Jianping; Wang, Ai

    2017-02-01

    Axles are important part of railway locomotives and vehicles. Periodic ultrasonic inspection of axles can effectively detect and monitor axle fatigue cracks. However, in the axle press-fit zone, the complex interface contact condition reduces the signal-noise ratio (SNR). Therefore, the probability of false positives and false negatives increases. In this work, a novel wavelet threshold function is created to remove noise and suppress press-fit interface echoes in axle ultrasonic defect detection. The novel wavelet threshold function with two variables is designed to ensure the precision of optimum searching process. Based on the positive correlation between the correlation coefficient and SNR and with the experiment phenomenon that the defect and the press-fit interface echo have different axle-circumferential correlation characteristics, a discrete optimum searching process for two undetermined variables in novel wavelet threshold function is conducted. The performance of the proposed method is assessed by comparing it with traditional threshold methods using real data. The statistic results of the amplitude and the peak SNR of defect echoes show that the proposed wavelet threshold denoising method not only maintains the amplitude of defect echoes but also has a higher peak SNR.

  12. Regional rainfall thresholds for landslide occurrence using a centenary database

    NASA Astrophysics Data System (ADS)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  13. Genetic variance of tolerance and the toxicant threshold model.

    PubMed

    Tanaka, Yoshinari; Mano, Hiroyuki; Tatsuta, Haruki

    2012-04-01

    A statistical genetics method is presented for estimating the genetic variance (heritability) of tolerance to pollutants on the basis of a standard acute toxicity test conducted on several isofemale lines of cladoceran species. To analyze the genetic variance of tolerance in the case when the response is measured as a few discrete states (quantal endpoints), the authors attempted to apply the threshold character model in quantitative genetics to the threshold model separately developed in ecotoxicology. The integrated threshold model (toxicant threshold model) assumes that the response of a particular individual occurs at a threshold toxicant concentration and that the individual tolerance characterized by the individual's threshold value is determined by genetic and environmental factors. As a case study, the heritability of tolerance to p-nonylphenol in the cladoceran species Daphnia galeata was estimated by using the maximum likelihood method and nested analysis of variance (ANOVA). Broad-sense heritability was estimated to be 0.199 ± 0.112 by the maximum likelihood method and 0.184 ± 0.089 by ANOVA; both results implied that the species examined had the potential to acquire tolerance to this substance by evolutionary change. Copyright © 2012 SETAC.

  14. Thermal bistability-based method for real-time optimization of ultralow-threshold whispering gallery mode microlasers.

    PubMed

    Lin, Guoping; Candela, Y; Tillement, O; Cai, Zhiping; Lefèvre-Seguin, V; Hare, J

    2012-12-15

    A method based on thermal bistability for ultralow-threshold microlaser optimization is demonstrated. When sweeping the pump laser frequency across a pump resonance, the dynamic thermal bistability slows down the power variation. The resulting line shape modification enables a real-time monitoring of the laser characteristic. We demonstrate this method for a functionalized microsphere exhibiting a submicrowatt laser threshold. This approach is confirmed by comparing the results with a step-by-step recording in quasi-static thermal conditions.

  15. Adaptive spline autoregression threshold method in forecasting Mitsubishi car sales volume at PT Srikandi Diamond Motors

    NASA Astrophysics Data System (ADS)

    Susanti, D.; Hartini, E.; Permana, A.

    2017-01-01

    Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.

  16. In-air hearing of a diving duck: A comparison of psychoacoustic and auditory brainstem response thresholds

    USGS Publications Warehouse

    Crowell, Sara E.; Wells-Berlin, Alicia M.; Therrien, Ronald E.; Yannuzzi, Sally E.; Carr, Catherine E.

    2016-01-01

    Auditory sensitivity was measured in a species of diving duck that is not often kept in captivity, the lesser scaup. Behavioral (psychoacoustics) and electrophysiological [the auditory brainstem response (ABR)] methods were used to measure in-air auditory sensitivity, and the resulting audiograms were compared. Both approaches yielded audiograms with similar U-shapes and regions of greatest sensitivity (2000−3000 Hz). However, ABR thresholds were higher than psychoacoustic thresholds at all frequencies. This difference was least at the highest frequency tested using both methods (5700 Hz) and greatest at 1000 Hz, where the ABR threshold was 26.8 dB higher than the behavioral measure of threshold. This difference is commonly reported in studies involving many different species. These results highlight the usefulness of each method, depending on the testing conditions and availability of the animals.

  17. In-air hearing of a diving duck: A comparison of psychoacoustic and auditory brainstem response thresholds.

    PubMed

    Crowell, Sara E; Wells-Berlin, Alicia M; Therrien, Ronald E; Yannuzzi, Sally E; Carr, Catherine E

    2016-05-01

    Auditory sensitivity was measured in a species of diving duck that is not often kept in captivity, the lesser scaup. Behavioral (psychoacoustics) and electrophysiological [the auditory brainstem response (ABR)] methods were used to measure in-air auditory sensitivity, and the resulting audiograms were compared. Both approaches yielded audiograms with similar U-shapes and regions of greatest sensitivity (2000-3000 Hz). However, ABR thresholds were higher than psychoacoustic thresholds at all frequencies. This difference was least at the highest frequency tested using both methods (5700 Hz) and greatest at 1000 Hz, where the ABR threshold was 26.8 dB higher than the behavioral measure of threshold. This difference is commonly reported in studies involving many different species. These results highlight the usefulness of each method, depending on the testing conditions and availability of the animals.

  18. THRESHOLD LOGIC.

    DTIC Science & Technology

    synthesis procedures; a ’best’ method is definitely established. (2) ’Symmetry Types for Threshold Logic’ is a tutorial expositon including a careful...development of the Goto-Takahasi self-dual type ideas. (3) ’Best Threshold Gate Decisions’ reports a comparison, on the 2470 7-argument threshold ...interpretation is shown best. (4) ’ Threshold Gate Networks’ reviews the previously discussed 2-algorithm in geometric terms, describes our FORTRAN

  19. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  20. Automated tumour boundary delineation on 18F-FDG PET images using active contour coupled with shifted-optimal thresholding method

    NASA Astrophysics Data System (ADS)

    Khamwan, Kitiwat; Krisanachinda, Anchali; Pluempitiwiriyawej, Charnchai

    2012-10-01

    This study presents an automatic method to trace the boundary of the tumour in positron emission tomography (PET) images. It has been discovered that Otsu's threshold value is biased when the within-class variances between the object and the background are significantly different. To solve the problem, a double-stage threshold search that minimizes the energy between the first Otsu's threshold and the maximum intensity value is introduced. Such shifted-optimal thresholding is embedded into a region-based active contour so that both algorithms are performed consecutively. The efficiency of the method is validated using six sphere inserts (0.52-26.53 cc volume) of the IEC/2001 torso phantom. Both spheres and phantom were filled with 18F solution with four source-to-background ratio (SBR) measurements of PET images. The results illustrate that the tumour volumes segmented by combined algorithm are of higher accuracy than the traditional active contour. The method had been clinically implemented in ten oesophageal cancer patients. The results are evaluated and compared with the manual tracing by an experienced radiation oncologist. The advantage of the algorithm is the reduced erroneous delineation that improves the precision and accuracy of PET tumour contouring. Moreover, the combined method is robust, independent of the SBR threshold-volume curves, and it does not require prior lesion size measurement.

  1. Identification of ecological thresholds from variations in phytoplankton communities among lakes: contribution to the definition of environmental standards.

    PubMed

    Roubeix, Vincent; Danis, Pierre-Alain; Feret, Thibaut; Baudoin, Jean-Marc

    2016-04-01

    In aquatic ecosystems, the identification of ecological thresholds may be useful for managers as it can help to diagnose ecosystem health and to identify key levers to enable the success of preservation and restoration measures. A recent statistical method, gradient forest, based on random forests, was used to detect thresholds of phytoplankton community change in lakes along different environmental gradients. It performs exploratory analyses of multivariate biological and environmental data to estimate the location and importance of community thresholds along gradients. The method was applied to a data set of 224 French lakes which were characterized by 29 environmental variables and the mean abundances of 196 phytoplankton species. Results showed the high importance of geographic variables for the prediction of species abundances at the scale of the study. A second analysis was performed on a subset of lakes defined by geographic thresholds and presenting a higher biological homogeneity. Community thresholds were identified for the most important physico-chemical variables including water transparency, total phosphorus, ammonia, nitrates, and dissolved organic carbon. Gradient forest appeared as a powerful method at a first exploratory step, to detect ecological thresholds at large spatial scale. The thresholds that were identified here must be reinforced by the separate analysis of other aquatic communities and may be used then to set protective environmental standards after consideration of natural variability among lakes.

  2. Adaptive threshold shearlet transform for surface microseismic data denoising

    NASA Astrophysics Data System (ADS)

    Tang, Na; Zhao, Xian; Li, Yue; Zhu, Dan

    2018-06-01

    Random noise suppression plays an important role in microseismic data processing. The microseismic data is often corrupted by strong random noise, which would directly influence identification and location of microseismic events. Shearlet transform is a new multiscale transform, which can effectively process the low magnitude of microseismic data. In shearlet domain, due to different distributions of valid signals and random noise, shearlet coefficients can be shrunk by threshold. Therefore, threshold is vital in suppressing random noise. The conventional threshold denoising algorithms usually use the same threshold to process all coefficients, which causes noise suppression inefficiency or valid signals loss. In order to solve above problems, we propose the adaptive threshold shearlet transform (ATST) for surface microseismic data denoising. In the new algorithm, we calculate the fundamental threshold for each direction subband firstly. In each direction subband, the adjustment factor is obtained according to each subband coefficient and its neighboring coefficients, in order to adaptively regulate the fundamental threshold for different shearlet coefficients. Finally we apply the adaptive threshold to deal with different shearlet coefficients. The experimental denoising results of synthetic records and field data illustrate that the proposed method exhibits better performance in suppressing random noise and preserving valid signal than the conventional shearlet denoising method.

  3. Twelve automated thresholding methods for segmentation of PET images: a phantom study.

    PubMed

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M

    2012-06-21

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical (18)F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  4. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    NASA Astrophysics Data System (ADS)

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M.

    2012-06-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  5. Differences in two-point discrimination and sensory threshold in the blind between braille and text reading: a pilot study.

    PubMed

    Noh, Ji-Woong; Park, Byoung-Sun; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kang, Ji-Hye; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Kim, Ju-Young; Kim, Junghwan

    2015-06-01

    [Purpose] This study investigated two-point discrimination (TPD) and the electrical sensory threshold of the blind to define the effect of using Braille on the tactile and electrical senses. [Subjects and Methods] Twenty-eight blind participants were divided equally into a text-reading and a Braille-reading group. We measured tactile sensory and electrical thresholds using the TPD method and a transcutaneous electrical nerve stimulator. [Results] The left palm TPD values were significantly different between the groups. The values of the electrical sensory threshold in the left hand, the electrical pain threshold in the left hand, and the electrical pain threshold in the right hand were significantly lower in the Braille group than in the text group. [Conclusion] These findings make it difficult to explain the difference in tactility between groups, excluding both palms. However, our data show that using Braille can enhance development of the sensory median nerve in the blind, particularly in terms of the electrical sensory and pain thresholds.

  6. Differences in two-point discrimination and sensory threshold in the blind between braille and text reading: a pilot study

    PubMed Central

    Noh, Ji-Woong; Park, Byoung-Sun; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kang, Ji-Hye; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Kim, Ju-Young; Kim, Junghwan

    2015-01-01

    [Purpose] This study investigated two-point discrimination (TPD) and the electrical sensory threshold of the blind to define the effect of using Braille on the tactile and electrical senses. [Subjects and Methods] Twenty-eight blind participants were divided equally into a text-reading and a Braille-reading group. We measured tactile sensory and electrical thresholds using the TPD method and a transcutaneous electrical nerve stimulator. [Results] The left palm TPD values were significantly different between the groups. The values of the electrical sensory threshold in the left hand, the electrical pain threshold in the left hand, and the electrical pain threshold in the right hand were significantly lower in the Braille group than in the text group. [Conclusion] These findings make it difficult to explain the difference in tactility between groups, excluding both palms. However, our data show that using Braille can enhance development of the sensory median nerve in the blind, particularly in terms of the electrical sensory and pain thresholds. PMID:26180348

  7. Definition of temperature thresholds: the example of the French heat wave warning system.

    PubMed

    Pascal, Mathilde; Wagner, Vérène; Le Tertre, Alain; Laaidi, Karine; Honoré, Cyrille; Bénichou, Françoise; Beaudeau, Pascal

    2013-01-01

    Heat-related deaths should be somewhat preventable. In France, some prevention measures are activated when minimum and maximum temperatures averaged over three days reach city-specific thresholds. The current thresholds were computed based on a descriptive analysis of past heat waves and on local expert judgement. We tested whether a different method would confirm these thresholds. The study was set in the six cities of Paris, Lyon, Marseille, Nantes, Strasbourg and Limoges between 1973 and 2003. For each city, we estimated the excess in mortality associated with different temperature thresholds, using a generalised additive model, controlling for long-time trends, seasons and days of the week. These models were used to compute the mortality predicted by different percentiles of temperatures. The thresholds were chosen as the percentiles associated with a significant excess mortality. In all cities, there was a good correlation between current thresholds and the thresholds derived from the models, with 0°C to 3°C differences for averaged maximum temperatures. Both set of thresholds were able to anticipate the main periods of excess mortality during the summers of 1973 to 2003. A simple method relying on descriptive analysis and expert judgement is sufficient to define protective temperature thresholds and to prevent heat wave mortality. As temperatures are increasing along with the climate change and adaptation is ongoing, more research is required to understand if and when thresholds should be modified.

  8. Calculation of photoionization cross section near auto-ionizing lines and magnesium photoionization cross section near threshold

    NASA Technical Reports Server (NTRS)

    Moore, E. N.; Altick, P. L.

    1972-01-01

    The research performed is briefly reviewed. A simple method was developed for the calculation of continuum states of atoms when autoionization is present. The method was employed to give the first theoretical cross section for beryllium and magnesium; the results indicate that the values used previously at threshold were sometimes seriously in error. These threshold values have potential applications in astrophysical abundance estimates.

  9. A de-noising method using the improved wavelet threshold function based on noise variance estimation

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Wang, Weida; Xiang, Changle; Han, Lijin; Nie, Haizhao

    2018-01-01

    The precise and efficient noise variance estimation is very important for the processing of all kinds of signals while using the wavelet transform to analyze signals and extract signal features. In view of the problem that the accuracy of traditional noise variance estimation is greatly affected by the fluctuation of noise values, this study puts forward the strategy of using the two-state Gaussian mixture model to classify the high-frequency wavelet coefficients in the minimum scale, which takes both the efficiency and accuracy into account. According to the noise variance estimation, a novel improved wavelet threshold function is proposed by combining the advantages of hard and soft threshold functions, and on the basis of the noise variance estimation algorithm and the improved wavelet threshold function, the research puts forth a novel wavelet threshold de-noising method. The method is tested and validated using random signals and bench test data of an electro-mechanical transmission system. The test results indicate that the wavelet threshold de-noising method based on the noise variance estimation shows preferable performance in processing the testing signals of the electro-mechanical transmission system: it can effectively eliminate the interference of transient signals including voltage, current, and oil pressure and maintain the dynamic characteristics of the signals favorably.

  10. Using pyramids to define local thresholds for blob detection.

    PubMed

    Shneier, M

    1983-03-01

    A method of detecting blobs in images is described. The method involves building a succession of lower resolution images and looking for spots in these images. A spot in a low resolution image corresponds to a distinguished compact region in a known position in the original image. Further, it is possible to calculate thresholds in the low resolution image, using very simple methods, and to apply those thresholds to the region of the original image corresponding to the spot. Examples are shown in which variations of the technique are applied to several images.

  11. Multi-mode ultrasonic welding control and optimization

    DOEpatents

    Tang, Jason C.H.; Cai, Wayne W

    2013-05-28

    A system and method for providing multi-mode control of an ultrasonic welding system. In one embodiment, the control modes include the energy of the weld, the time of the welding process and the compression displacement of the parts being welded during the welding process. The method includes providing thresholds for each of the modes, and terminating the welding process after the threshold for each mode has been reached, the threshold for more than one mode has been reached or the threshold for one of the modes has been reached. The welding control can be either open-loop or closed-loop, where the open-loop process provides the mode thresholds and once one or more of those thresholds is reached the welding process is terminated. The closed-loop control provides feedback of the weld energy and/or the compression displacement so that the weld power and/or weld pressure can be increased or decreased accordingly.

  12. Cloud Detection of Optical Satellite Images Using Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lee, Kuan-Yi; Lin, Chao-Hung

    2016-06-01

    Cloud covers are generally present in optical remote-sensing images, which limit the usage of acquired images and increase the difficulty of data analysis, such as image compositing, correction of atmosphere effects, calculations of vegetation induces, land cover classification, and land cover change detection. In previous studies, thresholding is a common and useful method in cloud detection. However, a selected threshold is usually suitable for certain cases or local study areas, and it may be failed in other cases. In other words, thresholding-based methods are data-sensitive. Besides, there are many exceptions to control, and the environment is changed dynamically. Using the same threshold value on various data is not effective. In this study, a threshold-free method based on Support Vector Machine (SVM) is proposed, which can avoid the abovementioned problems. A statistical model is adopted to detect clouds instead of a subjective thresholding-based method, which is the main idea of this study. The features used in a classifier is the key to a successful classification. As a result, Automatic Cloud Cover Assessment (ACCA) algorithm, which is based on physical characteristics of clouds, is used to distinguish the clouds and other objects. In the same way, the algorithm called Fmask (Zhu et al., 2012) uses a lot of thresholds and criteria to screen clouds, cloud shadows, and snow. Therefore, the algorithm of feature extraction is based on the ACCA algorithm and Fmask. Spatial and temporal information are also important for satellite images. Consequently, co-occurrence matrix and temporal variance with uniformity of the major principal axis are used in proposed method. We aim to classify images into three groups: cloud, non-cloud and the others. In experiments, images acquired by the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) and images containing the landscapes of agriculture, snow area, and island are tested. Experiment results demonstrate the detection accuracy of the proposed method is better than related methods.

  13. Imposed Power of Breathing Associated With Use of an Impedance Threshold Device

    DTIC Science & Technology

    2007-02-01

    threshold device and a sham impedance threshold device. DESIGN: Prospective randomized blinded protocol. SETTING: University medical center. PATIENTS...for males). METHODS: The volunteers completed 2 trials of breathing through a face mask fitted with an active impedance threshold device set to open...at -7cmH 2 O pressure, or with a sham impedance threshold device, which was identical to the active device except that it did not contain an

  14. LCAMP: Location Constrained Approximate Message Passing for Compressed Sensing MRI

    PubMed Central

    Sung, Kyunghyun; Daniel, Bruce L; Hargreaves, Brian A

    2016-01-01

    Iterative thresholding methods have been extensively studied as faster alternatives to convex optimization methods for solving large-sized problems in compressed sensing. A novel iterative thresholding method called LCAMP (Location Constrained Approximate Message Passing) is presented for reducing computational complexity and improving reconstruction accuracy when a nonzero location (or sparse support) constraint can be obtained from view shared images. LCAMP modifies the existing approximate message passing algorithm by replacing the thresholding stage with a location constraint, which avoids adjusting regularization parameters or thresholding levels. This work is first compared with other conventional reconstruction methods using random 1D signals and then applied to dynamic contrast-enhanced breast MRI to demonstrate the excellent reconstruction accuracy (less than 2% absolute difference) and low computation time (5 - 10 seconds using Matlab) with highly undersampled 3D data (244 × 128 × 48; overall reduction factor = 10). PMID:23042658

  15. Effects of threshold on the topology of gene co-expression networks.

    PubMed

    Couto, Cynthia Martins Villar; Comin, César Henrique; Costa, Luciano da Fontoura

    2017-09-26

    Several developments regarding the analysis of gene co-expression profiles using complex network theory have been reported recently. Such approaches usually start with the construction of an unweighted gene co-expression network, therefore requiring the selection of a suitable threshold defining which pairs of vertices will be connected. We aimed at addressing such an important problem by suggesting and comparing five different approaches for threshold selection. Each of the methods considers a respective biologically-motivated criterion for electing a potentially suitable threshold. A set of 21 microarray experiments from different biological groups was used to investigate the effect of applying the five proposed criteria to several biological situations. For each experiment, we used the Pearson correlation coefficient to measure the relationship between each gene pair, and the resulting weight matrices were thresholded considering several values, generating respective adjacency matrices (co-expression networks). Each of the five proposed criteria was then applied in order to select the respective threshold value. The effects of these thresholding approaches on the topology of the resulting networks were compared by using several measurements, and we verified that, depending on the database, the impact on the topological properties can be large. However, a group of databases was verified to be similarly affected by most of the considered criteria. Based on such results, it can be suggested that when the generated networks present similar measurements, the thresholding method can be chosen with greater freedom. If the generated networks are markedly different, the thresholding method that better suits the interests of each specific research study represents a reasonable choice.

  16. Influence of aging on thermal and vibratory thresholds of quantitative sensory testing.

    PubMed

    Lin, Yea-Huey; Hsieh, Song-Chou; Chao, Chi-Chao; Chang, Yang-Chyuan; Hsieh, Sung-Tsang

    2005-09-01

    Quantitative sensory testing has become a common approach to evaluate thermal and vibratory thresholds in various types of neuropathies. To understand the effect of aging on sensory perception, we measured warm, cold, and vibratory thresholds by performing quantitative sensory testing on a population of 484 normal subjects (175 males and 309 females), aged 48.61 +/- 14.10 (range 20-86) years. Sensory thresholds of the hand and foot were measured with two algorithms: the method of limits (Limits) and the method of level (Level). Thresholds measured by Limits are reaction-time-dependent, while those measured by Level are independent of reaction time. In addition, we explored (1) the correlations of thresholds between these two algorithms, (2) the effect of age on differences in thresholds between algorithms, and (3) differences in sensory thresholds between the two test sites. Age was consistently and significantly correlated with sensory thresholds of all tested modalities measured by both algorithms on multivariate regression analysis compared with other factors, including gender, body height, body weight, and body mass index. When thresholds were plotted against age, slopes differed between sensory thresholds of the hand and those of the foot: for the foot, slopes were steeper compared with those for the hand for each sensory modality. Sensory thresholds of both test sites measured by Level were highly correlated with those measured by Limits, and thresholds measured by Limits were higher than those measured by Level. Differences in sensory thresholds between the two algorithms were also correlated with age: thresholds of the foot were higher than those of the hand for each sensory modality. This difference in thresholds (measured with both Level and Limits) between the hand and foot was also correlated with age. These findings suggest that age is the most significant factor in determining sensory thresholds compared with the other factors of gender and anthropometric parameters, and this provides a foundation for investigating the neurobiologic significance of aging on the processing of sensory stimuli.

  17. Global gray-level thresholding based on object size.

    PubMed

    Ranefall, Petter; Wählby, Carolina

    2016-04-01

    In this article, we propose a fast and robust global gray-level thresholding method based on object size, where the selection of threshold level is based on recall and maximum precision with regard to objects within a given size interval. The method relies on the component tree representation, which can be computed in quasi-linear time. Feature-based segmentation is especially suitable for biomedical microscopy applications where objects often vary in number, but have limited variation in size. We show that for real images of cell nuclei and synthetic data sets mimicking fluorescent spots the proposed method is more robust than all standard global thresholding methods available for microscopy applications in ImageJ and CellProfiler. The proposed method, provided as ImageJ and CellProfiler plugins, is simple to use and the only required input is an interval of the expected object sizes. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  18. Study on the threshold of a stochastic SIR epidemic model and its extensions

    NASA Astrophysics Data System (ADS)

    Zhao, Dianli

    2016-09-01

    This paper provides a simple but effective method for estimating the threshold of a class of the stochastic epidemic models by use of the nonnegative semimartingale convergence theorem. Firstly, the threshold R0SIR is obtained for the stochastic SIR model with a saturated incidence rate, whose value is below 1 or above 1 will completely determine the disease to go extinct or prevail for any size of the white noise. Besides, when R0SIR > 1 , the system is proved to be convergent in time mean. Then, the threshold of the stochastic SIVS models with or without saturated incidence rate are also established by the same method. Comparing with the previously-known literatures, the related results are improved, and the method is simpler than before.

  19. Uncertainty in determining extreme precipitation thresholds

    NASA Astrophysics Data System (ADS)

    Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili

    2013-10-01

    Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Yuyu; Smith, Steven J.; Elvidge, Christopher

    Accurate information of urban areas at regional and global scales is important for both the science and policy-making communities. The Defense Meteorological Satellite Program/Operational Linescan System (DMSP/OLS) nighttime stable light data (NTL) provide a potential way to map urban area and its dynamics economically and timely. In this study, we developed a cluster-based method to estimate the optimal thresholds and map urban extents from the DMSP/OLS NTL data in five major steps, including data preprocessing, urban cluster segmentation, logistic model development, threshold estimation, and urban extent delineation. Different from previous fixed threshold method with over- and under-estimation issues, in ourmore » method the optimal thresholds are estimated based on cluster size and overall nightlight magnitude in the cluster, and they vary with clusters. Two large countries of United States and China with different urbanization patterns were selected to map urban extents using the proposed method. The result indicates that the urbanized area occupies about 2% of total land area in the US ranging from lower than 0.5% to higher than 10% at the state level, and less than 1% in China, ranging from lower than 0.1% to about 5% at the province level with some municipalities as high as 10%. The derived thresholds and urban extents were evaluated using high-resolution land cover data at the cluster and regional levels. It was found that our method can map urban area in both countries efficiently and accurately. Compared to previous threshold techniques, our method reduces the over- and under-estimation issues, when mapping urban extent over a large area. More important, our method shows its potential to map global urban extents and temporal dynamics using the DMSP/OLS NTL data in a timely, cost-effective way.« less

  1. Intraoperative detection of 18F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria

    PubMed Central

    2013-01-01

    Background Intraoperative detection of 18F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of 18F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Methods Of 58 patients undergoing 18F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine 18F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each 18F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. Results The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2–15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0–2.1) and 1.0 (± 0, range 1.0–1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Conclusions Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical threshold criteria method can allow for improved detection of 18F-FDG-avid tissue sites when a low in situ T/B ratio is encountered. PMID:23496877

  2. The effect of condoms on penile vibrotactile sensitivity thresholds in young, heterosexual men

    PubMed Central

    Hill, Brandon J.; Janssen, Erick; Kvam, Peter; Amick, Erick E.; Sanders, Stephanie A.

    2013-01-01

    Introduction Investigating the ways in which barrier methods such as condoms may affect penile sensory thresholds has potential relevance to the development of interventions in men who experience negative effects of condoms on sexual response and sensation. A quantitative, psychophysiological investigation examining the degree to which sensations are altered by condoms has, to date, not been conducted. Aim The objective of this study was to examine penile vibrotactile sensitivity thresholds in both flaccid and erect penises with and without a condom, while comparing men who do and those who do not report condom-associated erection problems (CAEP). Methods Penile vibrotactile sensitivity thresholds were assessed among a total of 141 young, heterosexual men using biothesiometry. An incremental two-step staircase method was used and repeated three times for each of four conditions. Intra-class correlation coefficients (ICC) were calculated for all vibratory assessments. Penile vibratory thresholds were compared using a mixed-model Analysis of Variance (ANOVA). Main Outcome Measures Penile vibrotactile sensitivity thresholds with and without a condom, erectile function measured by International Index of Erectile Function Questionnaire (IIEF), and self-reported degree of erection. Results Significant main effects of condoms (yes/no) and erection (yes/no) were found. No main or interaction effects of CAEP were found. Condoms were associated with higher penile vibrotactile sensitivity thresholds (F(1, 124)=17.11, p<.001). Penile vibrotactile thresholds were higher with an erect than with a flaccid penis (F(1, 124)=4.21, p=.042). Conclusion The current study demonstrates the feasibility of measuring penile vibratory thresholds with and without a condom in both erect and flaccid experimental conditions. As might be expected, condoms increased penile vibrotactile sensitivity thresholds. Interestingly, erections were associated with the highest thresholds. Thus, this study was the first to document that erect penises are less sensitive to vibrotactile stimulation than flaccid penises. PMID:24168347

  3. CHOW PARAMETERS IN THRESHOLD LOGIC,

    DTIC Science & Technology

    respect to threshold functions, they provide the optimal test-synthesis method for completely specified 7-argument (or less) functions, reflect the...signs and relative magnitudes of realizing weights and threshold , and can be used themselves as approximating weights. Results are reproved in a

  4. Robust optimization of the laser induced damage threshold of dielectric mirrors for high power lasers.

    PubMed

    Chorel, Marine; Lanternier, Thomas; Lavastre, Éric; Bonod, Nicolas; Bousquet, Bruno; Néauport, Jérôme

    2018-04-30

    We report on a numerical optimization of the laser induced damage threshold of multi-dielectric high reflection mirrors in the sub-picosecond regime. We highlight the interplay between the electric field distribution, refractive index and intrinsic laser induced damage threshold of the materials on the overall laser induced damage threshold (LIDT) of the multilayer. We describe an optimization method of the multilayer that minimizes the field enhancement in high refractive index materials while preserving a near perfect reflectivity. This method yields a significant improvement of the damage resistance since a maximum increase of 40% can be achieved on the overall LIDT of the multilayer.

  5. [The analysis of threshold effect using Empower Stats software].

    PubMed

    Lin, Lin; Chen, Chang-zhong; Yu, Xiao-dan

    2013-11-01

    In many studies about biomedical research factors influence on the outcome variable, it has no influence or has a positive effect within a certain range. Exceeding a certain threshold value, the size of the effect and/or orientation will change, which called threshold effect. Whether there are threshold effects in the analysis of factors (x) on the outcome variable (y), it can be observed through a smooth curve fitting to see whether there is a piecewise linear relationship. And then using segmented regression model, LRT test and Bootstrap resampling method to analyze the threshold effect. Empower Stats software developed by American X & Y Solutions Inc has a threshold effect analysis module. You can input the threshold value at a given threshold segmentation simulated data. You may not input the threshold, but determined the optimal threshold analog data by the software automatically, and calculated the threshold confidence intervals.

  6. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  7. Objective definition of rainfall intensity-duration thresholds for the initiation of post-fire debris flows in southern California

    USGS Publications Warehouse

    Staley, Dennis; Kean, Jason W.; Cannon, Susan H.; Schmidt, Kevin M.; Laber, Jayme L.

    2012-01-01

    Rainfall intensity–duration (ID) thresholds are commonly used to predict the temporal occurrence of debris flows and shallow landslides. Typically, thresholds are subjectively defined as the upper limit of peak rainstorm intensities that do not produce debris flows and landslides, or as the lower limit of peak rainstorm intensities that initiate debris flows and landslides. In addition, peak rainstorm intensities are often used to define thresholds, as data regarding the precise timing of debris flows and associated rainfall intensities are usually not available, and rainfall characteristics are often estimated from distant gauging locations. Here, we attempt to improve the performance of existing threshold-based predictions of post-fire debris-flow occurrence by utilizing data on the precise timing of debris flows relative to rainfall intensity, and develop an objective method to define the threshold intensities. We objectively defined the thresholds by maximizing the number of correct predictions of debris flow occurrence while minimizing the rate of both Type I (false positive) and Type II (false negative) errors. We identified that (1) there were statistically significant differences between peak storm and triggering intensities, (2) the objectively defined threshold model presents a better balance between predictive success, false alarms and failed alarms than previous subjectively defined thresholds, (3) thresholds based on measurements of rainfall intensity over shorter duration (≤60 min) are better predictors of post-fire debris-flow initiation than longer duration thresholds, and (4) the objectively defined thresholds were exceeded prior to the recorded time of debris flow at frequencies similar to or better than subjective thresholds. Our findings highlight the need to better constrain the timing and processes of initiation of landslides and debris flows for future threshold studies. In addition, the methods used to define rainfall thresholds in this study represent a computationally simple means of deriving critical values for other studies of nonlinear phenomena characterized by thresholds.

  8. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik; Marianno, Fernando J.

    2015-06-30

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  9. Method for early detection of cooling-loss events

    DOEpatents

    Bermudez, Sergio A.; Hamann, Hendrik F.; Marianno, Fernando J.

    2015-12-22

    A method of detecting cooling-loss event early is provided. The method includes defining a relative humidity limit and change threshold for a given space, measuring relative humidity in the given space, determining, with a processing unit, whether the measured relative humidity is within the defined relative humidity limit, generating a warning in an event the measured relative humidity is outside the defined relative humidity limit and determining whether a change in the measured relative humidity is less than the defined change threshold for the given space and generating an alarm in an event the change is greater than the defined change threshold.

  10. Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)

    2002-01-01

    A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang- Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.

  11. Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)

    2002-01-01

    A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang-Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.

  12. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State.

  13. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    PubMed

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  14. The role of explicit and implicit standards in visual speed discrimination.

    PubMed

    Norman, J Farley; Pattison, Kristina F; Norman, Hideko F; Craft, Amy E; Wiesemann, Elizabeth Y; Taylor, M Jett

    2008-01-01

    Five experiments were designed to investigate visual speed discrimination. Variations of the method of constant stimuli were used to obtain speed discrimination thresholds in experiments 1, 2, 4, and 5, while the method of single stimuli was used in experiment 3. The observers' thresholds were significantly influenced by the choice of psychophysical method and by changes in the standard speed. The observers' judgments were unaffected, however, by changes in the magnitude of random variations in stimulus duration, reinforcing the conclusions of Lappin et al (1975 Journal of Experimental Psychology: Human Perception and Performance 1 383 394). When an implicit standard was used, the observers produced relatively low discrimination thresholds (7.0% of the standard speed), verifying the results of McKee (1981 Vision Research 21 491-500). When an explicit standard was used in a 2AFC variant of the method of constant stimuli, however, the observers' discrimination thresholds increased by 74% (to 12.2%), resembling the high thresholds obtained by Mandriota et al (1962 Science 138 437-438). A subsequent signal-detection analysis revealed that the observers' actual sensitivities to differences in speed were in fact equivalent for both psychophysical methods. The formation of an implicit standard in the method of single stimuli allows human observers to make judgments of speed that are as precise as those obtained when explicit standards are available.

  15. Using ROC Curves to Choose Minimally Important Change Thresholds when Sensitivity and Specificity Are Valued Equally: The Forgotten Lesson of Pythagoras. Theoretical Considerations and an Example Application of Change in Health Status

    PubMed Central

    Froud, Robert; Abel, Gary

    2014-01-01

    Background Receiver Operator Characteristic (ROC) curves are being used to identify Minimally Important Change (MIC) thresholds on scales that measure a change in health status. In quasi-continuous patient reported outcome measures, such as those that measure changes in chronic diseases with variable clinical trajectories, sensitivity and specificity are often valued equally. Notwithstanding methodologists agreeing that these should be valued equally, different approaches have been taken to estimating MIC thresholds using ROC curves. Aims and objectives We aimed to compare the different approaches used with a new approach, exploring the extent to which the methods choose different thresholds, and considering the effect of differences on conclusions in responder analyses. Methods Using graphical methods, hypothetical data, and data from a large randomised controlled trial of manual therapy for low back pain, we compared two existing approaches with a new approach that is based on the addition of the sums of squares of 1-sensitivity and 1-specificity. Results There can be divergence in the thresholds chosen by different estimators. The cut-point selected by different estimators is dependent on the relationship between the cut-points in ROC space and the different contours described by the estimators. In particular, asymmetry and the number of possible cut-points affects threshold selection. Conclusion Choice of MIC estimator is important. Different methods for choosing cut-points can lead to materially different MIC thresholds and thus affect results of responder analyses and trial conclusions. An estimator based on the smallest sum of squares of 1-sensitivity and 1-specificity is preferable when sensitivity and specificity are valued equally. Unlike other methods currently in use, the cut-point chosen by the sum of squares method always and efficiently chooses the cut-point closest to the top-left corner of ROC space, regardless of the shape of the ROC curve. PMID:25474472

  16. The effect of the stability threshold on time to stabilization and its reliability following a single leg drop jump landing.

    PubMed

    Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H

    2016-02-08

    We aimed to provide insight in how threshold selection affects time to stabilization (TTS) and its reliability to support selection of methods to determine TTS. Eighty-two elite youth soccer players performed six single leg drop jump landings. The TTS was calculated based on four processed signals: raw ground reaction force (GRF) signal (RAW), moving root mean square window (RMS), sequential average (SA) or unbounded third order polynomial fit (TOP). For each trial and processing method a wide range of thresholds was applied. Per threshold, reliability of the TTS was assessed through intra-class correlation coefficients (ICC) for the vertical (V), anteroposterior (AP) and mediolateral (ML) direction of force. Low thresholds resulted in a sharp increase of TTS values and in the percentage of trials in which TTS exceeded trial duration. The TTS and ICC were essentially similar for RAW and RMS in all directions; ICC's were mostly 'insufficient' (<0.4) to 'fair' (0.4-0.6) for the entire range of thresholds. The SA signals resulted in the most stable ICC values across thresholds, being 'substantial' (>0.8) for V, and 'moderate' (0.6-0.8) for AP and ML. The ICC's for TOP were 'substantial' for V, 'moderate' for AP, and 'fair' for ML. The present findings did not reveal an optimal threshold to assess TTS in elite youth soccer players following a single leg drop jump landing. Irrespective of threshold selection, the SA and TOP methods yielded sufficiently reliable TTS values, while for RAW and RMS the reliability was insufficient to differentiate between players. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Continuous Seismic Threshold Monitoring

    DTIC Science & Technology

    1992-05-31

    Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic

  18. Automatic threshold optimization in nonlinear energy operator based spike detection.

    PubMed

    Malik, Muhammad H; Saeed, Maryam; Kamboh, Awais M

    2016-08-01

    In neural spike sorting systems, the performance of the spike detector has to be maximized because it affects the performance of all subsequent blocks. Non-linear energy operator (NEO), is a popular spike detector due to its detection accuracy and its hardware friendly architecture. However, it involves a thresholding stage, whose value is usually approximated and is thus not optimal. This approximation deteriorates the performance in real-time systems where signal to noise ratio (SNR) estimation is a challenge, especially at lower SNRs. In this paper, we propose an automatic and robust threshold calculation method using an empirical gradient technique. The method is tested on two different datasets. The results show that our optimized threshold improves the detection accuracy in both high SNR and low SNR signals. Boxplots are presented that provide a statistical analysis of improvements in accuracy, for instance, the 75th percentile was at 98.7% and 93.5% for the optimized NEO threshold and traditional NEO threshold, respectively.

  19. 78 FR 71566 - Takes of Marine Mammals Incidental to Specified Activities; Taking Marine Mammals Incidental to a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... species or stock(s) for subsistence uses (where relevant). Further, the permissible methods of taking and... Thresholds During Pile Installation Distance Area (sq. Pile type Method Threshold (m)\\1\\ km)\\2\\ Steel (sheet... methods of taking pursuant to such activity, and other means of effecting the least practicable impact on...

  20. Nonlinear threshold effect in the Z-scan method of characterizing limiters for high-intensity laser light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tereshchenko, S. A., E-mail: tsa@miee.ru; Savelyev, M. S.; Podgaetsky, V. M.

    A threshold model is described which permits one to determine the properties of limiters for high-powered laser light. It takes into account the threshold characteristics of the nonlinear optical interaction between the laser beam and the limiter working material. The traditional non-threshold model is a particular case of the threshold model when the limiting threshold is zero. The nonlinear characteristics of carbon nanotubes in liquid and solid media are obtained from experimental Z-scan data. Specifically, the nonlinear threshold effect was observed for aqueous dispersions of nanotubes, but not for nanotubes in solid polymethylmethacrylate. The threshold model fits the experimental Z-scanmore » data better than the non-threshold model. Output characteristics were obtained that integrally describe the nonlinear properties of the optical limiters.« less

  1. Method and system for controlling a rotational speed of a rotor of a turbogenerator

    DOEpatents

    Stahlhut, Ronnie Dean; Vuk, Carl Thomas

    2008-12-30

    A system and method controls a rotational speed of a rotor or shaft of a turbogenerator in accordance with a present voltage level on a direct current bus. A lower threshold and a higher threshold are established for a speed of a rotor or shaft of a turbogenerator. A speed sensor determines speed data or a speed signal for the rotor or shaft associated with a turbogenerator. A voltage regulator adjusts a voltage level associated with a direct current bus within a target voltage range if the speed data or speed signal indicates that the speed is above the higher threshold or below the lower threshold.

  2. Nonlinear Dynamic Modeling of Neuron Action Potential Threshold During Synaptically Driven Broadband Intracellular Activity

    PubMed Central

    Roach, Shane M.; Song, Dong; Berger, Theodore W.

    2012-01-01

    Activity-dependent variation of neuronal thresholds for action potential (AP) generation is one of the key determinants of spike-train temporal-pattern transformations from presynaptic to postsynaptic spike trains. In this study, we model the nonlinear dynamics of the threshold variation during synaptically driven broadband intracellular activity. First, membrane potentials of single CA1 pyramidal cells were recorded under physiologically plausible broadband stimulation conditions. Second, a method was developed to measure AP thresholds from the continuous recordings of membrane potentials. It involves measuring the turning points of APs by analyzing the third-order derivatives of the membrane potentials. Four stimulation paradigms with different temporal patterns were applied to validate this method by comparing the measured AP turning points and the actual AP thresholds estimated with varying stimulation intensities. Results show that the AP turning points provide consistent measurement of the AP thresholds, except for a constant offset. It indicates that 1) the variation of AP turning points represents the nonlinearities of threshold dynamics; and 2) an optimization of the constant offset is required to achieve accurate spike prediction. Third, a nonlinear dynamical third-order Volterra model was built to describe the relations between the threshold dynamics and the AP activities. Results show that the model can predict threshold accurately based on the preceding APs. Finally, the dynamic threshold model was integrated into a previously developed single neuron model and resulted in a 33% improvement in spike prediction. PMID:22156947

  3. Photoacoustic signals denoising of the glucose aqueous solutions using an improved wavelet threshold method

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Xiong, Zhihua

    2016-10-01

    The photoacoustic signals denoising of glucose is one of most important steps in the quality identification of the fruit because the real-time photoacoustic singals of glucose are easily interfered by all kinds of noises. To remove the noises and some useless information, an improved wavelet threshld function were proposed. Compared with the traditional wavelet hard and soft threshold functions, the improved wavelet threshold function can overcome the pseudo-oscillation effect of the denoised photoacoustic signals due to the continuity of the improved wavelet threshold function, and the error between the denoised signals and the original signals can be decreased. To validate the feasibility of the improved wavelet threshold function denoising, the denoising simulation experiments based on MATLAB programmimg were performed. In the simulation experiments, the standard test signal was used, and three different denoising methods were used and compared with the improved wavelet threshold function. The signal-to-noise ratio (SNR) and the root-mean-square error (RMSE) values were used to evaluate the performance of the improved wavelet threshold function denoising. The experimental results demonstrate that the SNR value of the improved wavelet threshold function is largest and the RMSE value is lest, which fully verifies that the improved wavelet threshold function denoising is feasible. Finally, the improved wavelet threshold function denoising was used to remove the noises of the photoacoustic signals of the glucose solutions. The denoising effect is also very good. Therefore, the improved wavelet threshold function denoising proposed by this paper, has a potential value in the field of denoising for the photoacoustic singals.

  4. Method to acquire regions of fruit, branch and leaf from image of red apple in orchard

    NASA Astrophysics Data System (ADS)

    Lv, Jidong; Xu, Liming

    2017-07-01

    This work proposed a method to acquire regions of fruit, branch and leaf from red apple image in orchard. To acquire fruit image, R-G image was extracted from the RGB image for corrosive working, hole filling, subregion removal, expansive working and opening operation in order. Finally, fruit image was acquired by threshold segmentation. To acquire leaf image, fruit image was subtracted from RGB image before extracting 2G-R-B image. Then, leaf image was acquired by subregion removal and threshold segmentation. To acquire branch image, dynamic threshold segmentation was conducted in the R-G image. Then, the segmented image was added to fruit image to acquire adding fruit image which was subtracted from RGB image with leaf image. Finally, branch image was acquired by opening operation, subregion removal and threshold segmentation after extracting the R-G image from the subtracting image. Compared with previous methods, more complete image of fruit, leaf and branch can be acquired from red apple image with this method.

  5. Ethanol catalytic optical driven deposition for 1D and 2D materials with ultra-low power threshold of 0 dBm

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Chen, Bohua; Xiao, Xu; Guo, Chaoshi; Zhang, Xiaoyan; Wang, Jun; Jiang, Meng; Wu, Kan; Chen, Jianping

    2018-01-01

    We have demonstrated a generalized optical driven deposition method, ethanol catalytic deposition (ECD) method, which is widely applicable to the deposition of a broad range of one-dimensional (1D) and two-dimensional (2D) materials with common deposition parameters. Using ECD method, deposition of 1D material carbon nanotubes and 2D materials MoS2, MoSe2, WS2 and WSe2 on tapered fiber has been demonstrated with the threshold power as low as 0 dBm. To our knowledge, this is the lowest threshold power ever reported in optical driven deposition, noting that the conventional optical driven deposition has a threshold typically near 15 dBm. It means ECD method can significantly reduce the power requirement and simplify the setup of the optical driven deposition as well as its wide applicability to different materials, which benefits the research on optical nonlinearity and ultrafast photonics of 1D and 2D materials.

  6. Prediction of spatially explicit rainfall intensity–duration thresholds for post-fire debris-flow generation in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-01-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity–duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity–duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity–duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity–duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity–duration thresholds do not currently exist.

  7. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2017-02-01

    Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity-duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity-duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity-duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity-duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity-duration thresholds do not currently exist.

  8. THRESHOLD ELEMENTS AND THE DESIGN OF SEQUENTIAL SWITCHING NETWORKS.

    DTIC Science & Technology

    The report covers research performed from March 1966 to March 1967. The major topics treated are: (1) methods for finding weight- threshold vectors...that realize a given switching function in multi- threshold linear logic; (2) synthesis of sequential machines by means of shift registers and simple

  9. Permanent laser conditioning of thin film optical materials

    DOEpatents

    Wolfe, C. Robert; Kozlowski, Mark R.; Campbell, John H.; Staggs, Michael; Rainer, Frank

    1995-01-01

    The invention comprises a method for producing optical thin films with a high laser damage threshold and the resulting thin films. The laser damage threshold of the thin films is permanently increased by irradiating the thin films with a fluence below an unconditioned laser damage threshold.

  10. The Influence of Head Impact Threshold for Reporting Data in Contact and Collision Sports: Systematic Review and Original Data Analysis.

    PubMed

    King, D; Hume, P; Gissane, C; Brughelli, M; Clark, T

    2016-02-01

    Head impacts and resulting head accelerations cause concussive injuries. There is no standard for reporting head impact data in sports to enable comparison between studies. The aim was to outline methods for reporting head impact acceleration data in sport and the effect of the acceleration thresholds on the number of impacts reported. A systematic review of accelerometer systems utilised to report head impact data in sport was conducted. The effect of using different thresholds on a set of impact data from 38 amateur senior rugby players in New Zealand over a competition season was calculated. Of the 52 studies identified, 42% reported impacts using a >10-g threshold, where g is the acceleration of gravity. Studies reported descriptive statistics as mean ± standard deviation, median, 25th to 75th interquartile range, and 95th percentile. Application of the varied impact thresholds to the New Zealand data set resulted in 20,687 impacts of >10 g, 11,459 (45% less) impacts of >15 g, and 4024 (81% less) impacts of >30 g. Linear and angular raw data were most frequently reported. Metrics combining raw data may be more useful; however, validity of the metrics has not been adequately addressed for sport. Differing data collection methods and descriptive statistics for reporting head impacts in sports limit inter-study comparisons. Consensus on data analysis methods for sports impact assessment is needed, including thresholds. Based on the available data, the 10-g threshold is the most commonly reported impact threshold and should be reported as the median with 25th and 75th interquartile ranges as the data are non-normally distributed. Validation studies are required to determine the best threshold and metrics for impact acceleration data collection in sport. Until in-field validation studies are completed, it is recommended that head impact data should be reported as median and interquartile ranges using the 10-g impact threshold.

  11. Simplified pupal surveys of Aedes aegypti (L.) for entomologic surveillance and dengue control.

    PubMed

    Barrera, Roberto

    2009-07-01

    Pupal surveys of Aedes aegypti (L.) are useful indicators of risk for dengue transmission, although sample sizes for reliable estimations can be large. This study explores two methods for making pupal surveys more practical yet reliable and used data from 10 pupal surveys conducted in Puerto Rico during 2004-2008. The number of pupae per person for each sampling followed a negative binomial distribution, thus showing aggregation. One method found a common aggregation parameter (k) for the negative binomial distribution, a finding that enabled the application of a sequential sampling method requiring few samples to determine whether the number of pupae/person was above a vector density threshold for dengue transmission. A second approach used the finding that the mean number of pupae/person is correlated with the proportion of pupa-infested households and calculated equivalent threshold proportions of pupa-positive households. A sequential sampling program was also developed for this method to determine whether observed proportions of infested households were above threshold levels. These methods can be used to validate entomological thresholds for dengue transmission.

  12. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    PubMed Central

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  13. Observed physical processes in mechanical tests of PBX9501 and recomendations for experiments to explore a possible plasticity/damage threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buechler, Miles A.

    2012-05-02

    This memo discusses observations that have been made in regards to a series of monotonic and cyclic uniaxial experiments performed on PBX9501 by Darla Thompson under Enhanced Surveilance Campaign support. These observations discussed in Section Cyclic compression observations strongly suggest the presence of viscoelastic, plastic, and damage phenomena in the mechanical response of the material. In Secton Uniaxial data analysis and observations methods are discussed for separating out the viscoelastic effects. A crude application of those methods suggests the possibility of a critical stress below which plasticity and damage may be negligible. The threshold should be explored because if itmore » exists it will be an important feature of any constitutive model. Additionally, if the threshold exists then modifications of experimental methods may be feasible which could potentially simplify future experiments or provide higher quality data from those experiments. A set of experiments to explore the threshold stress are proposed in Section Exploratory tests program for identifying threshold stress.« less

  14. A comparison of underwater hearing sensitivity in bottlenose dolphins (Tursiops truncatus) determined by electrophysiological and behavioral methods.

    PubMed

    Houser, Dorian S; Finneran, James J

    2006-09-01

    Variable stimulus presentation methods are used in auditory evoked potential (AEP) estimates of cetacean hearing sensitivity, each of which might affect stimulus reception and hearing threshold estimates. This study quantifies differences in underwater hearing thresholds obtained by AEP and behavioral means. For AEP estimates, a transducer embedded in a suction cup (jawphone) was coupled to the dolphin's lower jaw for stimulus presentation. Underwater AEP thresholds were obtained for three dolphins in San Diego Bay and for one dolphin in a quiet pool. Thresholds were estimated from the envelope following response at carrier frequencies ranging from 10 to 150 kHz. One animal, with an atypical audiogram, demonstrated significantly greater hearing loss in the right ear than in the left. Across test conditions, the range and average difference between AEP and behavioral threshold estimates were consistent with published comparisons between underwater behavioral and in-air AEP thresholds. AEP thresholds for one animal obtained in-air and in a quiet pool demonstrated a range of differences of -10 to 9 dB (mean = 3 dB). Results suggest that for the frequencies tested, the presentation of sound stimuli through a jawphone, underwater and in-air, results in acceptable differences to AEP threshold estimates.

  15. Multi-threshold de-noising of electrical imaging logging data based on the wavelet packet transform

    NASA Astrophysics Data System (ADS)

    Xie, Fang; Xiao, Chengwen; Liu, Ruilin; Zhang, Lili

    2017-08-01

    A key problem of effectiveness evaluation for fractured-vuggy carbonatite reservoir is how to accurately extract fracture and vug information from electrical imaging logging data. Drill bits quaked during drilling and resulted in rugged surfaces of borehole walls and thus conductivity fluctuations in electrical imaging logging data. The occurrence of the conductivity fluctuations (formation background noise) directly affects the fracture/vug information extraction and reservoir effectiveness evaluation. We present a multi-threshold de-noising method based on wavelet packet transform to eliminate the influence of rugged borehole walls. The noise is present as fluctuations in button-electrode conductivity curves and as pockmarked responses in electrical imaging logging static images. The noise has responses in various scales and frequency ranges and has low conductivity compared with fractures or vugs. Our de-noising method is to decompose the data into coefficients with wavelet packet transform on a quadratic spline basis, then shrink high-frequency wavelet packet coefficients in different resolutions with minimax threshold and hard-threshold function, and finally reconstruct the thresholded coefficients. We use electrical imaging logging data collected from fractured-vuggy Ordovician carbonatite reservoir in Tarim Basin to verify the validity of the multi-threshold de-noising method. Segmentation results and extracted parameters are shown as well to prove the effectiveness of the de-noising procedure.

  16. I. RENAL THRESHOLDS FOR HEMOGLOBIN IN DOGS

    PubMed Central

    Lichty, John A.; Havill, William H.; Whipple, George H.

    1932-01-01

    We use the term "renal threshold for hemoglobin" to indicate the smallest amount of hemoglobin which given intravenously will effect the appearance of recognizable hemoglobin in the urine. The initial renal threshold level for dog hemoglobin is established by the methods employed at an average value of 155 mg. hemoglobin per kilo body weight with maximal values of 210 and minimal of 124. Repeated daily injections of hemoglobin will depress this initial renal threshold level on the average 46 per cent with maximal values of 110 and minimal values of 60 mg. hemoglobin per kilo body weight. This minimal or depression threshold is relatively constant if the injections are continued. Rest periods without injections cause a return of the renal threshold for hemoglobin toward the initial threshold levels—recovery threshold level. Injections of hemoglobin below the initial threshold level but above the minimal or depression threshold will eventually reduce the renal threshold for hemoglobin to its depression threshold level. We believe the depression threshold or minimal renal threshold level due to repeated hemoglobin injections is a little above the glomerular threshold which we assume is the base line threshold for hemoglobin. Our reasons for this belief in the glomerular threshold are given above and in the other papers of this series. PMID:19870016

  17. A simple plug-in bagging ensemble based on threshold-moving for classifying binary and multiclass imbalanced data.

    PubMed

    Collell, Guillem; Prelec, Drazen; Patil, Kaustubh R

    2018-01-31

    Class imbalance presents a major hurdle in the application of classification methods. A commonly taken approach is to learn ensembles of classifiers using rebalanced data. Examples include bootstrap averaging (bagging) combined with either undersampling or oversampling of the minority class examples. However, rebalancing methods entail asymmetric changes to the examples of different classes, which in turn can introduce their own biases. Furthermore, these methods often require specifying the performance measure of interest a priori, i.e., before learning. An alternative is to employ the threshold moving technique, which applies a threshold to the continuous output of a model, offering the possibility to adapt to a performance measure a posteriori , i.e., a plug-in method. Surprisingly, little attention has been paid to this combination of a bagging ensemble and threshold-moving. In this paper, we study this combination and demonstrate its competitiveness. Contrary to the other resampling methods, we preserve the natural class distribution of the data resulting in well-calibrated posterior probabilities. Additionally, we extend the proposed method to handle multiclass data. We validated our method on binary and multiclass benchmark data sets by using both, decision trees and neural networks as base classifiers. We perform analyses that provide insights into the proposed method.

  18. Comparison between ABR with click and narrow band chirp stimuli in children.

    PubMed

    Zirn, Stefan; Louza, Julia; Reiman, Viktor; Wittlinger, Natalie; Hempel, John-Martin; Schuster, Maria

    2014-08-01

    Click and chirp-evoked auditory brainstem responses (ABR) are applied for the estimation of hearing thresholds in children. The present study analyzes ABR thresholds across a large sample of children's ears obtained with both methods. The aim was to demonstrate the correlation between both methods using narrow band chirp and click stimuli. Click and chirp evoked ABRs were measured in 253 children aged from 0 to 18 years to determine their individual auditory threshold. The delay-compensated stimuli were narrow band CE chirps with either 2000 Hz or 4000 Hz center frequencies. Measurements were performed consecutively during natural sleep, and under sedation or general anesthesia. Threshold estimation was performed for each measurement by two experienced audiologists. Pearson-correlation analysis revealed highly significant correlations (r=0.94) between click and chirp derived thresholds for both 2 kHz and 4 kHz chirps. No considerable differences were observed either between different age ranges or gender. Comparing the thresholds estimated using ABR with click stimuli and chirp stimuli, only 0.8-2% for the 2000 Hz NB-chirp and 0.4-1.2% of the 4000 Hz NB-chirp measurements differed more than 15 dB for different degrees of hearing loss or normal hearing. The results suggest that either NB-chirp or click ABR is sufficient for threshold estimation. This holds for the chirp frequencies of 2000 Hz and 4000 Hz. The use of either click- or chirp-evoked ABR allows a reduction of recording time in young infants. Nevertheless, to cross-check the results of one of the methods, we recommend measurements with the other method as well. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Critical thresholds and recovery of Chihuahuan Desert grasslands: Insights from long-term data

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: Desertification and other harmful state transitions in drylands are expected to accelerate with global change. Ecologists are called upon to devise methods to anticipate critical thresholds and promote recovery of desired states. As in other drylands, transitions in sem...

  20. Double Threshold Energy Detection Based Cooperative Spectrum Sensing for Cognitive Radio Networks with QoS Guarantee

    NASA Astrophysics Data System (ADS)

    Hu, Hang; Yu, Hong; Zhang, Yongzhi

    2013-03-01

    Cooperative spectrum sensing, which can greatly improve the ability of discovering the spectrum opportunities, is regarded as an enabling mechanism for cognitive radio (CR) networks. In this paper, we employ a double threshold detection method in energy detector to perform spectrum sensing, only the CR users with reliable sensing information are allowed to transmit one bit local decision to the fusion center. Simulation results will show that our proposed double threshold detection method could not only improve the sensing performance but also save the bandwidth of the reporting channel compared with the conventional detection method with one threshold. By weighting the sensing performance and the consumption of system resources in a utility function that is maximized with respect to the number of CR users, it has been shown that the optimal number of CR users is related to the price of these Quality-of-Service (QoS) requirements.

  1. Threshold secret sharing scheme based on phase-shifting interferometry.

    PubMed

    Deng, Xiaopeng; Shi, Zhengang; Wen, Wei

    2016-11-01

    We propose a new method for secret image sharing with the (3,N) threshold scheme based on phase-shifting interferometry. The secret image, which is multiplied with an encryption key in advance, is first encrypted by using Fourier transformation. Then, the encoded image is shared into N shadow images based on the recording principle of phase-shifting interferometry. Based on the reconstruction principle of phase-shifting interferometry, any three or more shadow images can retrieve the secret image, while any two or fewer shadow images cannot obtain any information of the secret image. Thus, a (3,N) threshold secret sharing scheme can be implemented. Compared with our previously reported method, the algorithm of this paper is suited for not only a binary image but also a gray-scale image. Moreover, the proposed algorithm can obtain a larger threshold value t. Simulation results are presented to demonstrate the feasibility of the proposed method.

  2. Pressure Flammability Thresholds in Oxygen of Selected Aerospace Materials

    NASA Technical Reports Server (NTRS)

    Hirsch, David; Williams, Jim; Harper, Susana; Beeson, Harold; Ruff, Gary; Pedley, Mike

    2010-01-01

    The experimental approach consisted of concentrating the testing in the flammability transition zone following the Bruceton Up-and-Down Method. For attribute data, the method has been shown to be very repeatable and most efficient. Other methods for characterization of critical levels (Karberand Probit) were also considered. The data yielded the upward limiting pressure index (ULPI), the pressure level where approx.50% of materials self-extinguish in a given environment.Parametric flammability thresholds other than oxygen concentration can be determined with the methodology proposed for evaluating the MOC when extinguishment occurs. In this case, a pressure threshold in 99.8% oxygen was determined with the methodology and found to be 0.4 to 0.9 psia for typical spacecraft materials. Correlation of flammability thresholds obtained with chemical, hot wire, and other ignition sources will be conducted to provide recommendations for using alternate ignition sources to evaluate flammability of aerospace materials.

  3. A Method For Assessing Economic Thresholds of Hardwood Competition

    Treesearch

    Steven A. Knowe

    2002-01-01

    A procedure was developed for computing economic thresholds for hardwood competition in pine plantations. The economic threshold represents the break-even level of competition above which hardwood control is a financially attractive treatment. Sensitivity analyses were conducted to examine the relative importance of biological and economic factors in determining...

  4. Permanent laser conditioning of thin film optical materials

    DOEpatents

    Wolfe, C.R.; Kozlowski, M.R.; Campbell, J.H.; Staggs, M.; Rainer, F.

    1995-12-05

    The invention comprises a method for producing optical thin films with a high laser damage threshold and the resulting thin films. The laser damage threshold of the thin films is permanently increased by irradiating the thin films with a fluence below an unconditioned laser damage threshold. 9 figs.

  5. [Detection of auditory impairment in the offsprings caused by drug treatment of the dams].

    PubMed

    Kameyama, T; Nabeshima, T; Itoh, J

    1982-12-01

    To study the auditory impairment induced by prenatal administration of aminoglycosides in the offspring, the shuttle box method to measure the auditory threshold of rats (Kameyama et al., Folia pharmacol. japon. 77, 15, 1981) was employed. Four groups of pregnant rats were administered 200 mg/kg kanamycin sulfate (KM), 200 mg/kg dihydrostreptomycin sulfate (DHSM), 100 mg/kg neomycin sulfate (NM), or 1 ml/kg saline intramuscularly from the 10th to the 19th day of pregnancy. The auditory threshold of the offspring could be measured by the shuttle box method in about 90% of the live born rats at the age of 100 days. The auditory thresholds of the groups were as follows (mean +/- S.E.): saline group, 53.8 +/- 0.6 dB (N = 36); KM group, 63.8 +/- 1.1 dB (N = 34); DHSM group, 60.0 +/- 1.2 dB (N = 29); NM group, 62.4 +/- 1.2 dB (N = 24). Auditory thresholds of drug-treated groups were significantly higher than that of the saline group. However, no increase in the auditory threshold of the mother rat was detected after treatment with aminoglycosides. In addition, the experimental procedure of the shuttle box method is very easy, and the auditory threshold of a large number of rats could be measured in a short period. These findings suggest that this method is a very useful one for screening for auditory impairment induced by prenatal drug treatment in rat offspring.

  6. Evaluation of bone formation in calcium phosphate scaffolds with μCT-method validation using SEM.

    PubMed

    Lewin, S; Barba, A; Persson, C; Franch, J; Ginebra, M-P; Öhman-Mägi, C

    2017-10-05

    There is a plethora of calcium phosphate (CaP) scaffolds used as synthetic substitutes to bone grafts. The scaffold performance is often evaluated from the quantity of bone formed within or in direct contact with the scaffold. Micro-computed tomography (μCT) allows three-dimensional evaluation of bone formation inside scaffolds. However, the almost identical x-ray attenuation of CaP and bone obtrude the separation of these phases in μCT images. Commonly, segmentation of bone in μCT images is based on gray scale intensity, with manually determined global thresholds. However, image analysis methods, and methods for manual thresholding in particular, lack standardization and may consequently suffer from subjectivity. The aim of the present study was to provide a methodological framework for addressing these issues. Bone formation in two types of CaP scaffold architectures (foamed and robocast), obtained from a larger animal study (a 12 week canine animal model) was evaluated by μCT. In addition, cross-sectional scanning electron microscopy (SEM) images were acquired as references to determine thresholds and to validate the result. μCT datasets were registered to the corresponding SEM reference. Global thresholds were then determined by quantitatively correlating the different area fractions in the μCT image, towards the area fractions in the corresponding SEM image. For comparison, area fractions were also quantified using global thresholds determined manually by two different approaches. In the validation the manually determined thresholds resulted in large average errors in area fraction (up to 17%), whereas for the evaluation using SEM references, the errors were estimated to be less than 3%. Furthermore, it was found that basing the thresholds on one single SEM reference gave lower errors than determining them manually. This study provides an objective, robust and less error prone method to determine global thresholds for the evaluation of bone formation in CaP scaffolds.

  7. Rejection thresholds in solid chocolate-flavored compound coating.

    PubMed

    Harwood, Meriel L; Ziegler, Gregory R; Hayes, John E

    2012-10-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers compared to melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate, a bitter and generally recognized as safe additive. Paired preference tests (blank compared to spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between 2 self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (P= 0.01). Conversely, eating style did not affect group rejection thresholds (P= 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (P= 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. This work makes use of the rejection threshold method to study market segmentation, extending its use to solid foods. We believe this method has broad applicability to the sensory specialist and product developer by providing a process to identify how much is too much when formulating products, even in the context of specific market segments. We illustrate this in solid chocolate-flavored compound coating, identifying substantial differences in the amount of acceptable bitterness in those who prefer milk chocolate compared to dark chocolate. This method provides a direct means to answer the question of how much is too much. © 2012 Institute of Food Technologists®

  8. A Continuous Threshold Expectile Model.

    PubMed

    Zhang, Feipeng; Li, Qunhua

    2017-12-01

    Expectile regression is a useful tool for exploring the relation between the response and the explanatory variables beyond the conditional mean. A continuous threshold expectile regression is developed for modeling data in which the effect of a covariate on the response variable is linear but varies below and above an unknown threshold in a continuous way. The estimators for the threshold and the regression coefficients are obtained using a grid search approach. The asymptotic properties for all the estimators are derived, and the estimator for the threshold is shown to achieve root-n consistency. A weighted CUSUM type test statistic is proposed for the existence of a threshold at a given expectile, and its asymptotic properties are derived under both the null and the local alternative models. This test only requires fitting the model under the null hypothesis in the absence of a threshold, thus it is computationally more efficient than the likelihood-ratio type tests. Simulation studies show that the proposed estimators and test have desirable finite sample performance in both homoscedastic and heteroscedastic cases. The application of the proposed method on a Dutch growth data and a baseball pitcher salary data reveals interesting insights. The proposed method is implemented in the R package cthreshER .

  9. Dual photon excitation microscopy and image threshold segmentation in live cell imaging during compression testing.

    PubMed

    Moo, Eng Kuan; Abusara, Ziad; Abu Osman, Noor Azuan; Pingguan-Murphy, Belinda; Herzog, Walter

    2013-08-09

    Morphological studies of live connective tissue cells are imperative to helping understand cellular responses to mechanical stimuli. However, photobleaching is a constant problem to accurate and reliable live cell fluorescent imaging, and various image thresholding methods have been adopted to account for photobleaching effects. Previous studies showed that dual photon excitation (DPE) techniques are superior over conventional one photon excitation (OPE) confocal techniques in minimizing photobleaching. In this study, we investigated the effects of photobleaching resulting from OPE and DPE on morphology of in situ articular cartilage chondrocytes across repeat laser exposures. Additionally, we compared the effectiveness of three commonly-used image thresholding methods in accounting for photobleaching effects, with and without tissue loading through compression. In general, photobleaching leads to an apparent volume reduction for subsequent image scans. Performing seven consecutive scans of chondrocytes in unloaded cartilage, we found that the apparent cell volume loss caused by DPE microscopy is much smaller than that observed using OPE microscopy. Applying scan-specific image thresholds did not prevent the photobleaching-induced volume loss, and volume reductions were non-uniform over the seven repeat scans. During cartilage loading through compression, cell fluorescence increased and, depending on the thresholding method used, led to different volume changes. Therefore, different conclusions on cell volume changes may be drawn during tissue compression, depending on the image thresholding methods used. In conclusion, our findings confirm that photobleaching directly affects cell morphology measurements, and that DPE causes less photobleaching artifacts than OPE for uncompressed cells. When cells are compressed during tissue loading, a complicated interplay between photobleaching effects and compression-induced fluorescence increase may lead to interpretations in cell responses to mechanical stimuli that depend on the microscopic approach and the thresholding methods used and may result in contradictory interpretations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Is ``No-Threshold'' a ``Non-Concept''?

    NASA Astrophysics Data System (ADS)

    Schaeffer, David J.

    1981-11-01

    A controversy prominent in scientific literature that has carried over to newspapers, magazines, and popular books is having serious social and political expressions today: “Is there, or is there not, a threshold below which exposure to a carcinogen will not induce cancer?” The distinction between establishing the existence of this threshold (which is a theoretical question) and its value (which is an experimental one) gets lost in the scientific arguments. Establishing the existence of this threshold has now become a philosophical question (and an emotional one). In this paper I qualitatively outline theoretical reasons why a threshold must exist, discuss experiments which measure thresholds on two chemicals, and describe and apply a statistical method for estimating the threshold value from exposure-response data.

  11. Estimation of signal coherence threshold and concealed spectral lines applied to detection of turbofan engine combustion noise.

    PubMed

    Miles, Jeffrey Hilton

    2011-05-01

    Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise.

  12. A simple method to estimate threshold friction velocity of wind erosion in the field

    USDA-ARS?s Scientific Manuscript database

    Nearly all wind erosion models require the specification of threshold friction velocity (TFV). Yet determining TFV of wind erosion in field conditions is difficult as it depends on both soil characteristics and distribution of vegetation or other roughness elements. While several reliable methods ha...

  13. Low-Threshold Active Teaching Methods for Mathematic Instruction

    ERIC Educational Resources Information Center

    Marotta, Sebastian M.; Hargis, Jace

    2011-01-01

    In this article, we present a large list of low-threshold active teaching methods categorized so the instructor can efficiently access and target the deployment of conceptually based lessons. The categories include teaching strategies for lecture on large and small class sizes; student action individually, in pairs, and groups; games; interaction…

  14. Smeared spectrum jamming suppression based on generalized S transform and threshold segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xin; Wang, Chunyang; Tan, Ming; Fu, Xiaolong

    2018-04-01

    Smeared Spectrum (SMSP) jamming is an effective jamming in countering linear frequency modulation (LFM) radar. According to the time-frequency distribution difference between jamming and echo, a jamming suppression method based on Generalized S transform (GST) and threshold segmentation is proposed. The sub-pulse period is firstly estimated based on auto correlation function firstly. Secondly, the time-frequency image and the related gray scale image are achieved based on GST. Finally, the Tsallis cross entropy is utilized to compute the optimized segmentation threshold, and then the jamming suppression filter is constructed based on the threshold. The simulation results show that the proposed method is of good performance in the suppression of false targets produced by SMSP.

  15. Bilevel thresholding of sliced image of sludge floc.

    PubMed

    Chu, C P; Lee, D J

    2004-02-15

    This work examined the feasibility of employing various thresholding algorithms to determining the optimal bilevel thresholding value for estimating the geometric parameters of sludge flocs from the microtome sliced images and from the confocal laser scanning microscope images. Morphological information extracted from images depends on the bilevel thresholding value. According to the evaluation on the luminescence-inverted images and fractal curves (quadric Koch curve and Sierpinski carpet), Otsu's method yields more stable performance than other histogram-based algorithms and is chosen to obtain the porosity. The maximum convex perimeter method, however, can probe the shapes and spatial distribution of the pores among the biomass granules in real sludge flocs. A combined algorithm is recommended for probing the sludge floc structure.

  16. Comparison of MRI segmentation techniques for measuring liver cyst volumes in autosomal dominant polycystic kidney disease.

    PubMed

    Farooq, Zerwa; Behzadi, Ashkan Heshmatzadeh; Blumenfeld, Jon D; Zhao, Yize; Prince, Martin R

    To compare MRI segmentation methods for measuring liver cyst volumes in autosomal dominant polycystic kidney disease (ADPKD). Liver cyst volumes in 42 ADPKD patients were measured using region growing, thresholding and cyst diameter techniques. Manual segmentation was the reference standard. Root mean square deviation was 113, 155, and 500 for cyst diameter, thresholding and region growing respectively. Thresholding error for cyst volumes below 500ml was 550% vs 17% for cyst volumes above 500ml (p<0.001). For measuring volume of a small number of cysts, cyst diameter and manual segmentation methods are recommended. For severe disease with numerous, large hepatic cysts, thresholding is an acceptable alternative. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Factors affecting perception thresholds of vertical whole-body vibration in recumbent subjects: Gender and age of subjects, and vibration duration

    NASA Astrophysics Data System (ADS)

    Matsumoto, Y.; Maeda, S.; Iwane, Y.; Iwata, Y.

    2011-04-01

    Some factors that may affect human perception thresholds of the vertical whole-body vibrations were investigated in two laboratory experiments with recumbent subjects. In the first experiment, the effects of gender and age of subjects on perception were investigated with three groups of 12 subjects, i.e., young males, young females and old males. For continuous sinusoidal vibrations at 2, 4, 8, 16, 31.5 and 63 Hz, there were no significant differences in the perception thresholds between male and female subjects, while the thresholds of young subjects tended to be significantly lower than the thresholds of old subjects. In the second experiment, the effect of vibration duration was investigated by using sinusoidal vibrations, at the same frequencies as above, modulated by the Hanning windows with different lengths (i.e., 0.5, 1.0, 2.0 and 4.0 s) for 12 subjects. It was found that the peak acceleration at the threshold tended to decrease with increasing duration of vibration. The perception thresholds were also evaluated by the running root-mean-square (rms) acceleration and the fourth power acceleration method defined in the current standards. The differences in the threshold of the transient vibrations for different durations were less with the fourth power acceleration method. Additionally, the effect of the integration time on the threshold was investigated for the running rms acceleration and the fourth power acceleration. It was found that the integration time that yielded less differences in the threshold of vibrations for different durations depended on the frequency of vibration.

  18. Demand for Colonoscopy in Colorectal Cancer Screening Using a Quantitative Fecal Immunochemical Test and Age/Sex-Specific Thresholds for Test Positivity.

    PubMed

    Chen, Sam Li-Sheng; Hsu, Chen-Yang; Yen, Amy Ming-Fang; Young, Graeme P; Chiu, Sherry Yueh-Hsia; Fann, Jean Ching-Yuan; Lee, Yi-Chia; Chiu, Han-Mo; Chiou, Shu-Ti; Chen, Hsiu-Hsi

    2018-06-01

    Background: Despite age and sex differences in fecal hemoglobin (f-Hb) concentrations, most fecal immunochemical test (FIT) screening programs use population-average cut-points for test positivity. The impact of age/sex-specific threshold on FIT accuracy and colonoscopy demand for colorectal cancer screening are unknown. Methods: Using data from 723,113 participants enrolled in a Taiwanese population-based colorectal cancer screening with single FIT between 2004 and 2009, sensitivity and specificity were estimated for various f-Hb thresholds for test positivity. This included estimates based on a "universal" threshold, receiver-operating-characteristic curve-derived threshold, targeted sensitivity, targeted false-positive rate, and a colonoscopy-capacity-adjusted method integrating colonoscopy workload with and without age/sex adjustments. Results: Optimal age/sex-specific thresholds were found to be equal to or lower than the universal 20 μg Hb/g threshold. For older males, a higher threshold (24 μg Hb/g) was identified using a 5% false-positive rate. Importantly, a nonlinear relationship was observed between sensitivity and colonoscopy workload with workload rising disproportionately to sensitivity at 16 μg Hb/g. At this "colonoscopy-capacity-adjusted" threshold, the test positivity (colonoscopy workload) was 4.67% and sensitivity was 79.5%, compared with a lower 4.0% workload and a lower 78.7% sensitivity using 20 μg Hb/g. When constrained on capacity, age/sex-adjusted estimates were generally lower. However, optimizing age/-sex-adjusted thresholds increased colonoscopy demand across models by 17% or greater compared with a universal threshold. Conclusions: Age/sex-specific thresholds improve FIT accuracy with modest increases in colonoscopy demand. Impact: Colonoscopy-capacity-adjusted and age/sex-specific f-Hb thresholds may be useful in optimizing individual screening programs based on detection accuracy, population characteristics, and clinical capacity. Cancer Epidemiol Biomarkers Prev; 27(6); 704-9. ©2018 AACR . ©2018 American Association for Cancer Research.

  19. Trunk muscle activation during golf swing: Baseline and threshold.

    PubMed

    Silva, Luís; Marta, Sérgio; Vaz, João; Fernandes, Orlando; Castro, Maria António; Pezarat-Correia, Pedro

    2013-10-01

    There is a lack of studies regarding EMG temporal analysis during dynamic and complex motor tasks, such as golf swing. The aim of this study is to analyze the EMG onset during the golf swing, by comparing two different threshold methods. Method A threshold was determined using the baseline activity recorded between two maximum voluntary contraction (MVC). Method B threshold was calculated using the mean EMG activity for 1000ms before the 500ms prior to the start of the Backswing. Two different clubs were also studied. Three-way repeated measures ANOVA was used to compare methods, muscles and clubs. Two-way mixed Intraclass Correlation Coefficient (ICC) with absolute agreement was used to determine the methods reliability. Club type usage showed no influence in onset detection. Rectus abdominis (RA) showed the higher agreement between methods. Erector spinae (ES), on the other hand, showed a very low agreement, that might be related to postural activity before the swing. External oblique (EO) is the first being activated, at 1295ms prior impact. There is a similar activation time between right and left muscles sides, although the right EO showed better agreement between methods than left side. Therefore, the algorithms usage is task- and muscle-dependent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Comparative performance of two quantitative safety signalling methods: implications for use in a pharmacovigilance department.

    PubMed

    Almenoff, June S; LaCroix, Karol K; Yuen, Nancy A; Fram, David; DuMouchel, William

    2006-01-01

    There is increasing interest in using disproportionality-based signal detection methods to support postmarketing safety surveillance activities. Two commonly used methods, empirical Bayes multi-item gamma Poisson shrinker (MGPS) and proportional reporting ratio (PRR), perform differently with respect to the number and types of signals detected. The goal of this study was to compare and analyse the performance characteristics of these two methods, to understand why they differ and to consider the practical implications of these differences for a large, industry-based pharmacovigilance department. We compared the numbers and types of signals of disproportionate reporting (SDRs) obtained with MGPS and PRR using two postmarketing safety databases and a simulated database. We recorded signal counts and performed a qualitative comparison of the drug-event combinations signalled by the two methods as well as a sensitivity analysis to better understand how the thresholds commonly used for these methods impact their performance. PRR detected more SDRs than MGPS. We observed that MGPS is less subject to confounding by demographic factors because it employs stratification and is more stable than PRR when report counts are low. Simulation experiments performed using published empirical thresholds demonstrated that PRR detected false-positive signals at a rate of 1.1%, while MGPS did not detect any statistical false positives. In an attempt to separate the effect of choice of signal threshold from more fundamental methodological differences, we performed a series of experiments in which we modified the conventional threshold values for each method so that each method detected the same number of SDRs for the example drugs studied. This analysis, which provided quantitative examples of the relationship between the published thresholds for the two methods, demonstrates that the signalling criterion published for PRR has a higher signalling frequency than that published for MGPS. The performance differences between the PRR and MGPS methods are related to (i) greater confounding by demographic factors with PRR; (ii) a higher tendency of PRR to detect false-positive signals when the number of reports is small; and (iii) the conventional thresholds that have been adapted for each method. PRR tends to be more 'sensitive' and less 'specific' than MGPS. A high-specificity disproportionality method, when used in conjunction with medical triage and investigation of critical medical events, may provide an efficient and robust approach to applying quantitative methods in routine postmarketing pharmacovigilance.

  1. [A cloud detection algorithm for MODIS images combining Kmeans clustering and multi-spectral threshold method].

    PubMed

    Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei

    2011-04-01

    An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.

  2. Comparison of the diagnostic accuracy, sensitivity and specificity of four odontological methods for age evaluation in Italian children at the age threshold of 14 years using ROC curves.

    PubMed

    Pinchi, Vilma; Pradella, Francesco; Vitale, Giulia; Rugo, Dario; Nieri, Michele; Norelli, Gian-Aristide

    2016-01-01

    The age threshold of 14 years is relevant in Italy as the minimum age for criminal responsibility. It is of utmost importance to evaluate the diagnostic accuracy of every odontological method for age evaluation considering the sensitivity, or the ability to estimate the true positive cases, and the specificity, or the ability to estimate the true negative cases. The research aims to compare the specificity and sensitivity of four commonly adopted methods of dental age estimation - Demirjian, Haavikko, Willems and Cameriere - in a sample of Italian children aged between 11 and 16 years, with an age threshold of 14 years, using receiver operating characteristic curves and the area under the curve (AUC). In addition, new decision criteria are developed to increase the accuracy of the methods. Among the four odontological methods for age estimation adopted in the research, the Cameriere method showed the highest AUC in both female and male cohorts. The Cameriere method shows a high degree of accuracy at the age threshold of 14 years. To adopt the Cameriere method to estimate the 14-year age threshold more accurately, however, it is suggested - according to the Youden index - that the decision criterion be set at the lower value of 12.928 for females and 13.258 years for males, obtaining a sensitivity of 85% and specificity of 88% in females, and a sensitivity of 77% and specificity of 92% in males. If a specificity level >90% is needed, the cut-off point should be set at 12.959 years (82% sensitivity) for females. © The Author(s) 2015.

  3. Cost Savings Threshold Analysis of a Capacity-Building Program for HIV Prevention Organizations

    ERIC Educational Resources Information Center

    Dauner, Kim Nichols; Oglesby, Willie H.; Richter, Donna L.; LaRose, Christopher M.; Holtgrave, David R.

    2008-01-01

    Although the incidence of HIV each year remains steady, prevention funding is increasingly competitive. Programs need to justify costs in terms of evaluation outcomes, including economic ones. Threshold analyses set performance standards to determine program effectiveness relative to that threshold. This method was used to evaluate the potential…

  4. Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging

    USDA-ARS?s Scientific Manuscript database

    Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...

  5. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  6. Network meta-analysis of diagnostic test accuracy studies identifies and ranks the optimal diagnostic tests and thresholds for health care policy and decision-making.

    PubMed

    Owen, Rhiannon K; Cooper, Nicola J; Quinn, Terence J; Lees, Rosalind; Sutton, Alex J

    2018-07-01

    Network meta-analyses (NMA) have extensively been used to compare the effectiveness of multiple interventions for health care policy and decision-making. However, methods for evaluating the performance of multiple diagnostic tests are less established. In a decision-making context, we are often interested in comparing and ranking the performance of multiple diagnostic tests, at varying levels of test thresholds, in one simultaneous analysis. Motivated by an example of cognitive impairment diagnosis following stroke, we synthesized data from 13 studies assessing the efficiency of two diagnostic tests: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), at two test thresholds: MMSE <25/30 and <27/30, and MoCA <22/30 and <26/30. Using Markov chain Monte Carlo (MCMC) methods, we fitted a bivariate network meta-analysis model incorporating constraints on increasing test threshold, and accounting for the correlations between multiple test accuracy measures from the same study. We developed and successfully fitted a model comparing multiple tests/threshold combinations while imposing threshold constraints. Using this model, we found that MoCA at threshold <26/30 appeared to have the best true positive rate, whereas MMSE at threshold <25/30 appeared to have the best true negative rate. The combined analysis of multiple tests at multiple thresholds allowed for more rigorous comparisons between competing diagnostics tests for decision making. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  7. 48 CFR 2913.201 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACQUISITION PROCEDURES Actions at or Below the Micro-Purchase Threshold 2913.201 General. The Government... micro-purchase threshold. Other small purchase methods (blanket purchase agreements, third party drafts...

  8. Threshold-free high-power methods for the ontological analysis of genome-wide gene-expression studies

    PubMed Central

    Nilsson, Björn; Håkansson, Petra; Johansson, Mikael; Nelander, Sven; Fioretos, Thoas

    2007-01-01

    Ontological analysis facilitates the interpretation of microarray data. Here we describe new ontological analysis methods which, unlike existing approaches, are threshold-free and statistically powerful. We perform extensive evaluations and introduce a new concept, detection spectra, to characterize methods. We show that different ontological analysis methods exhibit distinct detection spectra, and that it is critical to account for this diversity. Our results argue strongly against the continued use of existing methods, and provide directions towards an enhanced approach. PMID:17488501

  9. Rainfall threshold calculation for debris flow early warning in areas with scarcity of data

    NASA Astrophysics Data System (ADS)

    Pan, Hua-Li; Jiang, Yuan-Jun; Wang, Jun; Ou, Guo-Qiang

    2018-05-01

    Debris flows are natural disasters that frequently occur in mountainous areas, usually accompanied by serious loss of lives and properties. One of the most commonly used approaches to mitigate the risk associated with debris flows is the implementation of early warning systems based on well-calibrated rainfall thresholds. However, many mountainous areas have little data regarding rainfall and hazards, especially in debris-flow-forming regions. Therefore, the traditional statistical analysis method that determines the empirical relationship between rainstorms and debris flow events cannot be effectively used to calculate reliable rainfall thresholds in these areas. After the severe Wenchuan earthquake, there were plenty of deposits deposited in the gullies, which resulted in several debris flow events. The triggering rainfall threshold has decreased obviously. To get a reliable and accurate rainfall threshold and improve the accuracy of debris flow early warning, this paper developed a quantitative method, which is suitable for debris flow triggering mechanisms in meizoseismal areas, to identify rainfall threshold for debris flow early warning in areas with a scarcity of data based on the initiation mechanism of hydraulic-driven debris flow. First, we studied the characteristics of the study area, including meteorology, hydrology, topography and physical characteristics of the loose solid materials. Then, the rainfall threshold was calculated by the initiation mechanism of the hydraulic debris flow. The comparison with other models and with alternate configurations demonstrates that the proposed rainfall threshold curve is a function of the antecedent precipitation index (API) and 1 h rainfall. To test the proposed method, we selected the Guojuanyan gully, a typical debris flow valley that during the 2008-2013 period experienced several debris flow events, located in the meizoseismal areas of the Wenchuan earthquake, as a case study. The comparison with other threshold models and configurations shows that the selected approach is the most promising starting point for further studies on debris flow early warning systems in areas with a scarcity of data.

  10. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    NASA Astrophysics Data System (ADS)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  11. Determination of sensation threshold from small pulse trains of 2.01μm laser light

    NASA Astrophysics Data System (ADS)

    Dugan, Daniel C.; Johnson, Thomas E.

    2009-02-01

    The determination of sensation thresholds has applications ranging from uses in the medical community such as neural pathway mapping and for the diagnosis of diabetic neuropathy, to potential uses in determining safety standards. This study sought to determine the sensation threshold, and the distribution of sensation probabilities, for pulse trains ranging from two 10 ms pulses to nine 10 ms pulses from 2.01 μm laser light incident on a human forearm and chest. Threshold was defined as the energy density that would elicit sensation 50% of the time (ED50). A method of levels approach was used in conjunction with a monovariate binary response model to determine the ED50. We determined the ED50 and also a distribution of threshold probabilities. Threshold was found to be largely dependant on total energy deposited for smaller pulse trains, and thus independent of the number of pulses. Total energy becomes less important as the number of pulses increases however, and a decrease in threshold was measured for a nine pulse train as compared to one through four pulse trains. Thus we have demonstrated that this method is a useful and easy way for determining sensation thresholds from a 2.01 μm laser for possible clinical use. We have also demonstrated that lower power lasers when pulsed can elicit sensation at comparable levels to higher power single pulse lasers.

  12. Threshold Determination for Local Instantaneous Sea Surface Height Derivation with Icebridge Data in Beaufort Sea

    NASA Astrophysics Data System (ADS)

    Zhu, C.; Zhang, S.; Xiao, F.; Li, J.; Yuan, L.; Zhang, Y.; Zhu, T.

    2018-05-01

    The NASA Operation IceBridge (OIB) mission is the largest program in the Earth's polar remote sensing science observation project currently, initiated in 2009, which collects airborne remote sensing measurements to bridge the gap between NASA's ICESat and the upcoming ICESat-2 mission. This paper develop an improved method that optimizing the selection method of Digital Mapping System (DMS) image and using the optimal threshold obtained by experiments in Beaufort Sea to calculate the local instantaneous sea surface height in this area. The optimal threshold determined by comparing manual selection with the lowest (Airborne Topographic Mapper) ATM L1B elevation threshold of 2 %, 1 %, 0.5 %, 0.2 %, 0.1 % and 0.05 % in A, B, C sections, the mean of mean difference are 0.166 m, 0.124 m, 0.083 m, 0.018 m, 0.002 m and -0.034 m. Our study shows the lowest L1B data of 0.1 % is the optimal threshold. The optimal threshold and manual selections are also used to calculate the instantaneous sea surface height over images with leads, we find that improved methods has closer agreement with those from L1B manual selections. For these images without leads, the local instantaneous sea surface height estimated by using the linear equations between distance and sea surface height calculated over images with leads.

  13. Perfect Detection of Spikes in the Linear Sub-threshold Dynamics of Point Neurons

    PubMed Central

    Krishnan, Jeyashree; Porta Mana, PierGianLuca; Helias, Moritz; Diesmann, Markus; Di Napoli, Edoardo

    2018-01-01

    Spiking neuronal networks are usually simulated with one of three main schemes: the classical time-driven and event-driven schemes, and the more recent hybrid scheme. All three schemes evolve the state of a neuron through a series of checkpoints: equally spaced in the first scheme and determined neuron-wise by spike events in the latter two. The time-driven and the hybrid scheme determine whether the membrane potential of a neuron crosses a threshold at the end of the time interval between consecutive checkpoints. Threshold crossing can, however, occur within the interval even if this test is negative. Spikes can therefore be missed. The present work offers an alternative geometric point of view on neuronal dynamics, and derives, implements, and benchmarks a method for perfect retrospective spike detection. This method can be applied to neuron models with affine or linear subthreshold dynamics. The idea behind the method is to propagate the threshold with a time-inverted dynamics, testing whether the threshold crosses the neuron state to be evolved, rather than vice versa. Algebraically this translates into a set of inequalities necessary and sufficient for threshold crossing. This test is slower than the imperfect one, but can be optimized in several ways. Comparison confirms earlier results that the imperfect tests rarely miss spikes (less than a fraction 1/108 of missed spikes) in biologically relevant settings. PMID:29379430

  14. Uncertainty Estimates of Psychoacoustic Thresholds Obtained from Group Tests

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Christian, Andrew

    2016-01-01

    Adaptive psychoacoustic test methods, in which the next signal level depends on the response to the previous signal, are the most efficient for determining psychoacoustic thresholds of individual subjects. In many tests conducted in the NASA psychoacoustic labs, the goal is to determine thresholds representative of the general population. To do this economically, non-adaptive testing methods are used in which three or four subjects are tested at the same time with predetermined signal levels. This approach requires us to identify techniques for assessing the uncertainty in resulting group-average psychoacoustic thresholds. In this presentation we examine the Delta Method of frequentist statistics, the Generalized Linear Model (GLM), the Nonparametric Bootstrap, a frequentist method, and Markov Chain Monte Carlo Posterior Estimation and a Bayesian approach. Each technique is exercised on a manufactured, theoretical dataset and then on datasets from two psychoacoustics facilities at NASA. The Delta Method is the simplest to implement and accurate for the cases studied. The GLM is found to be the least robust, and the Bootstrap takes the longest to calculate. The Bayesian Posterior Estimate is the most versatile technique examined because it allows the inclusion of prior information.

  15. An objective method for measuring face detection thresholds using the sweep steady-state visual evoked response

    PubMed Central

    Ales, Justin M.; Farzin, Faraz; Rossion, Bruno; Norcia, Anthony M.

    2012-01-01

    We introduce a sensitive method for measuring face detection thresholds rapidly, objectively, and independently of low-level visual cues. The method is based on the swept parameter steady-state visual evoked potential (ssVEP), in which a stimulus is presented at a specific temporal frequency while parametrically varying (“sweeping”) the detectability of the stimulus. Here, the visibility of a face image was increased by progressive derandomization of the phase spectra of the image in a series of equally spaced steps. Alternations between face and fully randomized images at a constant rate (3/s) elicit a robust first harmonic response at 3 Hz specific to the structure of the face. High-density EEG was recorded from 10 human adult participants, who were asked to respond with a button-press as soon as they detected a face. The majority of participants produced an evoked response at the first harmonic (3 Hz) that emerged abruptly between 30% and 35% phase-coherence of the face, which was most prominent on right occipito-temporal sites. Thresholds for face detection were estimated reliably in single participants from 15 trials, or on each of the 15 individual face trials. The ssVEP-derived thresholds correlated with the concurrently measured perceptual face detection thresholds. This first application of the sweep VEP approach to high-level vision provides a sensitive and objective method that could be used to measure and compare visual perception thresholds for various object shapes and levels of categorization in different human populations, including infants and individuals with developmental delay. PMID:23024355

  16. Quantification of pulmonary vessel diameter in low-dose CT images

    NASA Astrophysics Data System (ADS)

    Rudyanto, Rina D.; Ortiz de Solórzano, Carlos; Muñoz-Barrutia, Arrate

    2015-03-01

    Accurate quantification of vessel diameter in low-dose Computer Tomography (CT) images is important to study pulmonary diseases, in particular for the diagnosis of vascular diseases and the characterization of morphological vascular remodeling in Chronic Obstructive Pulmonary Disease (COPD). In this study, we objectively compare several vessel diameter estimation methods using a physical phantom. Five solid tubes of differing diameters (from 0.898 to 3.980 mm) were embedded in foam, simulating vessels in the lungs. To measure the diameters, we first extracted the vessels using either of two approaches: vessel enhancement using multi-scale Hessian matrix computation, or explicitly segmenting them using intensity threshold. We implemented six methods to quantify the diameter: three estimating diameter as a function of scale used to calculate the Hessian matrix; two calculating equivalent diameter from the crosssection area obtained by thresholding the intensity and vesselness response, respectively; and finally, estimating the diameter of the object using the Full Width Half Maximum (FWHM). We find that the accuracy of frequently used methods estimating vessel diameter from the multi-scale vesselness filter depends on the range and the number of scales used. Moreover, these methods still yield a significant error margin on the challenging estimation of the smallest diameter (on the order or below the size of the CT point spread function). Obviously, the performance of the thresholding-based methods depends on the value of the threshold. Finally, we observe that a simple adaptive thresholding approach can achieve a robust and accurate estimation of the smallest vessels diameter.

  17. Tactile Acuity Charts: A Reliable Measure of Spatial Acuity

    PubMed Central

    Bruns, Patrick; Camargo, Carlos J.; Campanella, Humberto; Esteve, Jaume; Dinse, Hubert R.; Röder, Brigitte

    2014-01-01

    For assessing tactile spatial resolution it has recently been recommended to use tactile acuity charts which follow the design principles of the Snellen letter charts for visual acuity and involve active touch. However, it is currently unknown whether acuity thresholds obtained with this newly developed psychophysical procedure are in accordance with established measures of tactile acuity that involve passive contact with fixed duration and control of contact force. Here we directly compared tactile acuity thresholds obtained with the acuity charts to traditional two-point and grating orientation thresholds in a group of young healthy adults. For this purpose, two types of charts, using either Braille-like dot patterns or embossed Landolt rings with different orientations, were adapted from previous studies. Measurements with the two types of charts were equivalent, but generally more reliable with the dot pattern chart. A comparison with the two-point and grating orientation task data showed that the test-retest reliability of the acuity chart measurements after one week was superior to that of the passive methods. Individual thresholds obtained with the acuity charts agreed reasonably with the grating orientation threshold, but less so with the two-point threshold that yielded relatively distinct acuity estimates compared to the other methods. This potentially considerable amount of mismatch between different measures of tactile acuity suggests that tactile spatial resolution is a complex entity that should ideally be measured with different methods in parallel. The simple test procedure and high reliability of the acuity charts makes them a promising complement and alternative to the traditional two-point and grating orientation thresholds. PMID:24504346

  18. ECG signal performance de-noising assessment based on threshold tuning of dual-tree wavelet transform.

    PubMed

    El B'charri, Oussama; Latif, Rachid; Elmansouri, Khalifa; Abenaou, Abdenbi; Jenkal, Wissam

    2017-02-07

    Since the electrocardiogram (ECG) signal has a low frequency and a weak amplitude, it is sensitive to miscellaneous mixed noises, which may reduce the diagnostic accuracy and hinder the physician's correct decision on patients. The dual tree wavelet transform (DT-WT) is one of the most recent enhanced versions of discrete wavelet transform. However, threshold tuning on this method for noise removal from ECG signal has not been investigated yet. In this work, we shall provide a comprehensive study on the impact of the choice of threshold algorithm, threshold value, and the appropriate wavelet decomposition level to evaluate the ECG signal de-noising performance. A set of simulations is performed on both synthetic and real ECG signals to achieve the promised results. First, the synthetic ECG signal is used to observe the algorithm response. The evaluation results of synthetic ECG signal corrupted by various types of noise has showed that the modified unified threshold and wavelet hyperbolic threshold de-noising method is better in realistic and colored noises. The tuned threshold is then used on real ECG signals from the MIT-BIH database. The results has shown that the proposed method achieves higher performance than the ordinary dual tree wavelet transform into all kinds of noise removal from ECG signal. The simulation results indicate that the algorithm is robust for all kinds of noises with varying degrees of input noise, providing a high quality clean signal. Moreover, the algorithm is quite simple and can be used in real time ECG monitoring.

  19. A critique of the use of indicator-species scores for identifying thresholds in species responses

    USGS Publications Warehouse

    Cuffney, Thomas F.; Qian, Song S.

    2013-01-01

    Identification of ecological thresholds is important both for theoretical and applied ecology. Recently, Baker and King (2010, King and Baker 2010) proposed a method, threshold indicator analysis (TITAN), to calculate species and community thresholds based on indicator species scores adapted from Dufrêne and Legendre (1997). We tested the ability of TITAN to detect thresholds using models with (broken-stick, disjointed broken-stick, dose-response, step-function, Gaussian) and without (linear) definitive thresholds. TITAN accurately and consistently detected thresholds in step-function models, but not in models characterized by abrupt changes in response slopes or response direction. Threshold detection in TITAN was very sensitive to the distribution of 0 values, which caused TITAN to identify thresholds associated with relatively small differences in the distribution of 0 values while ignoring thresholds associated with large changes in abundance. Threshold identification and tests of statistical significance were based on the same data permutations resulting in inflated estimates of statistical significance. Application of bootstrapping to the split-point problem that underlies TITAN led to underestimates of the confidence intervals of thresholds. Bias in the derivation of the z-scores used to identify TITAN thresholds and skewedness in the distribution of data along the gradient produced TITAN thresholds that were much more similar than the actual thresholds. This tendency may account for the synchronicity of thresholds reported in TITAN analyses. The thresholds identified by TITAN represented disparate characteristics of species responses that, when coupled with the inability of TITAN to identify thresholds accurately and consistently, does not support the aggregation of individual species thresholds into a community threshold.

  20. Method to improve reliability of a fuel cell system using low performance cell detection at low power operation

    DOEpatents

    Choi, Tayoung; Ganapathy, Sriram; Jung, Jaehak; Savage, David R.; Lakshmanan, Balasubramanian; Vecasey, Pamela M.

    2013-04-16

    A system and method for detecting a low performing cell in a fuel cell stack using measured cell voltages. The method includes determining that the fuel cell stack is running, the stack coolant temperature is above a certain temperature and the stack current density is within a relatively low power range. The method further includes calculating the average cell voltage, and determining whether the difference between the average cell voltage and the minimum cell voltage is greater than a predetermined threshold. If the difference between the average cell voltage and the minimum cell voltage is greater than the predetermined threshold and the minimum cell voltage is less than another predetermined threshold, then the method increments a low performing cell timer. A ratio of the low performing cell timer and a system run timer is calculated to identify a low performing cell.

  1. Variability of argon laser-induced sensory and pain thresholds on human oral mucosa and skin.

    PubMed Central

    Svensson, P.; Bjerring, P.; Arendt-Nielsen, L.; Kaaber, S.

    1991-01-01

    The variability of laser-induced pain perception on human oral mucosa and hairy skin was investigated in order to establish a new method for evaluation of pain in the orofacial region. A high-energy argon laser was used for experimental pain stimulation, and sensory and pain thresholds were determined. The intra-individual coefficients of variation for oral thresholds were comparable to cutaneous thresholds. However, inter-individual variation was smaller for oral thresholds, which could be due to larger variation in cutaneous optical properties. The short-term and 24-hr changes in thresholds on both surfaces were less than 9%. The results indicate that habituation to laser thresholds may account for part of the intra-individual variation observed. However, the subjective ratings of the intensity of the laser stimuli were constant. Thus, oral thresholds may, like cutaneous thresholds, be used for assessment and quantification of analgesic efficacies and to investigate various pain conditions. PMID:1814248

  2. SeaWiFS Technical Report Series. Volume 7: Cloud screening for polar orbiting visible and infrared (IR) satellite sensors

    NASA Technical Reports Server (NTRS)

    Darzi, Michael; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)

    1992-01-01

    Methods for detecting and screening cloud contamination from satellite derived visible and infrared data are reviewed in this document. The methods are applicable to past, present, and future polar orbiting satellite radiometers. Such instruments include the Coastal Zone Color Scanner (CZCS), operational from 1978 through 1986; the Advanced Very High Resolution Radiometer (AVHRR); the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), scheduled for launch in August 1993; and the Moderate Resolution Imaging Spectrometer (IMODIS). Constant threshold methods are the least demanding computationally, and often provide adequate results. An improvement to these methods are the least demanding computationally, and often provide adequate results. An improvement to these methods is to determine the thresholds dynamically by adjusting them according to the areal and temporal distributions of the surrounding pixels. Spatial coherence methods set thresholds based on the expected spatial variability of the data. Other statistically derived methods and various combinations of basic methods are also reviewed. The complexity of the methods is ultimately limited by the computing resources. Finally, some criteria for evaluating cloud screening methods are discussed.

  3. Variable threshold method for ECG R-peak detection.

    PubMed

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  4. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  5. A Connection Admission Control Method for Web Server Systems

    NASA Astrophysics Data System (ADS)

    Satake, Shinsuke; Inai, Hiroshi; Saito, Tomoya; Arai, Tsuyoshi

    Most browsers establish multiple connections and download files in parallel to reduce the response time. On the other hand, a web server limits the total number of connections to prevent from being overloaded. That could decrease the response time, but would increase the loss probability, the probability of which a newly arriving client is rejected. This paper proposes a connection admission control method which accepts only one connection from a newly arriving client when the number of connections exceeds a threshold, but accepts new multiple connections when the number of connections is less than the threshold. Our method is aimed at reducing the response time by allowing as many clients as possible to establish multiple connections, and also reducing the loss probability. In order to reduce spending time to examine an adequate threshold for web server administrators, we introduce a procedure which approximately calculates the loss probability under a condition that the threshold is given. Via simulation, we validate the approximation and show effectiveness of the admission control.

  6. Signal processing system for electrotherapy applications

    NASA Astrophysics Data System (ADS)

    Płaza, Mirosław; Szcześniak, Zbigniew

    2017-08-01

    The system of signal processing for electrotherapeutic applications is proposed in the paper. The system makes it possible to model the curve of threshold human sensitivity to current (Dalziel's curve) in full medium frequency range (1kHz-100kHz). The tests based on the proposed solution were conducted and their results were compared with those obtained according to the assumptions of High Tone Power Therapy method and referred to optimum values. Proposed system has high dynamics and precision of mapping the curve of threshold human sensitivity to current and can be used in all methods where threshold curves are modelled.

  7. Development of a Method to Determine the Audiogram of the Guinea Pig for Threshold Shift Studies,

    DTIC Science & Technology

    1984-01-01

    AD-A139 717 DEVELOPENI OF A MEHO 10O DEUUUINE tHE AU0IOOGAB Of II THE GUINEA PIG FP0.6 I ARM AEOICAL RESEARCH LAO FORT RUICKER AL. C COMPEIIAIOAE It...tS4 USAARL REPORT NO. 84 - 4 RESU DEVELOPMENT OF A METHOD TO DETERMINE THE AUDIOGRAM OF THE GUINEA PIG FOR THRESHOLD SHIFT STUDIES By Carlos Comperatore...Determine the Audiogram of the Guinea Pig for Threshold Shift Studies G. PERFORMING ORo. REPORT MummaR 7. AUTHOR(e) S. CONTRACT OR GRANT NUMSERa) Carlos

  8. Northwest Manufacturing Initiative

    DTIC Science & Technology

    2012-03-27

    crack growth and threshold stress corrosion cracking evaluation. Threshold stress corrosion cracking was done using the rising step load method with...Group Technology methods to establish manufacturing cells for production efficiency, to develop internal Lean Champions, and to implement rapid... different levels, advisory, core, etc. VI. Core steering committee composed of members that have a significant vested interest. Action Item: Draft

  9. OPTOELECTRONICS, FIBER OPTICS, AND OTHER ASPECTS OF QUANTUM ELECTRONICS: Interference-threshold storage of optical data

    NASA Astrophysics Data System (ADS)

    Efimkov, V. F.; Zubarev, I. G.; Kolobrodov, V. V.; Sobolev, V. B.

    1989-08-01

    A method for the determination of the spatial characteristics of a laser beam is proposed and implemented. This method is based on the interaction of an interference field of two laser beams, which are spatially similar to the one being investigated, with a light-sensitive material characterized by a sensitivity threshold.

  10. Machine Learning Approach to Extract Diagnostic and Prognostic Thresholds: Application in Prognosis of Cardiovascular Mortality

    PubMed Central

    Mena, Luis J.; Orozco, Eber E.; Felix, Vanessa G.; Ostos, Rodolfo; Melgarejo, Jesus; Maestre, Gladys E.

    2012-01-01

    Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses. PMID:22924062

  11. Prefixed-threshold real-time selection method in free-space quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Xu, Feihu; Lo, Hoi-Kwong

    2018-03-01

    Free-space quantum key distribution allows two parties to share a random key with unconditional security, between ground stations, between mobile platforms, and even in satellite-ground quantum communications. Atmospheric turbulence causes fluctuations in transmittance, which further affect the quantum bit error rate and the secure key rate. Previous postselection methods to combat atmospheric turbulence require a threshold value determined after all quantum transmission. In contrast, here we propose a method where we predetermine the optimal threshold value even before quantum transmission. Therefore, the receiver can discard useless data immediately, thus greatly reducing data storage requirements and computing resources. Furthermore, our method can be applied to a variety of protocols, including, for example, not only single-photon BB84 but also asymptotic and finite-size decoy-state BB84, which can greatly increase its practicality.

  12. What is the best way to contour lung tumors on PET scans? Multiobserver validation of a gradient-based method using a NSCLC digital PET phantom.

    PubMed

    Werner-Wasik, Maria; Nelson, Arden D; Choi, Walter; Arai, Yoshio; Faulhaber, Peter F; Kang, Patrick; Almeida, Fabio D; Xiao, Ying; Ohri, Nitin; Brockway, Kristin D; Piper, Jonathan W; Nelson, Aaron S

    2012-03-01

    To evaluate the accuracy and consistency of a gradient-based positron emission tomography (PET) segmentation method, GRADIENT, compared with manual (MANUAL) and constant threshold (THRESHOLD) methods. Contouring accuracy was evaluated with sphere phantoms and clinically realistic Monte Carlo PET phantoms of the thorax. The sphere phantoms were 10-37 mm in diameter and were acquired at five institutions emulating clinical conditions. One institution also acquired a sphere phantom with multiple source-to-background ratios of 2:1, 5:1, 10:1, 20:1, and 70:1. One observer segmented (contoured) each sphere with GRADIENT and THRESHOLD from 25% to 50% at 5% increments. Subsequently, seven physicians segmented 31 lesions (7-264 mL) from 25 digital thorax phantoms using GRADIENT, THRESHOLD, and MANUAL. For spheres <20 mm in diameter, GRADIENT was the most accurate with a mean absolute % error in diameter of 8.15% (10.2% SD) compared with 49.2% (51.1% SD) for 45% THRESHOLD (p < 0.005). For larger spheres, the methods were statistically equivalent. For varying source-to-background ratios, GRADIENT was the most accurate for spheres >20 mm (p < 0.065) and <20 mm (p < 0.015). For digital thorax phantoms, GRADIENT was the most accurate (p < 0.01), with a mean absolute % error in volume of 10.99% (11.9% SD), followed by 25% THRESHOLD at 17.5% (29.4% SD), and MANUAL at 19.5% (17.2% SD). GRADIENT had the least systematic bias, with a mean % error in volume of -0.05% (16.2% SD) compared with 25% THRESHOLD at -2.1% (34.2% SD) and MANUAL at -16.3% (20.2% SD; p value <0.01). Interobserver variability was reduced using GRADIENT compared with both 25% THRESHOLD and MANUAL (p value <0.01, Levene's test). GRADIENT was the most accurate and consistent technique for target volume contouring. GRADIENT was also the most robust for varying imaging conditions. GRADIENT has the potential to play an important role for tumor delineation in radiation therapy planning and response assessment. Copyright © 2012. Published by Elsevier Inc.

  13. A multi-threshold sampling method for TOF-PET signal processing

    NASA Astrophysics Data System (ADS)

    Kim, H.; Kao, C. M.; Xie, Q.; Chen, C. T.; Zhou, L.; Tang, F.; Frisch, H.; Moses, W. W.; Choong, W. S.

    2009-04-01

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multi-threshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to eight threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25×6.25×25 mm3 LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an ˜18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an ˜9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain ˜300 ps coincidence timing resolution, ˜14% energy resolution at 511 keV, and ˜5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  14. Effective Identification of Functional Hearing Loss Using Behavioral Threshold Measures

    ERIC Educational Resources Information Center

    Schlauch, Robert S.; Koerner, Tess K.; Marshall, Lynne

    2015-01-01

    Purpose: Four functional hearing loss protocols were evaluated. Method: For each protocol, 30 participants feigned a hearing loss first on an audiogram and then for a screening test that began a threshold search from extreme levels (-10 or 90 dB HL). Two-tone and 3-tone protocols compared thresholds for ascending and descending tones for 2 (0.5…

  15. Underwater hearing in the loggerhead turtle (Caretta caretta): a comparison of behavioral and auditory evoked potential audiograms.

    PubMed

    Martin, Kelly J; Alessi, Sarah C; Gaspard, Joseph C; Tucker, Anton D; Bauer, Gordon B; Mann, David A

    2012-09-01

    The purpose of this study was to compare underwater behavioral and auditory evoked potential (AEP) audiograms in a single captive adult loggerhead sea turtle (Caretta caretta). The behavioral audiogram was collected using a go/no-go response procedure and a modified staircase method of threshold determination. AEP thresholds were measured using subdermal electrodes placed beneath the frontoparietal scale, dorsal to the midbrain. Both methods showed the loggerhead sea turtle to have low frequency hearing with best sensitivity between 100 and 400 Hz. AEP testing yielded thresholds from 100 to 1131 Hz with best sensitivity at 200 and 400 Hz (110 dB re. 1 μPa). Behavioral testing using 2 s tonal stimuli yielded underwater thresholds from 50 to 800 Hz with best sensitivity at 100 Hz (98 dB re. 1 μPa). Behavioral thresholds averaged 8 dB lower than AEP thresholds from 100 to 400 Hz and 5 dB higher at 800 Hz. The results suggest that AEP testing can be a good alternative to measuring a behavioral audiogram with wild or untrained marine turtles and when time is a crucial factor.

  16. Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates

    PubMed Central

    Malone, Brian J.

    2017-01-01

    Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194

  17. Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    PubMed

    Brown, Adrian P; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Boyd, James H

    2017-07-10

    Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets.

  18. Comparison of in-air evoked potential and underwater behavioral hearing thresholds in four bottlenose dolphins (Tursiops truncatus).

    PubMed

    Finneran, James J; Houser, Dorian S

    2006-05-01

    Traditional behavioral techniques for hearing assessment in marine mammals are limited by the time and access required to train subjects. Electrophysiological methods, where passive electrodes are used to measure auditory evoked potentials (AEPs), are attractive alternatives to behavioral techniques; however, there have been few attempts to compare AEP and behavioral results for the same subject. In this study, behavioral and AEP hearing thresholds were compared in four bottlenose dolphins. AEP thresholds were measured in-air using a piezoelectric sound projector embedded in a suction cup to deliver amplitude modulated tones to the dolphin through the lower jaw. Evoked potentials were recorded noninvasively using surface electrodes. Adaptive procedures allowed AEP hearing thresholds to be estimated from 10 to 150 kHz in a single ear in about 45 min. Behavioral thresholds were measured in a quiet pool and in San Diego Bay. AEP and behavioral threshold estimates agreed closely as to the upper cutoff frequency beyond which thresholds increased sharply. AEP thresholds were strongly correlated with pool behavioral thresholds across the range of hearing; differences between AEP and pool behavioral thresholds increased with threshold magnitude and ranged from 0 to + 18 dB.

  19. Analysis of Waveform Retracking Methods in Antarctic Ice Sheet Based on CRYOSAT-2 Data

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Li, F.; Zhang, S.; Hao, W.; Yuan, L.; Zhu, T.; Zhang, Y.; Zhu, C.

    2017-09-01

    Satellite altimetry plays an important role in many geoscientific and environmental studies of Antarctic ice sheet. The ranging accuracy is degenerated near coasts or over nonocean surfaces, due to waveform contamination. A postprocess technique, known as waveform retracking, can be used to retrack the corrupt waveform and in turn improve the ranging accuracy. In 2010, the CryoSat-2 satellite was launched with the Synthetic aperture Interferometric Radar ALtimeter (SIRAL) onboard. Satellite altimetry waveform retracking methods are discussed in the paper. Six retracking methods including the OCOG method, the threshold method with 10 %, 25 % and 50 % threshold level, the linear and exponential 5-β parametric methods are used to retrack CryoSat-2 waveform over the transect from Zhongshan Station to Dome A. The results show that the threshold retracker performs best with the consideration of waveform retracking success rate and RMS of retracking distance corrections. The linear 5-β parametric retracker gives best waveform retracking precision, but cannot make full use of the waveform data.

  20. Critical review and hydrologic application of threshold detection methods for the generalized Pareto (GP) distribution

    NASA Astrophysics Data System (ADS)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto

    2016-04-01

    Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the empirical records, as well as variations in their size, constitute the two most important factors that may significantly affect the accuracy of the obtained results. Acknowledgments The research project was implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and co-financed by the European Social Fund (ESF) and the Greek State. The work conducted by Roberto Deidda was funded under the Sardinian Regional Law 7/2007 (funding call 2013).

  1. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  2. Energy Calibration of a Silicon-Strip Detector for Photon-Counting Spectral CT by Direct Usage of the X-ray Tube Spectrum

    NASA Astrophysics Data System (ADS)

    Liu, Xuejin; Chen, Han; Bornefalk, Hans; Danielsson, Mats; Karlsson, Staffan; Persson, Mats; Xu, Cheng; Huber, Ben

    2015-02-01

    The variation among energy thresholds in a multibin detector for photon-counting spectral CT can lead to ring artefacts in the reconstructed images. Calibration of the energy thresholds can be used to achieve homogeneous threshold settings or to develop compensation methods to reduce the artefacts. We have developed an energy-calibration method for the different comparator thresholds employed in a photon-counting silicon-strip detector. In our case, this corresponds to specifying the linear relation between the threshold positions in units of mV and the actual deposited photon energies in units of keV. This relation is determined by gain and offset values that differ for different detector channels due to variations in the manufacturing process. Typically, the calibration is accomplished by correlating the peak positions of obtained pulse-height spectra to known photon energies, e.g. with the aid of mono-energetic x rays from synchrotron radiation, radioactive isotopes or fluorescence materials. Instead of mono-energetic x rays, the calibration method presented in this paper makes use of a broad x-ray spectrum provided by commercial x-ray tubes. Gain and offset as the calibration parameters are obtained by a regression analysis that adjusts a simulated spectrum of deposited energies to a measured pulse-height spectrum. Besides the basic photon interactions such as Rayleigh scattering, Compton scattering and photo-electric absorption, the simulation takes into account the effect of pulse pileup, charge sharing and the electronic noise of the detector channels. We verify the method for different detector channels with the aid of a table-top setup, where we find the uncertainty of the keV-value of a calibrated threshold to be between 0.1 and 0.2 keV.

  3. Application of automatic threshold in dynamic target recognition with low contrast

    NASA Astrophysics Data System (ADS)

    Miao, Hua; Guo, Xiaoming; Chen, Yu

    2014-11-01

    Hybrid photoelectric joint transform correlator can realize automatic real-time recognition with high precision through the combination of optical devices and electronic devices. When recognizing targets with low contrast using photoelectric joint transform correlator, because of the difference of attitude, brightness and grayscale between target and template, only four to five frames of dynamic targets can be recognized without any processing. CCD camera is used to capture the dynamic target images and the capturing speed of CCD is 25 frames per second. Automatic threshold has many advantages like fast processing speed, effectively shielding noise interference, enhancing diffraction energy of useful information and better reserving outline of target and template, so this method plays a very important role in target recognition with optical correlation method. However, the automatic obtained threshold by program can not achieve the best recognition results for dynamic targets. The reason is that outline information is broken to some extent. Optimal threshold is obtained by manual intervention in most cases. Aiming at the characteristics of dynamic targets, the processing program of improved automatic threshold is finished by multiplying OTSU threshold of target and template by scale coefficient of the processed image, and combining with mathematical morphology. The optimal threshold can be achieved automatically by improved automatic threshold processing for dynamic low contrast target images. The recognition rate of dynamic targets is improved through decreased background noise effect and increased correlation information. A series of dynamic tank images with the speed about 70 km/h are adapted as target images. The 1st frame of this series of tanks can correlate only with the 3rd frame without any processing. Through OTSU threshold, the 80th frame can be recognized. By automatic threshold processing of the joint images, this number can be increased to 89 frames. Experimental results show that the improved automatic threshold processing has special application value for the recognition of dynamic target with low contrast.

  4. Laboratory validation of an in-home method for assessing circadian phase using dim light melatonin onset (DLMO).

    PubMed

    Pullman, Rebecca E; Roepke, Stephanie E; Duffy, Jeanne F

    2012-06-01

    To determine whether an accurate circadian phase assessment could be obtained from saliva samples collected by patients in their home. Twenty-four individuals with a complaint of sleep initiation or sleep maintenance difficulty were studied for two evenings. Each participant received instructions for collecting eight hourly saliva samples in dim light at home. On the following evening they spent 9h in a laboratory room with controlled dim (<20 lux) light where hourly saliva samples were collected. Circadian phase of dim light melatonin onset (DLMO) was determined using both an absolute threshold (3 pg ml(-1)) and a relative threshold (two standard deviations above the mean of three baseline values). Neither threshold method worked well for one participant who was a "low-secretor". In four cases the participants' in-lab melatonin levels rose much earlier or were much higher than their at-home levels, and one participant appeared to take the at home samples out of order. Overall, the at-home and in-lab DLMO values were significantly correlated using both methods, and differed on average by 37 (± 19)min using the absolute threshold and by 54 (± 36)min using the relative threshold. The at-home assessment procedure was able to determine an accurate DLMO using an absolute threshold in 62.5% of the participants. Thus, an at-home procedure for assessing circadian phase could be practical for evaluating patients for circadian rhythm sleep disorders. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Application of a Threshold Method to the TRMM Radar for the Estimation of Space-Time Rain Rate Statistics

    NASA Technical Reports Server (NTRS)

    Meneghini, Robert; Jones, Jeffrey A.

    1997-01-01

    One of the TRMM radar products of interest is the monthly-averaged rain rates over 5 x 5 degree cells. Clearly, the most directly way of calculating these and similar statistics is to compute them from the individual estimates made over the instantaneous field of view of the Instrument (4.3 km horizontal resolution). An alternative approach is the use of a threshold method. It has been established that over sufficiently large regions the fractional area above a rain rate threshold and the area-average rain rate are well correlated for particular choices of the threshold [e.g., Kedem et al., 19901]. A straightforward application of this method to the TRMM data would consist of the conversion of the individual reflectivity factors to rain rates followed by a calculation of the fraction of these that exceed a particular threshold. Previous results indicate that for thresholds near or at 5 mm/h, the correlation between this fractional area and the area-average rain rate is high. There are several drawbacks to this approach, however. At the TRMM radar frequency of 13.8 GHz the signal suffers attenuation so that the negative bias of the high resolution rain rate estimates will increase as the path attenuation increases. To establish a quantitative relationship between fractional area and area-average rain rate, an independent means of calculating the area-average rain rate is needed such as an array of rain gauges. This type of calibration procedure, however, is difficult for a spaceborne radar such as TRMM. To estimate a statistic other than the mean of the distribution requires, in general, a different choice of threshold and a different set of tuning parameters.

  6. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  7. a New Multi-Spectral Threshold Normalized Difference Water Index Mst-Ndwi Water Extraction Method - a Case Study in Yanhe Watershed

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.

    2018-05-01

    Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.

  8. Novel image processing method study for a label-free optical biosensor

    NASA Astrophysics Data System (ADS)

    Yang, Chenhao; Wei, Li'an; Yang, Rusong; Feng, Ying

    2015-10-01

    Optical biosensor is generally divided into labeled type and label-free type, the former mainly contains fluorescence labeled method and radioactive-labeled method, while fluorescence-labeled method is more mature in the application. The mainly image processing methods of fluorescent-labeled biosensor includes smooth filtering, artificial gridding and constant thresholding. Since some fluorescent molecules may influence the biological reaction, label-free methods have been the main developing direction of optical biosensors nowadays. The using of wider field of view and larger angle of incidence light path which could effectively improve the sensitivity of the label-free biosensor also brought more difficulties in image processing, comparing with the fluorescent-labeled biosensor. Otsu's method is widely applied in machine vision, etc, which choose the threshold to minimize the intraclass variance of the thresholded black and white pixels. It's capacity-constrained with the asymmetrical distribution of images as a global threshold segmentation. In order to solve the irregularity of light intensity on the transducer, we improved the algorithm. In this paper, we present a new image processing algorithm based on a reflectance modulation biosensor platform, which mainly comprises the design of sliding normalization algorithm for image rectification and utilizing the improved otsu's method for image segmentation, in order to implement automatic recognition of target areas. Finally we used adaptive gridding method extracting the target parameters for analysis. Those methods could improve the efficiency of image processing, reduce human intervention, enhance the reliability of experiments and laid the foundation for the realization of high throughput of label-free optical biosensors.

  9. Evaluation of Wavelet Denoising Methods for Small-Scale Joint Roughness Estimation Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Bitenc, M.; Kieffer, D. S.; Khoshelham, K.

    2015-08-01

    The precision of Terrestrial Laser Scanning (TLS) data depends mainly on the inherent random range error, which hinders extraction of small details from TLS measurements. New post processing algorithms have been developed that reduce or eliminate the noise and therefore enable modelling details at a smaller scale than one would traditionally expect. The aim of this research is to find the optimum denoising method such that the corrected TLS data provides a reliable estimation of small-scale rock joint roughness. Two wavelet-based denoising methods are considered, namely Discrete Wavelet Transform (DWT) and Stationary Wavelet Transform (SWT), in combination with different thresholding procedures. The question is, which technique provides a more accurate roughness estimates considering (i) wavelet transform (SWT or DWT), (ii) thresholding method (fixed-form or penalised low) and (iii) thresholding mode (soft or hard). The performance of denoising methods is tested by two analyses, namely method noise and method sensitivity to noise. The reference data are precise Advanced TOpometric Sensor (ATOS) measurements obtained on 20 × 30 cm rock joint sample, which are for the second analysis corrupted by different levels of noise. With such a controlled noise level experiments it is possible to evaluate the methods' performance for different amounts of noise, which might be present in TLS data. Qualitative visual checks of denoised surfaces and quantitative parameters such as grid height and roughness are considered in a comparative analysis of denoising methods. Results indicate that the preferred method for realistic roughness estimation is DWT with penalised low hard thresholding.

  10. The Comparison Study of Quadratic Infinite Beam Program on Optimization Instensity Modulated Radiation Therapy Treatment Planning (IMRTP) between Threshold and Exponential Scatter Method with CERR® In The Case of Lung Cancer

    NASA Astrophysics Data System (ADS)

    Hardiyanti, Y.; Haekal, M.; Waris, A.; Haryanto, F.

    2016-08-01

    This research compares the quadratic optimization program on Intensity Modulated Radiation Therapy Treatment Planning (IMRTP) with the Computational Environment for Radiotherapy Research (CERR) software. We assumed that the number of beams used for the treatment planner was about 9 and 13 beams. The case used the energy of 6 MV with Source Skin Distance (SSD) of 100 cm from target volume. Dose calculation used Quadratic Infinite beam (QIB) from CERR. CERR was used in the comparison study between Gauss Primary threshold method and Gauss Primary exponential method. In the case of lung cancer, the threshold variation of 0.01, and 0.004 was used. The output of the dose was distributed using an analysis in the form of DVH from CERR. The maximum dose distributions obtained were on the target volume (PTV) Planning Target Volume, (CTV) Clinical Target Volume, (GTV) Gross Tumor Volume, liver, and skin. It was obtained that if the dose calculation method used exponential and the number of beam 9. When the dose calculation method used the threshold and the number of beam 13, the maximum dose distributions obtained were on the target volume PTV, GTV, heart, and skin.

  11. A newly identified calculation discrepancy of the Sunset semi-continuous carbon analyzer

    NASA Astrophysics Data System (ADS)

    Zheng, G.; Cheng, Y.; He, K.; Duan, F.; Ma, Y.

    2014-01-01

    Sunset Semi-Continuous Carbon Analyzer (SCCA) is an instrument widely used for carbonaceous aerosol measurement. Despite previous validation work, here we identified a new type of SCCA calculation discrepancy caused by the default multi-point baseline correction method. When exceeding a certain threshold carbon load, multi-point correction could cause significant Total Carbon (TC) underestimation. This calculation discrepancy was characterized for both sucrose and ambient samples with three temperature protocols. For ambient samples, 22%, 36% and 12% TC was underestimated by the three protocols, respectively, with corresponding threshold being ~0, 20 and 25 μg C. For sucrose, however, such discrepancy was observed with only one of these protocols, indicating the need of more refractory SCCA calibration substance. The discrepancy was less significant for the NIOSH (National Institute for Occupational Safety and Health)-like protocol compared with the other two protocols based on IMPROVE (Interagency Monitoring of PROtected Visual Environments). Although the calculation discrepancy could be largely reduced by the single-point baseline correction method, the instrumental blanks of single-point method were higher. Proposed correction method was to use multi-point corrected data when below the determined threshold, while use single-point results when beyond that threshold. The effectiveness of this correction method was supported by correlation with optical data.

  12. A newly identified calculation discrepancy of the Sunset semi-continuous carbon analyzer

    NASA Astrophysics Data System (ADS)

    Zheng, G. J.; Cheng, Y.; He, K. B.; Duan, F. K.; Ma, Y. L.

    2014-07-01

    The Sunset semi-continuous carbon analyzer (SCCA) is an instrument widely used for carbonaceous aerosol measurement. Despite previous validation work, in this study we identified a new type of SCCA calculation discrepancy caused by the default multipoint baseline correction method. When exceeding a certain threshold carbon load, multipoint correction could cause significant total carbon (TC) underestimation. This calculation discrepancy was characterized for both sucrose and ambient samples, with two protocols based on IMPROVE (Interagency Monitoring of PROtected Visual Environments) (i.e., IMPshort and IMPlong) and one NIOSH (National Institute for Occupational Safety and Health)-like protocol (rtNIOSH). For ambient samples, the IMPshort, IMPlong and rtNIOSH protocol underestimated 22, 36 and 12% of TC, respectively, with the corresponding threshold being ~ 0, 20 and 25 μgC. For sucrose, however, such discrepancy was observed only with the IMPshort protocol, indicating the need of more refractory SCCA calibration substance. Although the calculation discrepancy could be largely reduced by the single-point baseline correction method, the instrumental blanks of single-point method were higher. The correction method proposed was to use multipoint-corrected data when below the determined threshold, and use single-point results when beyond that threshold. The effectiveness of this correction method was supported by correlation with optical data.

  13. [Extraction of temperate vegetation phenology thresholds in North America based on flux tower observation data].

    PubMed

    Zhao, Jing-Jing; Liu, Liang-Yun

    2013-02-01

    Flux tower method can effectively monitor the vegetation seasonal and phenological variation processes. At present, the differences in the detection and quantitative evaluation of various phenology extraction methods were not well validated and quantified. Based on the gross primary productivity (GPP) and net ecosystem productivity (NEP) data of temperate forests from 9 forest FLUXNET sites in North America, and by using the start dates (SOS) and end dates (EOS) of the temperate forest growth seasons extracted by different phenology threshold extraction methods, in combining with the forest ecosystem carbon source/sink functions, this paper analyzed the effects of different threshold standards on the extraction results of the vegetations phenology. The results showed that the effects of different threshold standards on the stability of the extracted results of deciduous broadleaved forest (DBF) phenology were smaller than those on the stability of the extracted results of evergreen needleleaved forest (ENF) phenology. Among the extracted absolute and relative thresholds of the forests GPP, the extracted threshold of the DBF daily GPP= 2 g C.m-2.d-1 had the best agreement with the DBF daily GPP = 20% maximum GPP (GPPmax) , the phenological metrics with a threshold of daily GPP = 4 g C.m-2.d-1 was close to that between daily GPP = 20% GPPmax and daily GPP = 50% GPPmax, and the start date of ecosystem carbon sink function was close to the SOS metrics between daily GPP = 4 g C.m-2.d-1 and daily GPP= 20% GPPmax. For ENF, the phenological metrics with a threshold of daily GPP = 2 g C.m-2.d-1 and daily GPP = 4 g C.m-2.d-1 had the best agreement with the daily GPP = 20% GPPmax and daily GPP = 50% GPPmax, respectively, and the start date of the ecosystem carbon sink function was close to the SOS metrics between daily GPP = 2 g C.m-2.d-1 and daily GPP= 10% GPPmax.

  14. A human visual based binarization technique for histological images

    NASA Astrophysics Data System (ADS)

    Shreyas, Kamath K. M.; Rajendran, Rahul; Panetta, Karen; Agaian, Sos

    2017-05-01

    In the field of vision-based systems for object detection and classification, thresholding is a key pre-processing step. Thresholding is a well-known technique for image segmentation. Segmentation of medical images, such as Computed Axial Tomography (CAT), Magnetic Resonance Imaging (MRI), X-Ray, Phase Contrast Microscopy, and Histological images, present problems like high variability in terms of the human anatomy and variation in modalities. Recent advances made in computer-aided diagnosis of histological images help facilitate detection and classification of diseases. Since most pathology diagnosis depends on the expertise and ability of the pathologist, there is clearly a need for an automated assessment system. Histological images are stained to a specific color to differentiate each component in the tissue. Segmentation and analysis of such images is problematic, as they present high variability in terms of color and cell clusters. This paper presents an adaptive thresholding technique that aims at segmenting cell structures from Haematoxylin and Eosin stained images. The thresholded result can further be used by pathologists to perform effective diagnosis. The effectiveness of the proposed method is analyzed by visually comparing the results to the state of art thresholding methods such as Otsu, Niblack, Sauvola, Bernsen, and Wolf. Computer simulations demonstrate the efficiency of the proposed method in segmenting critical information.

  15. Variable Threshold Method for Determining the Boundaries of Imaged Subvisible Particles.

    PubMed

    Cavicchi, Richard E; Collett, Cayla; Telikepalli, Srivalli; Hu, Zhishang; Carrier, Michael; Ripple, Dean C

    2017-06-01

    An accurate assessment of particle characteristics and concentrations in pharmaceutical products by flow imaging requires accurate particle sizing and morphological analysis. Analysis of images begins with the definition of particle boundaries. Commonly a single threshold defines the level for a pixel in the image to be included in the detection of particles, but depending on the threshold level, this results in either missing translucent particles or oversizing of less transparent particles due to the halos and gradients in intensity near the particle boundaries. We have developed an imaging analysis algorithm that sets the threshold for a particle based on the maximum gray value of the particle. We show that this results in tighter boundaries for particles with high contrast, while conserving the number of highly translucent particles detected. The method is implemented as a plugin for FIJI, an open-source image analysis software. The method is tested for calibration beads in water and glycerol/water solutions, a suspension of microfabricated rods, and stir-stressed aggregates made from IgG. The result is that appropriate thresholds are automatically set for solutions with a range of particle properties, and that improved boundaries will allow for more accurate sizing results and potentially improved particle classification studies. Published by Elsevier Inc.

  16. Ecological thresholds: The key to successful enviromental management or an important concept with no practical application?

    USGS Publications Warehouse

    Groffman, P.M.; Baron, Jill S.; Blett, T.; Gold, A.J.; Goodman, I.; Gunderson, L.H.; Levinson, B.M.; Palmer, Margaret A.; Paerl, H.W.; Peterson, G.D.; Poff, N.L.; Rejeski, D.W.; Reynolds, J.F.; Turner, M.G.; Weathers, K.C.; Wiens, J.

    2006-01-01

    An ecological threshold is the point at which there is an abrupt change in an ecosystem quality, property or phenomenon, or where small changes in an environmental driver produce large responses in the ecosystem. Analysis of thresholds is complicated by nonlinear dynamics and by multiple factor controls that operate at diverse spatial and temporal scales. These complexities have challenged the use and utility of threshold concepts in environmental management despite great concern about preventing dramatic state changes in valued ecosystems, the need for determining critical pollutant loads and the ubiquity of other threshold-based environmental problems. In this paper we define the scope of the thresholds concept in ecological science and discuss methods for identifying and investigating thresholds using a variety of examples from terrestrial and aquatic environments, at ecosystem, landscape and regional scales. We end with a discussion of key research needs in this area.

  17. Critical thresholds in sea lice epidemics: evidence, sensitivity and subcritical estimation

    PubMed Central

    Frazer, L. Neil; Morton, Alexandra; Krkošek, Martin

    2012-01-01

    Host density thresholds are a fundamental component of the population dynamics of pathogens, but empirical evidence and estimates are lacking. We studied host density thresholds in the dynamics of ectoparasitic sea lice (Lepeophtheirus salmonis) on salmon farms. Empirical examples include a 1994 epidemic in Atlantic Canada and a 2001 epidemic in Pacific Canada. A mathematical model suggests dynamics of lice are governed by a stable endemic equilibrium until the critical host density threshold drops owing to environmental change, or is exceeded by stocking, causing epidemics that require rapid harvest or treatment. Sensitivity analysis of the critical threshold suggests variation in dependence on biotic parameters and high sensitivity to temperature and salinity. We provide a method for estimating the critical threshold from parasite abundances at subcritical host densities and estimate the critical threshold and transmission coefficient for the two epidemics. Host density thresholds may be a fundamental component of disease dynamics in coastal seas where salmon farming occurs. PMID:22217721

  18. Influence of surgical gloves on haptic perception thresholds.

    PubMed

    Hatzfeld, Christian; Dorsch, Sarah; Neupert, Carsten; Kupnik, Mario

    2018-02-01

    Impairment of haptic perception by surgical gloves could reduce requirements on haptic systems for surgery. While grip forces and manipulation capabilities were not impaired in previous studies, no data is available for perception thresholds. Absolute and differential thresholds (20 dB above threshold) of 24 subjects were measured for frequencies of 25 and 250 Hz with a Ψ-method. Effects of wearing a surgical glove, moisture on the contact surface and subject's experience with gloves were incorporated in a full-factorial experimental design. Absolute thresholds of 12.8 dB and -29.6 dB (means for 25 and 250 Hz, respectively) and differential thresholds of -12.6 dB and -9.5 dB agree with previous studies. A relevant effect of the frequency on absolute thresholds was found. Comparisons of glove- and no-glove-conditions did not reveal a significant mean difference. Wearing a single surgical glove does not affect absolute and differential haptic perception thresholds. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Value of information and pricing new healthcare interventions.

    PubMed

    Willan, Andrew R; Eckermann, Simon

    2012-06-01

    Previous application of value-of-information methods to optimal clinical trial design have predominantly taken a societal decision-making perspective, implicitly assuming that healthcare costs are covered through public expenditure and trial research is funded by government or donation-based philanthropic agencies. In this paper, we consider the interaction between interrelated perspectives of a societal decision maker (e.g. the National Institute for Health and Clinical Excellence [NICE] in the UK) charged with the responsibility for approving new health interventions for reimbursement and the company that holds the patent for a new intervention. We establish optimal decision making from societal and company perspectives, allowing for trade-offs between the value and cost of research and the price of the new intervention. Given the current level of evidence, there exists a maximum (threshold) price acceptable to the decision maker. Submission for approval with prices above this threshold will be refused. Given the current level of evidence and the decision maker's threshold price, there exists a minimum (threshold) price acceptable to the company. If the decision maker's threshold price exceeds the company's, then current evidence is sufficient since any price between the thresholds is acceptable to both. On the other hand, if the decision maker's threshold price is lower than the company's, then no price is acceptable to both and the company's optimal strategy is to commission additional research. The methods are illustrated using a recent example from the literature.

  20. Effects of tubing length and coupling method on hearing threshold and real-ear to coupler difference measures.

    PubMed

    Gustafson, Samantha; Pittman, Andrea; Fanning, Robert

    2013-06-01

    This tutorial demonstrates the effects of tubing length and coupling type (i.e., foam tip or personal earmold) on hearing threshold and real-ear-to-coupler difference (RECD) measures. Hearing thresholds from 0.25 kHz through 8 kHz are reported at various tubing lengths for 28 normal-hearing adults between the ages of 22 and 31 years. RECD values are reported for 14 of the adults. All measures were made with an insert earphone coupled to a standard foam tip and with an insert earphone coupled to each participant's personal earmold. Threshold and RECD measures obtained with a personal earmold were significantly different from those obtained with a foam tip on repeated measures analyses of variance. One-sample t tests showed these differences to vary systematically with increasing tubing length, with the largest average differences (7-8 dB) occurring at 4 kHz. This systematic examination demonstrates the equal and opposite effects of tubing length on threshold and acoustic measures. Specifically, as tubing length increased, sound pressure level in the ear canal decreased, affecting both hearing thresholds and the real-ear portion of the RECDs. This demonstration shows that when the same coupling method is used to obtain the hearing thresholds and RECD, equal and accurate estimates of real-ear sound pressure level are obtained.

  1. Position Estimation for Switched Reluctance Motor Based on the Single Threshold Angle

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Pang; Yu, Yue

    2017-05-01

    This paper presents a position estimate model of switched reluctance motor based on the single threshold angle. In view of the relationship of between the inductance and rotor position, the position is estimated by comparing the real-time dynamic flux linkage with the threshold angle position flux linkage (7.5° threshold angle, 12/8SRM). The sensorless model is built by Maltab/Simulink, the simulation are implemented under the steady state and transient state different condition, and verified its validity and feasibility of the method..

  2. Negative Difference Resistance and Its Application to Construct Boolean Logic Circuits

    NASA Astrophysics Data System (ADS)

    Nikodem, Maciej; Bawiec, Marek A.; Surmacz, Tomasz R.

    Electronic circuits based on nanodevices and quantum effect are the future of logic circuits design. Today's technology allows constructing resonant tunneling diodes, quantum cellular automata and nanowires/nanoribbons that are the elementary components of threshold gates. However, synthesizing a threshold circuit for an arbitrary logic function is still a challenging task where no efficient algorithms exist. This paper focuses on Generalised Threshold Gates (GTG), giving the overview of threshold circuit synthesis methods and presenting an algorithm that considerably simplifies the task in case of GTG circuits.

  3. Relationship between behavioral and physiological spectral-ripple discrimination.

    PubMed

    Won, Jong Ho; Clinard, Christopher G; Kwon, Seeyoun; Dasika, Vasant K; Nie, Kaibao; Drennan, Ward R; Tremblay, Kelly L; Rubinstein, Jay T

    2011-06-01

    Previous studies have found a significant correlation between spectral-ripple discrimination and speech and music perception in cochlear implant (CI) users. This relationship could be of use to clinicians and scientists who are interested in using spectral-ripple stimuli in the assessment and habilitation of CI users. However, previous psychoacoustic tasks used to assess spectral discrimination are not suitable for all populations, and it would be beneficial to develop methods that could be used to test all age ranges, including pediatric implant users. Additionally, it is important to understand how ripple stimuli are processed in the central auditory system and how their neural representation contributes to behavioral performance. For this reason, we developed a single-interval, yes/no paradigm that could potentially be used both behaviorally and electrophysiologically to estimate spectral-ripple threshold. In experiment 1, behavioral thresholds obtained using the single-interval method were compared to thresholds obtained using a previously established three-alternative forced-choice method. A significant correlation was found (r = 0.84, p = 0.0002) in 14 adult CI users. The spectral-ripple threshold obtained using the new method also correlated with speech perception in quiet and noise. In experiment 2, the effect of the number of vocoder-processing channels on the behavioral and physiological threshold in normal-hearing listeners was determined. Behavioral thresholds, using the new single-interval method, as well as cortical P1-N1-P2 responses changed as a function of the number of channels. Better behavioral and physiological performance (i.e., better discrimination ability at higher ripple densities) was observed as more channels added. In experiment 3, the relationship between behavioral and physiological data was examined. Amplitudes of the P1-N1-P2 "change" responses were significantly correlated with d' values from the single-interval behavioral procedure. Results suggest that the single-interval procedure with spectral-ripple phase inversion in ongoing stimuli is a valid approach for measuring behavioral or physiological spectral resolution.

  4. Noise reduction algorithm with the soft thresholding based on the Shannon entropy and bone-conduction speech cross- correlation bands.

    PubMed

    Na, Sung Dae; Wei, Qun; Seong, Ki Woong; Cho, Jin Ho; Kim, Myoung Nam

    2018-01-01

    The conventional methods of speech enhancement, noise reduction, and voice activity detection are based on the suppression of noise or non-speech components of the target air-conduction signals. However, air-conduced speech is hard to differentiate from babble or white noise signals. To overcome this problem, the proposed algorithm uses the bone-conduction speech signals and soft thresholding based on the Shannon entropy principle and cross-correlation of air- and bone-conduction signals. A new algorithm for speech detection and noise reduction is proposed, which makes use of the Shannon entropy principle and cross-correlation with the bone-conduction speech signals to threshold the wavelet packet coefficients of the noisy speech. The proposed method can be get efficient result by objective quality measure that are PESQ, RMSE, Correlation, SNR. Each threshold is generated by the entropy and cross-correlation approaches in the decomposed bands using the wavelet packet decomposition. As a result, the noise is reduced by the proposed method using the MATLAB simulation. To verify the method feasibility, we compared the air- and bone-conduction speech signals and their spectra by the proposed method. As a result, high performance of the proposed method is confirmed, which makes it quite instrumental to future applications in communication devices, noisy environment, construction, and military operations.

  5. Evaluation of automated threshold selection methods for accurately sizing microscopic fluorescent cells by image analysis.

    PubMed Central

    Sieracki, M E; Reichenbach, S E; Webb, K L

    1989-01-01

    The accurate measurement of bacterial and protistan cell biomass is necessary for understanding their population and trophic dynamics in nature. Direct measurement of fluorescently stained cells is often the method of choice. The tedium of making such measurements visually on the large numbers of cells required has prompted the use of automatic image analysis for this purpose. Accurate measurements by image analysis require an accurate, reliable method of segmenting the image, that is, distinguishing the brightly fluorescing cells from a dark background. This is commonly done by visually choosing a threshold intensity value which most closely coincides with the outline of the cells as perceived by the operator. Ideally, an automated method based on the cell image characteristics should be used. Since the optical nature of edges in images of light-emitting, microscopic fluorescent objects is different from that of images generated by transmitted or reflected light, it seemed that automatic segmentation of such images may require special considerations. We tested nine automated threshold selection methods using standard fluorescent microspheres ranging in size and fluorescence intensity and fluorochrome-stained samples of cells from cultures of cyanobacteria, flagellates, and ciliates. The methods included several variations based on the maximum intensity gradient of the sphere profile (first derivative), the minimum in the second derivative of the sphere profile, the minimum of the image histogram, and the midpoint intensity. Our results indicated that thresholds determined visually and by first-derivative methods tended to overestimate the threshold, causing an underestimation of microsphere size. The method based on the minimum of the second derivative of the profile yielded the most accurate area estimates for spheres of different sizes and brightnesses and for four of the five cell types tested. A simple model of the optical properties of fluorescing objects and the video acquisition system is described which explains how the second derivative best approximates the position of the edge. Images PMID:2516431

  6. Fluorescently labeled bevacizumab in human breast cancer: defining the classification threshold

    NASA Astrophysics Data System (ADS)

    Koch, Maximilian; de Jong, Johannes S.; Glatz, Jürgen; Symvoulidis, Panagiotis; Lamberts, Laetitia E.; Adams, Arthur L. L.; Kranendonk, Mariëtte E. G.; Terwisscha van Scheltinga, Anton G. T.; Aichler, Michaela; Jansen, Liesbeth; de Vries, Jakob; Lub-de Hooge, Marjolijn N.; Schröder, Carolien P.; Jorritsma-Smit, Annelies; Linssen, Matthijs D.; de Boer, Esther; van der Vegt, Bert; Nagengast, Wouter B.; Elias, Sjoerd G.; Oliveira, Sabrina; Witkamp, Arjen J.; Mali, Willem P. Th. M.; Van der Wall, Elsken; Garcia-Allende, P. Beatriz; van Diest, Paul J.; de Vries, Elisabeth G. E.; Walch, Axel; van Dam, Gooitzen M.; Ntziachristos, Vasilis

    2017-07-01

    In-vivo fluorescently labelled drug (bevacizumab) breast cancer specimen where obtained from patients. We propose a new structured method to determine the optimal classification threshold in targeted fluorescence intra-operative imaging.

  7. Vessel extraction in retinal images using automatic thresholding and Gabor Wavelet.

    PubMed

    Ali, Aziah; Hussain, Aini; Wan Zaki, Wan Mimi Diyana

    2017-07-01

    Retinal image analysis has been widely used for early detection and diagnosis of multiple systemic diseases. Accurate vessel extraction in retinal image is a crucial step towards a fully automated diagnosis system. This work affords an efficient unsupervised method for extracting blood vessels from retinal images by combining existing Gabor Wavelet (GW) method with automatic thresholding. Green channel image is extracted from color retinal image and used to produce Gabor feature image using GW. Both green channel image and Gabor feature image undergo vessel-enhancement step in order to highlight blood vessels. Next, the two vessel-enhanced images are transformed to binary images using automatic thresholding before combined to produce the final vessel output. Combining the images results in significant improvement of blood vessel extraction performance compared to using individual image. Effectiveness of the proposed method was proven via comparative analysis with existing methods validated using publicly available database, DRIVE.

  8. Constructing financial network based on PMFG and threshold method

    NASA Astrophysics Data System (ADS)

    Nie, Chun-Xiao; Song, Fu-Tie

    2018-04-01

    Based on planar maximally filtered graph (PMFG) and threshold method, we introduced a correlation-based network named PMFG-based threshold network (PTN). We studied the community structure of PTN and applied ISOMAP algorithm to represent PTN in low-dimensional Euclidean space. The results show that the community corresponds well to the cluster in the Euclidean space. Further, we studied the dynamics of the community structure and constructed the normalized mutual information (NMI) matrix. Based on the real data in the market, we found that the volatility of the market can lead to dramatic changes in the community structure, and the structure is more stable during the financial crisis.

  9. Estimation of Signal Coherence Threshold and Concealed Spectral Lines Applied to Detection of Turbofan Engine Combustion Noise

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2010-01-01

    Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise. Copyright 2011 Acoustical Society of America. This article may be downloaded for personal use only. Any other use requires prior permission of the author and the Acoustical Society of America.

  10. Salicylate-induced changes in auditory thresholds of adolescent and adult rats.

    PubMed

    Brennan, J F; Brown, C A; Jastreboff, P J

    1996-01-01

    Shifts in auditory intensity thresholds after salicylate administration were examined in postweanling and adult pigmented rats at frequencies ranging from 1 to 35 kHz. A total of 132 subjects from both age levels were tested under two-way active avoidance or one-way active avoidance paradigms. Estimated thresholds were inferred from behavioral responses to presentations of descending and ascending series of intensities for each test frequency value. Reliable threshold estimates were found under both avoidance conditioning methods, and compared to controls, subjects at both age levels showed threshold shifts at selective higher frequency values after salicylate injection, and the extent of shifts was related to salicylate dose level.

  11. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  12. Digital Image Sensor-Based Assessment of the Status of Oat (Avena sativa L.) Crops after Frost Damage

    PubMed Central

    Macedo-Cruz, Antonia; Pajares, Gonzalo; Santos, Matilde; Villegas-Romero, Isidro

    2011-01-01

    The aim of this paper is to classify the land covered with oat crops, and the quantification of frost damage on oats, while plants are still in the flowering stage. The images are taken by a digital colour camera CCD-based sensor. Unsupervised classification methods are applied because the plants present different spectral signatures, depending on two main factors: illumination and the affected state. The colour space used in this application is CIELab, based on the decomposition of the colour in three channels, because it is the closest to human colour perception. The histogram of each channel is successively split into regions by thresholding. The best threshold to be applied is automatically obtained as a combination of three thresholding strategies: (a) Otsu’s method, (b) Isodata algorithm, and (c) Fuzzy thresholding. The fusion of these automatic thresholding techniques and the design of the classification strategy are some of the main findings of the paper, which allows an estimation of the damages and a prediction of the oat production. PMID:22163940

  13. Universal phase transition in community detectability under a stochastic block model.

    PubMed

    Chen, Pin-Yu; Hero, Alfred O

    2015-03-01

    We prove the existence of an asymptotic phase-transition threshold on community detectability for the spectral modularity method [M. E. J. Newman, Phys. Rev. E 74, 036104 (2006) and Proc. Natl. Acad. Sci. (USA) 103, 8577 (2006)] under a stochastic block model. The phase transition on community detectability occurs as the intercommunity edge connection probability p grows. This phase transition separates a subcritical regime of small p, where modularity-based community detection successfully identifies the communities, from a supercritical regime of large p where successful community detection is impossible. We show that, as the community sizes become large, the asymptotic phase-transition threshold p* is equal to √[p1p2], where pi(i=1,2) is the within-community edge connection probability. Thus the phase-transition threshold is universal in the sense that it does not depend on the ratio of community sizes. The universal phase-transition phenomenon is validated by simulations for moderately sized communities. Using the derived expression for the phase-transition threshold, we propose an empirical method for estimating this threshold from real-world data.

  14. Digital image sensor-based assessment of the status of oat (Avena sativa L.) crops after frost damage.

    PubMed

    Macedo-Cruz, Antonia; Pajares, Gonzalo; Santos, Matilde; Villegas-Romero, Isidro

    2011-01-01

    The aim of this paper is to classify the land covered with oat crops, and the quantification of frost damage on oats, while plants are still in the flowering stage. The images are taken by a digital colour camera CCD-based sensor. Unsupervised classification methods are applied because the plants present different spectral signatures, depending on two main factors: illumination and the affected state. The colour space used in this application is CIELab, based on the decomposition of the colour in three channels, because it is the closest to human colour perception. The histogram of each channel is successively split into regions by thresholding. The best threshold to be applied is automatically obtained as a combination of three thresholding strategies: (a) Otsu's method, (b) Isodata algorithm, and (c) Fuzzy thresholding. The fusion of these automatic thresholding techniques and the design of the classification strategy are some of the main findings of the paper, which allows an estimation of the damages and a prediction of the oat production.

  15. Polynomial sequences for bond percolation critical thresholds

    DOE PAGES

    Scullard, Christian R.

    2011-09-22

    In this paper, I compute the inhomogeneous (multi-probability) bond critical surfaces for the (4, 6, 12) and (3 4, 6) using the linearity approximation described in (Scullard and Ziff, J. Stat. Mech. 03021), implemented as a branching process of lattices. I find the estimates for the bond percolation thresholds, pc(4, 6, 12) = 0.69377849... and p c(3 4, 6) = 0.43437077..., compared with Parviainen’s numerical results of p c = 0.69373383... and p c = 0.43430621... . These deviations are of the order 10 -5, as is standard for this method. Deriving thresholds in this way for a given latticemore » leads to a polynomial with integer coefficients, the root in [0, 1] of which gives the estimate for the bond threshold and I show how the method can be refined, leading to a series of higher order polynomials making predictions that likely converge to the exact answer. Finally, I discuss how this fact hints that for certain graphs, such as the kagome lattice, the exact bond threshold may not be the root of any polynomial with integer coefficients.« less

  16. Psychophysical Measurement of Rod and Cone Thresholds in Stargardt Disease with Full-Field Stimuli

    PubMed Central

    Collison, Frederick T.; Fishman, Gerald A.; McAnany, J. Jason; Zernant, Jana; Allikmets, Rando

    2014-01-01

    Purpose To investigate psychophysical thresholds in Stargardt disease with the full-field stimulus test (FST). Methods Visual acuity (VA), spectral-domain optical coherence tomography (SD-OCT), full-field electroretinogram (ERG), and FST measurements were made in one eye of 24 patients with Stargardt disease. Dark-adapted rod FST thresholds were measured with short-wavelength stimuli, and cone FST thresholds were obtained from the cone plateau phase of dark adaptation using long-wavelength stimuli. Correlation coefficients were calculated for FST thresholds versus macular thickness, VA and ERG amplitudes. Results Stargardt patient FST cone thresholds correlated significantly with VA, macular thickness, and ERG cone-response amplitudes (all P<0.01). The patients’ FST rod thresholds correlated with ERG rod-response amplitudes (P<0.01), but not macular thickness (P=0.05). All Stargardt disease patients with flecks confined to the macula and most of the patients with flecks extending outside of the macula had normal FST thresholds. All patients with extramacular atrophic changes had elevated FST cone thresholds and most had elevated FST rod thresholds. Conclusion FST rod and cone threshold elevation in Stargardt disease patients correlated well with measures of structure and function, as well as ophthalmoscopic retinal appearance. FST appears to be a useful tool for assessing rod and cone function in Stargardt disease. PMID:24695063

  17. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  18. High-resolution audiometry: an automated method for hearing threshold acquisition with quality control.

    PubMed

    Bian, Lin

    2012-01-01

    In clinical practice, hearing thresholds are measured at only five to six frequencies at octave intervals. Thus, the audiometric configuration cannot closely reflect the actual status of the auditory structures. In addition, differential diagnosis requires quantitative comparison of behavioral thresholds with physiological measures, such as otoacoustic emissions (OAEs) that are usually measured in higher resolution. The purpose of this research was to develop a method to improve the frequency resolution of the audiogram. A repeated-measure design was used in the study to evaluate the reliability of the threshold measurements. A total of 16 participants with clinically normal hearing and mild hearing loss were recruited from a population of university students. No intervention was involved in the study. Custom developed system and software were used for threshold acquisition with quality control (QC). With real-ear calibration and monitoring of test signals, the system provided accurate and individualized measure of hearing thresholds that were determined by an analysis based on signal detection theory (SDT). The reliability of the threshold measure was assessed by correlation and differences between the repeated measures. The audiometric configurations were diverse and unique to each individual ear. The accuracy, within-subject reliability, and between-test repeatability are relatively high. With QC, the high-resolution audiograms can be reliably and accurately measured. Hearing thresholds measured as ear canal sound pressures with higher frequency resolution can provide more customized hearing-aid fitting. The test system may be integrated with other physiological measures, such as OAEs, into a comprehensive evaluative tool. American Academy of Audiology.

  19. Noise reduction in Lidar signal using correlation-based EMD combined with soft thresholding and roughness penalty

    NASA Astrophysics Data System (ADS)

    Chang, Jianhua; Zhu, Lingyan; Li, Hongxu; Xu, Fan; Liu, Binggang; Yang, Zhenbo

    2018-01-01

    Empirical mode decomposition (EMD) is widely used to analyze the non-linear and non-stationary signals for noise reduction. In this study, a novel EMD-based denoising method, referred to as EMD with soft thresholding and roughness penalty (EMD-STRP), is proposed for the Lidar signal denoising. With the proposed method, the relevant and irrelevant intrinsic mode functions are first distinguished via a correlation coefficient. Then, the soft thresholding technique is applied to the irrelevant modes, and the roughness penalty technique is applied to the relevant modes to extract as much information as possible. The effectiveness of the proposed method was evaluated using three typical signals contaminated by white Gaussian noise. The denoising performance was then compared to the denoising capabilities of other techniques, such as correlation-based EMD partial reconstruction, correlation-based EMD hard thresholding, and wavelet transform. The use of EMD-STRP on the measured Lidar signal resulted in the noise being efficiently suppressed, with an improved signal to noise ratio of 22.25 dB and an extended detection range of 11 km.

  20. The Hilbert-Huang Transform-Based Denoising Method for the TEM Response of a PRBS Source Signal

    NASA Astrophysics Data System (ADS)

    Hai, Li; Guo-qiang, Xue; Pan, Zhao; Hua-sen, Zhong; Khan, Muhammad Younis

    2016-08-01

    The denoising process is critical in processing transient electromagnetic (TEM) sounding data. For the full waveform pseudo-random binary sequences (PRBS) response, an inadequate noise estimation may result in an erroneous interpretation. We consider the Hilbert-Huang transform (HHT) and its application to suppress the noise in the PRBS response. The focus is on the thresholding scheme to suppress the noise and the analysis of the signal based on its Hilbert time-frequency representation. The method first decomposes the signal into the intrinsic mode function, and then, inspired by the thresholding scheme in wavelet analysis; an adaptive and interval thresholding is conducted to set to zero all the components in intrinsic mode function which are lower than a threshold related to the noise level. The algorithm is based on the characteristic of the PRBS response. The HHT-based denoising scheme is tested on the synthetic and field data with the different noise levels. The result shows that the proposed method has a good capability in denoising and detail preservation.

  1. An adaptive threshold detector and channel parameter estimator for deep space optical communications

    NASA Technical Reports Server (NTRS)

    Arabshahi, P.; Mukai, R.; Yan, T. -Y.

    2001-01-01

    This paper presents a method for optimal adaptive setting of ulse-position-modulation pulse detection thresholds, which minimizes the total probability of error for the dynamically fading optical fee space channel.

  2. Comparison of memory thresholds for planar qudit geometries

    NASA Astrophysics Data System (ADS)

    Marks, Jacob; Jochym-O'Connor, Tomas; Gheorghiu, Vlad

    2017-11-01

    We introduce and analyze a new type of decoding algorithm called general color clustering, based on renormalization group methods, to be used in qudit color codes. The performance of this decoder is analyzed under a generalized bit-flip error model, and is used to obtain the first memory threshold estimates for qudit 6-6-6 color codes. The proposed decoder is compared with similar decoding schemes for qudit surface codes as well as the current leading qubit decoders for both sets of codes. We find that, as with surface codes, clustering performs sub-optimally for qubit color codes, giving a threshold of 5.6 % compared to the 8.0 % obtained through surface projection decoding methods. However, the threshold rate increases by up to 112% for large qudit dimensions, plateauing around 11.9 % . All the analysis is performed using QTop, a new open-source software for simulating and visualizing topological quantum error correcting codes.

  3. Low Temperature Polycrystalline Silicon Thin Film Transistor Pixel Circuits for Active Matrix Organic Light Emitting Diodes

    NASA Astrophysics Data System (ADS)

    Fan, Ching-Lin; Lin, Yu-Sheng; Liu, Yan-Wei

    A new pixel design and driving method for active matrix organic light emitting diode (AMOLED) displays that use low-temperature polycrystalline silicon thin-film transistors (LTPS-TFTs) with a voltage programming method are proposed and verified using the SPICE simulator. We had employed an appropriate TFT model in SPICE simulation to demonstrate the performance of the pixel circuit. The OLED anode voltage variation error rates are below 0.35% under driving TFT threshold voltage deviation (Δ Vth =± 0.33V). The OLED current non-uniformity caused by the OLED threshold voltage degradation (Δ VTO =+0.33V) is significantly reduced (below 6%). The simulation results show that the pixel design can improve the display image non-uniformity by compensating for the threshold voltage deviation in the driving TFT and the OLED threshold voltage degradation at the same time.

  4. Development of Matched (migratory Analytical Time Change Easy Detection) Method for Satellite-Tracked Migratory Birds

    NASA Astrophysics Data System (ADS)

    Doko, Tomoko; Chen, Wenbo; Higuchi, Hiroyoshi

    2016-06-01

    Satellite tracking technology has been used to reveal the migration patterns and flyways of migratory birds. In general, bird migration can be classified according to migration status. These statuses include the wintering period, spring migration, breeding period, and autumn migration. To determine the migration status, periods of these statuses should be individually determined, but there is no objective method to define 'a threshold date' for when an individual bird changes its status. The research objective is to develop an effective and objective method to determine threshold dates of migration status based on satellite-tracked data. The developed method was named the "MATCHED (Migratory Analytical Time Change Easy Detection) method". In order to demonstrate the method, data acquired from satellite-tracked Tundra Swans were used. MATCHED method is composed by six steps: 1) dataset preparation, 2) time frame creation, 3) automatic identification, 4) visualization of change points, 5) interpretation, and 6) manual correction. Accuracy was tested. In general, MATCHED method was proved powerful to identify the change points between migration status as well as stopovers. Nevertheless, identifying "exact" threshold dates is still challenging. Limitation and application of this method was discussed.

  5. Identifying the Threshold of Dominant Controls on Fire Spread in a Boreal Forest Landscape of Northeast China

    PubMed Central

    Liu, Zhihua; Yang, Jian; He, Hong S.

    2013-01-01

    The relative importance of fuel, topography, and weather on fire spread varies at different spatial scales, but how the relative importance of these controls respond to changing spatial scales is poorly understood. We designed a “moving window” resampling technique that allowed us to quantify the relative importance of controls on fire spread at continuous spatial scales using boosted regression trees methods. This quantification allowed us to identify the threshold value for fire size at which the dominant control switches from fuel at small sizes to weather at large sizes. Topography had a fluctuating effect on fire spread across the spatial scales, explaining 20–30% of relative importance. With increasing fire size, the dominant control switched from bottom-up controls (fuel and topography) to top-down controls (weather). Our analysis suggested that there is a threshold for fire size, above which fires are driven primarily by weather and more likely lead to larger fire size. We suggest that this threshold, which may be ecosystem-specific, can be identified using our “moving window” resampling technique. Although the threshold derived from this analytical method may rely heavily on the sampling technique, our study introduced an easily implemented approach to identify scale thresholds in wildfire regimes. PMID:23383247

  6. [The application of cortical and subcortical stimulation threshold in identifying the motor pathway and guiding the resection of gliomas in the functional areas].

    PubMed

    Ren, X H; Yang, X C; Huang, W; Yang, K Y; Liu, L; Qiao, H; Guo, L J; Cui, Y; Lin, S

    2018-03-06

    Objective: This study aimed to analyze the application of cortical and subcortical stimulation threshold in identifying the motor pathway and guiding the resection of gliomas in the functional area, and to illustrate the minimal safe threshold by ROC method. Methods: Fifty-seven patients with gliomas in the functional areas were enrolled in the study at Beijing Tiantan Hospital from 2015 to 2017. Anesthesia was maintained intravenously with propofol 10% and remifentanil. Throughout the resection process, cortical or subcortical stimulation threshold was determined along tumor border using monopolar or bipolar electrodes. The motor pathway was identified and protected from resection according to the stimulation threshold and transcranial MEPs. Minimal threshold in each case was recorded. Results: Total resection was achieved in 32 cases(56.1%), sub-total resection in 22 cases(38.6%), and partial resection in 3 cases(5.3%). Pre-operative motor disability was found in 9 cases. Compared with pre-operative motor scores, 19 exhibited impaired motor functions on day 1 after surgery, 5 had quick recovery by day 7 after surgery, and 7 had late recovery by 3 months after surgery. At 3 months, 7 still had impaired motor function. The frequency of intraoperative seizure was 1.8%(1/57). No other side effect was found during electronic monitoring in the operation. The ROC curve revealed that the minimal safe monopolar subcortical threshold was 5.70 mA for strength deterioration on day 1 and day 7 after surgery. Univariate analysis revealed that decreased transcranial MEPs and minimal subcortical threshold ≤5.7 mA were correlated with postoperative strength deterioration. Conclusions: Cortical and subcortical stimulation threshold has its merit in identifying the motor pathway and guiding the resection for tumors within the functional areas. 5.7 mA can be used as the minimal safe threshold to protect the motor pathway from injury.

  7. Efficient method for calculations of ro-vibrational states in triatomic molecules near dissociation threshold: Application to ozone

    NASA Astrophysics Data System (ADS)

    Teplukhin, Alexander; Babikov, Dmitri

    2016-09-01

    A method for calculations of rotational-vibrational states of triatomic molecules up to dissociation threshold (and scattering resonances above it) is devised, that combines hyper-spherical coordinates, sequential diagonalization-truncation procedure, optimized grid DVR, and complex absorbing potential. Efficiency and accuracy of the method and new code are tested by computing the spectrum of ozone up to dissociation threshold, using two different potential energy surfaces. In both cases good agreement with results of previous studies is obtained for the lower energy states localized in the deep (˜10 000 cm-1) covalent well. Upper part of the bound state spectrum, within 600 cm-1 below dissociation threshold, is also computed and is analyzed in detail. It is found that long progressions of symmetric-stretching and bending states (up to 8 and 11 quanta, respectively) survive up to dissociation threshold and even above it, whereas excitations of the asymmetric-stretching overtones couple to the local vibration modes, making assignments difficult. Within 140 cm-1 below dissociation threshold, large-amplitude vibrational states of a floppy complex O⋯O2 are formed over the shallow van der Waals plateau. These are assigned using two local modes: the rocking-motion and the dissociative-motion progressions, up to 6 quanta in each, both with frequency ˜20 cm-1. Many of these plateau states are mixed with states of the covalent well. Interestingly, excitation of the rocking-motion helps keeping these states localized within the plateau region, by raising the effective barrier.

  8. Evaluation and comparison of 50 Hz current threshold of electrocutaneous sensations using different methods

    PubMed Central

    Lindenblatt, G.; Silny, J.

    2006-01-01

    Leakage currents, tiny currents flowing from an everyday-life appliance through the body to the ground, can cause a non-adequate perception (called electrocutaneous sensation, ECS) or even pain and should be avoided. Safety standards for low-frequency range are based on experimental results of current thresholds of electrocutaneous sensations, which however show a wide range between about 50 μA (rms) and 1000 μA (rms). In order to be able to explain these differences, the perception threshold was measured repeatedly in experiments with test persons under identical experimental setup, but by means of different methods (measuring strategies), namely: direct adjustment, classical threshold as amperage of 50% perception probability, and confidence rating procedure of signal detection theory. The current is injected using a 1 cm2 electrode at the highly touch sensitive part of the index fingertip. These investigations show for the first time that the threshold of electrocutaneous sensations is influenced both by adaptation to the non-adequate stimulus and individual, emotional factors. Therefore, classical methods, on which the majority of the safety investigations are based, cannot be used to determine a leakage current threshold. The confidence rating procedure of the modern signal detection theory yields a value of 179.5 μA (rms) at 50 Hz power supply net frequency as the lower end of the 95% confidence range considering the variance in the investigated group. This value is expected to be free of adaptation influences, and is distinctly lower than the European limits and supports the stricter regulations of Canada and USA. PMID:17111461

  9. Mechanical sensibility in free and island flaps of the foot.

    PubMed

    Rautio, J; Kekoni, J; Hämäläinen, H; Härmä, M; Asko-Seljavaara, S

    1989-04-01

    Mechanical sensibility in 20 free skin flaps and four dorsalis pedis island flaps, used for the reconstruction of foot defects, was analyzed with conventional clinical methods and by determining sensibility thresholds to vibration frequencies of 20, 80, and 240 Hz. To eliminate inter-individual variability, a score was calculated for each frequency by dividing the thresholds determined for each flap with values obtained from the corresponding area on the uninjured foot. The soft tissue stability of the reconstruction was assessed. Patients were divided into three groups according to the scores. In the group of flaps with the best sensibility, the threshold increases were low at all frequencies. In the group with intermediate sensibility, the relative threshold increases were greater, the higher the frequency. In the group with the poorest sensibility, no thresholds were obtained with 240 Hz frequency and the thresholds increases were very high at all frequencies. Sensibility was not related to the length of follow-up time, nor to the type or size of the flap. However, flap sensibility was closely associated with that of the recipient area, where sensibility was usually inferior to that of normal skin. The island flaps generally had better sensibility than the free flaps. There was a good correspondence between the levels of sensibility determined by clinical and quantitative methods. The quantitative data on the level of sensibility obtained with the psychophysical method were found to be reliable and free from observer bias, and are therefore recommended for future studies. The degree of sensibility may have contributed to, but was not essential for, good soft-tissue stability of the reconstruction.

  10. Cloud detection method for Chinese moderate high resolution satellite imagery (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Zhong, Bo; Chen, Wuhan; Wu, Shanlong; Liu, Qinhuo

    2016-10-01

    Cloud detection of satellite imagery is very important for quantitative remote sensing research and remote sensing applications. However, many satellite sensors don't have enough bands for a quick, accurate, and simple detection of clouds. Particularly, the newly launched moderate to high spatial resolution satellite sensors of China, such as the charge-coupled device on-board the Chinese Huan Jing 1 (HJ-1/CCD) and the wide field of view (WFV) sensor on-board the Gao Fen 1 (GF-1), only have four available bands including blue, green, red, and near infrared bands, which are far from the requirements of most could detection methods. In order to solve this problem, an improved and automated cloud detection method for Chinese satellite sensors called OCM (Object oriented Cloud and cloud-shadow Matching method) is presented in this paper. It firstly modified the Automatic Cloud Cover Assessment (ACCA) method, which was developed for Landsat-7 data, to get an initial cloud map. The modified ACCA method is mainly based on threshold and different threshold setting produces different cloud map. Subsequently, a strict threshold is used to produce a cloud map with high confidence and large amount of cloud omission and a loose threshold is used to produce a cloud map with low confidence and large amount of commission. Secondly, a corresponding cloud-shadow map is also produced using the threshold of near-infrared band. Thirdly, the cloud maps and cloud-shadow map are transferred to cloud objects and cloud-shadow objects. Cloud and cloud-shadow are usually in pairs; consequently, the final cloud and cloud-shadow maps are made based on the relationship between cloud and cloud-shadow objects. OCM method was tested using almost 200 HJ-1/CCD images across China and the overall accuracy of cloud detection is close to 90%.

  11. Thresholds in chemical respiratory sensitisation.

    PubMed

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the acquisition of sensitisation to chemical respiratory allergens is a dose-related phenomenon, and that thresholds exist, it is frequently difficult to define accurate numerical values for threshold exposure levels. Nevertheless, based on occupational exposure data it may sometimes be possible to derive levels of exposure in the workplace, which are safe. An additional observation is the lack currently of suitable experimental methods for both routine hazard characterisation and the measurement of thresholds, and that such methods are still some way off. Given the current trajectory of toxicology, and the move towards the use of non-animal in vitro and/or in silico) methods, there is a need to consider the development of alternative approaches for the identification and characterisation of respiratory sensitisation hazards, and for risk assessment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. A fuzzy optimal threshold technique for medical images

    NASA Astrophysics Data System (ADS)

    Thirupathi Kannan, Balaji; Krishnasamy, Krishnaveni; Pradeep Kumar Kenny, S.

    2012-01-01

    A new fuzzy based thresholding method for medical images especially cervical cytology images having blob and mosaic structures is proposed in this paper. Many existing thresholding algorithms may segment either blob or mosaic images but there aren't any single algorithm that can do both. In this paper, an input cervical cytology image is binarized, preprocessed and the pixel value with minimum Fuzzy Gaussian Index is identified as an optimal threshold value and used for segmentation. The proposed technique is tested on various cervical cytology images having blob or mosaic structures, compared with various existing algorithms and proved better than the existing algorithms.

  13. Effect of density of localized states on the ovonic threshold switching characteristics of the amorphous GeSe films

    NASA Astrophysics Data System (ADS)

    Ahn, Hyung-Woo; Seok Jeong, Doo; Cheong, Byung-ki; Lee, Hosuk; Lee, Hosun; Kim, Su-dong; Shin, Sang-Yeol; Kim, Donghwan; Lee, Suyoun

    2013-07-01

    We investigated the effect of nitrogen (N) doping on the threshold voltage of an ovonic threshold switching device using amorphous GeSe. Using the spectroscopic ellipsometry, we found that the addition of N brought about significant changes in electronic structure of GeSe, such as the density of localized states and the band gap energy. Besides, it was observed that the characteristics of OTS devices strongly depended on the doping of N, which could be attributed to those changes in electronic structure suggesting a method to modulate the threshold voltage of the device.

  14. Method for depositing layers of high quality semiconductor material

    DOEpatents

    Guha, Subhendu; Yang, Chi C.

    2001-08-14

    Plasma deposition of substantially amorphous semiconductor materials is carried out under a set of deposition parameters which are selected so that the process operates near the amorphous/microcrystalline threshold. This threshold varies as a function of the thickness of the depositing semiconductor layer; and, deposition parameters, such as diluent gas concentrations, must be adjusted as a function of layer thickness. Also, this threshold varies as a function of the composition of the depositing layer, and in those instances where the layer composition is profiled throughout its thickness, deposition parameters must be adjusted accordingly so as to maintain the amorphous/microcrystalline threshold.

  15. A Non-parametric Cutout Index for Robust Evaluation of Identified Proteins*

    PubMed Central

    Serang, Oliver; Paulo, Joao; Steen, Hanno; Steen, Judith A.

    2013-01-01

    This paper proposes a novel, automated method for evaluating sets of proteins identified using mass spectrometry. The remaining peptide-spectrum match score distributions of protein sets are compared to an empirical absent peptide-spectrum match score distribution, and a Bayesian non-parametric method reminiscent of the Dirichlet process is presented to accurately perform this comparison. Thus, for a given protein set, the process computes the likelihood that the proteins identified are correctly identified. First, the method is used to evaluate protein sets chosen using different protein-level false discovery rate (FDR) thresholds, assigning each protein set a likelihood. The protein set assigned the highest likelihood is used to choose a non-arbitrary protein-level FDR threshold. Because the method can be used to evaluate any protein identification strategy (and is not limited to mere comparisons of different FDR thresholds), we subsequently use the method to compare and evaluate multiple simple methods for merging peptide evidence over replicate experiments. The general statistical approach can be applied to other types of data (e.g. RNA sequencing) and generalizes to multivariate problems. PMID:23292186

  16. Evaluation of Mandarin Chinese Speech Recognition in Adults with Cochlear Implants Using the Spectral Ripple Discrimination Test

    PubMed Central

    Dai, Chuanfu; Zhao, Zeqi; Zhang, Duo; Lei, Guanxiong

    2018-01-01

    Background The aim of this study was to explore the value of the spectral ripple discrimination test in speech recognition evaluation among a deaf (post-lingual) Mandarin-speaking population in China following cochlear implantation. Material/Methods The study included 23 Mandarin-speaking adult subjects with normal hearing (normal-hearing group) and 17 deaf adults who were former Mandarin-speakers, with cochlear implants (cochlear implantation group). The normal-hearing subjects were divided into men (n=10) and women (n=13). The spectral ripple discrimination thresholds between the groups were compared. The correlation between spectral ripple discrimination thresholds and Mandarin speech recognition rates in the cochlear implantation group were studied. Results Spectral ripple discrimination thresholds did not correlate with age (r=−0.19; p=0.22), and there was no significant difference in spectral ripple discrimination thresholds between the male and female groups (p=0.654). Spectral ripple discrimination thresholds of deaf adults with cochlear implants were significantly correlated with monosyllabic recognition rates (r=0.84; p=0.000). Conclusions In a Mandarin Chinese speaking population, spectral ripple discrimination thresholds of normal-hearing individuals were unaffected by both gender and age. Spectral ripple discrimination thresholds were correlated with Mandarin monosyllabic recognition rates of Mandarin-speaking in post-lingual deaf adults with cochlear implants. The spectral ripple discrimination test is a promising method for speech recognition evaluation in adults following cochlear implantation in China. PMID:29806954

  17. Evaluation of Mandarin Chinese Speech Recognition in Adults with Cochlear Implants Using the Spectral Ripple Discrimination Test.

    PubMed

    Dai, Chuanfu; Zhao, Zeqi; Shen, Weidong; Zhang, Duo; Lei, Guanxiong; Qiao, Yuehua; Yang, Shiming

    2018-05-28

    BACKGROUND The aim of this study was to explore the value of the spectral ripple discrimination test in speech recognition evaluation among a deaf (post-lingual) Mandarin-speaking population in China following cochlear implantation. MATERIAL AND METHODS The study included 23 Mandarin-speaking adult subjects with normal hearing (normal-hearing group) and 17 deaf adults who were former Mandarin-speakers, with cochlear implants (cochlear implantation group). The normal-hearing subjects were divided into men (n=10) and women (n=13). The spectral ripple discrimination thresholds between the groups were compared. The correlation between spectral ripple discrimination thresholds and Mandarin speech recognition rates in the cochlear implantation group were studied. RESULTS Spectral ripple discrimination thresholds did not correlate with age (r=-0.19; p=0.22), and there was no significant difference in spectral ripple discrimination thresholds between the male and female groups (p=0.654). Spectral ripple discrimination thresholds of deaf adults with cochlear implants were significantly correlated with monosyllabic recognition rates (r=0.84; p=0.000). CONCLUSIONS In a Mandarin Chinese speaking population, spectral ripple discrimination thresholds of normal-hearing individuals were unaffected by both gender and age. Spectral ripple discrimination thresholds were correlated with Mandarin monosyllabic recognition rates of Mandarin-speaking in post-lingual deaf adults with cochlear implants. The spectral ripple discrimination test is a promising method for speech recognition evaluation in adults following cochlear implantation in China.

  18. Examination of a Method to Determine the Reference Region for Calculating the Specific Binding Ratio in Dopamine Transporter Imaging.

    PubMed

    Watanabe, Ayumi; Inoue, Yusuke; Asano, Yuji; Kikuchi, Kei; Miyatake, Hiroki; Tokushige, Takanobu

    2017-01-01

    The specific binding ratio (SBR) was first reported by Tossici-Bolt et al. for quantitative indicators for dopamine transporter (DAT) imaging. It is defined as the ratio of the specific binding concentration of the striatum to the non-specific binding concentration of the whole brain other than the striatum. The non-specific binding concentration is calculated based on the region of interest (ROI), which is set 20 mm inside the outer contour, defined by a threshold technique. Tossici-Bolt et al. used a 50% threshold, but sometimes we couldn't define the ROI of non-specific binding concentration (reference region) and calculate SBR appropriately with a 50% threshold. Therefore, we sought a new method for determining the reference region when calculating SBR. We used data from 20 patients who had undergone DAT imaging in our hospital, to calculate the non-specific binding concentration by the following methods, the threshold to define a reference region was fixed at some specific values (the fixing method) and reference region was visually optimized by an examiner at every examination (the visual optimization method). First, we assessed the reference region of each method visually, and afterward, we quantitatively compared SBR calculated based on each method. In the visual assessment, the scores of the fixing method at 30% and visual optimization method were higher than the scores of the fixing method at other values, with or without scatter correction. In the quantitative assessment, the SBR obtained by visual optimization of the reference region, based on consensus of three radiological technologists, was used as a baseline (the standard method). The values of SBR showed good agreement between the standard method and both the fixing method at 30% and the visual optimization method, with or without scatter correction. Therefore, the fixing method at 30% and the visual optimization method were equally suitable for determining the reference region.

  19. Decision curve analysis: a novel method for evaluating prediction models.

    PubMed

    Vickers, Andrew J; Elkin, Elena B

    2006-01-01

    Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.

  20. A robustness test of the braided device foreshortening algorithm

    NASA Astrophysics Data System (ADS)

    Moyano, Raquel Kale; Fernandez, Hector; Macho, Juan M.; Blasco, Jordi; San Roman, Luis; Narata, Ana Paula; Larrabide, Ignacio

    2017-11-01

    Different computational methods have been recently proposed to simulate the virtual deployment of a braided stent inside a patient vasculature. Those methods are primarily based on the segmentation of the region of interest to obtain the local vessel morphology descriptors. The goal of this work is to evaluate the influence of the segmentation quality on the method named "Braided Device Foreshortening" (BDF). METHODS: We used the 3DRA images of 10 aneurysmatic patients (cases). The cases were segmented by applying a marching cubes algorithm with a broad range of thresholds in order to generate 10 surface models each. We selected a braided device to apply the BDF algorithm to each surface model. The range of the computed flow diverter lengths for each case was obtained to calculate the variability of the method against the threshold segmentation values. RESULTS: An evaluation study over 10 clinical cases indicates that the final length of the deployed flow diverter in each vessel model is stable, shielding maximum difference of 11.19% in vessel diameter and maximum of 9.14% in the simulated stent length for the threshold values. The average coefficient of variation was found to be 4.08 %. CONCLUSION: A study evaluating how the threshold segmentation affects the simulated length of the deployed FD, was presented. The segmentation algorithm used to segment intracranial aneurysm 3D angiography images presents small variation in the resulting stent simulation.

  1. Drug Adverse Event Detection in Health Plan Data Using the Gamma Poisson Shrinker and Comparison to the Tree-based Scan Statistic

    PubMed Central

    Brown, Jeffrey S.; Petronis, Kenneth R.; Bate, Andrew; Zhang, Fang; Dashevsky, Inna; Kulldorff, Martin; Avery, Taliser R.; Davis, Robert L.; Chan, K. Arnold; Andrade, Susan E.; Boudreau, Denise; Gunter, Margaret J.; Herrinton, Lisa; Pawloski, Pamala A.; Raebel, Marsha A.; Roblin, Douglas; Smith, David; Reynolds, Robert

    2013-01-01

    Background: Drug adverse event (AE) signal detection using the Gamma Poisson Shrinker (GPS) is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan). Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds. PMID:24300404

  2. Sentinel lymph node detection by an optical method using scattered photons

    PubMed Central

    Tellier, Franklin; Ravelo, Rasata; Simon, Hervé; Chabrier, Renée; Steibel, Jérôme; Poulet, Patrick

    2010-01-01

    We present a new near infrared optical probe for the sentinel lymph node detection, based on the recording of scattered photons. A two wavelengths setup was developed to improve the detection threshold of an injected dye: the Patent Blue V dye. The method used consists in modulating each laser diode at a given frequency. A Fast Fourier Transform of the recorded signal separates both components. The signal amplitudes are used to compute relative Patent Blue V concentration. Results on the probe using phantoms model and small animal experimentation exhibit a sensitivity threshold of 3.2 µmol/L, which is thirty fold better than the eye visible threshold. PMID:21258517

  3. Total protein of whole saliva as a biomarker of anaerobic threshold.

    PubMed

    Bortolini, Miguel Junior Sordi; De Agostini, Guilherme Gularte; Reis, Ismair Teodoro; Lamounier, Romeu Paulo Martins Silva; Blumberg, Jeffrey B; Espindola, Foued Salmen

    2009-09-01

    Saliva provides a convenient and noninvasive matrix for assessing specific physiological parameters, including some biomarkers of exercise. We investigated whether the total protein concentration of whole saliva (TPWS) would reflect the anaerobic threshold during an incremental exercise test. After a warm-up period, 13 nonsmoking men performed a maximum incremental exercise on a cycle ergometer. Blood and stimulated saliva were collected during the test. The TPWS anaerobic threshold (PAT) was determined using the Dmax method. The PAT was correlated with the blood lactate anaerobic threshold (AT; r = .93, p < .05). No significant difference (p = .16) was observed between PAT and AT. Thus, TPWS provides a convenient and noninvasive matrix for determining the anaerobic threshold during incremental exercise tests.

  4. Sub-Volumetric Classification and Visualization of Emphysema Using a Multi-Threshold Method and Neural Network

    NASA Astrophysics Data System (ADS)

    Tan, Kok Liang; Tanaka, Toshiyuki; Nakamura, Hidetoshi; Shirahata, Toru; Sugiura, Hiroaki

    Chronic Obstructive Pulmonary Disease is a disease in which the airways and tiny air sacs (alveoli) inside the lung are partially obstructed or destroyed. Emphysema is what occurs as more and more of the walls between air sacs get destroyed. The goal of this paper is to produce a more practical emphysema-quantification algorithm that has higher correlation with the parameters of pulmonary function tests compared to classical methods. The use of the threshold range from approximately -900 Hounsfield Unit to -990 Hounsfield Unit for extracting emphysema from CT has been reported in many papers. From our experiments, we realize that a threshold which is optimal for a particular CT data set might not be optimal for other CT data sets due to the subtle radiographic variations in the CT images. Consequently, we propose a multi-threshold method that utilizes ten thresholds between and including -900 Hounsfield Unit and -990 Hounsfield Unit for identifying the different potential emphysematous regions in the lung. Subsequently, we divide the lung into eight sub-volumes. From each sub-volume, we calculate the ratio of the voxels with the intensity below a certain threshold. The respective ratios of the voxels below the ten thresholds are employed as the features for classifying the sub-volumes into four emphysema severity classes. Neural network is used as the classifier. The neural network is trained using 80 training sub-volumes. The performance of the classifier is assessed by classifying 248 test sub-volumes of the lung obtained from 31 subjects. Actual diagnoses of the sub-volumes are hand-annotated and consensus-classified by radiologists. The four-class classification accuracy of the proposed method is 89.82%. The sub-volumetric classification results produced in this study encompass not only the information of emphysema severity but also the distribution of emphysema severity from the top to the bottom of the lung. We hypothesize that besides emphysema severity, the distribution of emphysema severity in the lung also plays an important role in the assessment of the overall functionality of the lung. We confirm our hypothesis by showing that the proposed sub-volumetric classification results correlate with the parameters of pulmonary function tests better than classical methods. We also visualize emphysema using a technique called the transparent lung model.

  5. Rainfall thresholds as a landslide indicator for engineered slopes on the Irish Rail network

    NASA Astrophysics Data System (ADS)

    Martinović, Karlo; Gavin, Kenneth; Reale, Cormac; Mangan, Cathal

    2018-04-01

    Rainfall thresholds express the minimum levels of rainfall that need to be reached or exceeded in order for landslides to occur in a particular area. They are a common tool in expressing the temporal portion of landslide hazard analysis. Numerous rainfall thresholds have been developed for different areas worldwide, however none of these are focused on landslides occurring on the engineered slopes on transport infrastructure networks. This paper uses empirical method to develop the rainfall thresholds for landslides on the Irish Rail network earthworks. For comparison, rainfall thresholds are also developed for natural terrain in Ireland. The results show that particular thresholds involving relatively low rainfall intensities are applicable for Ireland, owing to the specific climate. Furthermore, the comparison shows that rainfall thresholds for engineered slopes are lower than those for landslides occurring on the natural terrain. This has severe implications as it indicates that there is a significant risk involved when using generic weather alerts (developed largely for natural terrain) for infrastructure management, and showcases the need for developing railway and road specific rainfall thresholds for landslides.

  6. Electron-Atom Ionization Calculations using Propagating Exterior Complex Scaling

    NASA Astrophysics Data System (ADS)

    Bartlett, Philip

    2007-10-01

    The exterior complex scaling method (Science 286 (1999) 2474), pioneered by Rescigno, McCurdy and coworkers, provided highly accurate ab initio solutions for electron-hydrogen collisions by directly solving the time-independent Schr"odinger equation in coordinate space. An extension of this method, propagating exterior complex scaling (PECS), was developed by Bartlett and Stelbovics (J. Phys. B 37 (2004) L69, J. Phys. B 39 (2006) R379) and has been demonstrated to provide computationally efficient and accurate calculations of ionization and scattering cross sections over a large range of energies below, above and near the ionization threshold. An overview of the PECS method for three-body collisions and the computational advantages of its propagation and iterative coupling techniques will be presented along with results of: (1) near-threshold ionization of electron-hydrogen collisions and the Wannier threshold laws, (2) scattering cross section resonances below the ionization threshold, and (3) total and differential cross sections for electron collisions with excited targets and hydrogenic ions from low through to high energies. Recently, the PECS method has been extended to solve four-body collisions using time-independent methods in coordinate space and has initially been applied to the s-wave model for electron-helium collisions. A description of the extensions made to the PECS method to facilitate these significantly more computationally demanding calculations will be given, and results will be presented for elastic, single-excitation, double-excitation, single-ionization and double-ionization collisions.

  7. Meta-analysis of diagnostic accuracy studies in mental health

    PubMed Central

    Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J

    2015-01-01

    Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042

  8. Methods, apparatus and system for notification of predictable memory failure

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2017-01-03

    A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.

  9. Self-adjusting threshold mechanism for pixel detectors

    NASA Astrophysics Data System (ADS)

    Heim, Timon; Garcia-Sciveres, Maurice

    2017-09-01

    Readout chips of hybrid pixel detectors use a low power amplifier and threshold discrimination to process charge deposited in semiconductor sensors. Due to transistor mismatch each pixel circuit needs to be calibrated individually to achieve response uniformity. Traditionally this is addressed by programmable threshold trimming in each pixel, but requires robustness against radiation effects, temperature, and time. In this paper a self-adjusting threshold mechanism is presented, which corrects the threshold for both spatial inequality and time variation and maintains a constant response. It exploits the electrical noise as relative measure for the threshold and automatically adjust the threshold of each pixel to always achieve a uniform frequency of noise hits. A digital implementation of the method in the form of an up/down counter and combinatorial logic filter is presented. The behavior of this circuit has been simulated to evaluate its performance and compare it to traditional calibration results. The simulation results show that this mechanism can perform equally well, but eliminates instability over time and is immune to single event upsets.

  10. Threshold network of a financial market using the P-value of correlation coefficients

    NASA Astrophysics Data System (ADS)

    Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun

    2015-06-01

    Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.

  11. Threshold units: A correct metric for reaction time?

    PubMed Central

    Zele, Andrew J.; Cao, Dingcai; Pokorny, Joel

    2007-01-01

    Purpose To compare reaction time (RT) to rod incremental and decremental stimuli expressed in physical contrast units or psychophysical threshold units. Methods Rod contrast detection thresholds and suprathreshold RTs were measured for Rapid-On and Rapid-Off ramp stimuli. Results Threshold sensitivity to Rapid-Off stimuli was higher than to Rapid-On stimuli. Suprathreshold RTs specified in Weber contrast for Rapid-Off stimuli were shorter than for Rapid-On stimuli. Reaction time data expressed in multiples of threshold reversed the outcomes: Reaction times for Rapid-On stimuli were shorter than those for Rapid-Off stimuli. The use of alternative contrast metrics also failed to equate RTs. Conclusions A case is made that the interpretation of RT data may be confounded when expressed in threshold units. Stimulus energy or contrast is the only metric common to the response characteristics of the cells underlying speeded responses. The use of threshold metrics for RT can confuse the interpretation of an underlying physiological process. PMID:17240416

  12. Watershed safety and quality control by safety threshold method

    NASA Astrophysics Data System (ADS)

    Da-Wei Tsai, David; Mengjung Chou, Caroline; Ramaraj, Rameshprabu; Liu, Wen-Cheng; Honglay Chen, Paris

    2014-05-01

    Taiwan was warned as one of the most dangerous countries by IPCC and the World Bank. In such an exceptional and perilous island, we would like to launch the strategic research of land-use management on the catastrophe prevention and environmental protection. This study used the watershed management by "Safety Threshold Method" to restore and to prevent the disasters and pollution on island. For the deluge prevention, this study applied the restoration strategy to reduce total runoff which was equilibrium to 59.4% of the infiltration each year. For the sediment management, safety threshold management could reduce the sediment below the equilibrium of the natural sediment cycle. In the water quality issues, the best strategies exhibited the significant total load reductions of 10% in carbon (BOD5), 15% in nitrogen (nitrate) and 9% in phosphorus (TP). We found out the water quality could meet the BOD target by the 50% peak reduction with management. All the simulations demonstrated the safety threshold method was helpful to control the loadings within the safe range of disasters and environmental quality. Moreover, from the historical data of whole island, the past deforestation policy and the mistake economic projects were the prime culprits. Consequently, this study showed a practical method to manage both the disasters and pollution in a watershed scale by the land-use management.

  13. On plant detection of intact tomato fruits using image analysis and machine learning methods.

    PubMed

    Yamamoto, Kyosuke; Guo, Wei; Yoshioka, Yosuke; Ninomiya, Seishi

    2014-07-09

    Fully automated yield estimation of intact fruits prior to harvesting provides various benefits to farmers. Until now, several studies have been conducted to estimate fruit yield using image-processing technologies. However, most of these techniques require thresholds for features such as color, shape and size. In addition, their performance strongly depends on the thresholds used, although optimal thresholds tend to vary with images. Furthermore, most of these techniques have attempted to detect only mature and immature fruits, although the number of young fruits is more important for the prediction of long-term fluctuations in yield. In this study, we aimed to develop a method to accurately detect individual intact tomato fruits including mature, immature and young fruits on a plant using a conventional RGB digital camera in conjunction with machine learning approaches. The developed method did not require an adjustment of threshold values for fruit detection from each image because image segmentation was conducted based on classification models generated in accordance with the color, shape, texture and size of the images. The results of fruit detection in the test images showed that the developed method achieved a recall of 0.80, while the precision was 0.88. The recall values of mature, immature and young fruits were 1.00, 0.80 and 0.78, respectively.

  14. An Active Contour Model Based on Adaptive Threshold for Extraction of Cerebral Vascular Structures.

    PubMed

    Wang, Jiaxin; Zhao, Shifeng; Liu, Zifeng; Tian, Yun; Duan, Fuqing; Pan, Yutong

    2016-01-01

    Cerebral vessel segmentation is essential and helpful for the clinical diagnosis and the related research. However, automatic segmentation of brain vessels remains challenging because of the variable vessel shape and high complex of vessel geometry. This study proposes a new active contour model (ACM) implemented by the level-set method for segmenting vessels from TOF-MRA data. The energy function of the new model, combining both region intensity and boundary information, is composed of two region terms, one boundary term and one penalty term. The global threshold representing the lower gray boundary of the target object by maximum intensity projection (MIP) is defined in the first-region term, and it is used to guide the segmentation of the thick vessels. In the second term, a dynamic intensity threshold is employed to extract the tiny vessels. The boundary term is used to drive the contours to evolve towards the boundaries with high gradients. The penalty term is used to avoid reinitialization of the level-set function. Experimental results on 10 clinical brain data sets demonstrate that our method is not only able to achieve better Dice Similarity Coefficient than the global threshold based method and localized hybrid level-set method but also able to extract whole cerebral vessel trees, including the thin vessels.

  15. SU-C-BRE-07: Sensitivity Analysis of the Threshold Energy for the Creation of Strand Breaks and of Single and Double Strand Break Clustering Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pater, P

    Purpose: To analyse the sensitivity of the creation of strand breaks (SB) to the threshold energy (Eth) and thresholding method and to quantify the impact of clustering conditions on single strand break (SSB) and double strand break (DSB) yields. Methods: Monte Carlo simulations using Geant4-DNA were conducted for electron tracks of 280 eV to 220 keV in a geometrical DNA model composed of nucleosomes of 396 phospho-diester groups (PDGs) each. A strand break was created inside a PDG when the sum of all energy deposits (method 1) or energy transfers (method 2) was higher than Eth or when at leastmore » one interaction deposited (method 3) or transferred (method 4) an energy higher than Eth. SBs were then clustered into SSBs and DSBs using clustering scoring criteria from the literature and compared to our own. Results: The total number of SBs decreases as Eth is increased. In addition, thresholding on the energy transfers (methods 2 and 4) produces a higher SB count than when thresholding on energy deposits (methods 1 and 3). Method 2 produces a step-like function and should be avoided when attempting to optimize Eth. When SBs are grouped into damage patterns, clustering conditions can underestimated SSBs by up to 18 % and DSBs can be overestimated by up to 12 % compared to our own implementation. Conclusion: We show that two often underreported simulation parameters have a non-negligible effect on overall DNA damage yields. First more SBs are counted when using energy transfers to the PDG rather than energy deposits. Also, SBs grouped according to different clustering conditions can influence reported SSB and DSB by as much as 20%. Careful handling of these parameters is required when trying to compare DNA damage yields from different authors. Research funding from the governments of Canada and Quebec. PP acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  16. Results of FM-TV threshold reduction investigation for the ATS F trust experiment

    NASA Technical Reports Server (NTRS)

    Brown, J. P.

    1972-01-01

    An investigation of threshold effects in FM TV was initiated to determine if any simple, low cost techniques were available which can reduce the subjective video threshold, applicable to low cost community TV reception via satellite. Two methods of eliminating these effects were examined: the use of standard video pre-emphasis, and the use of an additional circuit to blank the picture tube during the retrace period.

  17. Threshold of transverse mode coupling instability with arbitrary space charge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balbekov, V.

    The threshold of the transverse mode coupling instability is calculated in framework of the square well model at arbitrary value of space charge tune shift. A new method of calculation is developed beyond the traditional expansion technique. The square, resistive, and exponential wakes are investigated. It is shown that the instability threshold goes up indefinitely when the tune shift increases. Finally, a comparison with conventional case of the parabolic potential well is performed.

  18. Threshold of transverse mode coupling instability with arbitrary space charge

    DOE PAGES

    Balbekov, V.

    2017-11-30

    The threshold of the transverse mode coupling instability is calculated in framework of the square well model at arbitrary value of space charge tune shift. A new method of calculation is developed beyond the traditional expansion technique. The square, resistive, and exponential wakes are investigated. It is shown that the instability threshold goes up indefinitely when the tune shift increases. Finally, a comparison with conventional case of the parabolic potential well is performed.

  19. Effect of microgravity on visual contrast threshold during STS Shuttle missions: Visual Function Tester-Model 2 (VFT-2)

    NASA Technical Reports Server (NTRS)

    Oneal, Melvin R.; Task, H. Lee; Genco, Louis V.

    1992-01-01

    Viewgraphs on effect of microgravity on visual contrast threshold during STS shuttle missions are presented. The purpose, methods, and results are discussed. The visual function tester model 2 is used.

  20. On-Orbit Reconfigurable Solar Array

    NASA Technical Reports Server (NTRS)

    Levy, Robert K. (Inventor)

    2017-01-01

    In one or more embodiments, the present disclosure teaches a method for reconfiguring a solar array. The method involves providing, for the solar array, at least one string of solar cells. The method further involves deactivating at least a portion of at least one of the strings of solar cells of the solar array when power produced by the solar array reaches a maximum power allowance threshold. In addition, the method involves activating at least a portion of at least one of the strings of the solar cells in the solar array when the power produced by the solar array reaches a minimum power allowance threshold.

  1. Sampling Based Influence Maximization on Linear Threshold Model

    NASA Astrophysics Data System (ADS)

    Jia, Su; Chen, Ling

    2018-04-01

    A sampling based influence maximization on linear threshold (LT) model method is presented. The method samples the routes in the possible worlds in the social networks, and uses Chernoff bound to estimate the number of samples so that the error can be constrained within a given bound. Then the active possibilities of the routes in the possible worlds are calculated, and are used to compute the influence spread of each node in the network. Our experimental results show that our method can effectively select appropriate seed nodes set that spreads larger influence than other similar methods.

  2. Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices

    DOEpatents

    Chassin, David P [Pasco, WA; Donnelly, Matthew K [Kennewick, WA; Dagle, Jeffery E [Richland, WA

    2011-12-06

    Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices are described. In one aspect, an electrical power distribution control method includes providing electrical energy from an electrical power distribution system, applying the electrical energy to a load, providing a plurality of different values for a threshold at a plurality of moments in time and corresponding to an electrical characteristic of the electrical energy, and adjusting an amount of the electrical energy applied to the load responsive to an electrical characteristic of the electrical energy triggering one of the values of the threshold at the respective moment in time.

  3. Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices

    DOEpatents

    Chassin, David P.; Donnelly, Matthew K.; Dagle, Jeffery E.

    2006-12-12

    Electrical power distribution control methods, electrical energy demand monitoring methods, and power management devices are described. In one aspect, an electrical power distribution control method includes providing electrical energy from an electrical power distribution system, applying the electrical energy to a load, providing a plurality of different values for a threshold at a plurality of moments in time and corresponding to an electrical characteristic of the electrical energy, and adjusting an amount of the electrical energy applied to the load responsive to an electrical characteristic of the electrical energy triggering one of the values of the threshold at the respective moment in time.

  4. Thresher: an improved algorithm for peak height thresholding of microbial community profiles.

    PubMed

    Starke, Verena; Steele, Andrew

    2014-11-15

    This article presents Thresher, an improved technique for finding peak height thresholds for automated rRNA intergenic spacer analysis (ARISA) profiles. We argue that thresholds must be sample dependent, taking community richness into account. In most previous fragment analyses, a common threshold is applied to all samples simultaneously, ignoring richness variations among samples and thereby compromising cross-sample comparison. Our technique solves this problem, and at the same time provides a robust method for outlier rejection, selecting for removal any replicate pairs that are not valid replicates. Thresholds are calculated individually for each replicate in a pair, and separately for each sample. The thresholds are selected to be the ones that minimize the dissimilarity between the replicates after thresholding. If a choice of threshold results in the two replicates in a pair failing a quantitative test of similarity, either that threshold or that sample must be rejected. We compare thresholded ARISA results with sequencing results, and demonstrate that the Thresher algorithm outperforms conventional thresholding techniques. The software is implemented in R, and the code is available at http://verenastarke.wordpress.com or by contacting the author. vstarke@ciw.edu or http://verenastarke.wordpress.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Standardised method of determining vibratory perception thresholds for diagnosis and screening in neurological investigation.

    PubMed Central

    Goldberg, J M; Lindblom, U

    1979-01-01

    Vibration threshold determinations were made by means of an electromagnetic vibrator at three sites (carpal, tibial, and tarsal), which were primarily selected for examining patients with polyneuropathy. Because of the vast variation demonstrated for both vibrator output and tissue damping, the thresholds were expressed in terms of amplitude of stimulator movement measured by means of an accelerometer, instead of applied voltage which is commonly used. Statistical analysis revealed a higher power of discimination for amplitude measurements at all three stimulus sites. Digital read-out gave the best statistical result and was also most practical. Reference values obtained from 110 healthy males, 10 to 74 years of age, were highly correlated with age for both upper and lower extremities. The variance of the vibration perception threshold was less than that of the disappearance threshold, and determination of the perception threshold alone may be sufficient in most cases. PMID:501379

  6. Global Motion Perception in 2-Year-Old Children: A Method for Psychophysical Assessment and Relationships With Clinical Measures of Visual Function

    PubMed Central

    Yu, Tzu-Ying; Jacobs, Robert J.; Anstice, Nicola S.; Paudel, Nabin; Harding, Jane E.; Thompson, Benjamin

    2013-01-01

    Purpose. We developed and validated a technique for measuring global motion perception in 2-year-old children, and assessed the relationship between global motion perception and other measures of visual function. Methods. Random dot kinematogram (RDK) stimuli were used to measure motion coherence thresholds in 366 children at risk of neurodevelopmental problems at 24 ± 1 months of age. RDKs of variable coherence were presented and eye movements were analyzed offline to grade the direction of the optokinetic reflex (OKR) for each trial. Motion coherence thresholds were calculated by fitting psychometric functions to the resulting datasets. Test–retest reliability was assessed in 15 children, and motion coherence thresholds were measured in a group of 10 adults using OKR and behavioral responses. Standard age-appropriate optometric tests also were performed. Results. Motion coherence thresholds were measured successfully in 336 (91.8%) children using the OKR technique, but only 31 (8.5%) using behavioral responses. The mean threshold was 41.7 ± 13.5% for 2-year-old children and 3.3 ± 1.2% for adults. Within-assessor reliability and test–retest reliability were high in children. Children's motion coherence thresholds were significantly correlated with stereoacuity (LANG I & II test, ρ = 0.29, P < 0.001; Frisby, ρ = 0.17, P = 0.022), but not with binocular visual acuity (ρ = 0.11, P = 0.07). In adults OKR and behavioral motion coherence thresholds were highly correlated (intraclass correlation = 0.81, P = 0.001). Conclusions. Global motion perception can be measured in 2-year-old children using the OKR. This technique is reliable and data from adults suggest that motion coherence thresholds based on the OKR are related to motion perception. Global motion perception was related to stereoacuity in children. PMID:24282224

  7. A novel approach to estimation of the time to biomarker threshold: applications to HIV.

    PubMed

    Reddy, Tarylee; Molenberghs, Geert; Njagi, Edmund Njeru; Aerts, Marc

    2016-11-01

    In longitudinal studies of biomarkers, an outcome of interest is the time at which a biomarker reaches a particular threshold. The CD4 count is a widely used marker of human immunodeficiency virus progression. Because of the inherent variability of this marker, a single CD4 count below a relevant threshold should be interpreted with caution. Several studies have applied persistence criteria, designating the outcome as the time to the occurrence of two consecutive measurements less than the threshold. In this paper, we propose a method to estimate the time to attainment of two consecutive CD4 counts less than a meaningful threshold, which takes into account the patient-specific trajectory and measurement error. An expression for the expected time to threshold is presented, which is a function of the fixed effects, random effects and residual variance. We present an application to human immunodeficiency virus-positive individuals from a seroprevalent cohort in Durban, South Africa. Two thresholds are examined, and 95% bootstrap confidence intervals are presented for the estimated time to threshold. Sensitivity analysis revealed that results are robust to truncation of the series and variation in the number of visits considered for most patients. Caution should be exercised when interpreting the estimated times for patients who exhibit very slow rates of decline and patients who have less than three measurements. We also discuss the relevance of the methodology to the study of other diseases and present such applications. We demonstrate that the method proposed is computationally efficient and offers more flexibility than existing frameworks. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Noxious heat threshold temperature and pronociceptive effects of allyl isothiocyanate (mustard oil) in TRPV1 or TRPA1 gene-deleted mice.

    PubMed

    Tékus, Valéria; Horváth, Ádám; Hajna, Zsófia; Borbély, Éva; Bölcskei, Kata; Boros, Melinda; Pintér, Erika; Helyes, Zsuzsanna; Pethő, Gábor; Szolcsányi, János

    2016-06-01

    To investigate the roles of TRPV1 and TRPA1 channels in baseline and allyl isothiocyanate (AITC)-evoked nociceptive responses by comparing wild-type and gene-deficient mice. In contrast to conventional methods of thermonociception measuring reflex latencies, we used our novel methods to determine the noxious heat threshold. It was revealed that the heat threshold of the tail measured by an increasing-temperature water bath is significantly higher in TRPV1(-/-), but not TRPA1(-/-), mice compared to respective wild-types. There was no difference between the noxious heat thresholds of the hind paw as measured by an increasing-temperature hot plate in TRPV1(-/-), TRPA1(-/-) and the corresponding wild-type mice. The withdrawal latency of the tail from 0°C water was prolonged in TRPA1(-/-), but not TRPV1(-/-), mice compared to respective wild-types. In wild-type animals, dipping the tail or paw into 1% AITC induced an 8-14°C drop of the noxious heat threshold (heat allodynia) of both the tail and paw, and 40-50% drop of the mechanonociceptive threshold (mechanical allodynia) of the paw measured by dynamic plantar esthesiometry. These AITC-evoked responses were diminished in TRPV1(-/-), but not TRPA1(-/-), mice. Tail withdrawal latency to 1% AITC was significantly prolonged in both gene-deleted strains. Different heat sensors determine the noxious heat threshold in distinct areas: a pivotal role for TRPV1 on the tail is contrasted with no involvement of either TRPV1 or TRPA1 on the hind paw. Noxious heat threshold measurement appears appropriate for preclinical screening of TRP channel ligands as novel analgesics. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Application of a Threshold Method to Airborne-Spaceborne Attenuating-Wavelength Radars for the Estimation of Space-Time Rain-Rate Statistics.

    NASA Astrophysics Data System (ADS)

    Meneghini, Robert

    1998-09-01

    A method is proposed for estimating the area-average rain-rate distribution from attenuating-wavelength spaceborne or airborne radar data. Because highly attenuated radar returns yield unreliable estimates of the rain rate, these are eliminated by means of a proxy variable, Q, derived from the apparent radar reflectivity factors and a power law relating the attenuation coefficient and the reflectivity factor. In determining the probability distribution function of areawide rain rates, the elimination of attenuated measurements at high rain rates and the loss of data at light rain rates, because of low signal-to-noise ratios, leads to truncation of the distribution at the low and high ends. To estimate it over all rain rates, a lognormal distribution is assumed, the parameters of which are obtained from a nonlinear least squares fit to the truncated distribution. Implementation of this type of threshold method depends on the method used in estimating the high-resolution rain-rate estimates (e.g., either the standard Z-R or the Hitschfeld-Bordan estimate) and on the type of rain-rate estimate (either point or path averaged). To test the method, measured drop size distributions are used to characterize the rain along the radar beam. Comparisons with the standard single-threshold method or with the sample mean, taken over the high-resolution estimates, show that the present method usually provides more accurate determinations of the area-averaged rain rate if the values of the threshold parameter, QT, are chosen in the range from 0.2 to 0.4.

  10. Threshold-free method for three-dimensional segmentation of organelles

    NASA Astrophysics Data System (ADS)

    Chan, Yee-Hung M.; Marshall, Wallace F.

    2012-03-01

    An ongoing challenge in the field of cell biology is to how to quantify the size and shape of organelles within cells. Automated image analysis methods often utilize thresholding for segmentation, but the calculated surface of objects depends sensitively on the exact threshold value chosen, and this problem is generally worse at the upper and lower zboundaries because of the anisotropy of the point spread function. We present here a threshold-independent method for extracting the three-dimensional surface of vacuoles in budding yeast whose limiting membranes are labeled with a fluorescent fusion protein. These organelles typically exist as a clustered set of 1-10 sphere-like compartments. Vacuole compartments and center points are identified manually within z-stacks taken using a spinning disk confocal microscope. A set of rays is defined originating from each center point and radiating outwards in random directions. Intensity profiles are calculated at coordinates along these rays, and intensity maxima are taken as the points the rays cross the limiting membrane of the vacuole. These points are then fit with a weighted sum of basis functions to define the surface of the vacuole, and then parameters such as volume and surface area are calculated. This method is able to determine the volume and surface area of spherical beads (0.96 to 2 micron diameter) with less than 10% error, and validation using model convolution methods produce similar results. Thus, this method provides an accurate, automated method for measuring the size and morphology of organelles and can be generalized to measure cells and other objects on biologically relevant length-scales.

  11. Measuring financial protection against catastrophic health expenditures: methodological challenges for global monitoring.

    PubMed

    Hsu, Justine; Flores, Gabriela; Evans, David; Mills, Anne; Hanson, Kara

    2018-05-31

    Monitoring financial protection against catastrophic health expenditures is important to understand how health financing arrangements in a country protect its population against high costs associated with accessing health services. While catastrophic health expenditures are generally defined to be when household expenditures for health exceed a given threshold of household resources, there is no gold standard with several methods applied to define the threshold and household resources. These different approaches to constructing the indicator might give different pictures of a country's progress towards financial protection. In order for monitoring to effectively provide policy insight, it is critical to understand the sensitivity of measurement to these choices. This paper examines the impact of varying two methodological choices by analysing household expenditure data from a sample of 47 countries. We assess sensitivity of cross-country comparisons to a range of thresholds by testing for restricted dominance. We further assess sensitivity of comparisons to different methods for defining household resources (i.e. total expenditure, non-food expenditure and non-subsistence expenditure) by conducting correlation tests of country rankings. We found country rankings are robust to the choice of threshold in a tenth to a quarter of comparisons within the 5-85% threshold range and this increases to half of comparisons if the threshold is restricted to 5-40%, following those commonly used in the literature. Furthermore, correlations of country rankings using different methods to define household resources were moderate to high; thus, this choice makes less difference from a measurement perspective than from an ethical perspective as different definitions of available household resources reflect varying concerns for equity. Interpreting comparisons from global monitoring based on a single threshold should be done with caution as these may not provide reliable insight into relative country progress. We therefore recommend financial protection against catastrophic health expenditures be measured across a range of thresholds using a catastrophic incidence curve as shown in this paper. We further recommend evaluating financial protection in relation to a country's health financing system arrangements in order to better understand the extent of protection and better inform future policy changes.

  12. Development of Thresholds and Exceedance Probabilities for Influent Water Quality to Meet Drinking Water Regulations

    NASA Astrophysics Data System (ADS)

    Reeves, K. L.; Samson, C.; Summers, R. S.; Balaji, R.

    2017-12-01

    Drinking water treatment utilities (DWTU) are tasked with the challenge of meeting disinfection and disinfection byproduct (DBP) regulations to provide safe, reliable drinking water under changing climate and land surface characteristics. DBPs form in drinking water when disinfectants, commonly chlorine, react with organic matter as measured by total organic carbon (TOC), and physical removal of pathogen microorganisms are achieved by filtration and monitored by turbidity removal. Turbidity and TOC in influent waters to DWTUs are expected to increase due to variable climate and more frequent fires and droughts. Traditional methods for forecasting turbidity and TOC require catchment specific data (i.e. streamflow) and have difficulties predicting them under non-stationary climate. A modelling framework was developed to assist DWTUs with assessing their risk for future compliance with disinfection and DBP regulations under changing climate. A local polynomial method was developed to predict surface water TOC using climate data collected from NOAA, Normalized Difference Vegetation Index (NDVI) data from the IRI Data Library, and historical TOC data from three DWTUs in diverse geographic locations. Characteristics from the DWTUs were used in the EPA Water Treatment Plant model to determine thresholds for influent TOC that resulted in DBP concentrations within compliance. Lastly, extreme value theory was used to predict probabilities of threshold exceedances under the current climate. Results from the utilities were used to produce a generalized TOC threshold approach that only requires water temperature and bromide concentration. The threshold exceedance model will be used to estimate probabilities of exceedances under projected climate scenarios. Initial results show that TOC can be forecasted using widely available data via statistical methods, where temperature, precipitation, Palmer Drought Severity Index, and NDVI with various lags were shown to be important predictors of TOC, and TOC thresholds can be determined using water temperature and bromide concentration. Results include a model to predict influent turbidity and turbidity thresholds, similar to the TOC models, as well as probabilities of threshold exceedances for TOC and turbidity under changing climate.

  13. Probe-specific mixed-model approach to detect copy number differences using multiplex ligation-dependent probe amplification (MLPA)

    PubMed Central

    González, Juan R; Carrasco, Josep L; Armengol, Lluís; Villatoro, Sergi; Jover, Lluís; Yasui, Yutaka; Estivill, Xavier

    2008-01-01

    Background MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample. Results Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace. Conclusion Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed. PMID:18522760

  14. Multiratio fusion change detection with adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Hytla, Patrick C.; Balster, Eric J.; Vasquez, Juan R.; Neuroth, Robert M.

    2017-04-01

    A ratio-based change detection method known as multiratio fusion (MRF) is proposed and tested. The MRF framework builds on other change detection components proposed in this work: dual ratio (DR) and multiratio (MR). The DR method involves two ratios coupled with adaptive thresholds to maximize detected changes and minimize false alarms. The use of two ratios is shown to outperform the single ratio case when the means of the image pairs are not equal. MR change detection builds on the DR method by including negative imagery to produce four total ratios with adaptive thresholds. Inclusion of negative imagery is shown to improve detection sensitivity and to boost detection performance in certain target and background cases. MRF further expands this concept by fusing together the ratio outputs using a routine in which detections must be verified by two or more ratios to be classified as a true changed pixel. The proposed method is tested with synthetically generated test imagery and real datasets with results compared to other methods found in the literature. DR is shown to significantly outperform the standard single ratio method. MRF produces excellent change detection results that exhibit up to a 22% performance improvement over other methods from the literature at low false-alarm rates.

  15. MEthods of ASsessing blood pressUre: identifying thReshold and target valuEs (MeasureBP): a review & study protocol.

    PubMed

    Blom, Kimberly C; Farina, Sasha; Gomez, Yessica-Haydee; Campbell, Norm R C; Hemmelgarn, Brenda R; Cloutier, Lyne; McKay, Donald W; Dawes, Martin; Tobe, Sheldon W; Bolli, Peter; Gelfer, Mark; McLean, Donna; Bartlett, Gillian; Joseph, Lawrence; Featherstone, Robin; Schiffrin, Ernesto L; Daskalopoulou, Stella S

    2015-04-01

    Despite progress in automated blood pressure measurement (BPM) technology, there is limited research linking hard outcomes to automated office BPM (OBPM) treatment targets and thresholds. Equivalences for automated BPM devices have been estimated from approximations of standardized manual measurements of 140/90 mmHg. Until outcome-driven targets and thresholds become available for automated measurement methods, deriving evidence-based equivalences between automated methods and standardized manual OBPM is the next best solution. The MeasureBP study group was initiated by the Canadian Hypertension Education Program to close this critical knowledge gap. MeasureBP aims to define evidence-based equivalent values between standardized manual OBPM and automated BPM methods by synthesizing available evidence using a systematic review and individual subject-level data meta-analyses. This manuscript provides a review of the literature and MeasureBP study protocol. These results will lay the evidenced-based foundation to resolve uncertainties within blood pressure guidelines which, in turn, will improve the management of hypertension.

  16. Tunable architecture for aircraft fault detection

    NASA Technical Reports Server (NTRS)

    Ganguli, Subhabrata (Inventor); Papageorgiou, George (Inventor); Glavaski-Radovanovic, Sonja (Inventor)

    2012-01-01

    A method for detecting faults in an aircraft is disclosed. The method involves predicting at least one state of the aircraft and tuning at least one threshold value to tightly upper bound the size of a mismatch between the at least one predicted state and a corresponding actual state of the non-faulted aircraft. If the mismatch between the at least one predicted state and the corresponding actual state is greater than or equal to the at least one threshold value, the method indicates that at least one fault has been detected.

  17. Denoising in digital speckle pattern interferometry using wave atoms.

    PubMed

    Federico, Alejandro; Kaufmann, Guillermo H

    2007-05-15

    We present an effective method for speckle noise removal in digital speckle pattern interferometry, which is based on a wave-atom thresholding technique. Wave atoms are a variant of 2D wavelet packets with a parabolic scaling relation and improve the sparse representation of fringe patterns when compared with traditional expansions. The performance of the denoising method is analyzed by using computer-simulated fringes, and the results are compared with those produced by wavelet and curvelet thresholding techniques. An application of the proposed method to reduce speckle noise in experimental data is also presented.

  18. Sparse Covariance Matrix Estimation With Eigenvalue Constraints

    PubMed Central

    LIU, Han; WANG, Lie; ZHAO, Tuo

    2014-01-01

    We propose a new approach for estimating high-dimensional, positive-definite covariance matrices. Our method extends the generalized thresholding operator by adding an explicit eigenvalue constraint. The estimated covariance matrix simultaneously achieves sparsity and positive definiteness. The estimator is rate optimal in the minimax sense and we develop an efficient iterative soft-thresholding and projection algorithm based on the alternating direction method of multipliers. Empirically, we conduct thorough numerical experiments on simulated datasets as well as real data examples to illustrate the usefulness of our method. Supplementary materials for the article are available online. PMID:25620866

  19. A Framework for Optimizing Phytosanitary Thresholds in Seed Systems.

    PubMed

    Choudhury, Robin Alan; Garrett, Karen A; Klosterman, Steven J; Subbarao, Krishna V; McRoberts, Neil

    2017-10-01

    Seedborne pathogens and pests limit production in many agricultural systems. Quarantine programs help prevent the introduction of exotic pathogens into a country, but few regulations directly apply to reducing the reintroduction and spread of endemic pathogens. Use of phytosanitary thresholds helps limit the movement of pathogen inoculum through seed, but the costs associated with rejected seed lots can be prohibitive for voluntary implementation of phytosanitary thresholds. In this paper, we outline a framework to optimize thresholds for seedborne pathogens, balancing the cost of rejected seed lots and benefit of reduced inoculum levels. The method requires relatively small amounts of data, and the accuracy and robustness of the analysis improves over time as data accumulate from seed testing. We demonstrate the method first and illustrate it with a case study of seedborne oospores of Peronospora effusa, the causal agent of spinach downy mildew. A seed lot threshold of 0.23 oospores per seed could reduce the overall number of oospores entering the production system by 90% while removing 8% of seed lots destined for distribution. Alternative mitigation strategies may result in lower economic losses to seed producers, but have uncertain efficacy. We discuss future challenges and prospects for implementing this approach.

  20. A safe an easy method for building consensus HIV sequences from 454 massively parallel sequencing data.

    PubMed

    Fernández-Caballero Rico, Jose Ángel; Chueca Porcuna, Natalia; Álvarez Estévez, Marta; Mosquera Gutiérrez, María Del Mar; Marcos Maeso, María Ángeles; García, Federico

    2018-02-01

    To show how to generate a consensus sequence from the information of massive parallel sequences data obtained from routine HIV anti-retroviral resistance studies, and that may be suitable for molecular epidemiology studies. Paired Sanger (Trugene-Siemens) and next-generation sequencing (NGS) (454 GSJunior-Roche) HIV RT and protease sequences from 62 patients were studied. NGS consensus sequences were generated using Mesquite, using 10%, 15%, and 20% thresholds. Molecular evolutionary genetics analysis (MEGA) was used for phylogenetic studies. At a 10% threshold, NGS-Sanger sequences from 17/62 patients were phylogenetically related, with a median bootstrap-value of 88% (IQR83.5-95.5). Association increased to 36/62 sequences, median bootstrap 94% (IQR85.5-98)], using a 15% threshold. Maximum association was at the 20% threshold, with 61/62 sequences associated, and a median bootstrap value of 99% (IQR98-100). A safe method is presented to generate consensus sequences from HIV-NGS data at 20% threshold, which will prove useful for molecular epidemiological studies. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  1. Symmetry, stability, and computation of degenerate lasing modes

    NASA Astrophysics Data System (ADS)

    Liu, David; Zhen, Bo; Ge, Li; Hernandez, Felipe; Pick, Adi; Burkhardt, Stephan; Liertzer, Matthias; Rotter, Stefan; Johnson, Steven G.

    2017-02-01

    We present a general method to obtain the stable lasing solutions for the steady-state ab initio lasing theory (SALT) for the case of a degenerate symmetric laser in two dimensions (2D). We find that under most regimes (with one pathological exception), the stable solutions are clockwise and counterclockwise circulating modes, generalizing previously known results of ring lasers to all 2D rotational symmetry groups. Our method uses a combination of semianalytical solutions close to lasing threshold and numerical solvers to track the lasing modes far above threshold. Near threshold, we find closed-form expressions for both circulating modes and other types of lasing solutions as well as for their linearized Maxwell-Bloch eigenvalues, providing a simple way to determine their stability without having to do a full nonlinear numerical calculation. Above threshold, we show that a key feature of the circulating mode is its "chiral" intensity pattern, which arises from spontaneous symmetry breaking of mirror symmetry, and whose symmetry group requires that the degeneracy persists even when nonlinear effects become important. Finally, we introduce a numerical technique to solve the degenerate SALT equations far above threshold even when spatial discretization artificially breaks the degeneracy.

  2. Wearable Lactate Threshold Predicting Device is Valid and Reliable in Runners.

    PubMed

    Borges, Nattai R; Driller, Matthew W

    2016-08-01

    Borges, NR and Driller, MW. Wearable lactate threshold predicting device is valid and reliable in runners. J Strength Cond Res 30(8): 2212-2218, 2016-A commercially available device claiming to be the world's first wearable lactate threshold predicting device (WLT), using near-infrared LED technology, has entered the market. The aim of this study was to determine the levels of agreement between the WLT-derived lactate threshold workload and traditional methods of lactate threshold (LT) calculation and the interdevice and intradevice reliability of the WLT. Fourteen (7 male, 7 female; mean ± SD; age: 18-45 years, height: 169 ± 9 cm, mass: 67 ± 13 kg, V[Combining Dot Above]O2max: 53 ± 9 ml·kg·min) subjects ranging from recreationally active to highly trained athletes completed an incremental exercise test to exhaustion on a treadmill. Blood lactate samples were taken at the end of each 3-minute stage during the test to determine lactate threshold using 5 traditional methods from blood lactate analysis which were then compared against the WLT predicted value. In a subset of the population (n = 12), repeat trials were performed to determine both inter-reliability and intrareliability of the WLT device. Intraclass correlation coefficient (ICC) found high to very high agreement between the WLT and traditional methods (ICC > 0.80), with TEMs and mean differences ranging between 3.9-10.2% and 1.3-9.4%. Both interdevice and intradevice reliability resulted in highly reproducible and comparable results (CV < 1.2%, TEM <0.2 km·h, ICC > 0.97). This study suggests that the WLT is a practical, reliable, and noninvasive tool for use in predicting LT in runners.

  3. Measuring patient tolerance for future adverse events in low-risk emergency department chest pain patients.

    PubMed

    Chen, Jennifer C; Cooper, Richelle J; Lopez-O'Sullivan, Ana; Schriger, David L

    2014-08-01

    We assess emergency department (ED) patients' risk thresholds for preferring admission versus discharge when presenting with chest pain and determine how the method of information presentation affects patients' choices. In this cross-sectional survey, we enrolled a convenience sample of lower-risk acute chest pain patients from an urban ED. We presented patients with a hypothetical value for the risk of adverse outcome that could be decreased by hospitalization and asked them to identify the risk threshold at which they preferred admission versus discharge. We randomized patients to a method of numeric presentation (natural frequency or percentage) and the initial risk presented (low or high) and followed each numeric assessment with an assessment based on visually depicted risks. We enrolled 246 patients and analyzed data on 234 with complete information. The geometric mean risk threshold with numeric presentation was 1 in 736 (1 in 233 with a percentage presentation; 1 in 2,425 with a natural frequency presentation) and 1 in 490 with a visual presentation. Fifty-nine percent of patients (137/234) chose the lowest or highest risk values offered. One hundred fourteen patients chose different thresholds for numeric and visual risk presentations. We observed strong anchoring effects; patients starting with the lowest risk chose a lower threshold than those starting with the highest risk possible and vice versa. Using an expected utility model to measure patients' risk thresholds does not seem to work, either to find a stable risk preference within individuals or in groups. Further work in measurement of patients' risk tolerance or methods of shared decisionmaking not dependent on assessment of risk tolerance is needed. Copyright © 2014 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  4. A novel threshold criterion in transcranial motor evoked potentials during surgery for gliomas close to the motor pathway.

    PubMed

    Abboud, Tammam; Schaper, Miriam; Dührsen, Lasse; Schwarz, Cindy; Schmidt, Nils Ole; Westphal, Manfred; Martens, Tobias

    2016-10-01

    OBJECTIVE Warning criteria for monitoring of motor evoked potentials (MEP) after direct cortical stimulation during surgery for supratentorial tumors have been well described. However, little is known about the value of MEP after transcranial electrical stimulation (TES) in predicting postoperative motor deficit when monitoring threshold level. The authors aimed to evaluate the feasibility and value of this method in glioma surgery by using a new approach for interpreting changes in threshold level involving contra- and ipsilateral MEP. METHODS Between November 2013 and December 2014, 93 patients underwent TES-MEP monitoring during resection of gliomas located close to central motor pathways but not involving the primary motor cortex. The MEP were elicited by transcranial repetitive anodal train stimulation. Bilateral MEP were continuously evaluated to assess percentage increase of threshold level (minimum voltage needed to evoke a stable motor response from each of the muscles being monitored) from the baseline set before dural opening. An increase in threshold level on the contralateral side (facial, arm, or leg muscles contralateral to the affected hemisphere) of more than 20% beyond the percentage increase on the ipsilateral side (facial, arm, or leg muscles ipsilateral to the affected hemisphere) was considered a significant alteration. Recorded alterations were subsequently correlated with postoperative neurological deterioration and MRI findings. RESULTS TES-MEP could be elicited in all patients, including those with recurrent glioma (31 patients) and preoperative paresis (20 patients). Five of 73 patients without preoperative paresis showed a significant increase in threshold level, and all of them developed new paresis postoperatively (transient in 4 patients and permanent in 1 patient). Eight of 20 patients with preoperative paresis showed a significant increase in threshold level, and all of them developed postoperative neurological deterioration (transient in 4 patients and permanent in 4 patients). In 80 patients no significant change in threshold level was detected, and none of them showed postoperative neurological deterioration. The specificity and sensitivity in this series were estimated at 100%. Postoperative MRI revealed gross-total tumor resection in 56 of 82 patients (68%) in whom complete tumor resection was attainable; territorial ischemia was detected in 4 patients. CONCLUSIONS The novel threshold criterion has made TES-MEP a useful method for predicting postoperative motor deficit in patients who undergo glioma surgery, and has been feasible in patients with preoperative paresis as well as in patients with recurrent glioma. Including contra- and ipsilateral changes in threshold level has led to a high sensitivity and specificity.

  5. Correlations of stock price fluctuations under multi-scale and multi-threshold scenarios

    NASA Astrophysics Data System (ADS)

    Sui, Guo; Li, Huajiao; Feng, Sida; Liu, Xueyong; Jiang, Meihui

    2018-01-01

    The multi-scale method is widely used in analyzing time series of financial markets and it can provide market information for different economic entities who focus on different periods. Through constructing multi-scale networks of price fluctuation correlation in the stock market, we can detect the topological relationship between each time series. Previous research has not addressed the problem that the original fluctuation correlation networks are fully connected networks and more information exists within these networks that is currently being utilized. Here we use listed coal companies as a case study. First, we decompose the original stock price fluctuation series into different time scales. Second, we construct the stock price fluctuation correlation networks at different time scales. Third, we delete the edges of the network based on thresholds and analyze the network indicators. Through combining the multi-scale method with the multi-threshold method, we bring to light the implicit information of fully connected networks.

  6. Optical spectral singularities as threshold resonances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mostafazadeh, Ali

    2011-04-15

    Spectral singularities are among generic mathematical features of complex scattering potentials. Physically they correspond to scattering states that behave like zero-width resonances. For a simple optical system, we show that a spectral singularity appears whenever the gain coefficient coincides with its threshold value and other parameters of the system are selected properly. We explore a concrete realization of spectral singularities for a typical semiconductor gain medium and propose a method of constructing a tunable laser that operates at threshold gain.

  7. A Ratiometric Threshold for Determining Presence of Cancer During Fluorescence-guided Surgery

    PubMed Central

    Warram, Jason M; de Boer, Esther; Moore, Lindsay S.; Schmalbach, Cecelia E; Withrow, Kirk P; Carroll, William R; Richman, Joshua S; Morlandt, Anthony B; Brandwein-Gensler, Margaret; Rosenthal, Eben L

    2015-01-01

    Background&Objective Fluorescence-guided imaging to assist in identification of malignant margins has the potential to dramatically improve oncologic surgery. However a standardized method for quantitative assessment of disease-specific fluorescence has not been investigated. Introduced here is a ratiometric threshold derived from mean fluorescent tissue intensity that can be used to semi-quantitatively delineate tumor from normal tissue. Methods Open-field and a closed-field imaging devices were used to quantify fluorescence in punch biopsy tissues sampled from primary tumors collected during a phase 1 trial evaluating the safety of cetuximab-IRDye800 in patients (n=11) undergoing surgical intervention for head and neck cancer. Fluorescence ratios were calculated using mean fluorescence intensity (MFI) from punch biopsy normalized by MFI of patient-matched tissues. Ratios were compared to pathological assessment and a ratiometric threshold was established to predict presence of cancer. Results During open-field imaging using an intraoperative device, the threshold for muscle normalized tumor fluorescence was found to be 2.7, which produced a sensitivity of 90.5% and specificity of 78.6% for delineating disease tissue. The skin-normalized threshold generated greater sensitivity (92.9%) and specificity (81.0%). Conclusion Successful implementation of a semi-quantitative threshold can provide a scientific methodology for delineating disease from normal tissue during fluorescence-guided resection of cancer. PMID:26074273

  8. Color difference threshold determination for acrylic denture base resins.

    PubMed

    Ren, Jiabao; Lin, Hong; Huang, Qingmei; Liang, Qifan; Zheng, Gang

    2015-01-01

    This study aimed to set evaluation indicators, i.e., perceptibility and acceptability color difference thresholds, of color stability for acrylic denture base resins for a spectrophotometric assessing method, which offered an alternative to the visual method described in ISO 20795-1:2013. A total of 291 disk specimens 50±1 mm in diameter and 0.5±0.1 mm thick were prepared (ISO 20795-1:2013) and processed through radiation tests in an accelerated aging chamber (ISO 7491:2000) for increasing times of 0 to 42 hours. Color alterations were measured with a spectrophotometer and evaluated using the CIE L*a*b* colorimetric system. Color differences were calculated through the CIEDE2000 color difference formula. Thirty-two dental professionals without color vision deficiencies completed perceptibility and acceptability assessments under controlled conditions in vitro. An S-curve fitting procedure was used to analyze the 50:50% perceptibility and acceptability thresholds. Furthermore, perceptibility and acceptability against the differences of the three color attributes, lightness, chroma, and hue, were also investigated. According to the S-curve fitting procedure, the 50:50% perceptibility threshold was 1.71ΔE00 (r(2)=0.88) and the 50:50% acceptability threshold was 4.00 ΔE00 (r(2)=0.89). Within the limitations of this study, 1.71/4.00 ΔE00 could be used as perceptibility/acceptability thresholds for acrylic denture base resins.

  9. A new laser pain threshold model detects a faster onset of action from a liquid formulation of 1 g paracetamol than an equivalent tablet formulation

    PubMed Central

    Sutton, J A; Gillin, W P; Grattan, T J; Clarke, G D; Kilminster, S G

    2002-01-01

    Aims To discover whether a new infra-red laser method could detect a change in pain threshold after as mild an analgesic as paracetamol and whether an effervescent liquid formulation produced a faster onset of action than tablets. Methods This double-blind, placebo controlled randomized study used a portable, infra-red laser to measure ‘first pain’ thresholds on the nondominant forearm in 12 normal volunteers before and after 1 g of paracetamol or placebo. The mean of six recordings was determined three times before dosing, the first being used as a familiarization procedure, and 14 times after dosing. Results We detected a small (2%), statistically significant difference in pain threshold between a liquid formulation of paracetamol and placebo at 30 and 60 min (P = 0.004 and P = 0.001), but not between tablets and placebo. Liquid also increased the threshold significantly compared with tablets at 60 min (P = 0.01). Conclusions To detect such a small increase in pain threshold requires a highly consistent measure and the coefficient of variation was 2% for the study overall, surprisingly low for a subjective phenomenon. The reasons for this include minimizing reflectance by blacking the skin, using a nonhairy site, averaging six data points at each sample time and controlling closely the ambient conditions and the subjects’ preparation for studies. PMID:11849194

  10. Auditory-steady-state response reliability in the audiological diagnosis after neonatal hearing screening.

    PubMed

    Núñez-Batalla, Faustino; Noriega-Iglesias, Sabel; Guntín-García, Maite; Carro-Fernández, Pilar; Llorente-Pendás, José Luis

    2016-01-01

    Conventional audiometry is the gold standard for quantifying and describing hearing loss. Alternative methods become necessary to assess subjects who are too young to respond reliably. Auditory evoked potentials constitute the most widely used method for determining hearing thresholds objectively; however, this stimulus is not frequency specific. The advent of the auditory steady-state response (ASSR) leads to more specific threshold determination. The current study describes and compares ASSR, auditory brainstem response (ABR) and conventional behavioural tone audiometry thresholds in a group of infants with various degrees of hearing loss. A comparison was made between ASSR, ABR and behavioural hearing thresholds in 35 infants detected in the neonatal hearing screening program. Mean difference scores (±SD) between ABR and high frequency ABR thresholds were 11.2 dB (±13) and 10.2 dB (±11). Pearson correlations between the ASSR and audiometry thresholds were 0.80 and 0.91 (500Hz); 0.84 and 0.82 (1000Hz); 0.85 and 0.84 (2000Hz); and 0.83 and 0.82 (4000Hz). The ASSR technique is a valuable extension of the clinical test battery for hearing-impaired children. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.

  11. Measurement precision of the anaerobic threshold by means of a portable calorimeter.

    PubMed

    Nogueira, Fernando dos Santos; Pompeu, Fernando Augusto Monteiro Sabóia

    2010-09-01

    Many methods are used for determining the Anaerobic Threshold (AT) by means of sophisticated ergospirometer. To test the AT variation, detected by mathematical models and visual inspection, when low cost ergospirometer is used and intended for clinical application. Seventy nine apparently healthy subjects were volunteers in this study; from these, 57 men. The VO₂(max) and the ventilatory threshold were determined by indirect, open-circuit calorimetry. The electro-enzymatic method was used for analyzing the lactacidemia and direct determination of the Lactate Threshold (LT). The AT was determined by two mathematical methods (MM(RSS) and MM(slope)), based on the gases exchange, and by the log-log visual method, for determining the LT. Two independent investigators determined the AT through visual inspection of three graphs, considering two methods (AT₋(a)= V-slope, EqV; and AT₋(b) = V-slope, EqV and ExCO₂). The data were analyzed by means of parametric statistics for determining the differences between AT₋(a) versus ExCO₂, MM(RSS) and MM(slope); AT-b versus MM(RSS) and MM(slope); and LT versus AT₋(a), AT₋(b), MM(RSS) and MM(slope). The MM(slope) was the only method that presented a significant difference between the AT₋(a) and AT₋(b) (p=0.001), with CV% >15. LT versus MM(slope) did not present significant difference (p=0.274), however, it was observed a high CV (24%). It was concluded that with the low cost equipment, the MM(RSS) and AT₋(a) methods can be used for determining the TAn. The MM(slope) method did not present satisfactory precision to be employed with this equipment.

  12. Noninvasive determination of anaerobic threshold by monitoring the %SpO2 changes and respiratory gas exchange.

    PubMed

    Nikooie, Roohollah; Gharakhanlo, Reza; Rajabi, Hamid; Bahraminegad, Morteza; Ghafari, Ali

    2009-10-01

    The purpose of this study was to determine the validity of noninvasive anaerobic threshold (AT) estimation using %SpO2 (arterial oxyhemoglobin saturation) changes and respiratory gas exchanges. Fifteen active, healthy males performed 2 graded exercise tests on a motor-driven treadmill in 2 separated sessions. Respiratory gas exchanges and heart rate (HR), lactate concentration, and %SpO2 were measured continuously throughout the test. Anaerobic threshold was determined based on blood lactate concentration (lactate-AT), %SpO2 changes (%SpO2-AT), respiratory exchange ratio (RER-AT), V-slope method (V-slope-AT), and ventilatory equivalent for O2 (EqO2-AT). Blood lactate measuring was considered as gold standard assessment of AT and was applied to confirm the validity of other noninvasive methods. The mean O2 corresponding to lactate-AT, %SpO2-AT, RER-AT, V-slope -AT, and EqO2-AT were 2176.6 +/- 206.4, 1909.5 +/- 221.4, 2141.2 +/- 245.6, 1933.7 +/- 216.4, and 1975 +/- 232.4, respectively. Intraclass correlation coefficient (ICC) analysis indicates a significant correlation between 4 noninvasive methods and the criterion method. Blond-Altman plots showed the good agreement between O2 corresponding to AT in each method and lactate-AT (95% confidence interval (CI). Our results indicate that a noninvasive and easy procedure of monitoring the %SpO2 is a valid method for estimation of AT. Also, in the present study, the respiratory exchange ratio (RER) method seemed to be the best respiratory index for noninvasive estimation of anaerobic threshold, and the heart rate corresponding to AT predicted by this method can be used by coaches and athletes to define training zones.

  13. Effects of Acupuncture on Sensory Perception: A Systematic Review and Meta-Analysis

    PubMed Central

    Baeumler, Petra I.; Fleckenstein, Johannes; Takayama, Shin; Simang, Michael; Seki, Takashi; Irnich, Dominik

    2014-01-01

    Background The effect of acupuncture on sensory perception has never been systematically reviewed; although, studies on acupuncture mechanisms are frequently based on the idea that changes in sensory thresholds reflect its effect on the nervous system. Methods Pubmed, EMBASE and Scopus were screened for studies investigating the effect of acupuncture on thermal or mechanical detection or pain thresholds in humans published in English or German. A meta-analysis of high quality studies was performed. Results Out of 3007 identified articles 85 were included. Sixty five studies showed that acupuncture affects at least one sensory threshold. Most studies assessed the pressure pain threshold of which 80% reported an increase after acupuncture. Significant short- and long-term effects on the pressure pain threshold in pain patients were revealed by two meta-analyses including four and two high quality studies, respectively. In over 60% of studies, acupuncture reduced sensitivity to noxious thermal stimuli, but measuring methods might influence results. Few but consistent data indicate that acupuncture reduces pin-prick like pain but not mechanical detection. Results on thermal detection are heterogeneous. Sensory threshold changes were equally frequent reported after manual acupuncture as after electroacupuncture. Among 48 sham-controlled studies, 25 showed stronger effects on sensory thresholds through verum than through sham acupuncture, but in 9 studies significant threshold changes were also observed after sham acupuncture. Overall, there is a lack of high quality acupuncture studies applying comprehensive assessments of sensory perception. Conclusions Our findings indicate that acupuncture affects sensory perception. Results are most compelling for the pressure pain threshold, especially in pain conditions associated with tenderness. Sham acupuncture can also cause such effects. Future studies should incorporate comprehensive, standardized assessments of sensory profiles in order to fully characterize its effect on sensory perception and to explore the predictive value of sensory profiles for the effectiveness of acupuncture. PMID:25502787

  14. Segment and fit thresholding: a new method for image analysis applied to microarray and immunofluorescence data.

    PubMed

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B

    2015-10-06

    Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.

  15. Determination of vessel cross-sectional area by thresholding in Radon space

    PubMed Central

    Gao, Yu-Rong; Drew, Patrick J

    2014-01-01

    The cross-sectional area of a blood vessel determines its resistance, and thus is a regulator of local blood flow. However, the cross-sections of penetrating vessels in the cortex can be non-circular, and dilation and constriction can change the shape of the vessels. We show that observed vessel shape changes can introduce large errors in flux calculations when using a single diameter measurement. Because of these shape changes, typical diameter measurement approaches, such as the full-width at half-maximum (FWHM) that depend on a single diameter axis will generate erroneous results, especially when calculating flux. Here, we present an automated method—thresholding in Radon space (TiRS)—for determining the cross-sectional area of a convex object, such as a penetrating vessel observed with two-photon laser scanning microscopy (2PLSM). The thresholded image is transformed back to image space and contiguous pixels are segmented. The TiRS method is analogous to taking the FWHM across multiple axes and is more robust to noise and shape changes than FWHM and thresholding methods. We demonstrate the superior precision of the TiRS method with in vivo 2PLSM measurements of vessel diameter. PMID:24736890

  16. Combining multiple thresholding binarization values to improve OCR output

    NASA Astrophysics Data System (ADS)

    Lund, William B.; Kennard, Douglas J.; Ringger, Eric K.

    2013-01-01

    For noisy, historical documents, a high optical character recognition (OCR) word error rate (WER) can render the OCR text unusable. Since image binarization is often the method used to identify foreground pixels, a body of research seeks to improve image-wide binarization directly. Instead of relying on any one imperfect binarization technique, our method incorporates information from multiple simple thresholding binarizations of the same image to improve text output. Using a new corpus of 19th century newspaper grayscale images for which the text transcription is known, we observe WERs of 13.8% and higher using current binarization techniques and a state-of-the-art OCR engine. Our novel approach combines the OCR outputs from multiple thresholded images by aligning the text output and producing a lattice of word alternatives from which a lattice word error rate (LWER) is calculated. Our results show a LWER of 7.6% when aligning two threshold images and a LWER of 6.8% when aligning five. From the word lattice we commit to one hypothesis by applying the methods of Lund et al. (2011) achieving an improvement over the original OCR output and a 8.41% WER result on this data set.

  17. Segment and Fit Thresholding: A New Method for Image Analysis Applied to Microarray and Immunofluorescence Data

    PubMed Central

    Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.

    2016-01-01

    Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978

  18. Interplay between the local information based behavioral responses and the epidemic spreading in complex networks.

    PubMed

    Liu, Can; Xie, Jia-Rong; Chen, Han-Shuang; Zhang, Hai-Feng; Tang, Ming

    2015-10-01

    The spreading of an infectious disease can trigger human behavior responses to the disease, which in turn plays a crucial role on the spreading of epidemic. In this study, to illustrate the impacts of the human behavioral responses, a new class of individuals, S(F), is introduced to the classical susceptible-infected-recovered model. In the model, S(F) state represents that susceptible individuals who take self-initiate protective measures to lower the probability of being infected, and a susceptible individual may go to S(F) state with a response rate when contacting an infectious neighbor. Via the percolation method, the theoretical formulas for the epidemic threshold as well as the prevalence of epidemic are derived. Our finding indicates that, with the increasing of the response rate, the epidemic threshold is enhanced and the prevalence of epidemic is reduced. The analytical results are also verified by the numerical simulations. In addition, we demonstrate that, because the mean field method neglects the dynamic correlations, a wrong result based on the mean field method is obtained-the epidemic threshold is not related to the response rate, i.e., the additional S(F) state has no impact on the epidemic threshold.

  19. Exploring three faint source detections methods for aperture synthesis radio images

    NASA Astrophysics Data System (ADS)

    Peracaula, M.; Torrent, A.; Masias, M.; Lladó, X.; Freixenet, J.; Martí, J.; Sánchez-Sutil, J. R.; Muñoz-Arjonilla, A. J.; Paredes, J. M.

    2015-04-01

    Wide-field radio interferometric images often contain a large population of faint compact sources. Due to their low intensity/noise ratio, these objects can be easily missed by automated detection methods, which have been classically based on thresholding techniques after local noise estimation. The aim of this paper is to present and analyse the performance of several alternative or complementary techniques to thresholding. We compare three different algorithms to increase the detection rate of faint objects. The first technique consists of combining wavelet decomposition with local thresholding. The second technique is based on the structural behaviour of the neighbourhood of each pixel. Finally, the third algorithm uses local features extracted from a bank of filters and a boosting classifier to perform the detections. The methods' performances are evaluated using simulations and radio mosaics from the Giant Metrewave Radio Telescope and the Australia Telescope Compact Array. We show that the new methods perform better than well-known state of the art methods such as SEXTRACTOR, SAD and DUCHAMP at detecting faint sources of radio interferometric images.

  20. Pollutant threshold concentration determination in marine ecosystems using an ecological interaction endpoint.

    PubMed

    Wang, Changyou; Liang, Shengkang; Guo, Wenting; Yu, Hua; Xing, Wenhui

    2015-09-01

    The threshold concentrations of pollutants are determined by extrapolating single-species effect data to community-level effects. This assumes the most sensitive endpoint of the life cycle of individuals and the species sensitivity distribution from single-species toxic effect tests, thus, ignoring the ecological interactions. The uncertainties due to this extrapolation can be partially overcome using the equilibrium point of a customized ecosystem. This method incorporates ecological interactions and integrates the effects on growth, survival, and ingestion into a single effect measure, the equilibrium point excursion in the customized ecosystem, in order to describe the toxic effects on plankton. A case study showed that the threshold concentration of copper calculated with the endpoint of the equilibrium point was 10 μg L(-1), which is significantly different from the threshold calculated with a single-species endpoint. The endpoint calculated using this method provides a more relevant measure of the ecological impact than any single individual-level endpoint. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Mathematics of quantitative kinetic PCR and the application of standard curves.

    PubMed

    Rutledge, R G; Côté, C

    2003-08-15

    Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.

  2. What Temperature of Coffee Exceeds the Pain Threshold? Pilot Study of a Sensory Analysis Method as Basis for Cancer Risk Assessment.

    PubMed

    Dirler, Julia; Winkler, Gertrud; Lachenmeier, Dirk W

    2018-06-01

    The International Agency for Research on Cancer (IARC) evaluates "very hot (>65 °C) beverages" as probably carcinogenic to humans. However, there is a lack of research regarding what temperatures consumers actually perceive as "very hot" or as "too hot". A method for sensory analysis of such threshold temperatures was developed. The participants were asked to mix a very hot coffee step by step into a cooler coffee. Because of that, the coffee to be tasted was incrementally increased in temperature during the test. The participants took a sip at every addition, until they perceive the beverage as too hot for consumption. The protocol was evaluated in the form of a pilot study using 87 participants. Interestingly, the average pain threshold of the test group (67 °C) and the preferred drinking temperature (63 °C) iterated around the IARC threshold for carcinogenicity. The developed methodology was found as fit for the purpose and may be applied in larger studies.

  3. Rejection Thresholds in Solid Chocolate-Flavored Compound Coating

    PubMed Central

    Harwood, Meriel L.; Ziegler, Gregory R.; Hayes, John E.

    2012-01-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers versus melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate (SOA), a bitter GRAS additive. Paired preference tests (blank vs. spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between two self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (p = 0.01). Conversely, eating style did not affect group rejection thresholds (p = 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (p = 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. PMID:22924788

  4. Establishing seasonal and alert influenza thresholds in Cambodia using the WHO method: implications for effective utilization of influenza surveillance in the tropics and subtropics.

    PubMed

    Ly, Sovann; Arashiro, Takeshi; Ieng, Vanra; Tsuyuoka, Reiko; Parry, Amy; Horwood, Paul; Heng, Seng; Hamid, Sarah; Vandemaele, Katelijn; Chin, Savuth; Sar, Borann; Arima, Yuzo

    2017-01-01

    To establish seasonal and alert thresholds and transmission intensity categories for influenza to provide timely triggers for preventive measures or upscaling control measures in Cambodia. Using Cambodia's influenza-like illness (ILI) and laboratory-confirmed influenza surveillance data from 2009 to 2015, three parameters were assessed to monitor influenza activity: the proportion of ILI patients among all outpatients, proportion of ILI samples positive for influenza and the product of the two. With these parameters, four threshold levels (seasonal, moderate, high and alert) were established and transmission intensity was categorized based on a World Health Organization alignment method. Parameters were compared against their respective thresholds. Distinct seasonality was observed using the two parameters that incorporated laboratory data. Thresholds established using the composite parameter, combining syndromic and laboratory data, had the least number of false alarms in declaring season onset and were most useful in monitoring intensity. Unlike in temperate regions, the syndromic parameter was less useful in monitoring influenza activity or for setting thresholds. Influenza thresholds based on appropriate parameters have the potential to provide timely triggers for public health measures in a tropical country where monitoring and assessing influenza activity has been challenging. Based on these findings, the Ministry of Health plans to raise general awareness regarding influenza among the medical community and the general public. Our findings have important implications for countries in the tropics/subtropics and in resource-limited settings, and categorized transmission intensity can be used to assess severity of potential pandemic influenza as well as seasonal influenza.

  5. Validity of the Talk Test for exercise prescription after myocardial revascularization.

    PubMed

    Zanettini, Renzo; Centeleghe, Paola; Franzelli, Cristina; Mori, Ileana; Benna, Stefania; Penati, Chiara; Sorlini, Nadia

    2013-04-01

    For exercise prescription, rating of perceived exertion is the subjective tool most frequently used in addition to methods based on percentage of peak exercise variables. The aim of this study was the validation of a subjective method widely called the Talk Test (TT) for optimization of training intensity in patients with recent myocardial revascularization. Fifty patients with recent myocardial revascularization (17 by coronary artery bypass grafting and 33 by percutaneous coronary intervention) were enrolled in a cardiac rehabilitation programme. Each patient underwent three repetitions of the TT during three different exercise sessions to evaluate the within-patient and between-operators reliability in assessing the workload (WL) at TT thresholds. These parameters were then compared with the data of a final cardiopulmonary exercise testing, and the WL range between the individual aerobic threshold (AeT) and anaerobic threshold (AnT) was considered as the optimal training zone. The within-patient and between-operators reliability in assessing TT thresholds were satisfactory. No significant differences were found between patients' and physiotherapists' evaluations of WL at different TT thresholds. WL at Last TT+ was between AeT and AnT in 88% of patients and slightly

  6. Influence of drug load on dissolution behavior of tablets containing a poorly water-soluble drug: estimation of the percolation threshold.

    PubMed

    Wenzel, Tim; Stillhart, Cordula; Kleinebudde, Peter; Szepes, Anikó

    2017-08-01

    Drug load plays an important role in the development of solid dosage forms, since it can significantly influence both processability and final product properties. The percolation threshold of the active pharmaceutical ingredient (API) corresponds to a critical concentration, above which an abrupt change in drug product characteristics can occur. The objective of this study was to identify the percolation threshold of a poorly water-soluble drug with regard to the dissolution behavior from immediate release tablets. The influence of the API particle size on the percolation threshold was also studied. Formulations with increasing drug loads were manufactured via roll compaction using constant process parameters and subsequent tableting. Drug dissolution was investigated in biorelevant medium. The percolation threshold was estimated via a model dependent and a model independent method based on the dissolution data. The intragranular concentration of mefenamic acid had a significant effect on granules and tablet characteristics, such as particle size distribution, compactibility and tablet disintegration. Increasing the intragranular drug concentration of the tablets resulted in lower dissolution rates. A percolation threshold of approximately 20% v/v could be determined for both particle sizes of the API above which an abrupt decrease of the dissolution rate occurred. However, the increasing drug load had a more pronounced effect on dissolution rate of tablets containing the micronized API, which can be attributed to the high agglomeration tendency of micronized substances during manufacturing steps, such as roll compaction and tableting. Both methods that were applied for the estimation of percolation threshold provided comparable values.

  7. Evaluation of quantification methods for real-time PCR minor groove binding hybridization probe assays.

    PubMed

    Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V

    2007-02-01

    Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.

  8. Using Reanalysis Data for the Prediction of Seasonal Wind Turbine Power Losses Due to Icing

    NASA Astrophysics Data System (ADS)

    Burtch, D.; Mullendore, G. L.; Delene, D. J.; Storm, B.

    2013-12-01

    The Northern Plains region of the United States is home to a significant amount of potential wind energy. However, in winter months capturing this potential power is severely impacted by the meteorological conditions, in the form of icing. Predicting the expected loss in power production due to icing is a valuable parameter that can be used in wind turbine operations, determination of wind turbine site locations and long-term energy estimates which are used for financing purposes. Currently, losses due to icing must be estimated when developing predictions for turbine feasibility and financing studies, while icing maps, a tool commonly used in Europe, are lacking in the United States. This study uses the Modern-Era Retrospective Analysis for Research and Applications (MERRA) dataset in conjunction with turbine production data to investigate various methods of predicting seasonal losses (October-March) due to icing at two wind turbine sites located 121 km apart in North Dakota. The prediction of icing losses is based on temperature and relative humidity thresholds and is accomplished using three methods. For each of the three methods, the required atmospheric variables are determined in one of two ways: using industry-specific software to correlate anemometer data in conjunction with the MERRA dataset and using only the MERRA dataset for all variables. For each season, a percentage of the total expected generated power lost due to icing is determined and compared to observed losses from the production data. An optimization is performed in order to determine the relative humidity threshold that minimizes the difference between the predicted and observed values. Eight seasons of data are used to determine an optimal relative humidity threshold, and a further three seasons of data are used to test this threshold. Preliminary results have shown that the optimized relative humidity threshold for the northern turbine is higher than the southern turbine for all methods. For the three test seasons, the optimized thresholds tend to under-predict the icing losses. However, the threshold determined using boundary layer similarity theory most closely predicts the power losses due to icing versus the other methods. For the northern turbine, the average predicted power loss over the three seasons is 4.65 % while the observed power loss is 6.22 % (average difference of 1.57 %). For the southern turbine, the average predicted power loss and observed power loss over the same time period are 4.43 % and 6.16 %, respectively (average difference of 1.73 %). The three-year average, however, does not clearly capture the variability that exists season-to-season. On examination of each of the test seasons individually, the optimized relative humidity threshold methodology performs better than fixed power loss estimates commonly used in the wind energy industry.

  9. SU-C-9A-01: Parameter Optimization in Adaptive Region-Growing for Tumor Segmentation in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, S; Huazhong University of Science and Technology, Wuhan, Hubei; Xue, M

    Purpose: To design a reliable method to determine the optimal parameter in the adaptive region-growing (ARG) algorithm for tumor segmentation in PET. Methods: The ARG uses an adaptive similarity criterion m - fσ ≤ I-PET ≤ m + fσ, so that a neighboring voxel is appended to the region based on its similarity to the current region. When increasing the relaxing factor f (f ≥ 0), the resulting volumes monotonically increased with a sharp increase when the region just grew into the background. The optimal f that separates the tumor from the background is defined as the first point withmore » the local maximum curvature on an Error function fitted to the f-volume curve. The ARG was tested on a tumor segmentation Benchmark that includes ten lung cancer patients with 3D pathologic tumor volume as ground truth. For comparison, the widely used 42% and 50% SUVmax thresholding, Otsu optimal thresholding, Active Contours (AC), Geodesic Active Contours (GAC), and Graph Cuts (GC) methods were tested. The dice similarity index (DSI), volume error (VE), and maximum axis length error (MALE) were calculated to evaluate the segmentation accuracy. Results: The ARG provided the highest accuracy among all tested methods. Specifically, the ARG has an average DSI, VE, and MALE of 0.71, 0.29, and 0.16, respectively, better than the absolute 42% thresholding (DSI=0.67, VE= 0.57, and MALE=0.23), the relative 42% thresholding (DSI=0.62, VE= 0.41, and MALE=0.23), the absolute 50% thresholding (DSI=0.62, VE=0.48, and MALE=0.21), the relative 50% thresholding (DSI=0.48, VE=0.54, and MALE=0.26), OTSU (DSI=0.44, VE=0.63, and MALE=0.30), AC (DSI=0.46, VE= 0.85, and MALE=0.47), GAC (DSI=0.40, VE= 0.85, and MALE=0.46) and GC (DSI=0.66, VE= 0.54, and MALE=0.21) methods. Conclusions: The results suggest that the proposed method reliably identified the optimal relaxing factor in ARG for tumor segmentation in PET. This work was supported in part by National Cancer Institute Grant R01 CA172638; The dataset is provided by AAPM TG211.« less

  10. Edge detection based on adaptive threshold b-spline wavelet for optical sub-aperture measuring

    NASA Astrophysics Data System (ADS)

    Zhang, Shiqi; Hui, Mei; Liu, Ming; Zhao, Zhu; Dong, Liquan; Liu, Xiaohua; Zhao, Yuejin

    2015-08-01

    In the research of optical synthetic aperture imaging system, phase congruency is the main problem and it is necessary to detect sub-aperture phase. The edge of the sub-aperture system is more complex than that in the traditional optical imaging system. And with the existence of steep slope for large-aperture optical component, interference fringe may be quite dense when interference imaging. Deep phase gradient may cause a loss of phase information. Therefore, it's urgent to search for an efficient edge detection method. Wavelet analysis as a powerful tool is widely used in the fields of image processing. Based on its properties of multi-scale transform, edge region is detected with high precision in small scale. Longing with the increase of scale, noise is reduced in contrary. So it has a certain suppression effect on noise. Otherwise, adaptive threshold method which sets different thresholds in various regions can detect edge points from noise. Firstly, fringe pattern is obtained and cubic b-spline wavelet is adopted as the smoothing function. After the multi-scale wavelet decomposition of the whole image, we figure out the local modulus maxima in gradient directions. However, it also contains noise, and thus adaptive threshold method is used to select the modulus maxima. The point which greater than threshold value is boundary point. Finally, we use corrosion and expansion deal with the resulting image to get the consecutive boundary of image.

  11. Ventilatory thresholds determined from HRV: comparison of 2 methods in obese adolescents.

    PubMed

    Quinart, S; Mourot, L; Nègre, V; Simon-Rigaud, M-L; Nicolet-Guénat, M; Bertrand, A-M; Meneveau, N; Mougin, F

    2014-03-01

    The development of personalised training programmes is crucial in the management of obesity. We evaluated the ability of 2 heart rate variability analyses to determine ventilatory thresholds (VT) in obese adolescents. 20 adolescents (mean age 14.3±1.6 years and body mass index z-score 4.2±0.1) performed an incremental test to exhaustion before and after a 9-month multidisciplinary management programme. The first (VT1) and second (VT2) ventilatory thresholds were identified by the reference method (gas exchanges). We recorded RR intervals to estimate VT1 and VT2 from heart rate variability using time-domain analysis and time-varying spectral-domain analysis. The coefficient correlations between thresholds were higher with spectral-domain analysis compared to time-domain analysis: Heart rate at VT1: r=0.91 vs. =0.66 and VT2: r=0.91 vs. =0.66; power at VT1: r=0.91 vs. =0.74 and VT2: r=0.93 vs. =0.78; spectral-domain vs. time-domain analysis respectively). No systematic bias in heart rate at VT1 and VT2 with standard deviations <6 bpm were found, confirming that spectral-domain analysis could replace the reference method for the detection of ventilatory thresholds. Furthermore, this technique is sensitive to rehabilitation and re-training, which underlines its utility in clinical practice. This inexpensive and non-invasive tool is promising for prescribing physical activity programs in obese adolescents. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Psychophysical assessment of low visual function in patients with retinal degenerative diseases (RDDs) with the Diagnosys full-field stimulus threshold (D-FST).

    PubMed

    Klein, M; Birch, D G

    2009-12-01

    To determine whether the Diagnosys full-field stimulus threshold (D-FST) is a valid, sensitive and repeatable psychophysical method of measuring and following visual function in low-vision subjects. Fifty-three affected eyes of 42 subjects with severe retinal degenerative diseases (RDDs) were tested with achromatic stimuli on the D-FST. Included were subjects who were either unable to perform a static perimetric field or had non-detectable or sub-microvolt electroretinograms (ERGs). A subset of 21 eyes of 17 subjects was tested on both the D-FST and the FST2, a previous established full-field threshold test. Seven eyes of 7 normal control subjects were tested on both the D-FST and the FST2. Results for the two methods were compared with the Bland-Altman test. On the D-FST, a threshold could successfully be determined for 13 of 14 eyes with light perception (LP) only (median 0.9 +/- 1.4 log cd/m2), and all eyes determined to be counting fingers (CF; median 0.3 +/- 1.8 log cd/m2). The median full-field threshold for the normal controls was -4.3 +/- 0.6 log cd/m2 on the D-FST and -4.8 +/- 0.9 log cd/m2 on the FST2. The D-FST offers a commercially available method with a robust psychophysical algorithm and is a useful tool for following visual function in low vision subjects.

  13. Extraction of Extended Small-Scale Objects in Digital Images

    NASA Astrophysics Data System (ADS)

    Volkov, V. Y.

    2015-05-01

    Detection and localization problem of extended small-scale objects with different shapes appears in radio observation systems which use SAR, infra-red, lidar and television camera. Intensive non-stationary background is the main difficulty for processing. Other challenge is low quality of images, blobs, blurred boundaries; in addition SAR images suffer from a serious intrinsic speckle noise. Statistics of background is not normal, it has evident skewness and heavy tails in probability density, so it is hard to identify it. The problem of extraction small-scale objects is solved here on the basis of directional filtering, adaptive thresholding and morthological analysis. New kind of masks is used which are open-ended at one side so it is possible to extract ends of line segments with unknown length. An advanced method of dynamical adaptive threshold setting is investigated which is based on isolated fragments extraction after thresholding. Hierarchy of isolated fragments on binary image is proposed for the analysis of segmentation results. It includes small-scale objects with different shape, size and orientation. The method uses extraction of isolated fragments in binary image and counting points in these fragments. Number of points in extracted fragments is normalized to the total number of points for given threshold and is used as effectiveness of extraction for these fragments. New method for adaptive threshold setting and control maximises effectiveness of extraction. It has optimality properties for objects extraction in normal noise field and shows effective results for real SAR images.

  14. A longitudinal study on the ammonia threshold in junior cyclists

    PubMed Central

    Yuan, Y; Chan, K

    2004-01-01

    Objectives: To identify the effect of a one year non-specific training programme on the ammonia threshold of a group of junior cyclists and to correlate ammonia threshold with other common physiological variables. Methods: The cyclists performed tests at three time points (T1, T2, T3) during the year. Follow up tests were conducted every six months after the original test. Ammonia threshold was obtained from a graded exercise with four minute steps. Results: The relatively non-specific one year training programme was effective in inducing an increase in peak VO2 (60.6 (5.9), 65.9 (7.4), and 64.6 (6.5) ml/min/kg at T1, T2, and T3 respectively) and endurance time (18.3 (4.5), 20.1 (5.2), and 27.0 (6.1) minutes at T1, T2, and T3 respectively), but was not effective for the sprint related variables. Ammonia threshold, together with lactate threshold and ventilatory threshold, was not significantly different at the three test times. Only endurance time correlated significantly with ammonia threshold (r  =  0.915, p  =  0.001). Conclusions: The findings suggest that a relatively non-specific one year training programme does not modify the ammonia threshold of junior cyclists. The significant correlation between ammonia threshold and endurance time further confirms that ammonia threshold is a measure of the ability to sustain exercise at submaximal intensities. PMID:15039242

  15. Outlier detection for particle image velocimetry data using a locally estimated noise variance

    NASA Astrophysics Data System (ADS)

    Lee, Yong; Yang, Hua; Yin, ZhouPing

    2017-03-01

    This work describes an adaptive spatial variable threshold outlier detection algorithm for raw gridded particle image velocimetry data using a locally estimated noise variance. This method is an iterative procedure, and each iteration is composed of a reference vector field reconstruction step and an outlier detection step. We construct the reference vector field using a weighted adaptive smoothing method (Garcia 2010 Comput. Stat. Data Anal. 54 1167-78), and the weights are determined in the outlier detection step using a modified outlier detector (Ma et al 2014 IEEE Trans. Image Process. 23 1706-21). A hard decision on the final weights of the iteration can produce outlier labels of the field. The technical contribution is that the spatial variable threshold motivation is embedded in the modified outlier detector with a locally estimated noise variance in an iterative framework for the first time. It turns out that a spatial variable threshold is preferable to a single spatial constant threshold in complicated flows such as vortex flows or turbulent flows. Synthetic cellular vortical flows with simulated scattered or clustered outliers are adopted to evaluate the performance of our proposed method in comparison with popular validation approaches. This method also turns out to be beneficial in a real PIV measurement of turbulent flow. The experimental results demonstrated that the proposed method yields the competitive performance in terms of outlier under-detection count and over-detection count. In addition, the outlier detection method is computational efficient and adaptive, requires no user-defined parameters, and corresponding implementations are also provided in supplementary materials.

  16. An improved TV caption image binarization method

    NASA Astrophysics Data System (ADS)

    Jiang, Mengdi; Cheng, Jianghua; Chen, Minghui; Ku, Xishu

    2018-04-01

    TV Video caption image binarization has important influence on semantic video retrieval. An improved binarization method for caption image is proposed in this paper. In order to overcome the shortcomings of ghost and broken strokes problems of traditional Niblack method, the method has considered the global information of the images and the local information of the images. First, Tradition Otsu and Niblack thresholds are used for initial binarization. Second, we introduced the difference between maximum and minimum values in the local window as a third threshold to generate two images. Finally, with a logic AND operation of the two images, great results were obtained. The experiment results prove that the proposed method is reliable and effective.

  17. A Multi-Channel Method for Detecting Periodic Forced Oscillations in Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Tuffner, Francis K.

    2016-11-14

    Forced oscillations in electric power systems are often symptomatic of equipment malfunction or improper operation. Detecting and addressing the cause of the oscillations can improve overall system operation. In this paper, a multi-channel method of detecting forced oscillations and estimating their frequencies is proposed. The method operates by comparing the sum of scaled periodograms from various channels to a threshold. A method of setting the threshold to specify the detector's probability of false alarm while accounting for the correlation between channels is also presented. Results from simulated and measured power system data indicate that the method outperforms its single-channel counterpartmore » and is suitable for real-world applications.« less

  18. Planning Target Margin Calculations for Prostate Radiotherapy Based on Intrafraction and Interfraction Motion Using Four Localization Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beltran, Chris; Herman, Michael G.; Davis, Brian J.

    2008-01-01

    Purpose: To determine planning target volume (PTV) margins for prostate radiotherapy based on the internal margin (IM) (intrafractional motion) and the setup margin (SM) (interfractional motion) for four daily localization methods: skin marks (tattoo), pelvic bony anatomy (bone), intraprostatic gold seeds using a 5-mm action threshold, and using no threshold. Methods and Materials: Forty prostate cancer patients were treated with external radiotherapy according to an online localization protocol using four intraprostatic gold seeds and electronic portal images (EPIs). Daily localization and treatment EPIs were obtained. These data allowed inter- and intrafractional analysis of prostate motion. The SM for the fourmore » daily localization methods and the IM were determined. Results: A total of 1532 fractions were analyzed. Tattoo localization requires a SM of 6.8 mm left-right (LR), 7.2 mm inferior-superior (IS), and 9.8 mm anterior-posterior (AP). Bone localization requires 3.1, 8.9, and 10.7 mm, respectively. The 5-mm threshold localization requires 4.0, 3.9, and 3.7 mm. No threshold localization requires 3.4, 3.2, and 3.2 mm. The intrafractional prostate motion requires an IM of 2.4 mm LR, 3.4 mm IS and AP. The PTV margin using the 5-mm threshold, including interobserver uncertainty, IM, and SM, is 4.8 mm LR, 5.4 mm IS, and 5.2 mm AP. Conclusions: Localization based on EPI with implanted gold seeds allows a large PTV margin reduction when compared with tattoo localization. Except for the LR direction, bony anatomy localization does not decrease the margins compared with tattoo localization. Intrafractional prostate motion is a limiting factor on margin reduction.« less

  19. Predicting the susceptibility to gully initiation in data-poor regions

    NASA Astrophysics Data System (ADS)

    Dewitte, Olivier; Daoudi, Mohamed; Bosco, Claudio; Van Den Eeckhaut, Miet

    2015-01-01

    Permanent gullies are common features in many landscapes and quite often they represent the dominant soil erosion process. Once a gully has initiated, field evidence shows that gully channel formation and headcut migration rapidly occur. In order to prevent the undesired effects of gullying, there is a need to predict the places where new gullies might initiate. From detailed field measurements, studies have demonstrated strong inverse relationships between slope gradient of the soil surface (S) and drainage area (A) at the point of channel initiation across catchments in different climatic and morphological environments. Such slope-area thresholds (S-A) can be used to predict locations in the landscape where gullies might initiate. However, acquiring S-A requires detailed field investigations and accurate high resolution digital elevation data, which are usually difficult to acquire. To circumvent this issue, we propose a two-step method that uses published S-A thresholds and a logistic regression analysis (LR). S-A thresholds from the literature are used as proxies of field measurement. The method is calibrated and validated on a watershed, close to the town of Algiers, northern Algeria, where gully erosion affects most of the slopes. The gullies extend up to several kilometres in length and cover 16% of the study area. First we reconstruct the initiation areas of the existing gullies by applying S-A thresholds for similar environments. Then, using the initiation area map as the dependent variable with combinations of topographic and lithological predictor variables, we calibrate several LR models. It provides relevant results in terms of statistical reliability, prediction performance, and geomorphological significance. This method using S-A thresholds with data-driven assessment methods like LR proves to be efficient when applied to common spatial data and establishes a methodology that will allow similar studies to be undertaken elsewhere.

  20. Optimum parameters of image preprocessing method for Shack-Hartmann wavefront sensor in different SNR condition

    NASA Astrophysics Data System (ADS)

    Wei, Ping; Li, Xinyang; Luo, Xi; Li, Jianfeng

    2018-02-01

    The centroid method is commonly adopted to locate the spot in the sub-apertures in the Shack-Hartmann wavefront sensor (SH-WFS), in which preprocessing image is required before calculating the spot location due to that the centroid method is extremely sensitive to noises. In this paper, the SH-WFS image was simulated according to the characteristics of the noises, background and intensity distribution. The Optimal parameters of SH-WFS image preprocessing method were put forward, in different signal-to-noise ratio (SNR) conditions, where the wavefront reconstruction error was considered as the evaluation index. Two methods of image preprocessing, thresholding method and windowing combing with thresholding method, were compared by studying the applicable range of SNR and analyzing the stability of the two methods, respectively.

  1. Detection and quantification system for monitoring instruments

    DOEpatents

    Dzenitis, John M [Danville, CA; Hertzog, Claudia K [Houston, TX; Makarewicz, Anthony J [Livermore, CA; Henderer, Bruce D [Livermore, CA; Riot, Vincent J [Oakland, CA

    2008-08-12

    A method of detecting real events by obtaining a set of recent signal results, calculating measures of the noise or variation based on the set of recent signal results, calculating an expected baseline value based on the set of recent signal results, determining sample deviation, calculating an allowable deviation by multiplying the sample deviation by a threshold factor, setting an alarm threshold from the baseline value plus or minus the allowable deviation, and determining whether the signal results exceed the alarm threshold.

  2. Application of Key Events and Analysis to Chemical Carcinogens and Noncarcinogens

    EPA Science Inventory

    The existence of thresholds for toxicants is a matter of debate in chemical rsk assessment and regulation. Current risk assessment methods are based on the assumption that, in the basense of sufficient data, carcinogenesis does not have a threshold, while non-carcinogenic endpoi...

  3. Swarm: robust and fast clustering method for amplicon-based studies.

    PubMed

    Mahé, Frédéric; Rognes, Torbjørn; Quince, Christopher; de Vargas, Colomban; Dunthorn, Micah

    2014-01-01

    Popular de novo amplicon clustering methods suffer from two fundamental flaws: arbitrary global clustering thresholds, and input-order dependency induced by centroid selection. Swarm was developed to address these issues by first clustering nearly identical amplicons iteratively using a local threshold, and then by using clusters' internal structure and amplicon abundances to refine its results. This fast, scalable, and input-order independent approach reduces the influence of clustering parameters and produces robust operational taxonomic units.

  4. Swarm: robust and fast clustering method for amplicon-based studies

    PubMed Central

    Rognes, Torbjørn; Quince, Christopher; de Vargas, Colomban; Dunthorn, Micah

    2014-01-01

    Popular de novo amplicon clustering methods suffer from two fundamental flaws: arbitrary global clustering thresholds, and input-order dependency induced by centroid selection. Swarm was developed to address these issues by first clustering nearly identical amplicons iteratively using a local threshold, and then by using clusters’ internal structure and amplicon abundances to refine its results. This fast, scalable, and input-order independent approach reduces the influence of clustering parameters and produces robust operational taxonomic units. PMID:25276506

  5. Threshold regression to accommodate a censored covariate.

    PubMed

    Qian, Jing; Chiou, Sy Han; Maye, Jacqueline E; Atem, Folefac; Johnson, Keith A; Betensky, Rebecca A

    2018-06-22

    In several common study designs, regression modeling is complicated by the presence of censored covariates. Examples of such covariates include maternal age of onset of dementia that may be right censored in an Alzheimer's amyloid imaging study of healthy subjects, metabolite measurements that are subject to limit of detection censoring in a case-control study of cardiovascular disease, and progressive biomarkers whose baseline values are of interest, but are measured post-baseline in longitudinal neuropsychological studies of Alzheimer's disease. We propose threshold regression approaches for linear regression models with a covariate that is subject to random censoring. Threshold regression methods allow for immediate testing of the significance of the effect of a censored covariate. In addition, they provide for unbiased estimation of the regression coefficient of the censored covariate. We derive the asymptotic properties of the resulting estimators under mild regularity conditions. Simulations demonstrate that the proposed estimators have good finite-sample performance, and often offer improved efficiency over existing methods. We also derive a principled method for selection of the threshold. We illustrate the approach in application to an Alzheimer's disease study that investigated brain amyloid levels in older individuals, as measured through positron emission tomography scans, as a function of maternal age of dementia onset, with adjustment for other covariates. We have developed an R package, censCov, for implementation of our method, available at CRAN. © 2018, The International Biometric Society.

  6. A wavelet-based adaptive fusion algorithm of infrared polarization imaging

    NASA Astrophysics Data System (ADS)

    Yang, Wei; Gu, Guohua; Chen, Qian; Zeng, Haifang

    2011-08-01

    The purpose of infrared polarization image is to highlight man-made target from a complex natural background. For the infrared polarization images can significantly distinguish target from background with different features, this paper presents a wavelet-based infrared polarization image fusion algorithm. The method is mainly for image processing of high-frequency signal portion, as for the low frequency signal, the original weighted average method has been applied. High-frequency part is processed as follows: first, the source image of the high frequency information has been extracted by way of wavelet transform, then signal strength of 3*3 window area has been calculated, making the regional signal intensity ration of source image as a matching measurement. Extraction method and decision mode of the details are determined by the decision making module. Image fusion effect is closely related to the setting threshold of decision making module. Compared to the commonly used experiment way, quadratic interpolation optimization algorithm is proposed in this paper to obtain threshold. Set the endpoints and midpoint of the threshold searching interval as initial interpolation nodes, and compute the minimum quadratic interpolation function. The best threshold can be obtained by comparing the minimum quadratic interpolation function. A series of image quality evaluation results show this method has got improvement in fusion effect; moreover, it is not only effective for some individual image, but also for a large number of images.

  7. Joint Dictionary Learning for Multispectral Change Detection.

    PubMed

    Lu, Xiaoqiang; Yuan, Yuan; Zheng, Xiangtao

    2017-04-01

    Change detection is one of the most important applications of remote sensing technology. It is a challenging task due to the obvious variations in the radiometric value of spectral signature and the limited capability of utilizing spectral information. In this paper, an improved sparse coding method for change detection is proposed. The intuition of the proposed method is that unchanged pixels in different images can be well reconstructed by the joint dictionary, which corresponds to knowledge of unchanged pixels, while changed pixels cannot. First, a query image pair is projected onto the joint dictionary to constitute the knowledge of unchanged pixels. Then reconstruction error is obtained to discriminate between the changed and unchanged pixels in the different images. To select the proper thresholds for determining changed regions, an automatic threshold selection strategy is presented by minimizing the reconstruction errors of the changed pixels. Adequate experiments on multispectral data have been tested, and the experimental results compared with the state-of-the-art methods prove the superiority of the proposed method. Contributions of the proposed method can be summarized as follows: 1) joint dictionary learning is proposed to explore the intrinsic information of different images for change detection. In this case, change detection can be transformed as a sparse representation problem. To the authors' knowledge, few publications utilize joint learning dictionary in change detection; 2) an automatic threshold selection strategy is presented, which minimizes the reconstruction errors of the changed pixels without the prior assumption of the spectral signature. As a result, the threshold value provided by the proposed method can adapt to different data due to the characteristic of joint dictionary learning; and 3) the proposed method makes no prior assumption of the modeling and the handling of the spectral signature, which can be adapted to different data.

  8. Field hearing measurements of the Atlantic sharpnose shark Rhizoprionodon terraenovae.

    PubMed

    Casper, B M; Mann, D A

    2009-12-01

    Field measurements of hearing thresholds were obtained from the Atlantic sharpnose shark Rhizoprionodon terraenovae using the auditory evoked potential method (AEP). The fish had most sensitive hearing at 20 Hz, the lowest frequency tested, with decreasing sensitivity at higher frequencies. Hearing thresholds were lower than AEP thresholds previously measured for the nurse shark Ginglymostoma cirratum and yellow stingray Urobatis jamaicensis at frequencies <200 Hz, and similar at 200 Hz and above. Rhizoprionodon terraenovae represents the closest comparison in terms of pelagic lifestyle to the sharks which have been observed in acoustic field attraction experiments. The sound pressure levels that would be equivalent to the particle acceleration thresholds of R. terraenovae were much higher than the sound levels which attracted closely related sharks suggesting a discrepancy between the hearing threshold experiments and the field attraction experiments.

  9. Continuous adjustment of threshold voltage in carbon nanotube field-effect transistors through gate engineering

    NASA Astrophysics Data System (ADS)

    Zhong, Donglai; Zhao, Chenyi; Liu, Lijun; Zhang, Zhiyong; Peng, Lian-Mao

    2018-04-01

    In this letter, we report a gate engineering method to adjust threshold voltage of carbon nanotube (CNT) based field-effect transistors (FETs) continuously in a wide range, which makes the application of CNT FETs especially in digital integrated circuits (ICs) easier. Top-gated FETs are fabricated using solution-processed CNT network films with stacking Pd and Sc films as gate electrodes. By decreasing the thickness of the lower layer metal (Pd) from 20 nm to zero, the effective work function of the gate decreases, thus tuning the threshold voltage (Vt) of CNT FETs from -1.0 V to 0.2 V. The continuous adjustment of threshold voltage through gate engineering lays a solid foundation for multi-threshold technology in CNT based ICs, which then can simultaneously provide high performance and low power circuit modules on one chip.

  10. Effects of visual erotic stimulation on vibrotactile detection thresholds in men.

    PubMed

    Jiao, Chuanshu; Knight, Peter K; Weerakoon, Patricia; Turman, A Bulent

    2007-12-01

    This study examined the effects of sexual arousal on vibration detection thresholds in the right index finger of 30 healthy, heterosexual males who reported no sexual dysfunction. Vibrotactile detection thresholds at frequencies of 30, 60, and 100 Hz were assessed before and after watching erotic and control videos using a forced-choice, staircase method. A mechanical stimulator was used to produce the vibratory stimulus. Results were analyzed using repeated measures analysis of variance. After watching the erotic video, the vibrotactile detection thresholds at 30, 60, and 100 Hz were significantly reduced (p < .01). No changes in thresholds were detected at any frequency following exposure to the non-erotic stimulus. The results show that sexual arousal resulted in an increase in vibrotactile sensitivity to low frequency stimuli in the index finger of sexually functional men.

  11. Study on the efficacy of ELA-Max (4% liposomal lidocaine) compared with EMLA cream (eutectic mixture of local anesthetics) using thermosensory threshold analysis in adult volunteers.

    PubMed

    Tang, M B Y; Goon, A T J; Goh, C L

    2004-04-01

    ELA-Max and EMLA cream are topical anesthetics that have been shown to have similar anesthetic efficacy in previous studies. To evaluate the analgesic efficacy of ELA-Max in comparison with EMLA cream using a novel method of thermosensory threshold analysis. A thermosensory analyzer was used to assess warmth- and heat-induced pain thresholds. No statistically significant difference was found in pain thresholds using either formulation. However, EMLA cream increased the heat-induced pain threshold to a greater extent than ELA-Max. Thermosensory measurement and analysis was well tolerated and no adverse events were encountered. EMLA cream may be superior to ELA-Max for heat-induced pain. This study suggests that thermosensory measurement may be another suitable tool for future topical anesthetic efficacy studies.

  12. Thresholds for the perception of whole-body linear sinusoidal motion in the horizontal plane

    NASA Technical Reports Server (NTRS)

    Mah, Robert W.; Young, Laurence R.; Steele, Charles R.; Schubert, Earl D.

    1989-01-01

    An improved linear sled has been developed to provide precise motion stimuli without generating perceptible extraneous motion cues (a noiseless environment). A modified adaptive forced-choice method was employed to determine perceptual thresholds to whole-body linear sinusoidal motion in 25 subjects. Thresholds for the detection of movement in the horizontal plane were found to be lower than those reported previously. At frequencies of 0.2 to 0.5 Hz, thresholds were shown to be independent of frequency, while at frequencies of 1.0 to 3.0 Hz, thresholds showed a decreasing sensitivity with increasing frequency, indicating that the perceptual process is not sensitive to the rate change of acceleration of the motion stimulus. The results suggest that the perception of motion behaves as an integrating accelerometer with a bandwidth of at least 3 Hz.

  13. Thresholding functional connectomes by means of mixture modeling.

    PubMed

    Bielczyk, Natalia Z; Walocha, Fabian; Ebel, Patrick W; Haak, Koen V; Llera, Alberto; Buitelaar, Jan K; Glennon, Jeffrey C; Beckmann, Christian F

    2018-05-01

    Functional connectivity has been shown to be a very promising tool for studying the large-scale functional architecture of the human brain. In network research in fMRI, functional connectivity is considered as a set of pair-wise interactions between the nodes of the network. These interactions are typically operationalized through the full or partial correlation between all pairs of regional time series. Estimating the structure of the latent underlying functional connectome from the set of pair-wise partial correlations remains an open research problem though. Typically, this thresholding problem is approached by proportional thresholding, or by means of parametric or non-parametric permutation testing across a cohort of subjects at each possible connection. As an alternative, we propose a data-driven thresholding approach for network matrices on the basis of mixture modeling. This approach allows for creating subject-specific sparse connectomes by modeling the full set of partial correlations as a mixture of low correlation values associated with weak or unreliable edges in the connectome and a sparse set of reliable connections. Consequently, we propose to use alternative thresholding strategy based on the model fit using pseudo-False Discovery Rates derived on the basis of the empirical null estimated as part of the mixture distribution. We evaluate the method on synthetic benchmark fMRI datasets where the underlying network structure is known, and demonstrate that it gives improved performance with respect to the alternative methods for thresholding connectomes, given the canonical thresholding levels. We also demonstrate that mixture modeling gives highly reproducible results when applied to the functional connectomes of the visual system derived from the n-back Working Memory task in the Human Connectome Project. The sparse connectomes obtained from mixture modeling are further discussed in the light of the previous knowledge of the functional architecture of the visual system in humans. We also demonstrate that with use of our method, we are able to extract similar information on the group level as can be achieved with permutation testing even though these two methods are not equivalent. We demonstrate that with both of these methods, we obtain functional decoupling between the two hemispheres in the higher order areas of the visual cortex during visual stimulation as compared to the resting state, which is in line with previous studies suggesting lateralization in the visual processing. However, as opposed to permutation testing, our approach does not require inference at the cohort level and can be used for creating sparse connectomes at the level of a single subject. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  14. CT image segmentation methods for bone used in medical additive manufacturing.

    PubMed

    van Eijnatten, Maureen; van Dijk, Roelof; Dobbe, Johannes; Streekstra, Geert; Koivisto, Juha; Wolff, Jan

    2018-01-01

    The accuracy of additive manufactured medical constructs is limited by errors introduced during image segmentation. The aim of this study was to review the existing literature on different image segmentation methods used in medical additive manufacturing. Thirty-two publications that reported on the accuracy of bone segmentation based on computed tomography images were identified using PubMed, ScienceDirect, Scopus, and Google Scholar. The advantages and disadvantages of the different segmentation methods used in these studies were evaluated and reported accuracies were compared. The spread between the reported accuracies was large (0.04 mm - 1.9 mm). Global thresholding was the most commonly used segmentation method with accuracies under 0.6 mm. The disadvantage of this method is the extensive manual post-processing required. Advanced thresholding methods could improve the accuracy to under 0.38 mm. However, such methods are currently not included in commercial software packages. Statistical shape model methods resulted in accuracies from 0.25 mm to 1.9 mm but are only suitable for anatomical structures with moderate anatomical variations. Thresholding remains the most widely used segmentation method in medical additive manufacturing. To improve the accuracy and reduce the costs of patient-specific additive manufactured constructs, more advanced segmentation methods are required. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  15. How to select a proper early warning threshold to detect infectious disease outbreaks based on the China infectious disease automated alert and response system (CIDARS).

    PubMed

    Wang, Ruiping; Jiang, Yonggen; Michael, Engelgau; Zhao, Genming

    2017-06-12

    China Centre for Diseases Control and Prevention (CDC) developed the China Infectious Disease Automated Alert and Response System (CIDARS) in 2005. The CIDARS was used to strengthen infectious disease surveillance and aid in the early warning of outbreak. The CIDARS has been integrated into the routine outbreak monitoring efforts of the CDC at all levels in China. Early warning threshold is crucial for outbreak detection in the CIDARS, but CDCs at all level are currently using thresholds recommended by the China CDC, and these recommended thresholds have recognized limitations. Our study therefore seeks to explore an operational method to select the proper early warning threshold according to the epidemic features of local infectious diseases. The data used in this study were extracted from the web-based Nationwide Notifiable Infectious Diseases Reporting Information System (NIDRIS), and data for infectious disease cases were organized by calendar week (1-52) and year (2009-2015) in Excel format; Px was calculated using a percentile-based moving window (moving window [5 week*5 year], x), where x represents one of 12 centiles (0.40, 0.45, 0.50….0.95). Outbreak signals for the 12 Px were calculated using the moving percentile method (MPM) based on data from the CIDARS. When the outbreak signals generated by the 'mean + 2SD' gold standard were in line with a Px generated outbreak signal for each week during the year of 2014, this Px was then defined as the proper threshold for the infectious disease. Finally, the performance of new selected thresholds for each infectious disease was evaluated by simulated outbreak signals based on 2015 data. Six infectious diseases were selected in this study (chickenpox, mumps, hand foot and mouth diseases (HFMD), scarlet fever, influenza and rubella). Proper thresholds for chickenpox (P75), mumps (P80), influenza (P75), rubella (P45), HFMD (P75), and scarlet fever (P80) were identified. The selected proper thresholds for these 6 infectious diseases could detect almost all simulated outbreaks within a shorter time period compared to thresholds recommended by the China CDC. It is beneficial to select the proper early warning threshold to detect infectious disease aberrations based on characteristics and epidemic features of local diseases in the CIDARS.

  16. Obstacle or Opportunity? Digital Thresholds in Professional Development

    ERIC Educational Resources Information Center

    McGowan, Susannah

    2012-01-01

    Using the Threshold Concepts Framework, I explore places where faculty frequently get stuck when attempting to adopt new technologies. They may be held back by preconceptions that technology is superfluous to traditional teaching methods or believe that they must understand the technology perfectly before introducing it into their teaching.…

  17. Development of a plant based threshold for tarnished plant bug (Hemiptera: miridae) in cotton

    USDA-ARS?s Scientific Manuscript database

    The tarnished plant bug, Lygus lineolaris (Palisot de Beauvois), is the most important insect pest of cotton, Gossypium hirsutum L., in the midsouthern United States. It is almost exclusively controlled with foliar insecticide applications, and sampling methods and thresholds need to be revisited. ...

  18. Method for photon activation positron annihilation analysis

    DOEpatents

    Akers, Douglas W.

    2006-06-06

    A non-destructive testing method comprises providing a specimen having at least one positron emitter therein; determining a threshold energy for activating the positron emitter; and determining whether a half-life of the positron emitter is less than a selected half-life. If the half-life of the positron emitter is greater than or equal to the selected half-life, then activating the positron emitter by bombarding the specimen with photons having energies greater than the threshold energy and detecting gamma rays produced by annihilation of positrons in the specimen. If the half-life of the positron emitter is less then the selected half-life, then alternately activating the positron emitter by bombarding the specimen with photons having energies greater then the threshold energy and detecting gamma rays produced by positron annihilation within the specimen.

  19. Development of Image Segmentation Methods for Intracranial Aneurysms

    PubMed Central

    Qian, Yi; Morgan, Michael

    2013-01-01

    Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery. PMID:23606905

  20. An adaptive design for updating the threshold value of a continuous biomarker

    PubMed Central

    Spencer, Amy V.; Harbron, Chris; Mander, Adrian; Wason, James; Peers, Ian

    2017-01-01

    Potential predictive biomarkers are often measured on a continuous scale, but in practice, a threshold value to divide the patient population into biomarker ‘positive’ and ‘negative’ is desirable. Early phase clinical trials are increasingly using biomarkers for patient selection, but at this stage, it is likely that little will be known about the relationship between the biomarker and the treatment outcome. We describe a single-arm trial design with adaptive enrichment, which can increase power to demonstrate efficacy within a patient subpopulation, the parameters of which are also estimated. Our design enables us to learn about the biomarker and optimally adjust the threshold during the study, using a combination of generalised linear modelling and Bayesian prediction. At the final analysis, a binomial exact test is carried out, allowing the hypothesis that ‘no population subset exists in which the novel treatment has a desirable response rate’ to be tested. Through extensive simulations, we are able to show increased power over fixed threshold methods in many situations without increasing the type-I error rate. We also show that estimates of the threshold, which defines the population subset, are unbiased and often more precise than those from fixed threshold studies. We provide an example of the method applied (retrospectively) to publically available data from a study of the use of tamoxifen after mastectomy by the German Breast Study Group, where progesterone receptor is the biomarker of interest. PMID:27417407

  1. Trapping volumetric measurement by multidetector CT in chronic obstructive pulmonary disease: Effect of CT threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xiaohua; Yuan, Huishu; Duan, Jianghui

    2013-08-15

    Purpose: The purpose of this study was to evaluate the effect of various computed tomography (CT) thresholds on trapping volumetric measurements by multidetector CT in chronic obstructive pulmonary disease (COPD).Methods: Twenty-three COPD patients were scanned with a 64-slice CT scanner in both the inspiratory and expiratory phase. CT thresholds of −950 Hu in inspiration and −950 to −890 Hu in expiration were used, after which trapping volumetric measurements were made using computer software. Trapping volume percentage (Vtrap%) under the different CT thresholds in the expiratory phase and below −950 Hu in the inspiratory phase was compared and correlated with lungmore » function.Results: Mean Vtrap% was similar under −930 Hu in the expiratory phase and below −950 Hu in the inspiratory phase, being 13.18 ± 9.66 and 13.95 ± 6.72 (both lungs), respectively; this difference was not significant (P= 0.240). Vtrap% under −950 Hu in the inspiratory phase and below the −950 to −890 Hu threshold in the expiratory phase was moderately negatively correlated with the ratio of forced expiratory volume in one second to forced vital capacity and the measured value of forced expiratory volume in one second as a percentage of the predicted value.Conclusions: Trapping volumetric measurement with multidetector CT is a promising method for the quantification of COPD. It is important to know the effect of various CT thresholds on trapping volumetric measurements.« less

  2. Identifying Thresholds for Ecosystem-Based Management

    PubMed Central

    Samhouri, Jameal F.; Levin, Phillip S.; Ainsworth, Cameron H.

    2010-01-01

    Background One of the greatest obstacles to moving ecosystem-based management (EBM) from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. Methodology/Principal Findings To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution) at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity) and functional (e.g., resilience) attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1) fishing and (2) nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. Conclusions/Significance For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management. PMID:20126647

  3. Do Optimal Prognostic Thresholds in Continuous Physiological Variables Really Exist? Analysis of Origin of Apparent Thresholds, with Systematic Review for Peak Oxygen Consumption, Ejection Fraction and BNP

    PubMed Central

    Leong, Tora; Rehman, Michaela B.; Pastormerlo, Luigi Emilio; Harrell, Frank E.; Coats, Andrew J. S.; Francis, Darrel P.

    2014-01-01

    Background Clinicians are sometimes advised to make decisions using thresholds in measured variables, derived from prognostic studies. Objectives We studied why there are conflicting apparently-optimal prognostic thresholds, for example in exercise peak oxygen uptake (pVO2), ejection fraction (EF), and Brain Natriuretic Peptide (BNP) in heart failure (HF). Data Sources and Eligibility Criteria Studies testing pVO2, EF or BNP prognostic thresholds in heart failure, published between 1990 and 2010, listed on Pubmed. Methods First, we examined studies testing pVO2, EF or BNP prognostic thresholds. Second, we created repeated simulations of 1500 patients to identify whether an apparently-optimal prognostic threshold indicates step change in risk. Results 33 studies (8946 patients) tested a pVO2 threshold. 18 found it prognostically significant: the actual reported threshold ranged widely (10–18 ml/kg/min) but was overwhelmingly controlled by the individual study population's mean pVO2 (r = 0.86, p<0.00001). In contrast, the 15 negative publications were testing thresholds 199% further from their means (p = 0.0001). Likewise, of 35 EF studies (10220 patients), the thresholds in the 22 positive reports were strongly determined by study means (r = 0.90, p<0.0001). Similarly, in the 19 positives of 20 BNP studies (9725 patients): r = 0.86 (p<0.0001). Second, survival simulations always discovered a “most significant” threshold, even when there was definitely no step change in mortality. With linear increase in risk, the apparently-optimal threshold was always near the sample mean (r = 0.99, p<0.001). Limitations This study cannot report the best threshold for any of these variables; instead it explains how common clinical research procedures routinely produce false thresholds. Key Findings First, shifting (and/or disappearance) of an apparently-optimal prognostic threshold is strongly determined by studies' average pVO2, EF or BNP. Second, apparently-optimal thresholds always appear, even with no step in prognosis. Conclusions Emphatic therapeutic guidance based on thresholds from observational studies may be ill-founded. We should not assume that optimal thresholds, or any thresholds, exist. PMID:24475020

  4. Threshold resummation of soft gluons in hadronic reactions - an introduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, E. L.

    The authors discuss the motivation for resummation of the effects of initial-state soft gluon radiation, to all orders in the strong coupling strength, for processes in which the near-threshold region in the partonic subenergy is important. The author summarizes the method of perturbative resummation and its application to the calculation of the total cross section for top quark production at hadron colliders. Comments are included on the differences between the treatment of subleading logarithmic terms in this method and in other approaches.

  5. Large signal-to-noise ratio quantification in MLE for ARARMAX models

    NASA Astrophysics Data System (ADS)

    Zou, Yiqun; Tang, Xiafei

    2014-06-01

    It has been shown that closed-loop linear system identification by indirect method can be generally transferred to open-loop ARARMAX (AutoRegressive AutoRegressive Moving Average with eXogenous input) estimation. For such models, the gradient-related optimisation with large enough signal-to-noise ratio (SNR) can avoid the potential local convergence in maximum likelihood estimation. To ease the application of this condition, the threshold SNR needs to be quantified. In this paper, we build the amplitude coefficient which is an equivalence to the SNR and prove the finiteness of the threshold amplitude coefficient within the stability region. The quantification of threshold is achieved by the minimisation of an elaborately designed multi-variable cost function which unifies all the restrictions on the amplitude coefficient. The corresponding algorithm based on two sets of physically realisable system input-output data details the minimisation and also points out how to use the gradient-related method to estimate ARARMAX parameters when local minimum is present as the SNR is small. Then, the algorithm is tested on a theoretical AutoRegressive Moving Average with eXogenous input model for the derivation of the threshold and a gas turbine engine real system for model identification, respectively. Finally, the graphical validation of threshold on a two-dimensional plot is discussed.

  6. Building rainfall thresholds for large-scales landslides by extracting occurrence time of landslides from seismic records

    NASA Astrophysics Data System (ADS)

    Yen, Hsin-Yi; Lin, Guan-Wei

    2017-04-01

    Understanding the rainfall condition which triggers mass moment on hillslope is the key to forecast rainfall-induced slope hazards, and the exact time of landslide occurrence is one of the basic information for rainfall statistics. In the study, we focused on large-scale landslides (LSLs) with disturbed area larger than 10 ha and conducted a string of studies including the recognition of landslide-induced ground motions and the analyses of different terms of rainfall thresholds. More than 10 heavy typhoons during the periods of 2005-2014 in Taiwan induced more than hundreds of LSLs and provided the opportunity to characterize the rainfall conditions which trigger LSLs. A total of 101 landslide-induced seismic signals were identified from the records of Taiwan seismic network. These signals exposed the occurrence time of landslide to assess rainfall conditions. Rainfall analyses showed that LSLs occurred when cumulative rainfall exceeded 500 mm. The results of rainfall-threshold analyses revealed that it is difficult to distinct LSLs from small-scale landslides (SSLs) by the I-D and R-D methods, but the I-R method can achieve the discrimination. Besides, an enhanced three-factor threshold considering deep water content was proposed as the rainfall threshold for LSLs.

  7. An innovative iterative thresholding algorithm for tumour segmentation and volumetric quantification on SPECT images: Monte Carlo-based methodology and validation.

    PubMed

    Pacilio, M; Basile, C; Shcherbinin, S; Caselli, F; Ventroni, G; Aragno, D; Mango, L; Santini, E

    2011-06-01

    Positron emission tomography (PET) and single-photon emission computed tomography (SPECT) imaging play an important role in the segmentation of functioning parts of organs or tumours, but an accurate and reproducible delineation is still a challenging task. In this work, an innovative iterative thresholding method for tumour segmentation has been proposed and implemented for a SPECT system. This method, which is based on experimental threshold-volume calibrations, implements also the recovery coefficients (RC) of the imaging system, so it has been called recovering iterative thresholding method (RIThM). The possibility to employ Monte Carlo (MC) simulations for system calibration was also investigated. The RIThM is an iterative algorithm coded using MATLAB: after an initial rough estimate of the volume of interest, the following calculations are repeated: (i) the corresponding source-to-background ratio (SBR) is measured and corrected by means of the RC curve; (ii) the threshold corresponding to the amended SBR value and the volume estimate is then found using threshold-volume data; (iii) new volume estimate is obtained by image thresholding. The process goes on until convergence. The RIThM was implemented for an Infinia Hawkeye 4 (GE Healthcare) SPECT/CT system, using a Jaszczak phantom and several test objects. Two MC codes were tested to simulate the calibration images: SIMIND and SimSet. For validation, test images consisting of hot spheres and some anatomical structures of the Zubal head phantom were simulated with SIMIND code. Additional test objects (flasks and vials) were also imaged experimentally. Finally, the RIThM was applied to evaluate three cases of brain metastases and two cases of high grade gliomas. Comparing experimental thresholds and those obtained by MC simulations, a maximum difference of about 4% was found, within the errors (+/- 2% and +/- 5%, for volumes > or = 5 ml or < 5 ml, respectively). Also for the RC data, the comparison showed differences (up to 8%) within the assigned error (+/- 6%). ANOVA test demonstrated that the calibration results (in terms of thresholds or RCs at various volumes) obtained by MC simulations were indistinguishable from those obtained experimentally. The accuracy in volume determination for the simulated hot spheres was between -9% and 15% in the range 4-270 ml, whereas for volumes less than 4 ml (in the range 1-3 ml) the difference increased abruptly reaching values greater than 100%. For the Zubal head phantom, errors ranged between 9% and 18%. For the experimental test images, the accuracy level was within +/- 10%, for volumes in the range 20-110 ml. The preliminary test of application on patients evidenced the suitability of the method in a clinical setting. The MC-guided delineation of tumor volume may reduce the acquisition time required for the experimental calibration. Analysis of images of several simulated and experimental test objects, Zubal head phantom and clinical cases demonstrated the robustness, suitability, accuracy, and speed of the proposed method. Nevertheless, studies concerning tumors of irregular shape and/or nonuniform distribution of the background activity are still in progress.

  8. Automated segmentation of linear time-frequency representations of marine-mammal sounds.

    PubMed

    Dadouchi, Florian; Gervaise, Cedric; Ioana, Cornel; Huillery, Julien; Mars, Jérôme I

    2013-09-01

    Many marine mammals produce highly nonlinear frequency modulations. Determining the time-frequency support of these sounds offers various applications, which include recognition, localization, and density estimation. This study introduces a low parameterized automated spectrogram segmentation method that is based on a theoretical probabilistic framework. In the first step, the background noise in the spectrogram is fitted with a Chi-squared distribution and thresholded using a Neyman-Pearson approach. In the second step, the number of false detections in time-frequency regions is modeled as a binomial distribution, and then through a Neyman-Pearson strategy, the time-frequency bins are gathered into regions of interest. The proposed method is validated on real data of large sequences of whistles from common dolphins, collected in the Bay of Biscay (France). The proposed method is also compared with two alternative approaches: the first is smoothing and thresholding of the spectrogram; the second is thresholding of the spectrogram followed by the use of morphological operators to gather the time-frequency bins and to remove false positives. This method is shown to increase the probability of detection for the same probability of false alarms.

  9. Method of Improved Fuzzy Contrast Combined Adaptive Threshold in NSCT for Medical Image Enhancement

    PubMed Central

    Yang, Jie; Kasabov, Nikola

    2017-01-01

    Noises and artifacts are introduced to medical images due to acquisition techniques and systems. This interference leads to low contrast and distortion in images, which not only impacts the effectiveness of the medical image but also seriously affects the clinical diagnoses. This paper proposes an algorithm for medical image enhancement based on the nonsubsampled contourlet transform (NSCT), which combines adaptive threshold and an improved fuzzy set. First, the original image is decomposed into the NSCT domain with a low-frequency subband and several high-frequency subbands. Then, a linear transformation is adopted for the coefficients of the low-frequency component. An adaptive threshold method is used for the removal of high-frequency image noise. Finally, the improved fuzzy set is used to enhance the global contrast and the Laplace operator is used to enhance the details of the medical images. Experiments and simulation results show that the proposed method is superior to existing methods of image noise removal, improves the contrast of the image significantly, and obtains a better visual effect. PMID:28744464

  10. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    NASA Astrophysics Data System (ADS)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  11. Thresholds for activation of rabbit retinal ganglion cells with an ultrafine, extracellular microelectrode.

    PubMed

    Jensen, Ralph J; Rizzo, Joseph F; Ziv, Ofer R; Grumet, Andrew; Wyatt, John

    2003-08-01

    To determine electrical thresholds required for extracellular activation of retinal ganglion cells as part of a project to develop an epiretinal prosthesis. Retinal ganglion cells were recorded extracellularly in retinas isolated from adult New Zealand White rabbits. Electrical current pulses of 100- micro s duration were delivered to the inner surface of the retina from a 5- micro m long electrode. In about half of the cells, the point of lowest threshold was found by searching with anodal current pulses; in the other cells, cathodal current pulses were used. Threshold measurements were obtained near the cell bodies of 20 ganglion cells and near the axons of 19 ganglion cells. Both cathodal and anodal stimuli evoked a neural response in the ganglion cells that consisted of a single action potential of near-constant latency that persisted when retinal synaptic transmission was blocked with cadmium chloride. For cell bodies, but not axons, thresholds for both cathodal and anodal stimulation were dependent on the search method used to find the point of lowest threshold. With search and stimulation of matching polarity, cathodal stimuli evoked a ganglion cell response at lower currents (approximately one seventh to one tenth axonal threshold) than did anodal stimuli for both cell bodies and axons. With cathodal search and stimulation, cell body median thresholds were somewhat lower (approximately one half) than the axonal median thresholds. With anodal search and stimulation, cell body median thresholds were approximately the same as axonal median thresholds. The results suggest that cathodal stimulation should produce lower thresholds, more localized stimulation, and somewhat better selectivity for cell bodies over axons than would anodal stimulation.

  12. Objectivity and validity of EMG method in estimating anaerobic threshold.

    PubMed

    Kang, S-K; Kim, J; Kwon, M; Eom, H

    2014-08-01

    The purposes of this study were to verify and compare the performances of anaerobic threshold (AT) point estimates among different filtering intervals (9, 15, 20, 25, 30 s) and to investigate the interrelationships of AT point estimates obtained by ventilatory threshold (VT) and muscle fatigue thresholds using electromyographic (EMG) activity during incremental exercise on a cycle ergometer. 69 untrained male university students, yet pursuing regular exercise voluntarily participated in this study. The incremental exercise protocol was applied with a consistent stepwise increase in power output of 20 watts per minute until exhaustion. AT point was also estimated in the same manner using V-slope program with gas exchange parameters. In general, the estimated values of AT point-time computed by EMG method were more consistent across 5 filtering intervals and demonstrated higher correlations among themselves when compared with those values obtained by VT method. The results found in the present study suggest that the EMG signals could be used as an alternative or a new option in estimating AT point. Also the proposed computing procedure implemented in Matlab for the analysis of EMG signals appeared to be valid and reliable as it produced nearly identical values and high correlations with VT estimates. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Non-invasive indices for the estimation of the anaerobic threshold of oarsmen.

    PubMed

    Erdogan, A; Cetin, C; Karatosun, H; Baydar, M L

    2010-01-01

    This study compared four common non-invasive indices with an invasive index for determining the anaerobic threshold (AT) in 22 adult male rowers using a Concept2 rowing ergometer. A criterion-standard progressive incremental test (invasive method) measured blood lactate concentrations to determine the 4 mmol/l threshold (La4-AT) and Dmax AT (Dm-AT). This was compared with three indices obtained by analysis of respiratory gases and one that was based on the heart rate (HR) deflection point (HRDP) all of which used the Conconi test (non-invasive methods). In the Conconi test, the HRDP was determined whilst continuously increasing the power output (PO) by 25 W/min and measuring respiratory gases and HR. The La4-AT and Dm-AT values differed slightly with respect to oxygen uptake, PO and HR however, AT values significantly correlated with each other and with the four non-invasive methods. In conclusion, the non-invasive indices were comparable with the invasive index and could, therefore, be used in the assessment of AT during rowing ergometer use. In this population of elite rowers, Conconi threshold (Con-AT), based on the measurement of HRDP tended to be the most adequate way of estimating AT for training regulation purposes.

  14. Gap Detection and Temporal Modulation Transfer Function as Behavioral Estimates of Auditory Temporal Acuity Using Band-Limited Stimuli in Young and Older Adults

    PubMed Central

    Shen, Yi

    2015-01-01

    Purpose Gap detection and the temporal modulation transfer function (TMTF) are 2 common methods to obtain behavioral estimates of auditory temporal acuity. However, the agreement between the 2 measures is not clear. This study compares results from these 2 methods and their dependencies on listener age and hearing status. Method Gap detection thresholds and the parameters that describe the TMTF (sensitivity and cutoff frequency) were estimated for young and older listeners who were naive to the experimental tasks. Stimuli were 800-Hz-wide noises with upper frequency limits of 2400 Hz, presented at 85 dB SPL. A 2-track procedure (Shen & Richards, 2013) was used for the efficient estimation of the TMTF. Results No significant correlation was found between gap detection threshold and the sensitivity or the cutoff frequency of the TMTF. No significant effect of age and hearing loss on either the gap detection threshold or the TMTF cutoff frequency was found, while the TMTF sensitivity improved with increasing hearing threshold and worsened with increasing age. Conclusion Estimates of temporal acuity using gap detection and TMTF paradigms do not seem to provide a consistent description of the effects of listener age and hearing status on temporal envelope processing. PMID:25087722

  15. Methodological issues when comparing hearing thresholds of a group with population standards: the case of the ferry engineers.

    PubMed

    Dobie, Robert A

    2006-10-01

    To discuss appropriate and inappropriate methods for comparing distributions of hearing thresholds of a study group with distributions in population standards and to determine whether the thresholds of Washington State Ferries engineers are different from those of men in the general population, using both frequency-by-frequency comparisons and analysis of audiometric shape. The most recent hearing conservation program audiograms of 321 noise-exposed engineers, ages 35 to 64, were compared with the predictions of Annexes A, B, and C from ANSI S3.44. There was no screening by history or otoscopy; all audiograms were included. 95% confidence intervals (95% CIs) were calculated for the engineers' median thresholds for each ear, for the better ear (defined two ways), and for the binaural average. For Annex B, where 95% CIs are also available, it was possible to calculate z scores for the differences between Annex B and the engineers' better ears. Bulge depth, an audiometric shape statistic, measured curvature between 1 and 6 kHz. Engineers' better-ear median thresholds were worse than those in Annex A but (except at 1 kHz) were as good as or better than those in Annexes B and C, which are more appropriate for comparison to an unscreened noise-exposed group like the engineers. Average bulge depth for the engineers was similar to that of the Annex B standard (no added occupational noise) and was much less than that of audiograms created by using the standard with added occupational noise between 90 and 100 dBA. Audiograms from groups that have been selected for a particular exposure, but, without regard to severity, can appropriately be compared with population standards, if certain pitfalls are avoided. For unscreened study groups with large age-sex subgroups, a simple method to assess statistical significance, taking into consideration uncertainties in both the study group and the comparison standard, is the calculation of z scores for the proportion of better-ear thresholds above the Annex B median. A less powerful method combines small age-sex subgroups after age correction. Small threshold differences, even if statistically significant, may not be due to genuine differences in hearing sensitivity between study group and standard. Audiometric shape analysis offers an independent dimension of comparison between the study group and audiograms predicted from the ANSI S3.44 standard, with and without occupational noise exposure. Important pitfalls in comparison to population standards include nonrandom selection of study groups, inappropriate choice of population standard, use of the right and left ear thresholds instead of the better-ear threshold for comparison to Annex B, and comparing means with medians. The thresholds of the engineers in this study were similar to published standards for an unscreened population.

  16. Near-Infrared Spectrum Detection of Wheat Gluten Protein Content Based on a Combined Filtering Method.

    PubMed

    Cai, Jian-Hua

    2017-09-01

    To eliminate the random error of the derivative near-IR (NIR) spectrum and to improve model stability and the prediction accuracy of the gluten protein content, a combined method is proposed for pretreatment of the NIR spectrum based on both empirical mode decomposition and the wavelet soft-threshold method. The principle and the steps of the method are introduced and the denoising effect is evaluated. The wheat gluten protein content is calculated based on the denoised spectrum, and the results are compared with those of the nine-point smoothing method and the wavelet soft-threshold method. Experimental results show that the proposed combined method is effective in completing pretreatment of the NIR spectrum, and the proposed method improves the accuracy of detection of wheat gluten protein content from the NIR spectrum.

  17. A method for managing re-identification risk from small geographic areas in Canada

    PubMed Central

    2010-01-01

    Background A common disclosure control practice for health datasets is to identify small geographic areas and either suppress records from these small areas or aggregate them into larger ones. A recent study provided a method for deciding when an area is too small based on the uniqueness criterion. The uniqueness criterion stipulates that an the area is no longer too small when the proportion of unique individuals on the relevant variables (the quasi-identifiers) approaches zero. However, using a uniqueness value of zero is quite a stringent threshold, and is only suitable when the risks from data disclosure are quite high. Other uniqueness thresholds that have been proposed for health data are 5% and 20%. Methods We estimated uniqueness for urban Forward Sortation Areas (FSAs) by using the 2001 long form Canadian census data representing 20% of the population. We then constructed two logistic regression models to predict when the uniqueness is greater than the 5% and 20% thresholds, and validated their predictive accuracy using 10-fold cross-validation. Predictor variables included the population size of the FSA and the maximum number of possible values on the quasi-identifiers (the number of equivalence classes). Results All model parameters were significant and the models had very high prediction accuracy, with specificity above 0.9, and sensitivity at 0.87 and 0.74 for the 5% and 20% threshold models respectively. The application of the models was illustrated with an analysis of the Ontario newborn registry and an emergency department dataset. At the higher thresholds considerably fewer records compared to the 0% threshold would be considered to be in small areas and therefore undergo disclosure control actions. We have also included concrete guidance for data custodians in deciding which one of the three uniqueness thresholds to use (0%, 5%, 20%), depending on the mitigating controls that the data recipients have in place, the potential invasion of privacy if the data is disclosed, and the motives and capacity of the data recipient to re-identify the data. Conclusion The models we developed can be used to manage the re-identification risk from small geographic areas. Being able to choose among three possible thresholds, a data custodian can adjust the definition of "small geographic area" to the nature of the data and recipient. PMID:20361870

  18. A geographic analysis of population density thresholds in the influenza pandemic of 1918–19

    PubMed Central

    2013-01-01

    Background Geographic variables play an important role in the study of epidemics. The role of one such variable, population density, in the spread of influenza is controversial. Prior studies have tested for such a role using arbitrary thresholds for population density above or below which places are hypothesized to have higher or lower mortality. The results of such studies are mixed. The objective of this study is to estimate, rather than assume, a threshold level of population density that separates low-density regions from high-density regions on the basis of population loss during an influenza pandemic. We study the case of the influenza pandemic of 1918–19 in India, where over 15 million people died in the short span of less than one year. Methods Using data from six censuses for 199 districts of India (n=1194), the country with the largest number of deaths from the influenza of 1918–19, we use a sample-splitting method embedded within a population growth model that explicitly quantifies population loss from the pandemic to estimate a threshold level of population density that separates low-density districts from high-density districts. Results The results demonstrate a threshold level of population density of 175 people per square mile. A concurrent finding is that districts on the low side of the threshold experienced rates of population loss (3.72%) that were lower than districts on the high side of the threshold (4.69%). Conclusions This paper introduces a useful analytic tool to the health geographic literature. It illustrates an application of the tool to demonstrate that it can be useful for pandemic awareness and preparedness efforts. Specifically, it estimates a level of population density above which policies to socially distance, redistribute or quarantine populations are likely to be more effective than they are for areas with population densities that lie below the threshold. PMID:23425498

  19. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    NASA Astrophysics Data System (ADS)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  20. Stroke-model-based character extraction from gray-level document images.

    PubMed

    Ye, X; Cheriet, M; Suen, C Y

    2001-01-01

    Global gray-level thresholding techniques such as Otsu's method, and local gray-level thresholding techniques such as edge-based segmentation or the adaptive thresholding method are powerful in extracting character objects from simple or slowly varying backgrounds. However, they are found to be insufficient when the backgrounds include sharply varying contours or fonts in different sizes. A stroke-model is proposed to depict the local features of character objects as double-edges in a predefined size. This model enables us to detect thin connected components selectively, while ignoring relatively large backgrounds that appear complex. Meanwhile, since the stroke width restriction is fully factored in, the proposed technique can be used to extract characters in predefined font sizes. To process large volumes of documents efficiently, a hybrid method is proposed for character extraction from various backgrounds. Using the measurement of class separability to differentiate images with simple backgrounds from those with complex backgrounds, the hybrid method can process documents with different backgrounds by applying the appropriate methods. Experiments on extracting handwriting from a check image, as well as machine-printed characters from scene images demonstrate the effectiveness of the proposed model.

  1. An objective method and measuring equipment for noise control and acoustic diagnostics of motorcars. [acoustic diagnostics on automobile engines

    NASA Technical Reports Server (NTRS)

    Kacprowski, J.; Motylewski, J.; Miazga, J.

    1974-01-01

    An objective method and apparatus for noise control and acoustic diagnostics of motorcar engines are reported. The method and apparatus let us know whether the noisiness of the vehicle under test exceeds the admissible threshold levels given by appropriate standards and if so what is the main source of the excessive noise. The method consists in measuring both the overall noise level and the sound pressure levels in definite frequency bands while the engine speed is controlled as well and may be fixed at prescribed values. Whenever the individually adjusted threshold level has been exceeded in any frequency band, a self-sustaining control signal is sent.

  2. Effects of acute hypoxia on the determination of anaerobic threshold using the heart rate-work rate relationships during incremental exercise tests.

    PubMed

    Ozcelik, O; Kelestimur, H

    2004-01-01

    Anaerobic threshold which describes the onset of systematic increase in blood lactate concentration is a widely used concept in clinical and sports medicine. A deflection point between heart rate-work rate has been introduced to determine the anaerobic threshold non-invasively. However, some researchers have consistently reported a heart rate deflection at higher work rates, while others have not. The present study was designed to investigate whether the heart rate deflection point accurately predicts the anaerobic threshold under the condition of acute hypoxia. Eight untrained males performed two incremental exercise tests using an electromagnetically braked cycle ergometer: one breathing room air and one breathing 12 % O2. The anaerobic threshold was estimated using the V-slope method and determined from the increase in blood lactate and the decrease in standard bicarbonate concentration. This threshold was also estimated by in the heart rate-work rate relationship. Not all subjects exhibited a heart rate deflection. Only two subjects in the control and four subjects in the hypoxia groups showed a heart rate deflection. Additionally, the heart rate deflection point overestimated the anaerobic threshold. In conclusion, the heart rate deflection point was not an accurate predictor of anaerobic threshold and acute hypoxia did not systematically affect the heart rate-work rate relationships.

  3. Quantifying fracture geometry with X-ray tomography: Technique of Iterative Local Thresholding (TILT) for 3D image segmentation

    DOE PAGES

    Deng, Hang; Fitts, Jeffrey P.; Peters, Catherine A.

    2016-02-01

    This paper presents a new method—the Technique of Iterative Local Thresholding (TILT)—for processing 3D X-ray computed tomography (xCT) images for visualization and quantification of rock fractures. The TILT method includes the following advancements. First, custom masks are generated by a fracture-dilation procedure, which significantly amplifies the fracture signal on the intensity histogram used for local thresholding. Second, TILT is particularly well suited for fracture characterization in granular rocks because the multi-scale Hessian fracture (MHF) filter has been incorporated to distinguish fractures from pores in the rock matrix. Third, TILT wraps the thresholding and fracture isolation steps in an optimized iterativemore » routine for binary segmentation, minimizing human intervention and enabling automated processing of large 3D datasets. As an illustrative example, we applied TILT to 3D xCT images of reacted and unreacted fractured limestone cores. Other segmentation methods were also applied to provide insights regarding variability in image processing. The results show that TILT significantly enhanced separability of grayscale intensities, outperformed the other methods in automation, and was successful in isolating fractures from the porous rock matrix. Because the other methods are more likely to misclassify fracture edges as void and/or have limited capacity in distinguishing fractures from pores, those methods estimated larger fracture volumes (up to 80 %), surface areas (up to 60 %), and roughness (up to a factor of 2). In conclusion, these differences in fracture geometry would lead to significant disparities in hydraulic permeability predictions, as determined by 2D flow simulations.« less

  4. Cryptic diversity and discordance in single-locus species delimitation methods within horned lizards (Phrynosomatidae: Phrynosoma).

    PubMed

    Blair, Christopher; Bryson, Robert W

    2017-11-01

    Biodiversity reduction and loss continues to progress at an alarming rate, and thus, there is widespread interest in utilizing rapid and efficient methods for quantifying and delimiting taxonomic diversity. Single-locus species delimitation methods have become popular, in part due to the adoption of the DNA barcoding paradigm. These techniques can be broadly classified into tree-based and distance-based methods depending on whether species are delimited based on a constructed genealogy. Although the relative performance of these methods has been tested repeatedly with simulations, additional studies are needed to assess congruence with empirical data. We compiled a large data set of mitochondrial ND4 sequences from horned lizards (Phrynosoma) to elucidate congruence using four tree-based (single-threshold GMYC, multiple-threshold GMYC, bPTP, mPTP) and one distance-based (ABGD) species delimitation models. We were particularly interested in cases with highly uneven sampling and/or large differences in intraspecific diversity. Results showed a high degree of discordance among methods, with multiple-threshold GMYC and bPTP suggesting an unrealistically high number of species (29 and 26 species within the P. douglasii complex alone). The single-threshold GMYC model was the most conservative, likely a result of difficulty in locating the inflection point in the genealogies. mPTP and ABGD appeared to be the most stable across sampling regimes and suggested the presence of additional cryptic species that warrant further investigation. These results suggest that the mPTP model may be preferable in empirical data sets with highly uneven sampling or large differences in effective population sizes of species. © 2017 John Wiley & Sons Ltd.

  5. An integrated use of topography with RSI in gully mapping, Shandong Peninsula, China.

    PubMed

    He, Fuhong; Wang, Tao; Gu, Lijuan; Li, Tao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    Taking the Quickbird optical satellite imagery of the small watershed of Beiyanzigou valley of Qixia city, Shandong province, as the study data, we proposed a new method by using a fused image of topography with remote sensing imagery (RSI) to achieve a high precision interpretation of gully edge lines. The technique first transformed remote sensing imagery into HSV color space from RGB color space. Then the slope threshold values of gully edge line and gully thalweg were gained through field survey and the slope data were segmented using thresholding, respectively. Based on the fused image in combination with gully thalweg thresholding vectors, the gully thalweg thresholding vectors were amended. Lastly, the gully edge line might be interpreted based on the amended gully thalweg vectors, fused image, gully edge line thresholding vectors, and slope data. A testing region was selected in the study area to assess the accuracy. Then accuracy assessment of the gully information interpreted by both interpreting remote sensing imagery only and the fused image was performed using the deviation, kappa coefficient, and overall accuracy of error matrix. Compared with interpreting remote sensing imagery only, the overall accuracy and kappa coefficient are increased by 24.080% and 264.364%, respectively. The average deviations of gully head and gully edge line are reduced by 60.448% and 67.406%, respectively. The test results show the thematic and the positional accuracy of gully interpreted by new method are significantly higher. Finally, the error sources for interpretation accuracy by the two methods were analyzed.

  6. An Integrated Use of Topography with RSI in Gully Mapping, Shandong Peninsula, China

    PubMed Central

    He, Fuhong; Wang, Tao; Gu, Lijuan; Li, Tao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    Taking the Quickbird optical satellite imagery of the small watershed of Beiyanzigou valley of Qixia city, Shandong province, as the study data, we proposed a new method by using a fused image of topography with remote sensing imagery (RSI) to achieve a high precision interpretation of gully edge lines. The technique first transformed remote sensing imagery into HSV color space from RGB color space. Then the slope threshold values of gully edge line and gully thalweg were gained through field survey and the slope data were segmented using thresholding, respectively. Based on the fused image in combination with gully thalweg thresholding vectors, the gully thalweg thresholding vectors were amended. Lastly, the gully edge line might be interpreted based on the amended gully thalweg vectors, fused image, gully edge line thresholding vectors, and slope data. A testing region was selected in the study area to assess the accuracy. Then accuracy assessment of the gully information interpreted by both interpreting remote sensing imagery only and the fused image was performed using the deviation, kappa coefficient, and overall accuracy of error matrix. Compared with interpreting remote sensing imagery only, the overall accuracy and kappa coefficient are increased by 24.080% and 264.364%, respectively. The average deviations of gully head and gully edge line are reduced by 60.448% and 67.406%, respectively. The test results show the thematic and the positional accuracy of gully interpreted by new method are significantly higher. Finally, the error sources for interpretation accuracy by the two methods were analyzed. PMID:25302333

  7. Using a Photon Beam for Thermal Nociceptive Threshold Experiments

    NASA Astrophysics Data System (ADS)

    Walker, Azida; Anderson, Jeffery; Sherwood, Spencer

    In humans, risk of diabetes and diabetic complications increases with age and duration of prediabetic state. In an effort to understand the progression of this disease scientists have evaluated the deterioration of the nervous system. One of the current methods used in the evaluation of the deterioration of the nervous system is through thermal threshold experiments. An incremental Hot / Cold Plate Analgesia Meter (IITC Life Science,CA is used to linearly increase the plate temperature at a rate of 10 ºC min-1 with a cutoff temperature of 55 ºC. Hind limb heat pain threshold (HPT) will be defined as a plate temperature at which the animal abruptly withdraws either one of its hind feet from the plate surface in a sharp move, typically followed by licking of the lifted paw. One of the disadvantages of using this hot plate method is in determining the true temperature at which the paw was withdrawn. While the temperature of the plate is known the position of the paw on the surface may vary; occasionally being cupped resulting in a temperature differentiation between the plate and the paw. During experiments the rats may urine onto the plate changing the temperature of the surface again resulting in reduced accuracy as to the withdrawal threshold. We propose here a new method for nociceptive somatic experiments involving the heat pain threshold experiments. This design employs the use of a photon beam to detect thermal response from an animal model. The details of this design is presented. Funded by the Undergraduate Research Council at the University of Central Arkansas.

  8. Pain Intensity Recognition Rates via Biopotential Feature Patterns with Support Vector Machines

    PubMed Central

    Gruss, Sascha; Treister, Roi; Werner, Philipp; Traue, Harald C.; Crawcour, Stephen; Andrade, Adriano; Walter, Steffen

    2015-01-01

    Background The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient’s report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric rating scales (NRS) count among the most common tools, which are restricted to patients with normal mental abilities. There also exist instruments for pain assessment in people with verbal and / or cognitive impairments and instruments for pain assessment in people who are sedated and automated ventilated. However, all these diagnostic methods either have limited reliability and validity or are very time-consuming. In contrast, biopotentials can be automatically analyzed with machine learning algorithms to provide a surrogate measure of pain intensity. Methods In this context, we created a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. Eighty-five participants were subjected to painful heat stimuli (baseline, pain threshold, two intermediate thresholds, and pain tolerance threshold) under controlled conditions and the signals of electromyography, skin conductance level, and electrocardiography were collected. A total of 159 features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, variability, and similarity. Results We achieved classification rates of 90.94% for baseline vs. pain tolerance threshold and 79.29% for baseline vs. pain threshold. The most selected pain features stemmed from the amplitude and similarity group and were derived from facial electromyography. Conclusion The machine learning measurement of pain in patients could provide valuable information for a clinical team and thus support the treatment assessment. PMID:26474183

  9. Is heart rate variability a feasible method to determine anaerobic threshold in progressive resistance exercise in coronary artery disease?

    PubMed Central

    Sperling, Milena P. R.; Simões, Rodrigo P.; Caruso, Flávia C. R.; Mendes, Renata G.; Arena, Ross; Borghi-Silva, Audrey

    2016-01-01

    ABSTRACT Background Recent studies have shown that the magnitude of the metabolic and autonomic responses during progressive resistance exercise (PRE) is associated with the determination of the anaerobic threshold (AT). AT is an important parameter to determine intensity in dynamic exercise. Objectives To investigate the metabolic and cardiac autonomic responses during dynamic resistance exercise in patients with Coronary Artery Disease (CAD). Method Twenty men (age = 63±7 years) with CAD [Left Ventricular Ejection Fraction (LVEF) = 60±10%] underwent a PRE protocol on a leg press until maximal exertion. The protocol began at 10% of One Repetition Maximum Test (1-RM), with subsequent increases of 10% until maximal exhaustion. Heart Rate Variability (HRV) indices from Poincaré plots (SD1, SD2, SD1/SD2) and time domain (rMSSD and RMSM), and blood lactate were determined at rest and during PRE. Results Significant alterations in HRV and blood lactate were observed starting at 30% of 1-RM (p<0.05). Bland-Altman plots revealed a consistent agreement between blood lactate threshold (LT) and rMSSD threshold (rMSSDT) and between LT and SD1 threshold (SD1T). Relative values of 1-RM in all LT, rMSSDT and SD1T did not differ (29%±5 vs 28%±5 vs 29%±5 Kg, respectively). Conclusion HRV during PRE could be a feasible noninvasive method of determining AT in CAD patients to plan intensities during cardiac rehabilitation. PMID:27556384

  10. [Determination of the anaerobic threshold by the rate of ventilation and cardio interval variability].

    PubMed

    Seluianov, V N; Kalinin, E M; Pak, G D; Maevskaia, V I; Konrad, A H

    2011-01-01

    The aim of this work is to develop methods for determining the anaerobic threshold according to the rate of ventilation and cardio interval variability during the test with stepwise increases load on the cycle ergometer and treadmill. In the first phase developed the method for determining the anaerobic threshold for lung ventilation. 49 highly skilled skiers took part in the experiment. They performed a treadmill ski-walking test with sticks with gradually increasing slope from 0 to 25 degrees, the slope increased by one degree every minute. In the second phase we developed a method for determining the anaerobic threshold according dynamics ofcardio interval variability during the test. The study included 86 athletes of different sports specialties who performed pedaling on the cycle ergometer "Monarch" in advance. Initial output was 25 W, power increased by 25 W every 2 min. The pace was steady--75 rev/min. Measurement of pulmonary ventilation and oxygen and carbon dioxide content was performed using gas analyzer COSMED K4. Sampling of arterial blood was carried from the ear lobe or finger, blood lactate concentration was determined using an "Akusport" instrument. RR-intervals registration was performed using heart rate monitor Polar s810i. As a result, it was shown that the graphical method for determining the onset of anaerobic threshold ventilation (VAnP) coincides with the accumulation of blood lactate 3.8 +/- 0.1 mmol/l when testing on a treadmill and 4.1 +/- 0.6 mmol/1 on the cycle ergometer. The connection between the measure of oxygen consumption at VAnP and the dispersion of cardio intervals (SD1), derived regression equation: VO2AnT = 0.35 + 0.01SD1W + 0.0016SD1HR + + 0.106SD1(ms), l/min; (R = 0.98, error evaluation function 0.26 L/min, p < 0.001), where W (W)--Power, HR--heart rate (beats/min), SD1--cardio intervals dispersion (ms) at the moment of registration of cardio interval threshold.

  11. Cloud cover over the equatorial eastern Pacific derived from July 1983 International Satellite Cloud Climatology Project data using a hybrid bispectral threshold method

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Harrison, Edwin F.; Gibson, Gary G.

    1987-01-01

    A set of visible and IR data obtained with GOES from July 17-31, 1983 is analyzed using a modified version of the hybrid bispectral threshold method developed by Minnis and Harrison (1984). This methodology can be divided into a set of procedures or optional techniques to determine the proper contaminate clear-sky temperature or IR threshold. The various optional techniques are described; the options are: standard, low-temperature limit, high-reflectance limit, low-reflectance limit, coldest pixel and thermal adjustment limit, IR-only low-cloud temperature limit, IR clear-sky limit, and IR overcast limit. Variations in the cloud parameters and the characteristics and diurnal cycles of trade cumulus and stratocumulus clouds over the eastern equatorial Pacific are examined. It is noted that the new method produces substantial changes in about one third of the cloud amount retrieval; and low cloud retrievals are affected most by the new constraints.

  12. The ship edge feature detection based on high and low threshold for remote sensing image

    NASA Astrophysics Data System (ADS)

    Li, Xuan; Li, Shengyang

    2018-05-01

    In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.

  13. Have the temperature time series a structural change after 1998?

    NASA Astrophysics Data System (ADS)

    Werner, Rolf; Valev, Dimitare; Danov, Dimitar

    2012-07-01

    The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.

  14. An evaluation of the effect of recent temperature variability on the prediction of coral bleaching events.

    PubMed

    Donner, Simon D

    2011-07-01

    Over the past 30 years, warm thermal disturbances have become commonplace on coral reefs worldwide. These periods of anomalous sea surface temperature (SST) can lead to coral bleaching, a breakdown of the symbiosis between the host coral and symbiotic dinoflagellates which reside in coral tissue. The onset of bleaching is typically predicted to occur when the SST exceeds a local climatological maximum by 1 degrees C for a month or more. However, recent evidence suggests that the threshold at which bleaching occurs may depend on thermal history. This study uses global SST data sets (HadISST and NOAA AVHRR) and mass coral bleaching reports (from Reefbase) to examine the effect of historical SST variability on the accuracy of bleaching prediction. Two variability-based bleaching prediction methods are developed from global analysis of seasonal and interannual SST variability. The first method employs a local bleaching threshold derived from the historical variability in maximum annual SST to account for spatial variability in past thermal disturbance frequency. The second method uses a different formula to estimate the local climatological maximum to account for the low seasonality of SST in the tropics. The new prediction methods are tested against the common globally fixed threshold method using the observed bleaching reports. The results find that estimating the bleaching threshold from local historical SST variability delivers the highest predictive power, but also a higher rate of Type I errors. The second method has the lowest predictive power globally, though regional analysis suggests that it may be applicable in equatorial regions. The historical data analysis suggests that the bleaching threshold may have appeared to be constant globally because the magnitude of interannual variability in maximum SST is similar for many of the world's coral reef ecosystems. For example, the results show that a SST anomaly of 1 degrees C is equivalent to 1.73-2.94 standard deviations of the maximum monthly SST for two-thirds of the world's coral reefs. Coral reefs in the few regions that experience anomalously high interannual SST variability like the equatorial Pacific could prove critical to understanding how coral communities acclimate or adapt to frequent and/or severe thermal disturbances.

  15. Lack of concordance amongst measurements of individual anaerobic threshold and maximal lactate steady state on a cycle ergometer.

    PubMed

    Arratibel-Imaz, Iñaki; Calleja-González, Julio; Emparanza, Jose Ignacio; Terrados, Nicolas; Mjaanes, Jeffrey M; Ostojic, Sergej M

    2016-01-01

    The calculation of exertion intensity, in which a change is produced in the metabolic processes which provide the energy to maintain physical work, has been defined as the anaerobic threshold (AT). The direct calculation of maximal lactate steady state (MLSS) would require exertion intensities over a long period of time and with sufficient rest periods which would prove significantly difficult for daily practice. Many protocols have been used for the indirect calculation of MLSS. The aim of this study is to determine if the results of measurements with 12 different AT calculation methods and calculation software [Keul, Simon, Stegmann, Bunc, Dickhuth (TKM and WLa), Dmax, Freiburg, Geiger-Hille, Log-Log, Lactate Minimum] can be used interchangeably, including the method of the fixed threshold of Mader/OBLA's 4 mmol/l and then to compare them with the direct measurement of MLSS. There were two parts to this research. Phase 1: results from 162 exertion tests chosen at random from the 1560 tests. Phase 2: sixteen athletes (n = 16) carried out different tests on five consecutive days. There was very high concordance among all the methods [intraclass correlation coefficient (ICC) > 0.90], except Log-Log in relation to the Stegamnn, Dmax, Dickhuth-WLa and Geiger-Hille. The Dickhuth-TKM showed a high tendency towards concordance, with Dmax (2.2 W) and Dickhuth-WLa (0.1 W). The Dickhuth-TKM method presented a high tendency to concordance with Dickhuth-WLa (0.5 W), Freiburg (7.4 W), MLSS (2.0 W), Bunc (8.9 W), Dmax (0.1 W). The calculation of MLSS power showed a high tendency to concordance, with Dickhuth-TKM (2 W), Dmax (2.1 W), Dickhuth-WLa (1.5 W). The fixed threshold of 4 mmol/l or OBLA produces slightly different and higher results than those obtained with all the methods analyzed, including MLSS, meaning an overestimation of power in the individual anaerobic threshold. The Dickhuth-TKM, Dmax and Dickhuth-WLa methods defined a high concordance on a cycle ergometer. Dickhuth-TKM, Dmax, Dickhuth-WLa described a high concordance with the power calculated to know the MLSS.

  16. Computer-Assisted Experiments with a Laser Diode

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2011-01-01

    A laser diode from an inexpensive laser pen (laser pointer) is used in simple experiments. The radiant output power and efficiency of the laser are measured, and polarization of the light beam is shown. The "h/e" ratio is available from the threshold of spontaneous emission. The lasing threshold is found using several methods. With a…

  17. Cortical and Sensory Causes of Individual Differences in Selective Attention Ability among Listeners with Normal Hearing Thresholds

    ERIC Educational Resources Information Center

    Shinn-Cunningham, Barbara

    2017-01-01

    Purpose: This review provides clinicians with an overview of recent findings relevant to understanding why listeners with normal hearing thresholds (NHTs) sometimes suffer from communication difficulties in noisy settings. Method: The results from neuroscience and psychoacoustics are reviewed. Results: In noisy settings, listeners focus their…

  18. 76 FR 55865 - Fisheries Off West Coast States; Notice of Availability for Secretarial Amendment 1 to the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... methods: Electronic Submission: Submit all electronic public comments via the Federal e-Rulemaking Portal http://www.regulations.gov . To submit comments via the e-Rulemaking Portal, first click the ``submit a...-frame. The overfished threshold would also be revised. The overfished threshold or minimum stock size...

  19. Use of change-point detection for friction-velocity threshold evaluation in eddy-covariance studies

    Treesearch

    A.G. Barr; A.D. Richardson; D.Y. Hollinger; D. Papale; M.A. Arain; T.A. Black; G. Bohrer; D. Dragoni; M.L. Fischer; L. Gu; B.E. Law; H.A. Margolis; J.H. McCaughey; J.W. Munger; W. Oechel; K. Schaeffer

    2013-01-01

    The eddy-covariance method often underestimates fluxes under stable, low-wind conditions at night when turbulence is not well developed. The most common approach to resolve the problem of nighttime flux underestimation is to identify and remove the deficit periods using friction-velocity (u∗) threshold filters (u∗

  20. Application of Terrestrial Geomorphic Threshold Theory to the Analysis of Small Channels on Mars

    NASA Technical Reports Server (NTRS)

    Rosenshein, E. B.; Greeley, R.; Arrowsmith, J. R.

    2001-01-01

    New terrestrial work on the geomorphic thresholds for channel initiation use the drainage area above a channel head vs. the slope at the channel head to delineate surface process types. This method has been used to characterize martian landscapes. Additional information is contained in the original extended abstract.

  1. The temporal dimension of regime shifts: How long can ecosystems operate beyond critical thresholds before transitions become irreversible?

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: Ecosystem thresholds are often identified by observing or inducing slow changes in different driver variables and investigating changes in the asymptotic state of the system, such as the response of lakes to nutrient loading or biome responses to climate change. Yet ma...

  2. Comparative advantages of novel algorithms using MSR threshold and MSR difference threshold for biclustering gene expression data.

    PubMed

    Das, Shyama; Idicula, Sumam Mary

    2011-01-01

    The goal of biclustering in gene expression data matrix is to find a submatrix such that the genes in the submatrix show highly correlated activities across all conditions in the submatrix. A measure called mean squared residue (MSR) is used to simultaneously evaluate the coherence of rows and columns within the submatrix. MSR difference is the incremental increase in MSR when a gene or condition is added to the bicluster. In this chapter, three biclustering algorithms using MSR threshold (MSRT) and MSR difference threshold (MSRDT) are experimented and compared. All these methods use seeds generated from K-Means clustering algorithm. Then these seeds are enlarged by adding more genes and conditions. The first algorithm makes use of MSRT alone. Both the second and third algorithms make use of MSRT and the newly introduced concept of MSRDT. Highly coherent biclusters are obtained using this concept. In the third algorithm, a different method is used to calculate the MSRDT. The results obtained on bench mark datasets prove that these algorithms are better than many of the metaheuristic algorithms.

  3. Research on energy stock market associated network structure based on financial indicators

    NASA Astrophysics Data System (ADS)

    Xi, Xian; An, Haizhong

    2018-01-01

    A financial market is a complex system consisting of many interacting units. In general, due to the various types of information exchange within the industry, there is a relationship between the stocks that can reveal their clear structural characteristics. Complex network methods are powerful tools for studying the internal structure and function of the stock market, which allows us to better understand the stock market. Applying complex network methodology, a stock associated network model based on financial indicators is created. Accordingly, we set threshold value and use modularity to detect the community network, and we analyze the network structure and community cluster characteristics of different threshold situations. The study finds that the threshold value of 0.7 is the abrupt change point of the network. At the same time, as the threshold value increases, the independence of the community strengthens. This study provides a method of researching stock market based on the financial indicators, exploring the structural similarity of financial indicators of stocks. Also, it provides guidance for investment and corporate financial management.

  4. Estimation of Crack Initiation and Propagation Thresholds of Confined Brittle Coal Specimens Based on Energy Dissipation Theory

    NASA Astrophysics Data System (ADS)

    Ning, Jianguo; Wang, Jun; Jiang, Jinquan; Hu, Shanchao; Jiang, Lishuai; Liu, Xuesheng

    2018-01-01

    A new energy-dissipation method to identify crack initiation and propagation thresholds is introduced. Conventional and cyclic loading-unloading triaxial compression tests and acoustic emission experiments were performed for coal specimens from a 980-m deep mine with different confining pressures of 10, 15, 20, 25, 30, and 35 MPa. Stress-strain relations, acoustic emission patterns, and energy evolution characteristics obtained during the triaxial compression tests were analyzed. The majority of the input energy stored in the coal specimens took the form of elastic strain energy. After the elastic-deformation stage, part of the input energy was consumed by stable crack propagation. However, with an increase in stress levels, unstable crack propagation commenced, and the energy dissipation and coal damage were accelerated. The variation in the pre-peak energy-dissipation ratio was consistent with the coal damage. This new method demonstrates that the crack initiation threshold was proportional to the peak stress ( σ p) for ratios that ranged from 0.4351 to 0.4753 σ p, and the crack damage threshold ranged from 0.8087 to 0.8677 σ p.

  5. Adaptive thresholding and dynamic windowing method for automatic centroid detection of digital Shack-Hartmann wavefront sensor.

    PubMed

    Yin, Xiaoming; Li, Xiang; Zhao, Liping; Fang, Zhongping

    2009-11-10

    A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.

  6. Analysis of Critical Mass in Threshold Model of Diffusion

    NASA Astrophysics Data System (ADS)

    Kim, Jeehong; Hur, Wonchang; Kang, Suk-Ho

    2012-04-01

    Why does diffusion sometimes show cascade phenomena but at other times is impeded? In addressing this question, we considered a threshold model of diffusion, focusing on the formation of a critical mass, which enables diffusion to be self-sustaining. Performing an agent-based simulation, we found that the diffusion model produces only two outcomes: Almost perfect adoption or relatively few adoptions. In order to explain the difference, we considered the various properties of network structures and found that the manner in which thresholds are arrayed over a network is the most critical factor determining the size of a cascade. On the basis of the results, we derived a threshold arrangement method effective for generation of a critical mass and calculated the size required for perfect adoption.

  7. Quantum secret sharing using orthogonal multiqudit entangled states

    NASA Astrophysics Data System (ADS)

    Bai, Chen-Ming; Li, Zhi-Hui; Liu, Cheng-Ji; Li, Yong-Ming

    2017-12-01

    In this work, we investigate the distinguishability of orthogonal multiqudit entangled states under restricted local operations and classical communication. According to these properties, we propose a quantum secret sharing scheme to realize three types of access structures, i.e., the ( n, n)-threshold, the restricted (3, n)-threshold and restricted (4, n)-threshold schemes (called LOCC-QSS scheme). All cooperating players in the restricted threshold schemes are from two disjoint groups. In the proposed protocol, the participants use the computational basis measurement and classical communication to distinguish between those orthogonal states and reconstruct the original secret. Furthermore, we also analyze the security of our scheme in four primary quantum attacks and give a simple encoding method in order to better prevent the participant conspiracy attack.

  8. Estimating the epidemic threshold on networks by deterministic connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Kezan, E-mail: lkzzr@sohu.com; Zhu, Guanghu; Fu, Xinchu

    2014-12-15

    For many epidemic networks some connections between nodes are treated as deterministic, while the remainder are random and have different connection probabilities. By applying spectral analysis to several constructed models, we find that one can estimate the epidemic thresholds of these networks by investigating information from only the deterministic connections. Nonetheless, in these models, generic nonuniform stochastic connections and heterogeneous community structure are also considered. The estimation of epidemic thresholds is achieved via inequalities with upper and lower bounds, which are found to be in very good agreement with numerical simulations. Since these deterministic connections are easier to detect thanmore » those stochastic connections, this work provides a feasible and effective method to estimate the epidemic thresholds in real epidemic networks.« less

  9. Experimental and Finite Element Modeling of Near-Threshold Fatigue Crack Growth for the K-Decreasing Test Method

    NASA Technical Reports Server (NTRS)

    Smith, Stephen W.; Seshadri, Banavara R.; Newman, John A.

    2015-01-01

    The experimental methods to determine near-threshold fatigue crack growth rate data are prescribed in ASTM standard E647. To produce near-threshold data at a constant stress ratio (R), the applied stress-intensity factor (K) is decreased as the crack grows based on a specified K-gradient. Consequently, as the fatigue crack growth rate threshold is approached and the crack tip opening displacement decreases, remote crack wake contact may occur due to the plastically deformed crack wake surfaces and shield the growing crack tip resulting in a reduced crack tip driving force and non-representative crack growth rate data. If such data are used to life a component, the evaluation could yield highly non-conservative predictions. Although this anomalous behavior has been shown to be affected by K-gradient, starting K level, residual stresses, environmental assisted cracking, specimen geometry, and material type, the specifications within the standard to avoid this effect are limited to a maximum fatigue crack growth rate and a suggestion for the K-gradient value. This paper provides parallel experimental and computational simulations for the K-decreasing method for two materials (an aluminum alloy, AA 2024-T3 and a titanium alloy, Ti 6-2-2-2-2) to aid in establishing clear understanding of appropriate testing requirements. These simulations investigate the effect of K-gradient, the maximum value of stress-intensity factor applied, and material type. A material independent term is developed to guide in the selection of appropriate test conditions for most engineering alloys. With the use of such a term, near-threshold fatigue crack growth rate tests can be performed at accelerated rates, near-threshold data can be acquired in days instead of weeks without having to establish testing criteria through trial and error, and these data can be acquired for most engineering materials, even those that are produced in relatively small product forms.

  10. Modified cable equation incorporating transverse polarization of neuronal membranes for accurate coupling of electric fields.

    PubMed

    Wang, Boshuo; Aberra, Aman S; Grill, Warren M; Peterchev, Angel V

    2018-04-01

    We present a theory and computational methods to incorporate transverse polarization of neuronal membranes into the cable equation to account for the secondary electric field generated by the membrane in response to transverse electric fields. The effect of transverse polarization on nonlinear neuronal activation thresholds is quantified and discussed in the context of previous studies using linear membrane models. The response of neuronal membranes to applied electric fields is derived under two time scales and a unified solution of transverse polarization is given for spherical and cylindrical cell geometries. The solution is incorporated into the cable equation re-derived using an asymptotic model that separates the longitudinal and transverse dimensions. Two numerical methods are proposed to implement the modified cable equation. Several common neural stimulation scenarios are tested using two nonlinear membrane models to compare thresholds of the conventional and modified cable equations. The implementations of the modified cable equation incorporating transverse polarization are validated against previous results in the literature. The test cases show that transverse polarization has limited effect on activation thresholds. The transverse field only affects thresholds of unmyelinated axons for short pulses and in low-gradient field distributions, whereas myelinated axons are mostly unaffected. The modified cable equation captures the membrane's behavior on different time scales and models more accurately the coupling between electric fields and neurons. It addresses the limitations of the conventional cable equation and allows sound theoretical interpretations. The implementation provides simple methods that are compatible with current simulation approaches to study the effect of transverse polarization on nonlinear membranes. The minimal influence by transverse polarization on axonal activation thresholds for the nonlinear membrane models indicates that predictions of stronger effects in linear membrane models with a fixed activation threshold are inaccurate. Thus, the conventional cable equation works well for most neuroengineering applications, and the presented modeling approach is well suited to address the exceptions.

  11. Differential equation models for sharp threshold dynamics.

    PubMed

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  12. Rapid detection of pandemic influenza in the presence of seasonal influenza

    PubMed Central

    2010-01-01

    Background Key to the control of pandemic influenza are surveillance systems that raise alarms rapidly and sensitively. In addition, they must minimise false alarms during a normal influenza season. We develop a method that uses historical syndromic influenza data from the existing surveillance system 'SERVIS' (Scottish Enhanced Respiratory Virus Infection Surveillance) for influenza-like illness (ILI) in Scotland. Methods We develop an algorithm based on the weekly case ratio (WCR) of reported ILI cases to generate an alarm for pandemic influenza. From the seasonal influenza data from 13 Scottish health boards, we estimate the joint probability distribution of the country-level WCR and the number of health boards showing synchronous increases in reported influenza cases over the previous week. Pandemic cases are sampled with various case reporting rates from simulated pandemic influenza infections and overlaid with seasonal SERVIS data from 2001 to 2007. Using this combined time series we test our method for speed of detection, sensitivity and specificity. Also, the 2008-09 SERVIS ILI cases are used for testing detection performances of the three methods with a real pandemic data. Results We compare our method, based on our simulation study, to the moving-average Cumulative Sums (Mov-Avg Cusum) and ILI rate threshold methods and find it to be more sensitive and rapid. For 1% case reporting and detection specificity of 95%, our method is 100% sensitive and has median detection time (MDT) of 4 weeks while the Mov-Avg Cusum and ILI rate threshold methods are, respectively, 97% and 100% sensitive with MDT of 5 weeks. At 99% specificity, our method remains 100% sensitive with MDT of 5 weeks. Although the threshold method maintains its sensitivity of 100% with MDT of 5 weeks, sensitivity of Mov-Avg Cusum declines to 92% with increased MDT of 6 weeks. For a two-fold decrease in the case reporting rate (0.5%) and 99% specificity, the WCR and threshold methods, respectively, have MDT of 5 and 6 weeks with both having sensitivity close to 100% while the Mov-Avg Cusum method can only manage sensitivity of 77% with MDT of 6 weeks. However, the WCR and Mov-Avg Cusum methods outperform the ILI threshold method by 1 week in retrospective detection of the 2009 pandemic in Scotland. Conclusions While computationally and statistically simple to implement, the WCR algorithm is capable of raising alarms, rapidly and sensitively, for influenza pandemics against a background of seasonal influenza. Although the algorithm was developed using the SERVIS data, it has the capacity to be used at other geographic scales and for different disease systems where buying some early extra time is critical. PMID:21106071

  13. Estimation of frequency offset in mobile satellite modems

    NASA Technical Reports Server (NTRS)

    Cowley, W. G.; Rice, M.; Mclean, A. N.

    1993-01-01

    In mobilesat applications, frequency offset on the received signal must be estimated and removed prior to further modem processing. A straightforward method of estimating the carrier frequency offset is to raise the received MPSK signal to the M-th power, and then estimate the location of the peak spectral component. An analysis of the lower signal to noise threshold of this method is carried out for BPSK signals. Predicted thresholds are compared to simulation results. It is shown how the method can be extended to pi/M MPSK signals. A real-time implementation of frequency offset estimation for the Australian mobile satellite system is described.

  14. Impact of view reduction in CT on radiation dose for patients

    NASA Astrophysics Data System (ADS)

    Parcero, E.; Flores, L.; Sánchez, M. G.; Vidal, V.; Verdú, G.

    2017-08-01

    Iterative methods have become a hot topic of research in computed tomography (CT) imaging because of their capacity to resolve the reconstruction problem from a limited number of projections. This allows the reduction of radiation exposure on patients during the data acquisition. The reconstruction time and the high radiation dose imposed on patients are the two major drawbacks in CT. To solve them effectively we adapted the method for sparse linear equations and sparse least squares (LSQR) with soft threshold filtering (STF) and the fast iterative shrinkage-thresholding algorithm (FISTA) to computed tomography reconstruction. The feasibility of the proposed methods is demonstrated numerically.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Shioumin; Kruijs, Robbert van de; Zoethout, Erwin

    Ion sputtering yields for Ru, Mo, and Si under Ar{sup +} ion bombardment in the near-threshold energy range have been studied using an in situ weight-loss method with a Kaufman ion source, Faraday cup, and quartz crystal microbalance. The results are compared to theoretical models. The accuracy of the in situ weight-loss method was verified by thickness-decrease measurements using grazing incidence x-ray reflectometry, and results from both methods are in good agreement. These results provide accurate data sets for theoretical modeling in the near-threshold sputter regime and are of relevance for (optical) surfaces exposed to plasmas, as, for instance, inmore » extreme ultraviolet photolithography.« less

  16. Threshold Velocity for Saltation Activity in the Taklimakan Desert

    NASA Astrophysics Data System (ADS)

    Yang, Xinghua; He, Qing; Matimin, Ali; Yang, Fan; Huo, Wen; Liu, Xinchun; Zhao, Tianliang; Shen, Shuanghe

    2017-12-01

    The threshold velocity is an indicator of a soil's susceptibility to saltation activity and is also an important parameter in dust emission models. In this study, the saltation activity, atmospheric conditions, and soil conditions were measured from 1 August 2008 to 31 July 2009 in the Taklimakan Desert, China. the threshold velocity was estimated using the Gaussian time fraction equivalence method. At 2 m height, the 1-min averaged threshold velocity varied between 3.5 and 10.9 m/s, with a mean of 5.9 m/s. Threshold velocities varying between 4.5 and 7.5 m/s accounted for about 91.4% of all measurements. The average threshold velocity displayed clear seasonal variations in the following sequence: winter (5.1 m/s) < autumn (5.8 m/s) < spring (6.1 m/s) < summer (6.5 m/s). A regression equation of threshold velocity was established based on the relations between daily mean threshold velocity and air temperature, specific humidity, and soil volumetric moisture content. High or moderate positive correlations were found between threshold velocity and air temperature, specific humidity, and soil volumetric moisture content (air temperature r = 0.75; specific humidity r = 0.59; and soil volumetric moisture content r = 0.55; sample size = 251). In the study area, the observed horizontal dust flux was 4198.0 kg/m during the whole period of observation, while the horizontal dust flux calculated using the threshold velocity from the regression equation was 4675.6 kg/m. The correlation coefficient between the calculated result and the observations was 0.91. These results indicate that atmospheric and soil conditions should not be neglected in parameterization schemes for threshold velocity.

  17. Calculating the dim light melatonin onset: the impact of threshold and sampling rate.

    PubMed

    Molina, Thomas A; Burgess, Helen J

    2011-10-01

    The dim light melatonin onset (DLMO) is the most reliable circadian phase marker in humans, but the cost of assaying samples is relatively high. Therefore, the authors examined differences between DLMOs calculated from hourly versus half-hourly sampling and differences between DLMOs calculated with two recommended thresholds (a fixed threshold of 3 pg/mL and a variable "3k" threshold equal to the mean plus two standard deviations of the first three low daytime points). The authors calculated these DLMOs from salivary dim light melatonin profiles collected from 122 individuals (64 women) at baseline. DLMOs derived from hourly sampling occurred on average only 6-8 min earlier than the DLMOs derived from half-hourly saliva sampling, and they were highly correlated with each other (r ≥ 0.89, p < .001). However, in up to 19% of cases the DLMO derived from hourly sampling was >30 min from the DLMO derived from half-hourly sampling. The 3 pg/mL threshold produced significantly less variable DLMOs than the 3k threshold. However, the 3k threshold was significantly lower than the 3 pg/mL threshold (p < .001). The DLMOs calculated with the 3k method were significantly earlier (by 22-24 min) than the DLMOs calculated with the 3 pg/mL threshold, regardless of sampling rate. These results suggest that in large research studies and clinical settings, the more affordable and practical option of hourly sampling is adequate for a reasonable estimate of circadian phase. Although the 3 pg/mL fixed threshold is less variable than the 3k threshold, it produces estimates of the DLMO that are further from the initial rise of melatonin.

  18. Cool, warm, and heat-pain detection thresholds: testing methods and inferences about anatomic distribution of receptors.

    PubMed

    Dyck, P J; Zimmerman, I; Gillen, D A; Johnson, D; Karnes, J L; O'Brien, P C

    1993-08-01

    We recently found that vibratory detection threshold is greatly influenced by the algorithm of testing. Here, we study the influence of stimulus characteristics and algorithm of testing and estimating threshold on cool (CDT), warm (WDT), and heat-pain (HPDT) detection thresholds. We show that continuously decreasing (for CDT) or increasing (for WDT) thermode temperature to the point at which cooling or warming is perceived and signaled by depressing a response key ("appearance" threshold) overestimates threshold with rapid rates of thermal change. The mean of the appearance and disappearance thresholds also does not perform well for insensitive sites and patients. Pyramidal (or flat-topped pyramidal) stimuli ranging in magnitude, in 25 steps, from near skin temperature to 9 degrees C for 10 seconds (for CDT), from near skin temperature to 45 degrees C for 10 seconds (for WDT), and from near skin temperature to 49 degrees C for 10 seconds (for HPDT) provide ideal stimuli for use in several algorithms of testing and estimating threshold. Near threshold, only the initial direction of thermal change from skin temperature is perceived, and not its return to baseline. Use of steps of stimulus intensity allows the subject or patient to take the needed time to decide whether the stimulus was felt or not (in 4, 2, and 1 stepping algorithms), or whether it occurred in stimulus interval 1 or 2 (in two-alternative forced-choice testing). Thermal thresholds were generally significantly lower with a large (10 cm2) than with a small (2.7 cm2) thermode.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Correlations among within-channel and between-channel auditory gap-detection thresholds in normal listeners.

    PubMed

    Phillips, Dennis P; Smith, Jennifer C

    2004-01-01

    We obtained data on within-channel and between-channel auditory temporal gap-detection acuity in the normal population. Ninety-five normal listeners were tested for gap-detection thresholds, for conditions in which the gap was bounded by spectrally identical, and by spectrally different, acoustic markers. Separate thresholds were obtained with the use of an adaptive tracking method, for gaps delimited by narrowband noise bursts centred on 1.0 kHz, noise bursts centred on 4.0 kHz, and for gaps bounded by a leading marker of 4.0 kHz noise and a trailing marker of 1.0 kHz noise. Gap thresholds were lowest for silent periods bounded by identical markers--'within-channel' stimuli. Gap thresholds were significantly longer for the between-channel stimulus--silent periods bounded by unidentical markers (p < 0.0001). Thresholds for the two within-channel tasks were highly correlated (R = 0.76). Thresholds for the between-channel stimulus were weakly correlated with thresholds for the within-channel stimuli (1.0 kHz, R = 0.39; and 4.0 kHz, R = 0.46). The relatively poor predictability of between-channel thresholds from the within-channel thresholds is new evidence on the separability of the mechanisms that mediate performance of the two tasks. The data confirm that the acuity difference for the tasks, which has previously been demonstrated in only small numbers of highly trained listeners, extends to a population of untrained listeners. The acuity of the between-channel mechanism may be relevant to the formation of voice-onset time-category boundaries in speech perception.

  20. Thermoreception and nociception of the skin: a classic paper of Bessou and Perl and analyses of thermal sensitivity during a student laboratory exercise.

    PubMed

    Kuhtz-Buschbeck, Johann P; Andresen, Wiebke; Göbel, Stephan; Gilster, René; Stick, Carsten

    2010-06-01

    About four decades ago, Perl and collaborators were the first ones who unambiguously identified specifically nociceptive neurons in the periphery. In their classic work, they recorded action potentials from single C-fibers of a cutaneous nerve in cats while applying carefully graded stimuli to the skin (Bessou P, Perl ER. Response of cutaneous sensory units with unmyelinated fibers to noxious stimuli. J Neurophysiol 32: 1025-1043, 1969). They discovered polymodal nociceptors, which responded to mechanical, thermal, and chemical stimuli in the noxious range, and differentiated them from low-threshold thermoreceptors. Their classic findings form the basis of the present method that undergraduate medical students experience during laboratory exercises of sensory physiology, namely, quantitative testing of the thermal detection and pain thresholds. This diagnostic method examines the function of thin afferent nerve fibers. We collected data from nearly 300 students that showed that 1) women are more sensitive to thermal detection and thermal pain at the thenar than men, 2) habituation shifts thermal pain thresholds during repetititve testing, 3) the cold pain threshold is rather variable and lower when tested after heat pain than in the reverse case (order effect), and 4) ratings of pain intensity on a visual analog scale are correlated with the threshold temperature for heat pain but not for cold pain. Median group results could be reproduced in a retest. Quantitative sensory testing of thermal thresholds is feasible and instructive in the setting of a laboratory exercise and is appreciated by the students as a relevant and interesting technique.

  1. Age effects on pain thresholds, temporal summation and spatial summation of heat and pressure pain.

    PubMed

    Lautenbacher, Stefan; Kunz, Miriam; Strate, Peter; Nielsen, Jesper; Arendt-Nielsen, Lars

    2005-06-01

    Experimental data on age-related changes in pain perception have so far been contradictory. It has appeared that the type of pain induction method is critical in this context, with sensitivity to heat pain being decreased whereas sensitivity to pressure pain may be even enhanced in the elderly. Furthermore, it has been shown that temporal summation of heat pain is more pronounced in the elderly but it has remained unclear whether age differences in temporal summation are also evident when using other pain induction methods. No studies on age-related changes in spatial summation of pain have so far been conducted. The aim of the present study was to provide a comprehensive survey on age-related changes in pain perception, i.e. in somatosensory thresholds (warmth, cold, vibration), pain thresholds (heat, pressure) and spatial and temporal summation of heat and pressure pain. We investigated 20 young (mean age 27.1 years) and 20 elderly (mean age 71.6 years) subjects. Our results confirmed and extended previous findings by showing that somatosensory thresholds for non-noxious stimuli increase with age whereas pressure pain thresholds decrease and heat pain thresholds show no age-related changes. Apart from an enhanced temporal summation of heat pain, pain summation was not found to be critically affected by age. The results of the present study provide evidence for stimulus-specific changes in pain perception in the elderly, with deep tissue (muscle) nociception being affected differently by age than superficial tissue (skin) nociception. Summation mechanisms contribute only moderately to age changes in pain perception.

  2. An adaptive design for updating the threshold value of a continuous biomarker.

    PubMed

    Spencer, Amy V; Harbron, Chris; Mander, Adrian; Wason, James; Peers, Ian

    2016-11-30

    Potential predictive biomarkers are often measured on a continuous scale, but in practice, a threshold value to divide the patient population into biomarker 'positive' and 'negative' is desirable. Early phase clinical trials are increasingly using biomarkers for patient selection, but at this stage, it is likely that little will be known about the relationship between the biomarker and the treatment outcome. We describe a single-arm trial design with adaptive enrichment, which can increase power to demonstrate efficacy within a patient subpopulation, the parameters of which are also estimated. Our design enables us to learn about the biomarker and optimally adjust the threshold during the study, using a combination of generalised linear modelling and Bayesian prediction. At the final analysis, a binomial exact test is carried out, allowing the hypothesis that 'no population subset exists in which the novel treatment has a desirable response rate' to be tested. Through extensive simulations, we are able to show increased power over fixed threshold methods in many situations without increasing the type-I error rate. We also show that estimates of the threshold, which defines the population subset, are unbiased and often more precise than those from fixed threshold studies. We provide an example of the method applied (retrospectively) to publically available data from a study of the use of tamoxifen after mastectomy by the German Breast Study Group, where progesterone receptor is the biomarker of interest. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  3. Optimal Design for the Precise Estimation of an Interaction Threshold: The Impact of Exposure to a Mixture of 18 Polyhalogenated Aromatic Hydrocarbons

    PubMed Central

    Yeatts, Sharon D.; Gennings, Chris; Crofton, Kevin M.

    2014-01-01

    Traditional additivity models provide little flexibility in modeling the dose–response relationships of the single agents in a mixture. While the flexible single chemical required (FSCR) methods allow greater flexibility, its implicit nature is an obstacle in the formation of the parameter covariance matrix, which forms the basis for many statistical optimality design criteria. The goal of this effort is to develop a method for constructing the parameter covariance matrix for the FSCR models, so that (local) alphabetic optimality criteria can be applied. Data from Crofton et al. are provided as motivation; in an experiment designed to determine the effect of 18 polyhalogenated aromatic hydrocarbons on serum total thyroxine (T4), the interaction among the chemicals was statistically significant. Gennings et al. fit the FSCR interaction threshold model to the data. The resulting estimate of the interaction threshold was positive and within the observed dose region, providing evidence of a dose-dependent interaction. However, the corresponding likelihood-ratio-based confidence interval was wide and included zero. In order to more precisely estimate the location of the interaction threshold, supplemental data are required. Using the available data as the first stage, the Ds-optimal second-stage design criterion was applied to minimize the variance of the hypothesized interaction threshold. Practical concerns associated with the resulting design are discussed and addressed using the penalized optimality criterion. Results demonstrate that the penalized Ds-optimal second-stage design can be used to more precisely define the interaction threshold while maintaining the characteristics deemed important in practice. PMID:22640366

  4. Effects of isoconcentration surface threshold values on the characteristics of needle-shaped precipitates in atom probe tomography data from an aged Al-Mg-Si alloy.

    PubMed

    Aruga, Yasuhiro; Kozuka, Masaya

    2016-04-01

    Needle-shaped precipitates in an aged Al-0.62Mg-0.93Si (mass%) alloy were identified using a compositional threshold method, an isoconcentration surface, in atom probe tomography (APT). The influence of thresholds on the morphological and compositional characteristics of the precipitates was investigated. Utilizing optimum parameters for the concentration space, a reliable number density of the precipitates is obtained without dependence on the elemental concentration threshold in comparison with evaluation by transmission electron microscopy (TEM). It is suggested that careful selection of the concentration space in APT can lead to a reasonable average Mg/Si ratio for the precipitates. It was found that the maximum length and maximum diameter of the precipitates are affected by the elemental concentration threshold. Adjustment of the concentration threshold gives better agreement with the precipitate dimensions measured by TEM. © The Author 2015. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Effect of Age and Severity of Facial Palsy on Taste Thresholds in Bell's Palsy Patients

    PubMed Central

    Park, Jung Min; Kim, Myung Gu; Jung, Junyang; Kim, Sung Su; Jung, A Ra; Kim, Sang Hoon

    2017-01-01

    Background and Objectives To investigate whether taste thresholds, as determined by electrogustometry (EGM) and chemical taste tests, differ by age and the severity of facial palsy in patients with Bell's palsy. Subjects and Methods This study included 29 patients diagnosed with Bell's palsy between January 2014 and May 2015 in our hospital. Patients were assorted into age groups and by severity of facial palsy, as determined by House-Brackmann Scale, and their taste thresholds were assessed by EGM and chemical taste tests. Results EGM showed that taste thresholds at four locations on the tongue and one location on the central soft palate, 1 cm from the palatine uvula, were significantly higher in Bell's palsy patients than in controls (p<0.05). In contrast, chemical taste tests showed no significant differences in taste thresholds between the two groups (p>0.05). The severity of facial palsy did not affect taste thresholds, as determined by both EGM and chemical taste tests (p>0.05). The overall mean electrical taste thresholds on EGM were higher in younger Bell's palsy patients than in healthy subjects, with the difference at the back-right area of the tongue differing significantly (p<0.05). In older individuals, however, no significant differences in taste thresholds were observed between Bell's palsy patients and healthy subjects (p>0.05). Conclusions Electrical taste thresholds were higher in Bell's palsy patients than in controls. These differences were observed in younger, but not in older, individuals. PMID:28417103

  6. A simplified focusing and astigmatism correction method for a scanning electron microscope

    NASA Astrophysics Data System (ADS)

    Lu, Yihua; Zhang, Xianmin; Li, Hai

    2018-01-01

    Defocus and astigmatism can lead to blurred images and poor resolution. This paper presents a simplified method for focusing and astigmatism correction of a scanning electron microscope (SEM). The method consists of two steps. In the first step, the fast Fourier transform (FFT) of the SEM image is performed and the FFT is subsequently processed with a threshold to achieve a suitable result. In the second step, the threshold FFT is used for ellipse fitting to determine the presence of defocus and astigmatism. The proposed method clearly provides the relationships between the defocus, the astigmatism and the direction of stretching of the FFT, and it can determine the astigmatism in a single image. Experimental studies are conducted to demonstrate the validity of the proposed method.

  7. Vehicle Speed and Length Estimation Using Data from Two Anisotropic Magneto-Resistive (AMR) Sensors

    PubMed Central

    Markevicius, Vytautas; Navikas, Dangirutis; Valinevicius, Algimantas; Zilys, Mindaugas

    2017-01-01

    Methods for estimating a car’s length are presented in this paper, as well as the results achieved by using a self-designed system equipped with two anisotropic magneto-resistive (AMR) sensors, which were placed on a road lane. The purpose of the research was to compare the lengths of mid-size cars, i.e., family cars (hatchbacks), saloons (sedans), station wagons and SUVs. Four methods were used in the research: a simple threshold based method, a threshold method based on moving average and standard deviation, a two-extreme-peak detection method and a method based on the amplitude and time normalization using linear extrapolation (or interpolation). The results were achieved by analyzing changes in the magnitude and in the absolute z-component of the magnetic field as well. The tests, which were performed in four different Earth directions, show differences in the values of estimated lengths. The magnitude-based results in the case when cars drove from the South to the North direction were even up to 1.2 m higher than the other results achieved using the threshold methods. Smaller differences in lengths were observed when the distances were measured between two extreme peaks in the car magnetic signatures. The results were summarized in tables and the errors of estimated lengths were presented. The maximal errors, related to real lengths, were up to 22%. PMID:28771171

  8. Measuring hearing in the harbor seal (Phoca vitulina): Comparison of behavioral and auditory brainstem response techniques

    NASA Astrophysics Data System (ADS)

    Wolski, Lawrence F.; Anderson, Rindy C.; Bowles, Ann E.; Yochem, Pamela K.

    2003-01-01

    Auditory brainstem response (ABR) and standard behavioral methods were compared by measuring in-air audiograms for an adult female harbor seal (Phoca vitulina). Behavioral audiograms were obtained using two techniques: the method of constant stimuli and the staircase method. Sensitivity was tested from 0.250 to 30 kHz. The seal showed good sensitivity from 6 to 12 kHz [best sensitivity 8.1 dB (re 20 μPa2.s) RMS at 8 kHz]. The staircase method yielded thresholds that were lower by 10 dB on average than the method of constant stimuli. ABRs were recorded at 2, 4, 8, 16, and 22 kHz and showed a similar best range (8-16 kHz). ABR thresholds averaged 5.7 dB higher than behavioral thresholds at 2, 4, and 8 kHz. ABRs were at least 7 dB lower at 16 kHz, and approximately 3 dB higher at 22 kHz. The better sensitivity of ABRs at higher frequencies could have reflected differences in the seal's behavior during ABR testing and/or bandwidth characteristics of test stimuli. These results agree with comparisons of ABR and behavioral methods performed in other recent studies and indicate that ABR methods represent a good alternative for estimating hearing range and sensitivity in pinnipeds, particularly when time is a critical factor and animals are untrained.

  9. Electrical leakage detection circuit

    DOEpatents

    Wild, Arthur

    2006-09-05

    A method is provided for detecting electrical leakage between a power supply and a frame of a vehicle or machine. The disclosed method includes coupling a first capacitor between a frame and a first terminal of a power supply for a predetermined period of time. The current flowing between the frame and the first capacitor is limited to a predetermined current limit. It is determined whether the voltage across the first capacitor exceeds a threshold voltage. A first output signal is provided when the voltage across the capacitor exceeds the threshold voltage.

  10. Methods of scaling threshold color difference using printed samples

    NASA Astrophysics Data System (ADS)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  11. A Purkinje shift in the spectral sensitivity of grey squirrels

    PubMed Central

    Silver, Priscilla H.

    1966-01-01

    1. The light-adapted spectral sensitivity of the grey squirrel has been determined by an automated training method at a level about 6 log units above the squirrel's absolute threshold. 2. The maximum sensitivity is near 555 nm, under light-adapted conditions, compared with the dark-adapted maximum near 500 nm found by a similar method. 3. Neither the light-adapted nor the dark-adapted behavioural threshold agrees with electrophysiological findings using single flash techniques, but there is agreement with e.r.g. results obtained with sinusoidal stimuli. PMID:5972118

  12. Rate-Compatible Protograph LDPC Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.

  13. The Impact of Clinical History on the Threshold Estimation of Auditory Brainstem Response Results for Infants

    ERIC Educational Resources Information Center

    Zaitoun, Maha; Cumming, Steven; Purcell, Alison; O'Brien, Katie

    2017-01-01

    Purpose: This study assesses the impact of patient clinical history on audiologists' performance when interpreting auditory brainstem response (ABR) results. Method: Fourteen audiologists' accuracy in estimating hearing threshold for 16 infants through interpretation of ABR traces was compared on 2 occasions at least 5 months apart. On the 1st…

  14. Methods for Assessing Item, Step, and Threshold Invariance in Polytomous Items Following the Partial Credit Model

    ERIC Educational Resources Information Center

    Penfield, Randall D.; Myers, Nicholas D.; Wolfe, Edward W.

    2008-01-01

    Measurement invariance in the partial credit model (PCM) can be conceptualized in several different but compatible ways. In this article the authors distinguish between three forms of measurement invariance in the PCM: step invariance, item invariance, and threshold invariance. Approaches for modeling these three forms of invariance are proposed,…

  15. A Multinomial Model for Identifying Significant Pure-Tone Threshold Shifts

    ERIC Educational Resources Information Center

    Schlauch, Robert S.; Carney, Edward

    2007-01-01

    Purpose: Significant threshold differences on retest for pure-tone audiometry are often evaluated by application of ad hoc rules, such as a shift in a pure-tone average or in 2 adjacent frequencies that exceeds a predefined amount. Rules that are so derived do not consider the probability of observing a particular audiogram. Methods: A general…

  16. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    ERIC Educational Resources Information Center

    Townsend, Lori; Hofer, Amy R.; Hanick, Silvia Lin; Brunetti, Korey

    2016-01-01

    This study used the Delphi method to engage expert practitioners on the topic of threshold concepts--core ideas and processes in a discipline that students need to grasp in order to progress in their learning, but that are often unspoken or unrecognized by expert practitioners--for information literacy. A panel of experts considered two questions:…

  17. Fuel cell flooding detection and correction

    DOEpatents

    DiPierno Bosco, Andrew; Fronk, Matthew Howard

    2000-08-15

    Method and apparatus for monitoring an H.sub.2 -O.sub.2 PEM fuel cells to detect and correct flooding. The pressure drop across a given H.sub.2 or O.sub.2 flow field is monitored and compared to predetermined thresholds of unacceptability. If the pressure drop exists a threshold of unacceptability corrective measures are automatically initiated.

  18. Extraction of hadron interactions above inelastic threshold in lattice QCD.

    PubMed

    Aoki, Sinya; Ishii, Noriyoshi; Doi, Takumi; Hatsuda, Tetsuo; Ikeda, Yoichi; Inoue, Takashi; Murano, Keiko; Nemura, Hidekatsu; Sasaki, Kenji

    2011-01-01

    We propose a new method to extract hadron interactions above inelastic threshold from the Nambu-Bethe-Salpeter amplitude in lattice QCD. We consider the scattering such as A + B → C + D, where A, B, C, D are names of different 1-particle states. An extension to cases where particle productions occur during scatterings is also discussed.

  19. Optimal Clustering in Graphs with Weighted Edges: A Unified Approach to the Threshold Problem.

    ERIC Educational Resources Information Center

    Goetschel, Roy; Voxman, William

    1987-01-01

    Relations on a finite set V are viewed as weighted graphs. Using the language of graph theory, two methods of partitioning V are examined: selecting threshold values and applying them to a maximal weighted spanning forest, and using a parametric linear program to obtain a most adhesive partition. (Author/EM)

  20. Diagnosis of ADHD in Adults: What Is the Appropriate "DSM-5" Symptom Threshold for Hyperactivity-Impulsivity?

    ERIC Educational Resources Information Center

    Solanto, Mary V.; Wasserstein, Jeanette; Marks, David J.; Mitchell, Katherine J.

    2012-01-01

    Objective: To empirically identify the appropriate symptom threshold for hyperactivity-impulsivity for diagnosis of ADHD in adults. Method: Participants were 88 adults (M [SD] age = 41.69 [11.78] years, 66% female, 16% minority) meeting formal "DSM-IV" criteria for ADHD combined or predominantly inattentive subtypes based on a structured…

  1. Diagnostic performance of different measurement methods for lung nodule enhancement at quantitative contrast-enhanced computed tomography

    NASA Astrophysics Data System (ADS)

    Wormanns, Dag; Klotz, Ernst; Dregger, Uwe; Beyer, Florian; Heindel, Walter

    2004-05-01

    Lack of angiogenesis virtually excludes malignancy of a pulmonary nodule; assessment with quantitative contrast-enhanced CT (QECT) requires a reliable enhancement measurement technique. Diagnostic performance of different measurement methods in the distinction between malignant and benign nodules was evaluated. QECT (unenhanced scan and 4 post-contrast scans) was performed in 48 pulmonary nodules (12 malignant, 12 benign, 24 indeterminate). Nodule enhancement was the difference between the highest nodule density at any post-contrast scan and the unenhanced scan. Enhancement was determined with: A) the standard 2D method; B) a 3D method consisting of segmentation, removal of peripheral structures and density averaging. Enhancement curves were evaluated for their plausibility using a predefined set of criteria. Sensitivity and specificity were 100% and 33% for the 2D method resp. 92% and 55% for the 3D method using a threshold of 20 HU. One malignant nodule did not show significant enhancement with method B due to adjacent atelectasis which disappeared within the few minutes of the QECT examination. Better discrimination between benign and malignant lesions was achieved with a slightly higher threshold than proposed in the literature. Application of plausibility criteria to the enhancement curves rendered less plausibility faults with the 3D method. A new 3D method for analysis of QECT scans yielded less artefacts and better specificity in the discrimination between benign and malignant pulmonary nodules when using an appropriate enhancement threshold. Nevertheless, QECT results must be interpreted with care.

  2. Cavitation and non-cavitation regime for large-scale ultrasonic standing wave particle separation systems--In situ gentle cavitation threshold determination and free radical related oxidation.

    PubMed

    Johansson, Linda; Singh, Tanoj; Leong, Thomas; Mawson, Raymond; McArthur, Sally; Manasseh, Richard; Juliano, Pablo

    2016-01-01

    We here suggest a novel and straightforward approach for liter-scale ultrasound particle manipulation standing wave systems to guide system design in terms of frequency and acoustic power for operating in either cavitation or non-cavitation regimes for ultrasound standing wave systems, using the sonochemiluminescent chemical luminol. We show that this method offers a simple way of in situ determination of the cavitation threshold for selected separation vessel geometry. Since the pressure field is system specific the cavitation threshold is system specific (for the threshold parameter range). In this study we discuss cavitation effects and also measure one implication of cavitation for the application of milk fat separation, the degree of milk fat lipid oxidation by headspace volatile measurements. For the evaluated vessel, 2 MHz as opposed to 1 MHz operation enabled operation in non-cavitation or low cavitation conditions as measured by the luminol intensity threshold method. In all cases the lipid oxidation derived volatiles were below the human sensory detection level. Ultrasound treatment did not significantly influence the oxidative changes in milk for either 1 MHz (dose of 46 kJ/L and 464 kJ/L) or 2 MHz (dose of 37 kJ/L and 373 kJ/L) operation. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A universal approach to determine footfall timings from kinematics of a single foot marker in hoofed animals

    PubMed Central

    Clayton, Hilary M.

    2015-01-01

    The study of animal movement commonly requires the segmentation of continuous data streams into individual strides. The use of forceplates and foot-mounted accelerometers readily allows the detection of the foot-on and foot-off events that define a stride. However, when relying on optical methods such as motion capture, there is lack of validated robust, universally applicable stride event detection methods. To date, no method has been validated for movement on a circle, while algorithms are commonly specific to front/hind limbs or gait. In this study, we aimed to develop and validate kinematic stride segmentation methods applicable to movement on straight line and circle at walk and trot, which exclusively rely on a single, dorsal hoof marker. The advantage of such marker placement is the robustness to marker loss and occlusion. Eight horses walked and trotted on a straight line and in a circle over an array of multiple forceplates. Kinetic events were detected based on the vertical force profile and used as the reference values. Kinematic events were detected based on displacement, velocity or acceleration signals of the dorsal hoof marker depending on the algorithm using (i) defined thresholds associated with derived movement signals and (ii) specific events in the derived movement signals. Method comparison was performed by calculating limits of agreement, accuracy, between-horse precision and within-horse precision based on differences between kinetic and kinematic event. In addition, we examined the effect of force thresholds ranging from 50 to 150 N on the timings of kinetic events. The two approaches resulted in very good and comparable performance: of the 3,074 processed footfall events, 95% of individual foot on and foot off events differed by no more than 26 ms from the kinetic event, with average accuracy between −11 and 10 ms and average within- and between horse precision ≤8 ms. While the event-based method may be less likely to suffer from scaling effects, on soft ground the threshold-based method may prove more valuable. While we found that use of velocity thresholds for foot on detection results in biased event estimates for the foot on the inside of the circle at trot, adjusting thresholds for this condition negated the effect. For the final four algorithms, we found no noteworthy bias between conditions or between front- and hind-foot timings. Different force thresholds in the range of 50 to 150 N had the greatest systematic effect on foot-off estimates in the hind limbs (up to on average 16 ms per condition), being greater than the effect on foot-on estimates or foot-off estimates in the forelimbs (up to on average ±7 ms per condition). PMID:26157641

  4. Observations Regarding Scatter Fraction and NEC Measurements for Small Animal PET

    NASA Astrophysics Data System (ADS)

    Yang, Yongfeng; Cherry, S. R.

    2006-02-01

    The goal of this study was to evaluate the magnitude and origin of scattered radiation in a small-animal PET scanner and to assess the impact of these findings on noise equivalent count rate (NECR) measurements, a metric often used to optimize scanner acquisition parameters and to compare one scanner with another. The scatter fraction (SF) was measured for line sources in air and line sources placed within a mouse-sized phantom (25 mm /spl phi//spl times/70 mm) and a rat-sized phantom (60 mm /spl phi//spl times/150 mm) on the microPET II small-animal PET scanner. Measurements were performed for lower energy thresholds ranging from 150-450 keV and a fixed upper energy threshold of 750 keV. Four different methods were compared for estimating the SF. Significant scatter fractions were measured with just the line source in the field of view, with the spatial distribution of these events consistent with scatter from the gantry and room environment. For mouse imaging, this component dominates over object scatter, and the measured SF is strongly method dependent. The environmental SF rapidly increases as the lower energy threshold decreases and can be more than 30% for an open energy window of 150-750 keV. The object SF originating from the mouse phantom is about 3-4% and does not change significantly as the lower energy threshold increases. The object SF for the rat phantom ranges from 10 to 35% for different energy windows and increases as the lower energy threshold decreases. Because the measured SF is highly dependent on the method, and there is as yet no agreed upon standard for animal PET, care must be exercised when comparing NECR for small objects between different scanners. Differences may be methodological rather than reflecting any relevant difference in the performance of the scanner. Furthermore, these results have implications for scatter correction methods when the majority of the detected scatter does not arise from the object itself.

  5. Comparing the ISO-recommended and the cumulative data-reduction algorithms in S-on-1 laser damage test by a reverse approach method

    NASA Astrophysics Data System (ADS)

    Zorila, Alexandru; Stratan, Aurel; Nemes, George

    2018-01-01

    We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.

  6. Whole body vibration training improves vibration perception threshold in healthy young adults: A randomized clinical trial pilot study

    PubMed Central

    Hernandez-Mocholi, M.A.; Dominguez-Muñoz, F.J.; Corzo, H.; Silva, S.C.S.; Adsuar, J.C.; Gusi, N.

    2016-01-01

    Objectives: Loss of foot sensitivity is a relevant parameter to assess and prevent in several diseases. It is crucial to determine the vibro-tactile sensitivity threshold response to acute conditions to explore innovative monitor tools and interventions to prevent and treat this challenge. The aims were: 1) to analyze the acute effects of a single whole body vibration session (4min-18Hz-4mm) on vibro-tactile perception threshold in healthy young adults. 2) to analyze the 48 hours effects of 3 whole body vibration sessions on vibro-tactile perception threshold in healthy young adults. Methods: A randomized controlled clinical trial over 3 sessions of whole body vibration intervention or 3 sessions of placebo intervention. Twenty-eight healthy young adults were included: 11 experimental group and 12 placebo group. The experimental group performed 3 sessions of WBV while the placebo group performed 3 sessions of placebo intervention. Results: The vibro-tactile threshold increased right after a single WBV session in comparison with placebo. Nevertheless, after 3 whole body vibration sessions and 48 hours, the threshold decreased to values lower than the initial. Conclusions: The acute response of the vibro-tactile threshold to one whole body vibration session increased, but the 48 hours short-term response of this threshold decreased in healthy young adults. PMID:26944818

  7. Impacts of selected stimulation patterns on the perception threshold in electrocutaneous stimulation

    PubMed Central

    2011-01-01

    Background Consistency is one of the most important concerns to convey stable artificially induced sensory feedback. However, the constancy of perceived sensations cannot be guaranteed, as the artificially evoked sensation is a function of the interaction of stimulation parameters. The hypothesis of this study is that the selected stimulation parameters in multi-electrode cutaneous stimulation have significant impacts on the perception threshold. Methods The investigated parameters included the stimulated location, the number of active electrodes, the number of pulses, and the interleaved time between a pair of electrodes. Biphasic, rectangular pulses were applied via five surface electrodes placed on the forearm of 12 healthy subjects. Results Our main findings were: 1) the perception thresholds at the five stimulated locations were significantly different (p < 0.0001), 2) dual-channel simultaneous stimulation lowered the perception thresholds and led to smaller variance in perception thresholds compared to single-channel stimulation, 3) the perception threshold was inversely related to the number of pulses, and 4) the perception threshold increased with increasing interleaved time when the interleaved time between two electrodes was below 500 μs. Conclusions To maintain a consistent perception threshold, our findings indicate that dual-channel simultaneous stimulation with at least five pulses should be used, and that the interleaved time between two electrodes should be longer than 500 μs. We believe that these findings have implications for design of reliable sensory feedback codes. PMID:21306616

  8. Intraoperative identification of the facial nerve by needle electromyography stimulation with a burr

    PubMed Central

    KHAMGUSHKEEVA, N.N.; ANIKIN, I.A.; KORNEYENKOV, A.A.

    2016-01-01

    The purpose of this research is to improve the safety of surgery for patients with a pathology of the middle and inner ear by preventing damage to the facial nerve by conducting intraoperative monitoring of the facial nerve by needle electromyography with continuous stimulation with a burr. Patients and Methods The clinical part of the prospective study was carried out on 48 patients that were diagnosed with suppurative otitis media. After the surgery with intraoperative monitoring, the facial nerve with an intact bone wall was stimulated electrically in the potentially dangerous places of damage. Minimum (threshold) stimulation (mA) of the facial nerve with a threshold event of 100 μV was used to register EMG events. The anatomical part of the study was carried out on 30 unformalinized cadaver temporal bones from adult bodies. The statistical analysis of obtained data was carried out with parametric methods (Student’s t-test), non-parametric correlation (Spearman’s method) and regression analysis. Results It was found that 1 mA of threshold amperage corresponded to 0.8 mm thickness of the bone wall of the facial canal. Values of transosseous threshold stimulation in potentially dangerous sections of the injury to the facial nerve were obtained. Conclusion These data lower the risk of paresis (paralysis) of the facial muscles during otologic surgery. PMID:27142821

  9. Barostat testing of rectal sensation and compliance in humans: comparison of results across two centres and overall reproducibility.

    PubMed

    Cremonini, F; Houghton, L A; Camilleri, M; Ferber, I; Fell, C; Cox, V; Castillo, E J; Alpers, D H; Dewit, O E; Gray, E; Lea, R; Zinsmeister, A R; Whorwell, P J

    2005-12-01

    We assessed reproducibility of measurements of rectal compliance and sensation in health in studies conducted at two centres. We estimated samples size necessary to show clinically meaningful changes in future studies. We performed rectal barostat tests three times (day 1, day 1 after 4 h and 14-17 days later) in 34 healthy participants. We measured compliance and pressure thresholds for first sensation, urgency, discomfort and pain using ascending method of limits and symptom ratings for gas, urgency, discomfort and pain during four phasic distensions (12, 24, 36 and 48 mmHg) in random order. Results obtained at the two centres differed minimally. Reproducibility of sensory end points varies with type of sensation, pressure level and method of distension. Pressure threshold for pain and sensory ratings for non-painful sensations at 36 and 48 mmHg distension were most reproducible in the two centres. Sample size calculations suggested that crossover design is preferable in therapeutic trials: for each dose of medication tested, a sample of 21 should be sufficient to demonstrate 30% changes in all sensory thresholds and almost all sensory ratings. We conclude that reproducibility varies with sensation type, pressure level and distension method, but in a two-centre study, differences in observed results of sensation are minimal and pressure threshold for pain and sensory ratings at 36-48 mmHg of distension are reproducible.

  10. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  11. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    PubMed

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  12. T wave alternans during exercise and atrial pacing in humans

    NASA Technical Reports Server (NTRS)

    Hohnloser, S. H.; Klingenheben, T.; Zabel, M.; Li, Y. G.; Albrecht, P.; Cohen, R. J.

    1997-01-01

    INTRODUCTION: Evidence is accumulating that microvolt T wave alternans (TWA) is a marker of increased risk for ventricular tachyarrhythmias. Initially, atrial pacing was used to elevate heart rate and elicit TWA. More recently, a noninvasive approach has been developed that elevates heart rate using exercise. METHODS AND RESULTS: In 30 consecutive patients with a history of ventricular tachyarrhythmias, the spectral method was used to detect TWA during both atrial pacing and submaximal exercise testing. The concordance rate for the presence or absence of TWA using the two measurement methods was 84%. There was a patient-specific heart rate threshold for the detection of TWA that averaged 100 +/- 14 beats/min during exercise compared with 97 +/- 9 beats/min during right atrial pacing (P = NS). Beyond this threshold, there was a significant and comparable increase in level of TWA with decreasing pacing cycle length and increasing exercise heart rates. CONCLUSIONS: The present study is the first to demonstrate that microvolt TWA can be assessed reliably and noninvasively during exercise stress. There is a patient-specific heart rate threshold beyond which TWA continues to increase with increasing heart rates. Heart rate thresholds for the onset of TWA measured during atrial pacing and exercise stress were comparable, indicating that heart rate alone appears to be the main factor of determining the onset of TWA during submaximal exercise stress.

  13. Validation of quantitative light-induced fluorescence-digital (QLF-D) for the detection of approximal caries in vitro.

    PubMed

    Ko, Hae-Youn; Kang, Si-Mook; Kim, Hee Eun; Kwon, Ho-Keun; Kim, Baek-Il

    2015-05-01

    Detection of approximal caries lesions can be difficult due to their anatomical position. This study aimed to assess the ability of the quantitative light-induced fluorescence-digital (QLF-D) in detecting approximal caries, and to compare the performance with those of the International Caries Detection and Assessment System II (ICDAS II) and digital radiography (DR). Extracted permanent teeth (n=100) were selected and mounted in pairs. The simulation pairs were assessed by one calibrated dentist using each detection method. After all the examinations, the teeth (n=95) were sectioned and examined histologically as gold standard. The modalities were compared in terms of sensitivity, specificity, areas under receiver operating characteristic curves (AUROC) for enamel (D1) and dentine (D3) levels. The intra-examiner reliability was assessed for all modalities. At D1 threshold, the ICDAS II presented the highest sensitivity (0.80) while the DR showed the highest specificity (0.89); however, the methods with the greatest AUC values at D1 threshold were DR and QLF-D (0.80 and 0.80 respectively). At D3 threshold, the methods with the highest sensitivity were ICDAS II and QLF-D (0.64 and 0.64 respectively) while the method with the lowest sensitivity was DR (0.50). However, with regard to the AUC values at D3 threshold, the QLF-D presented the highest value (0.76). All modalities showed to have excellent intra-examiner reliability. The newly developed QLF-D was not only able to detect proximal caries, but also showed to have comparable performance to the visual inspection and radiography in detecting proximal caries. QLF-D has the potential to be a useful detection method for proximal caries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. DTFP-Growth: Dynamic Threshold-Based FP-Growth Rule Mining Algorithm Through Integrating Gene Expression, Methylation, and Protein-Protein Interaction Profiles.

    PubMed

    Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan

    2018-04-01

    Association rule mining is an important technique for identifying interesting relationships between gene pairs in a biological data set. Earlier methods basically work for a single biological data set, and, in maximum cases, a single minimum support cutoff can be applied globally, i.e., across all genesets/itemsets. To overcome this limitation, in this paper, we propose dynamic threshold-based FP-growth rule mining algorithm that integrates gene expression, methylation and protein-protein interaction profiles based on weighted shortest distance to find the novel associations among different pairs of genes in multi-view data sets. For this purpose, we introduce three new thresholds, namely, Distance-based Variable/Dynamic Supports (DVS), Distance-based Variable Confidences (DVC), and Distance-based Variable Lifts (DVL) for each rule by integrating co-expression, co-methylation, and protein-protein interactions existed in the multi-omics data set. We develop the proposed algorithm utilizing these three novel multiple threshold measures. In the proposed algorithm, the values of , , and are computed for each rule separately, and subsequently it is verified whether the support, confidence, and lift of each evolved rule are greater than or equal to the corresponding individual , , and values, respectively, or not. If all these three conditions for a rule are found to be true, the rule is treated as a resultant rule. One of the major advantages of the proposed method compared with other related state-of-the-art methods is that it considers both the quantitative and interactive significance among all pairwise genes belonging to each rule. Moreover, the proposed method generates fewer rules, takes less running time, and provides greater biological significance for the resultant top-ranking rules compared to previous methods.

  15. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    PubMed

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Quantitative somatosensory testing of the penis: optimizing the clinical neurological examination.

    PubMed

    Bleustein, Clifford B; Eckholdt, Haftan; Arezzo, Joseph C; Melman, Arnold

    2003-06-01

    Quantitative somatosensory testing, including vibration, pressure, spatial perception and thermal thresholds of the penis, has demonstrated neuropathy in patients with a history of erectile dysfunction of all etiologies. We evaluated which measurement of neurological function of the penis was best at predicting erectile dysfunction and examined the impact of location on the penis for quantitative somatosensory testing measurements. A total of 107 patients were evaluated. All patients were required to complete the erectile function domain of the International Index of Erectile Function (IIEF) questionnaire, of whom 24 had no complaints of erectile dysfunction and scored within the "normal" range on the IIEF. Patients were subsequently tested on ventral middle penile shaft, proximal dorsal midline penile shaft and glans penis (with foreskin retracted) for vibration, pressure, spatial perception, and warm and cold thermal thresholds. Mixed models repeated measures analysis of variance controlling for age, diabetes and hypertension revealed that method of measurement (quantitative somatosensory testing) was predictive of IIEF score (F = 209, df = 4,1315, p <0.001), while site of measurement on the penis was not. To determine the best method of measurement, we used hierarchical regression, which revealed that warm temperature was the best predictor of erectile dysfunction with pseudo R(2) = 0.19, p <0.0007. There was no significant improvement in predicting erectile dysfunction when another test was added. Using 37C and greater as the warm thermal threshold yielded a sensitivity of 88.5%, specificity 70.0% and positive predictive value 85.5%. Quantitative somatosensory testing using warm thermal threshold measurements taken at the glans penis can be used alone to assess the neurological status of the penis. Warm thermal thresholds alone offer a quick, noninvasive accurate method of evaluating penile neuropathy in an office setting.

  17. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis; Negri, Jacquelyn; Kean, Jason

    2016-04-01

    Population expansion into fire-prone steeplands has resulted in an increase in post-fire debris-flow risk in the western United States. Logistic regression methods for determining debris-flow likelihood and the calculation of empirical rainfall intensity-duration thresholds for debris-flow initiation represent two common approaches for characterizing hazard and reducing risk. Logistic regression models are currently being used to rapidly assess debris-flow hazard in response to design storms of known intensities (e.g. a 10-year recurrence interval rainstorm). Empirical rainfall intensity-duration thresholds comprise a major component of the United States Geological Survey (USGS) and the National Weather Service (NWS) debris-flow early warning system at a regional scale in southern California. However, these two modeling approaches remain independent, with each approach having limitations that do not allow for synergistic local-scale (e.g. drainage-basin scale) characterization of debris-flow hazard during intense rainfall. The current logistic regression equations consider rainfall a unique independent variable, which prevents the direct calculation of the relation between rainfall intensity and debris-flow likelihood. Regional (e.g. mountain range or physiographic province scale) rainfall intensity-duration thresholds fail to provide insight into the basin-scale variability of post-fire debris-flow hazard and require an extensive database of historical debris-flow occurrence and rainfall characteristics. Here, we present a new approach that combines traditional logistic regression and intensity-duration threshold methodologies. This method allows for local characterization of both the likelihood that a debris-flow will occur at a given rainfall intensity, the direct calculation of the rainfall rates that will result in a given likelihood, and the ability to calculate spatially explicit rainfall intensity-duration thresholds for debris-flow generation in recently burned areas. Our approach synthesizes the two methods by incorporating measured rainfall intensity into each model variable (based on measures of topographic steepness, burn severity and surface properties) within the logistic regression equation. This approach provides a more realistic representation of the relation between rainfall intensity and debris-flow likelihood, as likelihood values asymptotically approach zero when rainfall intensity approaches 0 mm/h, and increase with more intense rainfall. Model performance was evaluated by comparing predictions to several existing regional thresholds. The model, based upon training data collected in southern California, USA, has proven to accurately predict rainfall intensity-duration thresholds for other areas in the western United States not included in the original training dataset. In addition, the improved logistic regression model shows promise for emergency planning purposes and real-time, site-specific early warning. With further validation, this model may permit the prediction of spatially-explicit intensity-duration thresholds for debris-flow generation in areas where empirically derived regional thresholds do not exist. This improvement would permit the expansion of the early-warning system into other regions susceptible to post-fire debris flow.

  18. Influence of Injury Risk Thresholds on the Performance of an Algorithm to Predict Crashes with Serious Injuries

    PubMed Central

    Bahouth, George; Digges, Kennerly; Schulman, Carl

    2012-01-01

    This paper presents methods to estimate crash injury risk based on crash characteristics captured by some passenger vehicles equipped with Advanced Automatic Crash Notification technology. The resulting injury risk estimates could be used within an algorithm to optimize rescue care. Regression analysis was applied to the National Automotive Sampling System / Crashworthiness Data System (NASS/CDS) to determine how variations in a specific injury risk threshold would influence the accuracy of predicting crashes with serious injuries. The recommended thresholds for classifying crashes with severe injuries are 0.10 for frontal crashes and 0.05 for side crashes. The regression analysis of NASS/CDS indicates that these thresholds will provide sensitivity above 0.67 while maintaining a positive predictive value in the range of 0.20. PMID:23169132

  19. s-wave threshold in electron attachment - Observations and cross sections in CCl4 and SF6 at ultralow electron energies

    NASA Technical Reports Server (NTRS)

    Chutjian, A.; Alajajian, S. H.

    1985-01-01

    The threshold photoionization method was used to study low-energy electron attachment phenomena in and cross sections of CCl4 and SF6 compounds, which have applications in the design of gaseous dielectrics and diffuse discharge opening switches. Measurements were made at electron energies from below threshold to 140 meV at resolutions of 6 and 8 meV. A narrow resolution-limited structure was observed in electron attachment to CCl4 and SF6 at electron energies below 10 meV, which is attributed to the divergence of the attachment cross section in the limit epsilon, l approaches zero. The results are compared with experimental collisional-ionization results, electron-swarm unfolded cross sections, and earlier threshold photoionization data.

  20. Determination of Cross-Sectional Area of Focused Picosecond Gaussian Laser Beam

    NASA Technical Reports Server (NTRS)

    Ledesma, Rodolfo; Fitz-Gerald, James; Palmieri, Frank; Connell, John

    2018-01-01

    Measurement of the waist diameter of a focused Gaussian-beam at the 1/e(sup 2) intensity, also referred to as spot size, is key to determining the fluence in laser processing experiments. Spot size measurements are also helpful to calculate the threshold energy and threshold fluence of a given material. This work reports an application of a conventional method, by analyzing single laser ablated spots for different laser pulse energies, to determine the cross-sectional area of a focused Gaussian-beam, which has a nominal pulse width of approx. 10 ps. Polished tungsten was used as the target material, due to its low surface roughness and low ablation threshold, to measure the beam waist diameter. From the ablative spot measurements, the ablation threshold fluence of the tungsten substrate was also calculated.

  1. Methods for improving the damage performance of fused silica polished by magnetorheological finishing

    DOE PAGES

    Kafka, Kyle R. P.; Hoffman, Brittany N.; Papernov, Semyon; ...

    2017-12-11

    The laser-induced damage threshold of fused-silica samples processed via magnetorheological finishing is investigated for polishing compounds depending on the type of abrasive material and the post-polishing surface roughness. The effectiveness of laser conditioning is examined using a ramped pre-exposure with the same 351-nm, 3-ns Gaussian pulses. Lastly, we examine chemical etching of the surface and correlate the resulting damage threshold to the etching protocol. A combination of etching and laser conditioning is found to improve the damage threshold by a factor of ~3, while maintaining <1-nm surface roughness.

  2. Methods for improving the damage performance of fused silica polished by magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Kafka, K. R. P.; Hoffman, B.; Papernov, S.; DeMarco, M. A.; Hall, C.; Marshall, K. L.; Demos, S. G.

    2017-12-01

    The laser-induced damage threshold of fused-silica samples processed via magnetorheological finishing is investigated for polishing compounds depending on the type of abrasive material and the post-polishing surface roughness. The effectiveness of laser conditioning is examined using a ramped pre-exposure with the same 351-nm, 3-ns Gaussian pulses. Finally, we examine chemical etching of the surface and correlate the resulting damage threshold to the etching protocol. A combination of etching and laser conditioning is found to improve the damage threshold by a factor of 3, while maintaining <1-nm surface roughness.

  3. Comparison of an adaptive local thresholding method on CBCT and µCT endodontic images

    NASA Astrophysics Data System (ADS)

    Michetti, Jérôme; Basarab, Adrian; Diemer, Franck; Kouame, Denis

    2018-01-01

    Root canal segmentation on cone beam computed tomography (CBCT) images is difficult because of the noise level, resolution limitations, beam hardening and dental morphological variations. An image processing framework, based on an adaptive local threshold method, was evaluated on CBCT images acquired on extracted teeth. A comparison with high quality segmented endodontic images on micro computed tomography (µCT) images acquired from the same teeth was carried out using a dedicated registration process. Each segmented tooth was evaluated according to volume and root canal sections through the area and the Feret’s diameter. The proposed method is shown to overcome the limitations of CBCT and to provide an automated and adaptive complete endodontic segmentation. Despite a slight underestimation (-4, 08%), the local threshold segmentation method based on edge-detection was shown to be fast and accurate. Strong correlations between CBCT and µCT segmentations were found both for the root canal area and diameter (respectively 0.98 and 0.88). Our findings suggest that combining CBCT imaging with this image processing framework may benefit experimental endodontology, teaching and could represent a first development step towards the clinical use of endodontic CBCT segmentation during pulp cavity treatment.

  4. 3D GGO candidate extraction in lung CT images using multilevel thresholding on supervoxels

    NASA Astrophysics Data System (ADS)

    Huang, Shan; Liu, Xiabi; Han, Guanghui; Zhao, Xinming; Zhao, Yanfeng; Zhou, Chunwu

    2018-02-01

    The earlier detection of ground glass opacity (GGO) is of great importance since GGOs are more likely to be malignant than solid nodules. However, the detection of GGO is a difficult task in lung cancer screening. This paper proposes a novel GGO candidate extraction method, which performs multilevel thresholding on supervoxels in 3D lung CT images. Firstly, we segment the lung parenchyma based on Otsu algorithm. Secondly, the voxels which are adjacent in 3D discrete space and sharing similar grayscale are clustered into supervoxels. This procedure is used to enhance GGOs and reduce computational complexity. Thirdly, Hessian matrix is used to emphasize focal GGO candidates. Lastly, an improved adaptive multilevel thresholding method is applied on segmented clusters to extract GGO candidates. The proposed method was evaluated on a set of 19 lung CT scans containing 166 GGO lesions from the Lung CT Imaging Signs (LISS) database. The experimental results show that our proposed GGO candidate extraction method is effective, with a sensitivity of 100% and 26.3 of false positives per scan (665 GGO candidates, 499 non-GGO regions and 166 GGO regions). It can handle both focal GGOs and diffuse GGOs.

  5. Wavelet median denoising of ultrasound images

    NASA Astrophysics Data System (ADS)

    Macey, Katherine E.; Page, Wyatt H.

    2002-05-01

    Ultrasound images are contaminated with both additive and multiplicative noise, which is modeled by Gaussian and speckle noise respectively. Distinguishing small features such as fallopian tubes in the female genital tract in the noisy environment is problematic. A new method for noise reduction, Wavelet Median Denoising, is presented. Wavelet Median Denoising consists of performing a standard noise reduction technique, median filtering, in the wavelet domain. The new method is tested on 126 images, comprised of 9 original images each with 14 levels of Gaussian or speckle noise. Results for both separable and non-separable wavelets are evaluated, relative to soft-thresholding in the wavelet domain, using the signal-to-noise ratio and subjective assessment. The performance of Wavelet Median Denoising is comparable to that of soft-thresholding. Both methods are more successful in removing Gaussian noise than speckle noise. Wavelet Median Denoising outperforms soft-thresholding for a larger number of cases of speckle noise reduction than of Gaussian noise reduction. Noise reduction is more successful using non-separable wavelets than separable wavelets. When both methods are applied to ultrasound images obtained from a phantom of the female genital tract a small improvement is seen; however, a substantial improvement is required prior to clinical use.

  6. SU-E-I-96: A Study About the Influence of ROI Variation On Tumor Segmentation in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, L; Tan, S; Lu, W

    2014-06-01

    Purpose: To study the influence of different regions of interest (ROI) on tumor segmentation in PET. Methods: The experiments were conducted on a cylindrical phantom. Six spheres with different volumes (0.5ml, 1ml, 6ml, 12ml, 16ml and 20 ml) were placed inside a cylindrical container to mimic tumors of different sizes. The spheres were filled with 11C solution as sources and the cylindrical container was filled with 18F-FDG solution as the background. The phantom was continuously scanned in a Biograph-40 True Point/True View PET/CT scanner, and 42 images were reconstructed with source-to-background ratio (SBR) ranging from 16:1 to 1.8:1. We tookmore » a large and a small ROI for each sphere, both of which contain the whole sphere and does not contain any other spheres. Six other ROIs of different sizes were then taken between the large and the small ROI. For each ROI, all images were segmented by eitht thresholding methods and eight advanced methods, respectively. The segmentation results were evaluated by dice similarity index (DSI), classification error (CE) and volume error (VE). The robustness of different methods to ROI variation was quantified using the interrun variation and a generalized Cohen's kappa. Results: With the change of ROI, the segmentation results of all tested methods changed more or less. Compared with all advanced methods, thresholding methods were less affected by the ROI change. In addition, most of the thresholding methods got more accurate segmentation results for all sphere sizes. Conclusion: The results showed that the segmentation performance of all tested methods was affected by the change of ROI. Thresholding methods were more robust to this change and they can segment the PET image more accurately. This work was supported in part by National Natural Science Foundation of China (NNSFC), under Grant Nos. 60971112 and 61375018, and Fundamental Research Funds for the Central Universities, under Grant No. 2012QN086. Wei Lu was supported in part by the National Institutes of Health (NIH) Grant No. R01 CA172638.« less

  7. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  8. Estimating the extreme low-temperature event using nonparametric methods

    NASA Astrophysics Data System (ADS)

    D'Silva, Anisha

    This thesis presents a new method of estimating the one-in-N low temperature threshold using a non-parametric statistical method called kernel density estimation applied to daily average wind-adjusted temperatures. We apply our One-in-N Algorithm to local gas distribution companies (LDCs), as they have to forecast the daily natural gas needs of their consumers. In winter, demand for natural gas is high. Extreme low temperature events are not directly related to an LDCs gas demand forecasting, but knowledge of extreme low temperatures is important to ensure that an LDC has enough capacity to meet customer demands when extreme low temperatures are experienced. We present a detailed explanation of our One-in-N Algorithm and compare it to the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution. We show that our One-in-N Algorithm estimates the one-in- N low temperature threshold more accurately than the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution according to root mean square error (RMSE) measure at a 5% level of significance. The One-in- N Algorithm is tested by counting the number of times the daily average wind-adjusted temperature is less than or equal to the one-in- N low temperature threshold.

  9. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  10. Determination of Acoustic Cavitation Probabilities and Thresholds Using a Single Focusing Transducer to Induce and Detect Acoustic Cavitation Events: I. Method and Terminology.

    PubMed

    Haller, Julian; Wilkens, Volker; Shaw, Adam

    2018-02-01

    A method to determine acoustic cavitation probabilities in tissue-mimicking materials (TMMs) is described that uses a high-intensity focused ultrasound (HIFU) transducer for both inducing and detecting the acoustic cavitation events. The method was evaluated by studying acoustic cavitation probabilities in agar-based TMMs with and without scatterers and for different sonication modes like continuous wave, single pulses (microseconds to milliseconds) and repeated burst signals. Acoustic cavitation thresholds (defined here as the peak rarefactional in situ pressure at which the acoustic cavitation probability reaches 50%) at a frequency of 1.06 MHz were observed between 1.1 MPa (for 1 s of continuous wave sonication) and 4.6 MPa (for 1 s of a repeated burst signal with 25-cycle burst length and 10-ms burst period) in a 3% (by weight) agar phantom without scatterers. The method and its evaluation are described, and general terminology useful for standardizing the description of insonation conditions and comparing results is provided. In the accompanying second part, the presented method is used to systematically study the acoustic cavitation thresholds in the same material for a range of sonication modes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Vibration perception threshold for sight-threatening retinopathy screening in type 2 diabetic outpatients.

    PubMed

    Shen, Jing; Hu, Yanyun; Liu, Fang; Zeng, Hui; Li, Lianxi; Zhao, Jun; Zhao, Jungong; Zheng, Taishan; Lu, Huijuan; Lu, Fengdi; Bao, Yuqian; Jia, Weiping

    2013-10-01

    We investigated the relationship between vibration perception threshold and diabetic retinopathy and verified the screening value of vibration perception threshold for severe diabetic retinopathy. A total of 955 patients with type 2 diabetes were recruited and divided into three groups according to their fundus oculi photography results: no diabetic retinopathy (n = 654, 68.48%), non-sight-threatening diabetic retinopathy (n = 189, 19.79%) and sight-threatening diabetic retinopathy (n = 112, 11.73%). Their clinical and biochemical characteristics, vibration perception threshold and the diabetic retinopathy grades were detected and compared. There were significant differences in diabetes duration and blood glucose levels among three groups (all p < 0.05). The values of vibration perception threshold increased with the rising severity of retinopathy, and the vibration perception threshold level of sight-threatening diabetic retinopathy group was significantly higher than both non-sight-threatening diabetic retinopathy and no diabetic retinopathy groups (both p < 0.01). The prevalence of sight-threatening diabetic retinopathy in vibration perception threshold >25 V group was significantly higher than those in 16-24 V group (p < 0.01). The severity of diabetic retinopathy was positively associated with diabetes duration, blood glucose indexes and vibration perception threshold (all p < 0.01). Multiple stepwise regression analysis proved that glycosylated haemoglobin (β = 0.385, p = 0.000), diabetes duration (β = 0.275, p = 0.000) and vibration perception threshold (β = 0.180, p = 0.015) were independent risk factors for diabetic retinopathy. Receiver operating characteristic analysis further revealed that vibration perception threshold higher than 18 V was the optimal cut point for reflecting high risk of sight-threatening diabetic retinopathy (odds ratio = 4.20, 95% confidence interval = 2.67-6.59). There was a close association between vibration perception threshold and the severity of diabetic retinopathy. vibration perception threshold was a potential screening method for diabetic retinopathy, and its optimal cut-off for prompting high risk of sight-threatening retinopathy was 18 V. Copyright © 2013 John Wiley & Sons, Ltd.

  12. The potential advantages of (18)FDG PET/CT-based target volume delineation in radiotherapy planning of head and neck cancer.

    PubMed

    Moule, Russell N; Kayani, Irfan; Moinuddin, Syed A; Meer, Khalda; Lemon, Catherine; Goodchild, Kathleen; Saunders, Michele I

    2010-11-01

    This study investigated two fixed threshold methods to delineate the target volume using (18)FDG PET/CT before and during a course of radical radiotherapy in locally advanced squamous cell carcinoma of the head and neck. Patients were enrolled into the study between March 2006 and May 2008. (18)FDG PET/CT scans were carried out 72h prior to the start of radiotherapy and then at 10, 44 and 66Gy. Functional volumes were delineated according to the SUV Cut Off (SUVCO) (2.5, 3.0, 3.5, and 4.0bwg/ml) and percentage of the SUVmax (30%, 35%, 40%, 45%, and 50%) thresholds. The background (18)FDG uptake and the SUVmax within the volumes were also assessed. Primary and lymph node volumes for the eight patients significantly reduced with each increase in the delineation threshold (for example 2.5-3.0bwg/ml SUVCO) compared to the baseline threshold at each imaging point. There was a significant reduction in the volume (p⩽0.0001-0.01) after 36Gy compared to the 0Gy by the SUVCO method. There was a negative correlation between the SUVmax within the primary and lymph node volumes and delivered radiation dose (p⩽0.0001-0.011) but no difference in the SUV within the background reference region. The volumes delineated by the PTSUVmax method increased with the increase in the delivered radiation dose after 36Gy because the SUVmax within the region of interest used to define the edge of the volume was equal or less than the background (18)FDG uptake and the software was unable to effectively differentiate between tumour and background uptake. The changes in the target volumes delineated by the SUVCO method were less susceptible to background (18)FDG uptake compared to those delineated by the PTSUVmax and may be more helpful in radiotherapy planning. The best method and threshold have still to be determined within institutions, both nationally and internationally. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. New method to evaluate the 7Li(p, n)7Be reaction near threshold

    NASA Astrophysics Data System (ADS)

    Herrera, María S.; Moreno, Gustavo A.; Kreiner, Andrés J.

    2015-04-01

    In this work a complete description of the 7Li(p, n)7Be reaction near threshold is given using center-of-mass and relative coordinates. It is shown that this standard approach, not used before in this context, leads to a simple mathematical representation which gives easy access to all relevant quantities in the reaction and allows a precise numerical implementation. It also allows in a simple way to include proton beam-energy spread affects. The method, implemented as a C++ code, was validated both with numerical and experimental data finding a good agreement. This tool is also used here to analyze scattered published measurements such as (p, n) cross sections, differential and total neutron yields for thick targets. Using these data we derive a consistent set of parameters to evaluate neutron production near threshold. Sensitivity of the results to data uncertainty and the possibility of incorporating new measurements are also discussed.

  14. Photoionization of atomic barium subshells in the 4 d threshold region using the relativistic multiconfiguration Tamm-Dancoff approximation

    NASA Astrophysics Data System (ADS)

    Ganesan, Aarthi; Deshmukh, P. C.; Manson, S. T.

    2017-03-01

    Photoionization cross sections and photoelectron angular distribution asymmetry parameters are calculated for the 4 d10, 5 s2, 5 p6 , and 6 s2 subshells of atomic barium as a test of the relativistic multiconfiguration Tamm-Dancoff (RMCTD) method. The shape resonance present in the near-threshold region of the 4 d subshell is studied in detail in the 4 d photoionization along with the 5 s , 5 p , and 6 s subshells in the region of the 4 d thresholds, as the 4 d shape resonance strongly influences these subshells in its vicinity. The results are compared with available experiment and other many-body theoretical results in an effort to assess the capabilities of the RMCTD methodology. The electron correlations addressed in the RMCTD method give relatively good agreement with the experimental data, indicating that the important many-body correlations are included correctly.

  15. An adaptive embedded mesh procedure for leading-edge vortex flows

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.; Beer, Michael A.; Law, Glenn W.

    1989-01-01

    A procedure for solving the conical Euler equations on an adaptively refined mesh is presented, along with a method for determining which cells to refine. The solution procedure is a central-difference cell-vertex scheme. The adaptation procedure is made up of a parameter on which the refinement decision is based, and a method for choosing a threshold value of the parameter. The refinement parameter is a measure of mesh-convergence, constructed by comparison of locally coarse- and fine-grid solutions. The threshold for the refinement parameter is based on the curvature of the curve relating the number of cells flagged for refinement to the value of the refinement threshold. Results for three test cases are presented. The test problem is that of a delta wing at angle of attack in a supersonic free-stream. The resulting vortices and shocks are captured efficiently by the adaptive code.

  16. Optimizing the rapid measurement of detection thresholds in infants

    PubMed Central

    Jones, Pete R.; Kalwarowsky, Sarah; Braddick, Oliver J.; Atkinson, Janette; Nardini, Marko

    2015-01-01

    Accurate measures of perceptual threshold are difficult to obtain in infants. In a clinical context, the challenges are particularly acute because the methods must yield meaningful results quickly and within a single individual. The present work considers how best to maximize speed, accuracy, and reliability when testing infants behaviorally and suggests some simple principles for improving test efficiency. Monte Carlo simulations, together with empirical (visual acuity) data from 65 infants, are used to demonstrate how psychophysical methods developed with adults can produce misleading results when applied to infants. The statistical properties of an effective clinical infant test are characterized, and based on these, it is shown that (a) a reduced (false-positive) guessing rate can greatly increase test efficiency, (b) the ideal threshold to target is often below 50% correct, and (c) simply taking the max correct response can often provide the best measure of an infant's perceptual sensitivity. PMID:26237298

  17. A diffusion modelling approach to understanding contextual cueing effects in children with ADHD

    PubMed Central

    Weigard, Alexander; Huang-Pollock, Cynthia

    2014-01-01

    Background Strong theoretical models suggest implicit learning deficits may exist among children with Attention Deficit Hyperactivity Disorder (ADHD). Method We examine implicit contextual cueing (CC) effects among children with ADHD (n=72) and non-ADHD Controls (n=36). Results Using Ratcliff’s drift diffusion model, we found that among Controls, the CC effect is due to improvements in attentional guidance and to reductions in response threshold. Children with ADHD did not show a CC effect; although they were able to use implicitly acquired information to deploy attentional focus, they had more difficulty adjusting their response thresholds. Conclusions Improvements in attentional guidance and reductions in response threshold together underlie the CC effect. Results are consistent with neurocognitive models of ADHD that posit sub-cortical dysfunction but intact spatial attention, and encourage the use of alternative data analytic methods when dealing with reaction time data. PMID:24798140

  18. Ultralow percolation threshold of single walled carbon nanotube-epoxy composites synthesized via an ionic liquid dispersant/initiator

    NASA Astrophysics Data System (ADS)

    Watters, Arianna L.; Palmese, Giuseppe R.

    2014-09-01

    Uniform dispersion of single walled carbon nanotubes (SWNTs) in an epoxy was achieved by a streamlined mechano-chemical processing method. SWNT-epoxy composites were synthesized using a room temperature ionic liquid (IL) with an imidazolium cation and dicyanamide anion. The novel approach of using ionic liquid that behaves as a dispersant for SWNTs and initiator for epoxy polymerization greatly simplifies nanocomposite synthesis. The material was processed using simple and scalable three roll milling. The SWNT dispersion of the resultant composite was evaluated by electron microscopy and electrical conductivity measurements in conjunction with percolation theory. Processing conditions were optimized to achieve the lowest possible percolation threshold, 4.29 × 10-5 volume fraction SWNTs. This percolation threshold is among the best reported in literature yet it was obtained using a streamlined method that greatly simplifies processing.

  19. Investigating resonances above and below the threshold in nuclear reactions of astrophysical interest and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    La Cognata, M., E-mail: lacognata@lns.infn.it; Kiss, G. G.; Mukhamedzhanov, A. M.

    2015-10-15

    Resonances in nuclear cross sections dramatically change their trends. Therefore, the presence of unexpected resonances might lead to unpredicted consequences on astrophysics and nuclear physics. In nuclear physics, resonances allow one to study states in the intermediate compound systems, to evaluate their cluster structure, for instance, especially in the energy regions approaching particle decay thresholds. In astrophysics, resonances might lead to changes in the nucleosynthesis flow, determining different isotopic compositions of the nuclear burning ashes. For these reasons, the Trojan Horse method has been modified to investigate resonant reactions. Thanks to this novel approach, for the first time normalization tomore » direct data might be avoided. Moreover, in the case of sub threshold resonances, the Trojan Horse method modified to investigate resonances allows one to deduce the asymptotic normalization coefficient, showing the close connection between the two indirect approaches.« less

  20. Improvement in Brightness Uniformity by Compensating for the Threshold Voltages of Both the Driving Thin-Film Transistor and the Organic Light-Emitting Diode for Active-Matrix Organic Light-Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Ching-Lin Fan,; Hui-Lung Lai,; Jyu-Yu Chang,

    2010-05-01

    In this paper, we propose a novel pixel design and driving method for active-matrix organic light-emitting diode (AM-OLED) displays using low-temperature polycrystalline silicon thin-film transistors (LTPS-TFTs). The proposed threshold voltage compensation circuit, which comprised five transistors and two capacitors, has been verified to supply uniform output current by simulation work using the automatic integrated circuit modeling simulation program with integrated circuit emphasis (AIM-SPICE) simulator. The driving scheme of this voltage programming method includes four periods: precharging, compensation, data input, and emission. The simulated results demonstrate excellent properties such as low error rate of OLED anode voltage variation (<1%) and high output current. The proposed pixel circuit shows high immunity to the threshold voltage deviation characteristics of both the driving poly-Si TFT and the OLED.

  1. A New Low Temperature Polycrystalline Silicon Thin Film Transistor Pixel Circuit for Active Matrix Organic Light Emitting Diode

    NASA Astrophysics Data System (ADS)

    Ching-Lin Fan,; Yi-Yan Lin,; Jyu-Yu Chang,; Bo-Jhang Sun,; Yan-Wei Liu,

    2010-06-01

    This study presents one novel compensation pixel design and driving method for active matrix organic light-emitting diode (AMOLED) displays that use low-temperature polycrystalline silicon thin-film transistors (LTPS-TFTs) with a voltage feed-back method and the simulation results are proposed and verified by SPICE simulator. The measurement and simulation of LTPS TFT characteristics demonstrate the good fitting result. The proposed circuit consists of four TFTs and two capacitors with an additional signal line. The error rates of OLED anode voltage variation are below 0.3% under the threshold voltage deviation of driving TFT (Δ VTH = ± 0.33 V). The simulation results show that the pixel design can improve the display image non-uniformity by compensating the threshold voltage deviation of driving TFT and the degradation of OLED threshold voltage at the same time.

  2. A New Low Temperature Polycrystalline Silicon Thin Film Transistor Pixel Circuit for Active Matrix Organic Light Emitting Diode

    NASA Astrophysics Data System (ADS)

    Fan, Ching-Lin; Lin, Yi-Yan; Chang, Jyu-Yu; Sun, Bo-Jhang; Liu, Yan-Wei

    2010-06-01

    This study presents one novel compensation pixel design and driving method for active matrix organic light-emitting diode (AMOLED) displays that use low-temperature polycrystalline silicon thin-film transistors (LTPS-TFTs) with a voltage feed-back method and the simulation results are proposed and verified by SPICE simulator. The measurement and simulation of LTPS TFT characteristics demonstrate the good fitting result. The proposed circuit consists of four TFTs and two capacitors with an additional signal line. The error rates of OLED anode voltage variation are below 0.3% under the threshold voltage deviation of driving TFT (ΔVTH = ±0.33 V). The simulation results show that the pixel design can improve the display image non-uniformity by compensating the threshold voltage deviation of driving TFT and the degradation of OLED threshold voltage at the same time.

  3. Method and apparatus for monitoring a hydrocarbon-selective catalytic reduction device

    DOEpatents

    Schmieg, Steven J; Viola, Michael B; Cheng, Shi-Wai S; Mulawa, Patricia A; Hilden, David L; Sloane, Thompson M; Lee, Jong H

    2014-05-06

    A method for monitoring a hydrocarbon-selective catalytic reactor device of an exhaust aftertreatment system of an internal combustion engine operating lean of stoichiometry includes injecting a reductant into an exhaust gas feedstream upstream of the hydrocarbon-selective catalytic reactor device at a predetermined mass flowrate of the reductant, and determining a space velocity associated with a predetermined forward portion of the hydrocarbon-selective catalytic reactor device. When the space velocity exceeds a predetermined threshold space velocity, a temperature differential across the predetermined forward portion of the hydrocarbon-selective catalytic reactor device is determined, and a threshold temperature as a function of the space velocity and the mass flowrate of the reductant is determined. If the temperature differential across the predetermined forward portion of the hydrocarbon-selective catalytic reactor device is below the threshold temperature, operation of the engine is controlled to regenerate the hydrocarbon-selective catalytic reactor device.

  4. Rainfall thresholds for possible landslide occurrence in Italy

    NASA Astrophysics Data System (ADS)

    Peruccacci, Silvia; Brunetti, Maria Teresa; Gariano, Stefano Luigi; Melillo, Massimo; Rossi, Mauro; Guzzetti, Fausto

    2017-08-01

    The large physiographic variability and the abundance of landslide and rainfall data make Italy an ideal site to investigate variations in the rainfall conditions that can result in rainfall-induced landslides. We used landslide information obtained from multiple sources and rainfall data captured by 2228 rain gauges to build a catalogue of 2309 rainfall events with - mostly shallow - landslides in Italy between January 1996 and February 2014. For each rainfall event with landslides, we reconstructed the rainfall history that presumably caused the slope failure, and we determined the corresponding rainfall duration D (in hours) and cumulated event rainfall E (in mm). Adopting a power law threshold model, we determined cumulated event rainfall-rainfall duration (ED) thresholds, at 5% exceedance probability, and their uncertainty. We defined a new national threshold for Italy, and 26 regional thresholds for environmental subdivisions based on topography, lithology, land-use, land cover, climate, and meteorology, and we used the thresholds to study the variations of the rainfall conditions that can result in landslides in different environments, in Italy. We found that the national and the environmental thresholds cover a small part of the possible DE domain. The finding supports the use of empirical rainfall thresholds for landslide forecasting in Italy, but poses an empirical limitation to the possibility of defining thresholds for small geographical areas. We observed differences between some of the thresholds. With increasing mean annual precipitation (MAP), the thresholds become higher and steeper, indicating that more rainfall is needed to trigger landslides where the MAP is high than where it is low. This suggests that the landscape adjusts to the regional meteorological conditions. We also observed that the thresholds are higher for stronger rocks, and that forested areas require more rainfall than agricultural areas to initiate landslides. Finally, we observed that a 20% exceedance probability national threshold was capable of predicting all the rainfall-induced landslides with casualties between 1996 and 2014, and we suggest that this threshold can be used to forecast fatal rainfall-induced landslides in Italy. We expect the method proposed in this work to define and compare the thresholds to have an impact on the definition of new rainfall thresholds for possible landslide occurrence in Italy, and elsewhere.

  5. The Relationship between the Behavioral Hearing Thresholds and Maximum Bilirubin Levels at Birth in Children with a History of Neonatal Hyperbilirubinemia

    PubMed Central

    Panahi, Rasool; Jafari, Zahra; Sheibanizade, Abdoreza; Salehi, Masoud; Esteghamati, Abdoreza; Hasani, Sara

    2013-01-01

    Introduction: Neonatal hyperbilirubinemia is one of the most important factors affecting the auditory system and can cause sensorineural hearing loss. This study investigated the relationship between behavioral hearing thresholds in children with a history of jaundice and the maximum level of bilirubin concentration in the blood. Materials and Methods: This study was performed on 18 children with a mean age of 5.6 years and with a history of neonatal hyperbilirubinemia. Behavioral hearing thresholds, transient evoked emissions and brainstem evoked responses were evaluated in all children. Results: Six children (33.3%) had normal hearing thresholds and the remaining (66.7%) had some degree of hearing loss. There was no significant relationship (r=-0.28, P=0.09) between the mean total bilirubin levels and behavioral hearing thresholds in all samples. A transient evoked emission was seen only in children with normal hearing thresholds however in eight cases brainstem evoked responses had not detected. Conclusion: Increased blood levels of bilirubin at the neonatal period were potentially one of the causes of hearing loss. There was a lack of a direct relationship between neonatal bilirubin levels and the average hearing thresholds which emphasizes on the necessity of monitoring the various amounts of bilirubin levels. PMID:24303432

  6. An Objective Estimation of Air-Bone-Gap in Cochlear Implant Recipients with Residual Hearing Using Electrocochleography.

    PubMed

    Koka, Kanthaiah; Saoji, Aniket A; Attias, Joseph; Litvak, Leonid M

    2017-01-01

    Although, cochlear implants (CI) traditionally have been used to treat individuals with bilateral profound sensorineural hearing loss, a recent trend is to implant individuals with residual low-frequency hearing. Notably, many of these individuals demonstrate an air-bone gap (ABG) in low-frequency, pure-tone thresholds following implantation. An ABG is the difference between audiometric thresholds measured using air conduction (AC) and bone conduction (BC) stimulation. Although, behavioral AC thresholds are straightforward to assess, BC thresholds can be difficult to measure in individuals with severe-to-profound hearing loss because of vibrotactile responses to high-level, low-frequency stimulation and the potential contribution of hearing in the contralateral ear. Because of these technical barriers to measuring behavioral BC thresholds in implanted patients with residual hearing, it would be helpful to have an objective method for determining ABG. This study evaluated an innovative technique for measuring electrocochleographic (ECochG) responses using the cochlear microphonic (CM) response to assess AC and BC thresholds in implanted patients with residual hearing. Results showed high correlations between CM thresholds and behavioral audiograms for AC and BC conditions, thereby demonstrating the feasibility of using ECochG as an objective tool for quantifying ABG in CI recipients.

  7. A Universal Threshold for the Assessment of Load and Output Residuals of Strain-Gage Balance Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2017-01-01

    A new universal residual threshold for the detection of load and gage output residual outliers of wind tunnel strain{gage balance data was developed. The threshold works with both the Iterative and Non{Iterative Methods that are used in the aerospace testing community to analyze and process balance data. It also supports all known load and gage output formats that are traditionally used to describe balance data. The threshold's definition is based on an empirical electrical constant. First, the constant is used to construct a threshold for the assessment of gage output residuals. Then, the related threshold for the assessment of load residuals is obtained by multiplying the empirical electrical constant with the sum of the absolute values of all first partial derivatives of a given load component. The empirical constant equals 2.5 microV/V for the assessment of balance calibration or check load data residuals. A value of 0.5 microV/V is recommended for the evaluation of repeat point residuals because, by design, the calculation of these residuals removes errors that are associated with the regression analysis of the data itself. Data from a calibration of a six-component force balance is used to illustrate the application of the new threshold definitions to real{world balance calibration data.

  8. Impact of rainfall spatial variability on Flash Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Douinot, Audrey; Roux, Hélène; Garambois, Pierre-André; Larnier, Kevin

    2014-05-01

    According to the United States National Hazard Statistics database, flooding and flash flooding have caused the largest number of deaths of any weather-related phenomenon over the last 30 years (Flash Flood Guidance Improvement Team, 2003). Like the storms that cause them, flash floods are very variable and non-linear phenomena in time and space, with the result that understanding and anticipating flash flood genesis is far from straightforward. In the U.S., the Flash Flood Guidance (FFG) estimates the average number of inches of rainfall for given durations required to produce flash flooding in the indicated county. In Europe, flash flood often occurred on small catchments (approximately 100 km2) and it has been shown that the spatial variability of rainfall has a great impact on the catchment response (Le Lay and Saulnier, 2007). Therefore, in this study, based on the Flash flood Guidance method, rainfall spatial variability information is introduced in the threshold estimation. As for FFG, the threshold is the number of millimeters of rainfall required to produce a discharge higher than the discharge corresponding to the first level (yellow) warning of the French flood warning service (SCHAPI: Service Central d'Hydrométéorologie et d'Appui à la Prévision des Inondations). The indexes δ1 and δ2 of Zoccatelli et al. (2010), based on the spatial moments of catchment rainfall, are used to characterize the rainfall spatial distribution. Rainfall spatial variability impacts on warning threshold and on hydrological processes are then studied. The spatially distributed hydrological model MARINE (Roux et al., 2011), dedicated to flash flood prediction is forced with synthetic rainfall patterns of different spatial distributions. This allows the determination of a warning threshold diagram: knowing the spatial distribution of the rainfall forecast and therefore the 2 indexes δ1 and δ2, the threshold value is read on the diagram. A warning threshold diagram is built for each studied catchment. The proposed methodology is applied on three Mediterranean catchments often submitted to flash floods. The new forecasting method as well as the Flash Flood Guidance method (uniform rainfall threshold) are tested on 25 flash floods events that had occurred on those catchments. Results show a significant impact of rainfall spatial variability. Indeed, it appears that the uniform rainfall threshold (FFG threshold) always overestimates the observed rainfall threshold. The difference between the FFG threshold and the proposed threshold ranges from 8% to 30%. The proposed methodology allows the calculation of a threshold more representative of the observed one. However, results strongly depend on the related event duration and on the catchment properties. For instance, the impact of the rainfall spatial variability seems to be correlated with the catchment size. According to these results, it seems to be interesting to introduce information on the catchment properties in the threshold calculation. Flash Flood Guidance Improvement Team, 2003. River Forecast Center (RFC) Development Management Team. Final Report. Office of Hydrologic Development (OHD), Silver Spring, Mary-land. Le Lay, M. and Saulnier, G.-M., 2007. Exploring the signature of climate and landscape spatial variabilities in flash flood events: Case of the 8-9 September 2002 Cévennes-Vivarais catastrophic event. Geophysical Research Letters, 34(L13401), doi:10.1029/2007GL029746. Roux, H., Labat, D., Garambois, P.-A., Maubourguet, M.-M., Chorda, J. and Dartus, D., 2011. A physically-based parsimonious hydrological model for flash floods in Mediterranean catchments. Nat. Hazards Earth Syst. Sci. J1 - NHESS, 11(9), 2567-2582. Zoccatelli, D., Borga, M., Zanon, F., Antonescu, B. and Stancalie, G., 2010. Which rainfall spatial information for flash flood response modelling? A numerical investigation based on data from the Carpathian range, Romania. Journal of Hydrology, 394(1-2), 148-161.

  9. A SVM-based quantitative fMRI method for resting-state functional network detection.

    PubMed

    Song, Xiaomu; Chen, Nan-kuei

    2014-09-01

    Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Graded-threshold parametric response maps: towards a strategy for adaptive dose painting

    NASA Astrophysics Data System (ADS)

    Lausch, A.; Jensen, N.; Chen, J.; Lee, T. Y.; Lock, M.; Wong, E.

    2014-03-01

    Purpose: To modify the single-threshold parametric response map (ST-PRM) method for predicting treatment outcomes in order to facilitate its use for guidance of adaptive dose painting in intensity-modulated radiotherapy. Methods: Multiple graded thresholds were used to extend the ST-PRM method (Nat. Med. 2009;15(5):572-576) such that the full functional change distribution within tumours could be represented with respect to multiple confidence interval estimates for functional changes in similar healthy tissue. The ST-PRM and graded-threshold PRM (GT-PRM) methods were applied to functional imaging scans of 5 patients treated for hepatocellular carcinoma. Pre and post-radiotherapy arterial blood flow maps (ABF) were generated from CT-perfusion scans of each patient. ABF maps were rigidly registered based on aligning tumour centres of mass. ST-PRM and GT-PRM analyses were then performed on overlapping tumour regions within the registered ABF maps. Main findings: The ST-PRMs contained many disconnected clusters of voxels classified as having a significant change in function. While this may be useful to predict treatment response, it may pose challenges for identifying boost volumes or for informing dose-painting by numbers strategies. The GT-PRMs included all of the same information as ST-PRMs but also visualized the full tumour functional change distribution. Heterogeneous clusters in the ST-PRMs often became more connected in the GT-PRMs by voxels with similar functional changes. Conclusions: GT-PRMs provided additional information which helped to visualize relationships between significant functional changes identified by ST-PRMs. This may enhance ST-PRM utility for guiding adaptive dose painting.

  11. Establishing Ion Ratio Thresholds Based on Absolute Peak Area for Absolute Protein Quantification using Protein Cleavage Isotope Dilution Mass Spectrometry

    PubMed Central

    Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.

    2014-01-01

    Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770

  12. The importance of reference materials in doping-control analysis.

    PubMed

    Mackay, Lindsey G; Kazlauskas, Rymantas

    2011-08-01

    Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.

  13. The Sensory Difference Threshold of Menthol Odor in Flavored Tobacco Determined by Combining Sensory and Chemical Analysis.

    PubMed

    Krüsemann, Erna J Z; Cremers, Johannes W J M; Visser, Wouter F; Punter, Pieter H; Talhout, Reinskje

    2017-03-01

    Cigarettes are an often-used consumer product, and flavor is an important determinant of their product appeal. Cigarettes with strong nontobacco flavors are popular among young people, and may facilitate smoking initiation. Discriminating flavors in tobacco is important for regulation purposes, for instance to set upper limits to the levels of important flavor additives. We provide a simple and fast method to determine the human odor difference threshold for flavor additives in a tobacco matrix, using a combination of chemical and sensory analysis. For an example, the human difference threshold for menthol odor, one of the most frequently used tobacco flavors, was determined. A consumer panel consisting of 20 women compared different concentrations of menthol-flavored tobacco to unflavored cigarette tobacco using the 2-alternative forced choice method. Components contributing to menthol odor were quantified using headspace GC-MS. The sensory difference threshold of menthol odor corresponded to a mixture of 43 (37-50)% menthol-flavored tobacco, containing 1.8 (1.6-2.1) mg menthol, 2.7 (2.3-3.1) µg menthone, and 1.0 (0.9-1.2) µg neomenthyl acetate per gram of tobacco. Such a method is important in the context of the European Tobacco Product Directive, and the US Food and Drug Administration Tobacco Control Act, that both prohibit cigarettes and roll-your-own tobacco with a characterizing flavor other than tobacco. Our method can also be adapted for matrices other than tobacco, such as food. © The Author 2016. Published by Oxford University Press.

  14. Cost-effectiveness of different strategies for selecting and treating individuals at increased risk of osteoporosis or osteopenia: a systematic review.

    PubMed

    Müller, Dirk; Pulm, Jannis; Gandjour, Afschin

    2012-01-01

    To compare cost-effectiveness modeling analyses of strategies to prevent osteoporotic and osteopenic fractures either based on fixed thresholds using bone mineral density or based on variable thresholds including bone mineral density and clinical risk factors. A systematic review was performed by using the MEDLINE database and reference lists from previous reviews. On the basis of predefined inclusion/exclusion criteria, we identified relevant studies published since January 2006. Articles included for the review were assessed for their methodological quality and results. The literature search resulted in 24 analyses, 14 of them using a fixed-threshold approach and 10 using a variable-threshold approach. On average, 70% of the criteria for methodological quality were fulfilled, but almost half of the analyses did not include medication adherence in the base case. The results of variable-threshold strategies were more homogeneous and showed more favorable incremental cost-effectiveness ratios compared with those based on a fixed threshold with bone mineral density. For analyses with fixed thresholds, incremental cost-effectiveness ratios varied from €80,000 per quality-adjusted life-year in women aged 55 years to cost saving in women aged 80 years. For analyses with variable thresholds, the range was €47,000 to cost savings. Risk assessment using variable thresholds appears to be more cost-effective than selecting high-risk individuals by fixed thresholds. Although the overall quality of the studies was fairly good, future economic analyses should further improve their methods, particularly in terms of including more fracture types, incorporating medication adherence, and including or discussing unrelated costs during added life-years. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  15. Audiometric Notch and Extended High-Frequency Hearing Threshold Shift in Relation to Total Leisure Noise Exposure: An Exploratory Analysis

    PubMed Central

    Wei, Wenjia; Heinze, Stefanie; Gerstner, Doris G.; Walser, Sandra M.; Twardella, Dorothee; Reiter, Christina; Weilnhammer, Veronika; Perez-Alvarez, Carmelo; Steffens, Thomas; Herr, Caroline E.W.

    2017-01-01

    Background: Studies investigating leisure noise effect on extended high frequency hearing are insufficient and they have inconsistent results. The aim of this study was to investigate if extended high-frequency hearing threshold shift is related to audiometric notch, and if total leisure noise exposure is associated with extended high-frequency hearing threshold shift. Materials and Methods: A questionnaire of the Ohrkan cohort study was used to collect information on demographics and leisure time activities. Conventional and extended high-frequency audiometry was performed. We did logistic regression between extended high-frequency hearing threshold shift and audiometric notch as well as between total leisure noise exposure and extended high-frequency hearing threshold shift. Potential confounders (sex, school type, and firecrackers) were included. Results: Data from 278 participants (aged 18–23 years, 53.2% female) were analyzed. Associations between hearing threshold shift at 10, 11.2, 12.5, and 14 kHz with audiometric notch were observed with a higher prevalence of threshold shift at the four frequencies, compared to the notch. However, we found no associations between total leisure noise exposure and hearing threshold shift at any extended high frequency. Conclusion: This exploratory analysis suggests that while extended high-frequency hearing threshold shifts are not related to total leisure noise exposure, they are strongly associated with audiometric notch. This leads us to further explore the hypothesis that extended high-frequency threshold shift might be indicative of the appearance of audiometric notch at a later time point, which can be investigated in the future follow-ups of the Ohrkan cohort. PMID:29319010

  16. Metabolic Tumor Volume and Total Lesion Glycolysis in Oropharyngeal Cancer Treated With Definitive Radiotherapy: Which Threshold Is the Best Predictor of Local Control?

    PubMed

    Castelli, Joël; Depeursinge, Adrien; de Bari, Berardino; Devillers, Anne; de Crevoisier, Renaud; Bourhis, Jean; Prior, John O

    2017-06-01

    In the context of oropharyngeal cancer treated with definitive radiotherapy, the aim of this retrospective study was to identify the best threshold value to compute metabolic tumor volume (MTV) and/or total lesion glycolysis to predict local-regional control (LRC) and disease-free survival. One hundred twenty patients with a locally advanced oropharyngeal cancer from 2 different institutions treated with definitive radiotherapy underwent FDG PET/CT before treatment. Various MTVs and total lesion glycolysis were defined based on 2 segmentation methods: (i) an absolute threshold of SUV (0-20 g/mL) or (ii) a relative threshold for SUVmax (0%-100%). The parameters' predictive capabilities for disease-free survival and LRC were assessed using the Harrell C-index and Cox regression model. Relative thresholds between 40% and 68% and absolute threshold between 5.5 and 7 had a similar predictive value for LRC (C-index = 0.65 and 0.64, respectively). Metabolic tumor volume had a higher predictive value than gross tumor volume (C-index = 0.61) and SUVmax (C-index = 0.54). Metabolic tumor volume computed with a relative threshold of 51% of SUVmax was the best predictor of disease-free survival (hazard ratio, 1.23 [per 10 mL], P = 0.009) and LRC (hazard ratio: 1.22 [per 10 mL], P = 0.02). The use of different thresholds within a reasonable range (between 5.5 and 7 for an absolute threshold and between 40% and 68% for a relative threshold) seems to have no major impact on the predictive value of MTV. This parameter may be used to identify patient with a high risk of recurrence and who may benefit from treatment intensification.

  17. Resonances in positron-potassium (e +-K) system with natural and unnatural parities

    NASA Astrophysics Data System (ADS)

    Umair, M.; Jonsell, S.

    2016-01-01

    We present an investigation of resonances with natural and unnatural parities in the positron-potassium system using the complex scaling method. A model potential is used to represent the interaction between the core and the valence electron. Explicitly correlated Gaussian wave functions are used to represent the correlation effects between the valence electron, the positron and the K+ core. Resonance energies and widths for two partial waves (S- and P-wave) below the {{K}}(4p,5 s,5p,4 d,4f) excitation thresholds and positronium n = 2 formation threshold are calculated for natural parity. Resonance states for P e below the {{K}}(4d) excitation threshold and positronium n = 2, 3 formation thresholds are calculated for unnatural parity which has not been previously reported. Below both positronium thresholds we have found a dipole series of resonances, with binding energies scaling in good agreement with exceptions from an analytical calculation. The present results are compared with those in the literature.

  18. New Measures of Masked Text Recognition in Relation to Speech-in-Noise Perception and Their Associations with Age and Cognitive Abilities

    ERIC Educational Resources Information Center

    Besser, Jana; Zekveld, Adriana A.; Kramer, Sophia E.; Ronnberg, Jerker; Festen, Joost M.

    2012-01-01

    Purpose: In this research, the authors aimed to increase the analogy between Text Reception Threshold (TRT; Zekveld, George, Kramer, Goverts, & Houtgast, 2007) and Speech Reception Threshold (SRT; Plomp & Mimpen, 1979) and to examine the TRT's value in estimating cognitive abilities that are important for speech comprehension in noise. Method: The…

  19. Auditory Brainstem Response Thresholds to Air- and Bone-Conducted CE-Chirps in Neonates and Adults

    ERIC Educational Resources Information Center

    Cobb, Kensi M.; Stuart, Andrew

    2016-01-01

    Purpose The purpose of this study was to compare auditory brainstem response (ABR) thresholds to air- and bone-conducted CE-Chirps in neonates and adults. Method Thirty-two neonates with no physical or neurologic challenges and 20 adults with normal hearing participated. ABRs were acquired with a starting intensity of 30 dB normal hearing level…

  20. Combining the Finite Element Method with Structural Connectome-based Analysis for Modeling Neurotrauma:Connectome Neurotrauma Mechanics

    DTIC Science & Technology

    2012-08-16

    death threshold. Using an injury threshold of 18% strain, 161 edges were removed. Watts and Strogatz [66] define the small-world network based on the...NeuroImage 52: 1059–1069. 65. Latora V, Marchiori M (2001) Efficient behavior of small-world networks. Phys Rev Lett 87: 198701. 66. Watts DJ, Strogatz SH

Top