Sample records for established threshold values

  1. The limits of thresholds: silica and the politics of science, 1935 to 1990.

    PubMed Central

    Markowitz, G; Rosner, D

    1995-01-01

    Since the 1930s threshold limit values have been presented as an objectively established measure of US industrial safety. However, there have been important questions raised regarding the adequacy of these thresholds for protecting workers from silicosis. This paper explores the historical debates over silica threshold limit values and the intense political negotiation that accompanied their establishment. In the 1930s and early 1940s, a coalition of business, public health, insurance, and political interests formed in response to a widely perceived "silicosis crisis." Part of the resulting program aimed at containing the crisis was the establishment of threshold limit values. Yet silicosis cases continued to be documented. By the 1960s these cases had become the basis for a number of revisions to the thresholds. In the 1970s, following a National Institute for Occupational Safety and Health recommendation to lower the threshold limit value for silica and to eliminate sand as an abrasive in blasting, industry fought attempts to make the existing values more stringent. This paper traces the process by which threshold limit values became part of a compromise between the health of workers and the economic interests of industry. Images p254-a p256-a p257-a p259-a PMID:7856788

  2. Is ``No-Threshold'' a ``Non-Concept''?

    NASA Astrophysics Data System (ADS)

    Schaeffer, David J.

    1981-11-01

    A controversy prominent in scientific literature that has carried over to newspapers, magazines, and popular books is having serious social and political expressions today: “Is there, or is there not, a threshold below which exposure to a carcinogen will not induce cancer?” The distinction between establishing the existence of this threshold (which is a theoretical question) and its value (which is an experimental one) gets lost in the scientific arguments. Establishing the existence of this threshold has now become a philosophical question (and an emotional one). In this paper I qualitatively outline theoretical reasons why a threshold must exist, discuss experiments which measure thresholds on two chemicals, and describe and apply a statistical method for estimating the threshold value from exposure-response data.

  3. I. RENAL THRESHOLDS FOR HEMOGLOBIN IN DOGS

    PubMed Central

    Lichty, John A.; Havill, William H.; Whipple, George H.

    1932-01-01

    We use the term "renal threshold for hemoglobin" to indicate the smallest amount of hemoglobin which given intravenously will effect the appearance of recognizable hemoglobin in the urine. The initial renal threshold level for dog hemoglobin is established by the methods employed at an average value of 155 mg. hemoglobin per kilo body weight with maximal values of 210 and minimal of 124. Repeated daily injections of hemoglobin will depress this initial renal threshold level on the average 46 per cent with maximal values of 110 and minimal values of 60 mg. hemoglobin per kilo body weight. This minimal or depression threshold is relatively constant if the injections are continued. Rest periods without injections cause a return of the renal threshold for hemoglobin toward the initial threshold levels—recovery threshold level. Injections of hemoglobin below the initial threshold level but above the minimal or depression threshold will eventually reduce the renal threshold for hemoglobin to its depression threshold level. We believe the depression threshold or minimal renal threshold level due to repeated hemoglobin injections is a little above the glomerular threshold which we assume is the base line threshold for hemoglobin. Our reasons for this belief in the glomerular threshold are given above and in the other papers of this series. PMID:19870016

  4. Pressure and cold pain threshold reference values in a large, young adult, pain-free population.

    PubMed

    Waller, Robert; Smith, Anne Julia; O'Sullivan, Peter Bruce; Slater, Helen; Sterling, Michele; McVeigh, Joanne Alexandra; Straker, Leon Melville

    2016-10-01

    Currently there is a lack of large population studies that have investigated pain sensitivity distributions in healthy pain free people. The aims of this study were: (1) to provide sex-specific reference values of pressure and cold pain thresholds in young pain-free adults; (2) to examine the association of potential correlates of pain sensitivity with pain threshold values. This study investigated sex specific pressure and cold pain threshold estimates for young pain free adults aged 21-24 years. A cross-sectional design was utilised using participants (n=617) from the Western Australian Pregnancy Cohort (Raine) Study at the 22-year follow-up. The association of site, sex, height, weight, smoking, health related quality of life, psychological measures and activity with pain threshold values was examined. Pressure pain threshold (lumbar spine, tibialis anterior, neck and dorsal wrist) and cold pain threshold (dorsal wrist) were assessed using standardised quantitative sensory testing protocols. Reference values for pressure pain threshold (four body sites) stratified by sex and site, and cold pain threshold (dorsal wrist) stratified by sex are provided. Statistically significant, independent correlates of increased pressure pain sensitivity measures were site (neck, dorsal wrist), sex (female), higher waist-hip ratio and poorer mental health. Statistically significant, independent correlates of increased cold pain sensitivity measures were, sex (female), poorer mental health and smoking. These data provide the most comprehensive and robust sex specific reference values for pressure pain threshold specific to four body sites and cold pain threshold at the dorsal wrist for young adults aged 21-24 years. Establishing normative values in this young age group is important given that the transition from adolescence to adulthood is a critical temporal period during which trajectories for persistent pain can be established. These data will provide an important research resource to enable more accurate profiling and interpretation of pain sensitivity in clinical pain disorders in young adults. The robust and comprehensive data can assist interpretation of future clinical pain studies and provide further insight into the complex associations of pain sensitivity that can be used in future research. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  5. CHANGES IN THE ANAEROBIC THRESHOLD IN AN ANNUAL CYCLE OF SPORT TRAINING OF YOUNG SOCCER PLAYERS

    PubMed Central

    Andrzejewski, M.; Wieczorek, A.; Barinow-Wojewódzki, A.; Jadczak, Ł.; Adrian, S.; Pietrzak, M.; Wieczorek, S.

    2013-01-01

    The aim of the study was to assess changes in the anaerobic threshold of young soccer players in an annual training cycle. A group of highly trained 15-18 year old players of KKS Lech Poznań were tested. The tests included an annual training macrocycle, and its individual stages resulted from the time structure of the sports training. In order to assess the level of exercise capacities of the players, a field exercise test of increasing intensity was carried out on a soccer pitch. The test made it possible to determine the 4 millimolar lactate threshold (T LA 4 mmol · l-1) on the basis of the lactate concentration in blood [LA], to establish the threshold running speed and the threshold heart rate [HR]. The threshold running speed at the level of the 4 millimolar lactate threshold was established using the two-point form of the equation of a straight line. The obtained indicators of the threshold running speed allowed for precise establishment of effort intensity used in individual training in developing aerobic endurance. In order to test the significance of differences in mean values between four dates of tests, a non-parametric Friedman ANOVA test was used. The significance of differences between consecutive dates of tests was determined using a post-hoc Friedman ANOVA test. The tests showed significant differences in values of selected indicators determined at the anaerobic threshold in various stages of an annual training cycle of young soccer players. The most beneficial changes in terms of the threshold running speed were noted on the fourth date of tests, when the participants had the highest values of 4.01 m · s-1 for older juniors, and 3.80 m · s-1 for younger juniors. This may be indicative of effective application of an individualized programme of training loads and of good preparation of teams for competition in terms of players’ aerobic endurance. PMID:24744480

  6. Changes in the anaerobic threshold in an annual cycle of sport training of young soccer players.

    PubMed

    Sliwowski, R; Andrzejewski, M; Wieczorek, A; Barinow-Wojewódzki, A; Jadczak, L; Adrian, S; Pietrzak, M; Wieczorek, S

    2013-06-01

    The aim of the study was to assess changes in the anaerobic threshold of young soccer players in an annual training cycle. A group of highly trained 15-18 year old players of KKS Lech Poznań were tested. The tests included an annual training macrocycle, and its individual stages resulted from the time structure of the sports training. In order to assess the level of exercise capacities of the players, a field exercise test of increasing intensity was carried out on a soccer pitch. The test made it possible to determine the 4 millimolar lactate threshold (T LA 4 mmol · l(-1)) on the basis of the lactate concentration in blood [LA], to establish the threshold running speed and the threshold heart rate [HR]. The threshold running speed at the level of the 4 millimolar lactate threshold was established using the two-point form of the equation of a straight line. The obtained indicators of the threshold running speed allowed for precise establishment of effort intensity used in individual training in developing aerobic endurance. In order to test the significance of differences in mean values between four dates of tests, a non-parametric Friedman ANOVA test was used. The significance of differences between consecutive dates of tests was determined using a post-hoc Friedman ANOVA test. The tests showed significant differences in values of selected indicators determined at the anaerobic threshold in various stages of an annual training cycle of young soccer players. The most beneficial changes in terms of the threshold running speed were noted on the fourth date of tests, when the participants had the highest values of 4.01 m · s(-1) for older juniors, and 3.80 m · s(-1) for younger juniors. This may be indicative of effective application of an individualized programme of training loads and of good preparation of teams for competition in terms of players' aerobic endurance.

  7. [The new German general threshold limit value for dust--pro and contra the adoption in Austria].

    PubMed

    Godnic-Cvar, Jasminka; Ponocny, Ivo

    2004-01-01

    Since it has been realised that inhalation of inert dust is one of the important confounding variables for the development of chronic bronchitis, the threshold values for occupational exposure to these dusts needs to be further decreased. The German Commission for the Investigation of Health Hazards of Chemical Compounds in the Work Area (MAK-Commission) has set a new threshold (MAK-Value) for inert dusts (4 mg/m3 for inhalable dust, 1.5 mg/m3 for respirable dust) in 1997. This value is much lower than the threshold values currently used world-wide. The aim of the present article is to assess the scientific plausibility of the methodology (databases and statistics) used to set these new German MAK-Values, regarding their adoption in Austria. Although we believe that it is substantial to lower the MAK-Value for inert dust in order to prevent the development of chronic bronchitis as a consequence of occupational exposure to inert dusts, the applied methodology used by the German MAK-Commission in 1997 to set the new MAK-Values does not justify the reduction of the threshold limit value. A carefully designed study to establish an appropriate scientific basis for setting a new threshold value for inert dusts in the workplace should be carried out. Meanwhile, at least the currently internationally applied threshold values should be adopted in Austria.

  8. 48 CFR 8.405-6 - Limiting sources.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... or BPA with an estimated value exceeding the micro-purchase threshold not placed or established in... Schedule ordering procedures. The original order or BPA must not have been previously issued under sole... order or BPA exceeding the simplified acquisition threshold. (2) Posting. (i) Within 14 days after...

  9. 48 CFR 8.405-6 - Limiting sources.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... or BPA with an estimated value exceeding the micro-purchase threshold not placed or established in... Schedule ordering procedures. The original order or BPA must not have been previously issued under sole... order or BPA exceeding the simplified acquisition threshold. (2) Posting. (i) Within 14 days after...

  10. 48 CFR 8.405-6 - Limiting sources.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... or BPA with an estimated value exceeding the micro-purchase threshold not placed or established in... Schedule ordering procedures. The original order or BPA must not have been previously issued under sole... order or BPA exceeding the simplified acquisition threshold. (2) Posting. (i) Within 14 days after...

  11. 48 CFR 8.405-6 - Limiting sources.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... or BPA with an estimated value exceeding the micro-purchase threshold not placed or established in... Schedule ordering procedures. The original order or BPA must not have been previously issued under sole... order or BPA exceeding the simplified acquisition threshold. (2) Posting. (i) Within 14 days after...

  12. Lead toxicity thresholds in 17 Chinese soils based on substrate-induced nitrification assay.

    PubMed

    Li, Ji; Huang, Yizong; Hu, Ying; Jin, Shulan; Bao, Qiongli; Wang, Fei; Xiang, Meng; Xie, Huiting

    2016-06-01

    The influence of soil properties on toxicity threshold values for Pb toward soil microbial processes is poorly recognized. The impact of leaching on the Pb threshold has not been assessed systematically. Lead toxicity was screened in 17 Chinese soils using a substrate-induced nitrification (SIN) assay under both leached and unleached conditions. The effective concentration of added Pb causing 50% inhibition (EC50) ranged from 185 to >2515mg/kg soil for leached soil and 130 to >2490mg/kg soil for unleached soil. These results represented >13- and >19-fold variations among leached and unleached soils, respectively. Leaching significantly reduced Pb toxicity for 70% of both alkaline and acidic soils tested, with an average leaching factor of 3.0. Soil pH and CEC were the two most useful predictors of Pb toxicity in soils, explaining over 90% of variance in the unleached EC50 value. The relationships established in the present study predicted Pb toxicity within a factor of two of measured values. These relationships between Pb toxicity and soil properties could be used to establish site-specific guidance on Pb toxicity thresholds. Copyright © 2016. Published by Elsevier B.V.

  13. Investigating Over Critical Thresholds of Forest Megafires Danger Conditions in Europe Utilising the ECMWF ERA-Interim Reanalysis

    NASA Astrophysics Data System (ADS)

    Petroliagkis, Thomas I.; Camia, Andrea; Liberta, Giorgio; Durrant, Tracy; Pappenberger, Florian; San-Miguel-Ayanz, Jesus

    2014-05-01

    The European Forest Fire Information System (EFFIS) has been established by the Joint Research Centre (JRC) and the Directorate General for Environment (DG ENV) of the European Commission (EC) to support the services in charge of the protection of forests against fires in the EU and neighbour countries, and also to provide the EC services and the European Parliament with information on forest fires in Europe. Within its applications, EFFIS provides current and forecast meteorological fire danger maps up to 6 days. Weather plays a key role in affecting wildfire occurrence and behaviour. Meteorological parameters can be used to derive meteorological fire weather indices that provide estimations of fire danger level at a given time over a specified area of interest. In this work, we investigate the suitability of critical thresholds of fire danger to provide an early warning for megafires (fires > 500 ha) over Europe. Past trends of fire danger are analysed computing daily fire danger from weather data taken from re-analysis fields for a period of 31 years (1980 to 2010). Re-analysis global data sets coming from the construction of high-quality climate records, which combine past observations collected from many different observing and measuring platforms, are capable of describing how Fire Danger Indices have evolved over time at a global scale. The latest and most updated ERA-Interim dataset of the European Centre for Medium-Range Weather Forecast (ECMWF) was used to extract meteorological variables needed to compute daily values of the Canadian Fire Weather Index (CFWI) over Europe, with a horizontal resolution of about 75x75 km. Daily time series of CFWI were constructed and analysed over a total of 1,071 European NUTS3 centroids, resulting in a set of percentiles and critical thresholds. Such percentiles could be used as thresholds to help fire services establish a measure of the significance of CFWI outputs as they relate to levels of fire potential, fuel conditions and fire danger. Median percentile values of fire days accumulated over the 31-year period were compared to median values of all days from that period. As expected, the CWFI time series exhibit different values on fire days than on all days. In addition, a percentile analysis was performed in order to determine the behaviour of index values corresponding to fire events falling into the megafire category. This analysis resulted in a set of critical thresholds based on percentiles. By utilising such thresholds, an initial framework of an early warning system has being established. By lowering the value of any of these thresholds, the number of hits could be increased until all extremes were captured (resulting in zero misses). However, in doing so, the number of false alarms tends to increase significantly. Consequently, an optimal trade-off between hits and false alarms has to be established when setting different (critical) CFWI thresholds.

  14. 78 FR 38074 - Announcement Regarding a Change in Eligibility for Unemployment Insurance (UI) Claimants in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-25

    ...Announcement regarding a change in eligibility for Unemployment Insurance (UI) claimants in Alabama, Alaska, Delaware, Illinois, Louisiana, Michigan, Mississippi, Ohio, the Virgin Islands and Wisconsin in the Emergency Unemployment Compensation (EUC08) program, and the Federal-State Extended Benefits (EB) program. The U.S. Department of Labor (Department) produces trigger notices indicating which states qualify for both EB and EUC08 benefits, and provides the beginning and ending dates of payable periods for each qualifying state. The trigger notices covering state eligibility for these programs can be found at: http://ows.doleta.gov/unemploy/claims-- arch.asp. The following changes have occurred since the publication of the last notice regarding states EUC08 and EB trigger status: Alabama's trigger value had fallen below the 7.0% threshold and has triggered ``off'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted total unemployment rate (TUR) in Alabama was 6.9%, falling below the 7.0% trigger threshold necessary to remain ``on'' Tier 3 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Alabama could exhaust Tier 2 and establish Tier 3 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had for Tier 3 after April 13, 2013. Alaska's insured unemployment rate (IUR) has fallen below the 6.0% trigger threshold and has triggered ``off'' of EB. Based on data from Alaska for the week ending April 13, 2013, the 13 week IUR in Alaska fell below the 6.0% trigger threshold necessary to remain ``on'' EB. The payable period in EB for Alaska ended May 4, 2013. Alaska's IUR has fallen below the 6.0% trigger threshold and has triggered ``off'' Tier 4 of EUC08. Based on data from Alaska for the week ending April 13, 2013, the 13 week IUR in Alaska fell below the 6.0% trigger rate threshold to remain ``on'' Tier 4 of EUC08. The week ending May 4, 2013, was the last week in which EUC08 claimants in Alaska could exhaust Tier 3, and establish Tier 4 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had for Tier 4 after May 4, 2013. Delaware's trigger value exceeds the 7.0% trigger threshold and has triggered ``on'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR in Delaware was 7.1%, exceeding the 7.0% threshold necessary to trigger ``on'' Tier 3 of EUC08. The week beginning April 7, 2013, was the first week in which EUC08 claimants in Delaware who had exhausted Tier 2, and are otherwise eligible, could establish Tier 3 eligibility. Illinois' trigger value met the 9.0% trigger threshold and has triggered ``on'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 29, 2013, the three month average, seasonally adjusted TUR in Illinois met the 9.0% trigger threshold to trigger ``on'' Tier 4 of EUC08. The week beginning April 14, 2013, was the first week in which EUC08 claimants in Illinois who had exhausted Tier 3, and were otherwise eligible, could establish Tier 4 eligibility. Louisiana's trigger value has fallen below the 6.0% trigger threshold and has triggered ``off'' Tier 2 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR in Louisiana was 5.8%, falling below the 6.0% trigger threshold to remain ``on'' Tier 2 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Louisiana could exhaust Tier 1, and establish Tier 2 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had in Tier 2 after April 13, 2013. Michigan's trigger value has fallen below the 9.0% trigger threshold and has triggered ``off'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR for Michigan was 8.9%, falling below the 9.0% trigger threshold to remain ``on'' Tier 4 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Michigan could exhaust Tier 3, and establish Tier 4 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had in Tier 4 after April 13, 2013. Mississippi's trigger value exceeds the 9.0% trigger threshold and has triggered ``on'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 29, 2013, the three month average, seasonally adjusted TUR in Mississippi was 9.3%, exceeding the 9.0% trigger threshold to trigger ``on'' Tier 4 of EUC08. The week beginning April 14, 2013, was the first week in which EUC08 claimants in Mississippi who had exhausted Tier 3, and are otherwise eligible, could establish Tier 4 eligibility. Ohio's trigger value met the 7.0% trigger threshold and has triggered ``on'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on April 19, 2013, the three month average, seasonally adjusted total unemployment rate in Ohio had met 7.0% trigger threshold to trigger ``on'' in Tier 3 of EUC08. The week beginning May 5, 2013, was the first week in which EUC08 claimants in Ohio who had exhausted Tier 2, and were otherwise eligible, could establish Tier 3 eligibility. The Virgin Islands' estimated trigger rate fell below the 6.0% threshold and has triggered ``off'' both Tier 2 and Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 8, 2013, the estimated three month average, seasonally adjusted TUR in the Virgin Islands fell below the 6.0% trigger threshold rate to remain ``on'' both Tier 2 and Tier 3 of EUC08. That triggered the Virgin Islands off both Tier 2 and Tier 3 of EUC08. The week ending March, 30 2013, was the last week in which EUC08 claimants in the Virgin Islands could exhaust Tier 1 and establish Tier 2 eligibility, or exhaust Tier 2 and establish Tier 3 eligibility. Wisconsin's trigger value met the 7.0% threshold and has triggered ``on'' Tier 3 of EUC08, however mandatory 13 week ``off'' period delayed effective date. Based on data released by the Bureau of Labor Statistics on April 19, 2013, the three month average, seasonally adjusted TUR for Wisconsin has met the 7.0% trigger rate threshold to trigger ``on'' Tier 3 of EUC08. However, Wisconsin was in a 13 week mandatory ``off'' period that started February 9, 2013, and did not conclude until May 11, 2013. As a result, Wisconsin remained in an ``off'' period for Tier 3 of EUC08 through May 11, 2013, and triggered ``on'' Tier 3 of EUC08 effective May 12, 2013. The week beginning May 12, 2013, was the first week in which EUC08 claimants in Wisconsin who have exhausted Tier 2, and are otherwise eligible, can establish Tier 3 eligibility.

  15. Could ecological thresholds of toxicological concern (eco-TTCs) be used to support development of ambient water quality criteria?

    EPA Science Inventory

    The Threshold of Toxicologic Concern (TTC) is an approach used for a decades in human hazard assessment. A TTC establishes an exposure level for a chemical below which no appreciable risk to human health is expected based upon a de minimis value for toxicity identified for many ...

  16. Establishing Ion Ratio Thresholds Based on Absolute Peak Area for Absolute Protein Quantification using Protein Cleavage Isotope Dilution Mass Spectrometry

    PubMed Central

    Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.

    2014-01-01

    Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770

  17. Otoliths - Accelerometer and seismometer; Implications in Vestibular Evoked Myogenic Potential (VEMP).

    PubMed

    Grant, Wally; Curthoys, Ian

    2017-09-01

    Vestibular otolithic organs are recognized as transducers of head acceleration and they function as such up to their corner frequency or undamped natural frequency. It is well recognized that these organs respond to frequencies above their corner frequency up to the 2-3 kHz range (Curthoys et al., 2016). A mechanics model for the transduction of these organs is developed that predicts the response below the undamped natural frequency as an accelerometer and above that frequency as a seismometer. The model is converted to a transfer function using hair cell bundle deflection. Measured threshold acceleration stimuli are used along with threshold deflections for threshold transfer function values. These are compared to model predicted values, both below and above their undamped natural frequency. Threshold deflection values are adjusted to match the model transfer function. The resulting threshold deflection values were well within in measure threshold bundle deflection ranges. Vestibular Evoked Myogenic Potentials (VEMPs) today routinely uses stimulus frequencies of 500 and 1000 Hz, and otoliths have been established incontrovertibly by clinical and neural evidence as the stimulus source. The mechanism for stimulus at these frequencies above the undamped natural frequency of otoliths is presented where otoliths are utilizing a seismometer mode of response for VEMP transduction. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Relation between the structural parameters of metallic glasses at the onset crystallization temperatures and threshold values of the effective diffusion coefficients

    NASA Astrophysics Data System (ADS)

    Tkatch, V. I.; Svyrydova, K. A.; Vasiliev, S. V.; Kovalenko, O. V.

    2017-08-01

    Using the results of differential scanning calorimetry and X-ray diffractometry, an analysis has been carried out of the initial stages of the eutectic and primary mechanisms of crystallization of a series of metallic glasses based on Fe and Al with the established temperature dependences of the effective diffusion coefficients. Analytical relationships, which relate the volume density of crystallites formed in the glasses at the temperatures of the onset of crystallization with the values of the effective diffusion coefficients at these temperatures have been proposed. It has been established that, in the glasses, the crystallization of which begins at the lower boundary of the threshold values of the effective diffusion coefficients ( 10-20 m2/s), structures are formed with the volume density of crystallites on the order of 1023-1024 m-3 and, at the upper boundary (10-18 m2/s), of the order of 1018 and 1020 m-3 in the glasses that are crystallized via the eutectic and primary mechanisms, respectively. Good agreement between the calculated and experimental estimates indicates that the threshold values of the effective diffusion coefficients are the main factors that determine the structure of glasses at the initial stages of crystallization.

  19. A Quantitative Approach to Distinguish Pneumonia From Atelectasis Using Computed Tomography Attenuation.

    PubMed

    Edwards, Rachael M; Godwin, J David; Hippe, Dan S; Kicska, Gregory

    2016-01-01

    It is known that atelectasis demonstrates greater contrast enhancement than pneumonia on computed tomography (CT). However, the effectiveness of using a Hounsfield unit (HU) threshold to distinguish pneumonia from atelectasis has never been shown. The objective of the study is to demonstrate that an HU threshold can be quantitatively used to effectively distinguish pneumonia from atelectasis. Retrospectively identified CT pulmonary angiogram examinations that did not show pulmonary embolism but contained nonaerated lungs were classified as atelectasis or pneumonia based on established clinical criteria. The HU attenuation was measured in these nonaerated lungs. Receiver operating characteristic (ROC) analysis was performed to determine the area under the ROC curve, sensitivity, and specificity of using the attenuation to distinguish pneumonia from atelectasis. Sixty-eight nonaerated lungs were measured in 55 patients. The mean (SD) enhancement was 62 (18) HU in pneumonia and 119 (24) HU in atelectasis (P < 0.001). A threshold of 92 HU diagnosed pneumonia with 97% sensitivity (confidence interval [CI], 80%-99%) and 85% specificity (CI, 70-93). Accuracy, measured as area under the ROC curve, was 0.97 (CI, 0.89-0.99). We have established that a threshold HU value can be used to confidently distinguish pneumonia from atelectasis with our standard CT pulmonary angiogram imaging protocol and patient population. This suggests that a similar threshold HU value may be determined for other scanning protocols, and application of this threshold may facilitate a more confident diagnosis of pneumonia and thus speed treatment.

  20. 32 CFR 3.8 - DoD access to records policy.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to use an IPA: the business unit's name, address and the expected value of its award. When the clause... business unit that will perform the OT agreement, or a subawardee, meets the criteria for an audit pursuant... paragraph (c) of this section. The value establishing the threshold is the total value of the agreement...

  1. 32 CFR 3.8 - DoD access to records policy.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to use an IPA: the business unit's name, address and the expected value of its award. When the clause... business unit that will perform the OT agreement, or a subawardee, meets the criteria for an audit pursuant... paragraph (c) of this section. The value establishing the threshold is the total value of the agreement...

  2. Economic evaluation and cost-effectiveness thresholds: signals to firms and implications for R & D investment and innovation.

    PubMed

    Vernon, John A; Goldberg, Robert; Golec, Joseph

    2009-01-01

    In this article we describe how reimbursement cost-effectiveness thresholds, per unit of health benefit, whether set explicitly or observed implicitly via historical reimbursement decisions, serve as a signal to firms about the commercial viability of their R&D projects (including candidate products for in-licensing). Traditional finance methods for R&D project valuations, such as net present value analyses (NPV), incorporate information from these payer reimbursement signals to help determine which R&D projects should be continued and which should be terminated (in the case of the latter because they yield an NPV < 0). Because the influence these signals have for firm R&D investment decisions is so significant, we argue that it is important for reimbursement thresholds to reflect the economic value of the unit of health benefit being considered for reimbursement. Thresholds set too low (below the economic value of the health benefit) will result in R&D investment levels that are too low relative to the economic value of R&D (on the margin). Similarly, thresholds set too high (above the economic value of the health benefit) will result in inefficiently high levels of R&D spending. The US in particular, which represents approximately half of the global pharmaceutical market (based on sales), and which seems poised to begin undertaking cost effectiveness in a systematic way, needs to exert caution in setting policies that explicitly or implicitly establish cost-effectiveness reimbursement thresholds for healthcare products and technologies, such as pharmaceuticals.

  3. Evaluating links between forest harvest and stream temperature threshold exceedances: the value of spatial and temporal data

    Treesearch

    Jeremiah D. Groom; Sherri L. Johnson; Joshua D. Seeds; George G. Ice

    2017-01-01

    We present the results of a replicated before-after-control-impact study on 33 streams to test the effectiveness of riparian rules for private and State forests at meeting temperature criteria in streams in western Oregon. Many states have established regulatory temperature thresholds, referred to as numeric criteria, to protect cold-water fishes such as salmon and...

  4. Existence of infinitely many stationary solutions of the L2-subcritical and critical NLSE on compact metric graphs

    NASA Astrophysics Data System (ADS)

    Dovetta, Simone

    2018-04-01

    We investigate the existence of stationary solutions for the nonlinear Schrödinger equation on compact metric graphs. In the L2-subcritical setting, we prove the existence of an infinite number of such solutions, for every value of the mass. In the critical regime, the existence of infinitely many solutions is established if the mass is lower than a threshold value, while global minimizers of the NLS energy exist if and only if the mass is lower or equal to the threshold. Moreover, the relation between this threshold and the topology of the graph is characterized. The investigation is based on variational techniques and some new versions of Gagliardo-Nirenberg inequalities.

  5. Development of an epiphyte indicator of nutrient enrichment ...

    EPA Pesticide Factsheets

    Metrics of epiphyte load on macrophytes were evaluated for use as quantitative biological indicators for nutrient impacts in estuarine waters, based on review and analysis of the literature on epiphytes and macrophytes, primarily seagrasses, but including some brackish and freshwater rooted macrophyte species. An approach is presented that empirically derives threshold epiphyte loads which are likely to cause specified levels of decrease in macrophyte response metrics such as biomass, shoot density, percent cover, production and growth. Data from 36 studies of 10 macrophyte species were pooled to derive relationships between epiphyte load and -25 and -50% seagrass response levels, which are proposed as the primary basis for establishment of critical threshold values. Given multiple sources of variability in the response data, threshold ranges based on the range of values falling between the median and the 75th quantiles of observations at a given seagrass response level are proposed rather than single, critical point values. Four epiphyte load threshold categories - low, moderate, high, very high, are proposed. Comparison of values of epiphyte loads associated with 25 and 50% reductions in light to macrophytes suggest that the threshold ranges are realistic both in terms of the principle mechanism of impact to macrophytes and in terms of the magnitude of resultant impacts expressed by the macrophytes. Some variability in response levels was observed among

  6. A new function for estimating local rainfall thresholds for landslide triggering

    NASA Astrophysics Data System (ADS)

    Cepeda, J.; Nadim, F.; Høeg, K.; Elverhøi, A.

    2009-04-01

    The widely used power law for establishing rainfall thresholds for triggering of landslides was first proposed by N. Caine in 1980. The most updated global thresholds presented by F. Guzzetti and co-workers in 2008 were derived using Caine's power law and a rigorous and comprehensive collection of global data. Caine's function is defined as I = α×Dβ, where I and D are the mean intensity and total duration of rainfall, and α and β are parameters estimated for a lower boundary curve to most or all the positive observations (i.e., landslide triggering rainfall events). This function does not account for the effect of antecedent precipitation as a conditioning factor for slope instability, an approach that may be adequate for global or regional thresholds that include landslides in surface geologies with a wide range of subsurface drainage conditions and pore-pressure responses to sustained rainfall. However, in a local scale and in geological settings dominated by a narrow range of drainage conditions and behaviours of pore-pressure response, the inclusion of antecedent precipitation in the definition of thresholds becomes necessary in order to ensure their optimum performance, especially when used as part of early warning systems (i.e., false alarms and missed events must be kept to a minimum). Some authors have incorporated the effect of antecedent rainfall in a discrete manner by first comparing the accumulated precipitation during a specified number of days against a reference value and then using a Caine's function threshold only when that reference value is exceeded. The approach in other authors has been to calculate threshold values as linear combinations of several triggering and antecedent parameters. The present study is aimed to proposing a new threshold function based on a generalisation of Caine's power law. The proposed function has the form I = (α1×Anα2)×Dβ, where I and D are defined as previously. The expression in parentheses is equivalent to Caine's α parameter. α1, α2 and β are parameters estimated for the threshold. An is the n-days cumulative rainfall. The suggested procedure to estimate the threshold is as follows: (1) Given N storms, assign one of the following flags to each storm: nL (non-triggering storms), yL (triggering storms), uL (uncertain-triggering storms). Successful predictions correspond to nL and yL storms occurring below and above the threshold, respectively. Storms flagged as uL are actually assigned either an nL or yL flag using a randomization procedure. (2) Establish a set of values of ni (e.g. 1, 4, 7, 10, 15 days, etc.) to test for accumulated precipitation. (3) For each storm and each ni value, obtain the antecedent accumulated precipitation in ni days Ani. (4) Generate a 3D grid of values of α1, α2 and β. (5) For a certain value of ni, generate confusion matrices for the N storms at each grid point and estimate an evaluation metrics parameter EMP (e.g., accuracy, specificity, etc.). (6) Repeat the previous step for all the set of ni values. (7) From the 3D grid corresponding to each ni value, search for the optimum grid point EMPopti(global minimum or maximum parameter). (8) Search for the optimum value of ni in the space ni vs EMPopti . (9) The threshold is defined by the value of ni obtained in the previous step and the corresponding values of α1, α2 and β. The procedure is illustrated using rainfall data and landslide observations from the San Salvador volcano, where a rainfall-triggered debris flow destroyed a neighbourhood in the capital city of El Salvador in 19 September, 1982, killing not less than 300 people.

  7. FDI technology spillover and threshold effect of the technology gap: regional differences in the Chinese industrial sector.

    PubMed

    Wang, Hui; Liu, Huifang; Cao, Zhiyong; Wang, Bowen

    2016-01-01

    This paper presents a new perspective that there is a double-threshold effect in terms of the technology gap existing in the foreign direct investment (FDI) technology spillover process in different regional Chinese industrial sectors. In this paper, a double-threshold regression model was established to examine the relation between the threshold effect of the technology gap and technology spillover. Based on the provincial panel data of Chinese industrial sectors from 2000 to 2011, the empirical results reveal that there are two threshold values, which are 1.254 and 2.163, in terms of the technology gap in the industrial sector in eastern China. There are also two threshold values in both the central and western industrial sector, which are 1.516, 2.694 and 1.635, 2.714, respectively. The technology spillover is a decreasing function of the technology gap in both the eastern and western industrial sectors, but a concave curve function of the technology gap is in the central industrial sectors. Furthermore, the FDI technology spillover has increased gradually in recent years. Based on the empirical results, suggestions were proposed to elucidate the introduction of the FDI and the improvement in the industrial added value in different regions of China.

  8. The excretion of theobromine in Thoroughbred racehorses after feeding compounded cubes containing cocoa husk--establishment of a threshold value in horse urine.

    PubMed

    Haywood, P E; Teale, P; Moss, M S

    1990-07-01

    Thoroughbred geldings were fed racehorse cubes containing a predetermined concentration of theobromine in the form of cocoa husk. They were offered 7 kg of cubes per day, divided between morning and evening feed, and food consumption was monitored. Urinary concentrations of theobromine were determined following the consumption of cubes containing 11.5, 6.6, 2.0 and 1.2 mg per kg of theobromine, to verify whether or not such concentrations would produce positive urine tests. Pre-dose urine samples were collected to verify the absence of theobromine before each experiment. It became apparent from the results of the first three administrations that the limit of detection of theobromine, using such procedures, would be reached at a feed level of about 1 mg per kg theobromine. Therefore the final administration, using cubes containing 1.2 mg per kg theobromine, was singled out for additional analytical work and quantitative procedures were developed to measure urinary concentrations of theobromine. It was anticipated that the results would form a basis for discussions relating to the establishment of a threshold value for theobromine in horse urine. The Stewards of the Jockey Club subsequently gave notice that they had established a threshold level for theobromine in urine of 2 micrograms/ml.

  9. Analysis of novel stochastic switched SILI epidemic models with continuous and impulsive control

    NASA Astrophysics Data System (ADS)

    Gao, Shujing; Zhong, Deming; Zhang, Yan

    2018-04-01

    In this paper, we establish two new stochastic switched epidemic models with continuous and impulsive control. The stochastic perturbations are considered for the natural death rate in each equation of the models. Firstly, a stochastic switched SILI model with continuous control schemes is investigated. By using Lyapunov-Razumikhin method, the sufficient conditions for extinction in mean are established. Our result shows that the disease could be die out theoretically if threshold value R is less than one, regardless of whether the disease-free solutions of the corresponding subsystems are stable or unstable. Then, a stochastic switched SILI model with continuous control schemes and pulse vaccination is studied. The threshold value R is derived. The global attractivity of the model is also obtained. At last, numerical simulations are carried out to support our results.

  10. Value of information and pricing new healthcare interventions.

    PubMed

    Willan, Andrew R; Eckermann, Simon

    2012-06-01

    Previous application of value-of-information methods to optimal clinical trial design have predominantly taken a societal decision-making perspective, implicitly assuming that healthcare costs are covered through public expenditure and trial research is funded by government or donation-based philanthropic agencies. In this paper, we consider the interaction between interrelated perspectives of a societal decision maker (e.g. the National Institute for Health and Clinical Excellence [NICE] in the UK) charged with the responsibility for approving new health interventions for reimbursement and the company that holds the patent for a new intervention. We establish optimal decision making from societal and company perspectives, allowing for trade-offs between the value and cost of research and the price of the new intervention. Given the current level of evidence, there exists a maximum (threshold) price acceptable to the decision maker. Submission for approval with prices above this threshold will be refused. Given the current level of evidence and the decision maker's threshold price, there exists a minimum (threshold) price acceptable to the company. If the decision maker's threshold price exceeds the company's, then current evidence is sufficient since any price between the thresholds is acceptable to both. On the other hand, if the decision maker's threshold price is lower than the company's, then no price is acceptable to both and the company's optimal strategy is to commission additional research. The methods are illustrated using a recent example from the literature.

  11. A Mathematical Model of Anthrax Transmission in Animal Populations.

    PubMed

    Saad-Roy, C M; van den Driessche, P; Yakubu, Abdul-Aziz

    2017-02-01

    A general mathematical model of anthrax (caused by Bacillus anthracis) transmission is formulated that includes live animals, infected carcasses and spores in the environment. The basic reproduction number [Formula: see text] is calculated, and existence of a unique endemic equilibrium is established for [Formula: see text] above the threshold value 1. Using data from the literature, elasticity indices for [Formula: see text] and type reproduction numbers are computed to quantify anthrax control measures. Including only herbivorous animals, anthrax is eradicated if [Formula: see text]. For these animals, oscillatory solutions arising from Hopf bifurcations are numerically shown to exist for certain parameter values with [Formula: see text] and to have periodicity as observed from anthrax data. Including carnivores and assuming no disease-related death, anthrax again goes extinct below the threshold. Local stability of the endemic equilibrium is established above the threshold; thus, periodic solutions are not possible for these populations. It is shown numerically that oscillations in spore growth may drive oscillations in animal populations; however, the total number of infected animals remains about the same as with constant spore growth.

  12. Investigation of Adaptive-threshold Approaches for Determining Area-Time Integrals from Satellite Infrared Data to Estimate Convective Rain Volumes

    NASA Technical Reports Server (NTRS)

    Smith, Paul L.; VonderHaar, Thomas H.

    1996-01-01

    The principal goal of this project is to establish relationships that would allow application of area-time integral (ATI) calculations based upon satellite data to estimate rainfall volumes. The research is being carried out as a collaborative effort between the two participating organizations, with the satellite data analysis to determine values for the ATIs being done primarily by the STC-METSAT scientists and the associated radar data analysis to determine the 'ground-truth' rainfall estimates being done primarily at the South Dakota School of Mines and Technology (SDSM&T). Synthesis of the two separate kinds of data and investigation of the resulting rainfall-versus-ATI relationships is then carried out jointly. The research has been pursued using two different approaches, which for convenience can be designated as the 'fixed-threshold approach' and the 'adaptive-threshold approach'. In the former, an attempt is made to determine a single temperature threshold in the satellite infrared data that would yield ATI values for identifiable cloud clusters which are closely related to the corresponding rainfall amounts as determined by radar. Work on the second, or 'adaptive-threshold', approach for determining the satellite ATI values has explored two avenues: (1) attempt involved choosing IR thresholds to match the satellite ATI values with ones separately calculated from the radar data on a case basis; and (2) an attempt involved a striaghtforward screening analysis to determine the (fixed) offset that would lead to the strongest correlation and lowest standard error of estimate in the relationship between the satellite ATI values and the corresponding rainfall volumes.

  13. Extrinsic regime shifts drive abrupt changes in regeneration dynamics at upper treeline in the Rocky Mountains, U.S.A.

    PubMed

    Elliott, Grant P

    2012-07-01

    Given the widespread and often dramatic influence of climate change on terrestrial ecosystems, it is increasingly common for abrupt threshold changes to occur, yet explicitly testing for climate and ecological regime shifts is lacking in climatically sensitive upper treeline ecotones. In this study, quantitative evidence based on empirical data is provided to support the key role of extrinsic, climate-induced thresholds in governing the spatial and temporal patterns of tree establishment in these high-elevation environments. Dendroecological techniques were used to reconstruct a 420-year history of regeneration dynamics within upper treeline ecotones along a latitudinal gradient (approximately 44-35 degrees N) in the Rocky Mountains. Correlation analysis was used to assess the possible influence of minimum and maximum temperature indices and cool-season (November-April) precipitation on regional age-structure data. Regime-shift analysis was used to detect thresholds in tree establishment during the entire period of record (1580-2000), temperature variables significantly Correlated with establishment during the 20th century, and cool-season precipitation. Tree establishment was significantly correlated with minimum temperature during the spring (March-May) and cool season. Regime-shift analysis identified an abrupt increase in regional tree establishment in 1950 (1950-1954 age class). Coincident with this period was a shift toward reduced cool-season precipitation. The alignment of these climate conditions apparently triggered an abrupt increase in establishment that was unprecedented during the period of record. Two main findings emerge from this research that underscore the critical role of climate in governing regeneration dynamics within upper treeline ecotones. (1) Regional climate variability is capable of exceeding bioclimatic thresholds, thereby initiating synchronous and abrupt changes in the spatial and temporal patterns of tree establishment at broad regional scales. (2) The importance of climate parameters exceeding critical threshold values and triggering a regime shift in tree establishment appears to be contingent on the alignment of favorable temperature and moisture regimes. This research suggests that threshold changes in the climate system can fundamentally alter regeneration dynamics within upper treeline ecotones and, through the use of regime-shift analysis, reveals important climate-vegetation linkages.

  14. Optimum threshold selection method of centroid computation for Gaussian spot

    NASA Astrophysics Data System (ADS)

    Li, Xuxu; Li, Xinyang; Wang, Caixia

    2015-10-01

    Centroid computation of Gaussian spot is often conducted to get the exact position of a target or to measure wave-front slopes in the fields of target tracking and wave-front sensing. Center of Gravity (CoG) is the most traditional method of centroid computation, known as its low algorithmic complexity. However both electronic noise from the detector and photonic noise from the environment reduces its accuracy. In order to improve the accuracy, thresholding is unavoidable before centroid computation, and optimum threshold need to be selected. In this paper, the model of Gaussian spot is established to analyze the performance of optimum threshold under different Signal-to-Noise Ratio (SNR) conditions. Besides, two optimum threshold selection methods are introduced: TmCoG (using m % of the maximum intensity of spot as threshold), and TkCoG ( usingμn +κσ n as the threshold), μn and σn are the mean value and deviation of back noise. Firstly, their impact on the detection error under various SNR conditions is simulated respectively to find the way to decide the value of k or m. Then, a comparison between them is made. According to the simulation result, TmCoG is superior over TkCoG for the accuracy of selected threshold, and detection error is also lower.

  15. Methods for threshold determination in multiplexed assays

    DOEpatents

    Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J

    2014-06-24

    Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.

  16. An integrated evaluation of some faecal indicator bacteria (FIB) and chemical markers as potential tools for monitoring sewage contamination in subtropical estuaries.

    PubMed

    Cabral, Ana Caroline; Stark, Jonathan S; Kolm, Hedda E; Martins, César C

    2018-04-01

    Sewage input and the relationship between chemical markers (linear alkylbenzenes and coprostanol) and fecal indicator bacteria (FIB, Escherichia coli and enterococci), were evaluated in order to establish thresholds values for chemical markers in suspended particulate matter (SPM) as indicators of sewage contamination in two subtropical estuaries in South Atlantic Brazil. Both chemical markers presented no linear relationship with FIB due to high spatial microbiological variability, however, microbiological water quality was related to coprostanol values when analyzed by logistic regression, indicating that linear models may not be the best representation of the relationship between both classes of indicators. Logistic regression was performed with all data and separately for two sampling seasons, using 800 and 100 MPN 100 mL -1 of E. coli and enterococci, respectively, as the microbiological limits of sewage contamination. Threshold values of coprostanol varied depending on the FIB and season, ranging between 1.00 and 2.23 μg g -1 SPM. The range of threshold values of coprostanol for SPM are relatively higher and more variable than those suggested in literature for sediments (0.10-0.50 μg g -1 ), probably due to higher concentration of coprostanol in SPM than in sediment. Temperature may affect the relationship between microbiological indicators and coprostanol, since the threshold value of coprostanol found here was similar to tropical areas, but lower than those found during winter in temperate areas, reinforcing the idea that threshold values should be calibrated for different climatic conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Thresholds of information leakage for speech security outside meeting rooms.

    PubMed

    Robinson, Matthew; Hopkins, Carl; Worrall, Ken; Jackson, Tim

    2014-09-01

    This paper describes an approach to provide speech security outside meeting rooms where a covert listener might attempt to extract confidential information. Decision-based experiments are used to establish a relationship between an objective measurement of the Speech Transmission Index (STI) and a subjective assessment relating to the threshold of information leakage. This threshold is defined for a specific percentage of English words that are identifiable with a maximum safe vocal effort (e.g., "normal" speech) used by the meeting participants. The results demonstrate that it is possible to quantify an offset that links STI with a specific threshold of information leakage which describes the percentage of words identified. The offsets for male talkers are shown to be approximately 10 dB larger than for female talkers. Hence for speech security it is possible to determine offsets for the threshold of information leakage using male talkers as the "worst case scenario." To define a suitable threshold of information leakage, the results show that a robust definition can be based upon 1%, 2%, or 5% of words identified. For these percentages, results are presented for offset values corresponding to different STI values in a range from 0.1 to 0.3.

  18. Effects of street traffic noise in the night

    NASA Technical Reports Server (NTRS)

    Wehrli, B.; Nemecek, J.; Turrian, V.; Hoffman, R.; Wanner, H.

    1980-01-01

    The relationship between automobile traffic noise and the degree of disturbance experience experienced at night was explored through a random sample survey of 1600 individuals in rural and urban areas. The data obtained were used to establish threshold values.

  19. Standard deviation index for stimulated Brillouin scattering suppression with different homogeneities.

    PubMed

    Ran, Yang; Su, Rongtao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Si, Lei

    2016-05-10

    We present a new quantitative index of standard deviation to measure the homogeneity of spectral lines in a fiber amplifier system so as to find the relation between the stimulated Brillouin scattering (SBS) threshold and the homogeneity of the corresponding spectral lines. A theoretical model is built and a simulation framework has been established to estimate the SBS threshold when input spectra with different homogeneities are set. In our experiment, by setting the phase modulation voltage to a constant value and the modulation frequency to different values, spectral lines with different homogeneities can be obtained. The experimental results show that the SBS threshold increases negatively with the standard deviation of the modulated spectrum, which is in good agreement with the theoretical results. When the phase modulation voltage is confined to 10 V and the modulation frequency is set to 80 MHz, the standard deviation of the modulated spectrum equals 0.0051, which is the lowest value in our experiment. Thus, at this time, the highest SBS threshold has been achieved. This standard deviation can be a good quantitative index in evaluating the power scaling potential in a fiber amplifier system, which is also a design guideline in suppressing the SBS to a better degree.

  20. Determination of Cost-Effectiveness Threshold for Health Care Interventions in Malaysia.

    PubMed

    Lim, Yen Wei; Shafie, Asrul Akmal; Chua, Gin Nie; Ahmad Hassali, Mohammed Azmi

    2017-09-01

    One major challenge in prioritizing health care using cost-effectiveness (CE) information is when alternatives are more expensive but more effective than existing technology. In such a situation, an external criterion in the form of a CE threshold that reflects the willingness to pay (WTP) per quality-adjusted life-year is necessary. To determine a CE threshold for health care interventions in Malaysia. A cross-sectional, contingent valuation study was conducted using a stratified multistage cluster random sampling technique in four states in Malaysia. One thousand thirteen respondents were interviewed in person for their socioeconomic background, quality of life, and WTP for a hypothetical scenario. The CE thresholds established using the nonparametric Turnbull method ranged from MYR12,810 to MYR22,840 (~US $4,000-US $7,000), whereas those estimated with the parametric interval regression model were between MYR19,929 and MYR28,470 (~US $6,200-US $8,900). Key factors that affected the CE thresholds were education level, estimated monthly household income, and the description of health state scenarios. These findings suggest that there is no single WTP value for a quality-adjusted life-year. The CE threshold estimated for Malaysia was found to be lower than the threshold value recommended by the World Health Organization. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. A Life Below the Threshold?: Examining Conflict Between Ethical Principles and Parental Values in Neonatal Treatment Decision Making.

    PubMed

    Cunningham, Thomas V

    2016-01-01

    Three common ethical principles for establishing the limits of parental authority in pediatric treatment decision-making are the harm principle, the principle of best interest, and the threshold view. This paper considers how these principles apply to a case of a premature neonate with multiple significant co-morbidities whose mother wanted all possible treatments, and whose health care providers wondered whether it would be ethically permissible to allow him to die comfortably despite her wishes. Whether and how these principles help in understanding what was morally right for the child is questioned. The paper concludes that the principles were of some value in understanding the moral geography of the case; however, this case reveals that common bioethical principles for medical decision-making are problematically value-laden because they are inconsistent with the widespread moral value of medical vitalism.

  2. Study on the threshold of a stochastic SIR epidemic model and its extensions

    NASA Astrophysics Data System (ADS)

    Zhao, Dianli

    2016-09-01

    This paper provides a simple but effective method for estimating the threshold of a class of the stochastic epidemic models by use of the nonnegative semimartingale convergence theorem. Firstly, the threshold R0SIR is obtained for the stochastic SIR model with a saturated incidence rate, whose value is below 1 or above 1 will completely determine the disease to go extinct or prevail for any size of the white noise. Besides, when R0SIR > 1 , the system is proved to be convergent in time mean. Then, the threshold of the stochastic SIVS models with or without saturated incidence rate are also established by the same method. Comparing with the previously-known literatures, the related results are improved, and the method is simpler than before.

  3. Comparison of alternatives to amplitude thresholding for onset detection of acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bai, F.; Gagar, D.; Foote, P.; Zhao, Y.

    2017-02-01

    Acoustic Emission (AE) monitoring can be used to detect the presence of damage as well as determine its location in Structural Health Monitoring (SHM) applications. Information on the time difference of the signal generated by the damage event arriving at different sensors in an array is essential in performing localisation. Currently, this is determined using a fixed threshold which is particularly prone to errors when not set to optimal values. This paper presents three new methods for determining the onset of AE signals without the need for a predetermined threshold. The performance of the techniques is evaluated using AE signals generated during fatigue crack growth and compared to the established Akaike Information Criterion (AIC) and fixed threshold methods. It was found that the 1D location accuracy of the new methods was within the range of < 1 - 7.1 % of the monitored region compared to 2.7% for the AIC method and a range of 1.8-9.4% for the conventional Fixed Threshold method at different threshold levels.

  4. Visible lesion laser thresholds in Cynomolgus (Macaca fascicularis) retina with a 1064 nm 12-ns pulsed laser

    NASA Astrophysics Data System (ADS)

    Oliver, Jeffrey W.; Stolarski, David J.; Noojin, Gary D.; Hodnett, Harvey M.; Imholte, Michelle L.; Rockwell, Benjamin A.; Kumru, Semih S.

    2007-02-01

    A series of experiments in a new animal model for retinal damage, cynomolgus monkeys (Macaca fascicularis), have been conducted to determine the damage threshold for 12.5-nanosecond laser exposures at 1064 nm. These results provide a direct comparison to threshold values obtained in rhesus monkey (Macaca mulatta), which is the model historically used in establishing retinal maximum permissible exposure (MPE) limits. In this study, the irradiance level of a collimated Gaussian laser beam of 2.5 mm diameter at the cornea was randomly varied to produce a rectangular grid of exposures on the retina. Exposures sites were fundoscopically evaluated at post-irradiance intervals of 1 hour and 24 hours. Probit analysis was performed on dose-response data to obtain probability of response curves. The 50% probability of damage (ED50) values for 1 and 24 hours post-exposure are 28.5(22.7-38.4) μJ and 17.0(12.9-21.8) μJ, respectively. These values compare favorably to data obtained with the rhesus model, 28.7(22.3-39.3) μJ and 19.1(13.6-24.4) μJ, suggesting that the cynomolgus monkey may be a suitable replacement for rhesus monkey in photoacoustic minimum visible lesion threshold studies.

  5. Man-systems evaluation of moving base vehicle simulation motion cues. [human acceleration perception involving visual feedback

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, M.; Brye, R. G.

    1974-01-01

    A motion cue investigation program is reported that deals with human factor aspects of high fidelity vehicle simulation. General data on non-visual motion thresholds and specific threshold values are established for use as washout parameters in vehicle simulation. A general purpose similator is used to test the contradictory cue hypothesis that acceleration sensitivity is reduced during a vehicle control task involving visual feedback. The simulator provides varying acceleration levels. The method of forced choice is based on the theory of signal detect ability.

  6. Experimental and Finite Element Modeling of Near-Threshold Fatigue Crack Growth for the K-Decreasing Test Method

    NASA Technical Reports Server (NTRS)

    Smith, Stephen W.; Seshadri, Banavara R.; Newman, John A.

    2015-01-01

    The experimental methods to determine near-threshold fatigue crack growth rate data are prescribed in ASTM standard E647. To produce near-threshold data at a constant stress ratio (R), the applied stress-intensity factor (K) is decreased as the crack grows based on a specified K-gradient. Consequently, as the fatigue crack growth rate threshold is approached and the crack tip opening displacement decreases, remote crack wake contact may occur due to the plastically deformed crack wake surfaces and shield the growing crack tip resulting in a reduced crack tip driving force and non-representative crack growth rate data. If such data are used to life a component, the evaluation could yield highly non-conservative predictions. Although this anomalous behavior has been shown to be affected by K-gradient, starting K level, residual stresses, environmental assisted cracking, specimen geometry, and material type, the specifications within the standard to avoid this effect are limited to a maximum fatigue crack growth rate and a suggestion for the K-gradient value. This paper provides parallel experimental and computational simulations for the K-decreasing method for two materials (an aluminum alloy, AA 2024-T3 and a titanium alloy, Ti 6-2-2-2-2) to aid in establishing clear understanding of appropriate testing requirements. These simulations investigate the effect of K-gradient, the maximum value of stress-intensity factor applied, and material type. A material independent term is developed to guide in the selection of appropriate test conditions for most engineering alloys. With the use of such a term, near-threshold fatigue crack growth rate tests can be performed at accelerated rates, near-threshold data can be acquired in days instead of weeks without having to establish testing criteria through trial and error, and these data can be acquired for most engineering materials, even those that are produced in relatively small product forms.

  7. Regulation of non-relevant metabolites of plant protection products in drinking and groundwater in the EU: Current status and way forward.

    PubMed

    Laabs, V; Leake, C; Botham, P; Melching-Kollmuß, S

    2015-10-01

    Non-relevant metabolites are defined in the EU regulation for plant protection product authorization and a detailed definition of non-relevant metabolites is given in an EU Commission DG Sanco (now DG SANTE - Health and Food Safety) guidance document. However, in water legislation at EU and member state level non-relevant metabolites of pesticides are either not specifically regulated or diverse threshold values are applied. Based on their inherent properties, non-relevant metabolites should be regulated based on substance-specific and toxicity-based limit values in drinking and groundwater like other anthropogenic chemicals. Yet, if a general limit value for non-relevant metabolites in drinking and groundwater is favored, an application of a Threshold of Toxicological Concern (TTC) concept for Cramer class III compounds leads to a threshold value of 4.5 μg L(-1). This general value is exemplarily shown to be protective for non-relevant metabolites, based on individual drinking water limit values derived for a set of 56 non-relevant metabolites. A consistent definition of non-relevant metabolites of plant protection products, as well as their uniform regulation in drinking and groundwater in the EU, is important to achieve legal clarity for all stakeholders and to establish planning security for development of plant protection products for the European market. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Comparison of the anticonvulsant potency of various diuretic drugs in the maximal electroshock-induced seizure threshold test in mice.

    PubMed

    Załuska, Katarzyna; Kondrat-Wróbel, Maria W; Łuszczki, Jarogniew J

    2018-05-01

    The coexistence of seizures and arterial hypertension requires an adequate and efficacious treatment involving both protection from seizures and reduction of high arterial blood pressure. Accumulating evidence indicates that some diuretic drugs (with a well-established position in the treatment of arterial hypertension) also possess anticonvulsant properties in various experimental models of epilepsy. The aim of this study was to assess the anticonvulsant potency of 6 commonly used diuretic drugs (i.e., amiloride, ethacrynic acid, furosemide, hydrochlorothiazide, indapamide, and spironolactone) in the maximal electroshock-induced seizure threshold (MEST) test in mice. Doses of the studied diuretics and their corresponding threshold increases were linearly related, allowing for the determination of doses which increase the threshold for electroconvulsions in drug-treated animals by 20% (TID20 values) over the threshold in control animals. Amiloride, hydrochlorothiazide and indapamide administered systemically (intraperitoneally - i.p.) increased the threshold for maximal electroconvulsions in mice, and the experimentally-derived TID20 values in the maximal electroshock seizure threshold test were 30.2 mg/kg for amiloride, 68.2 mg/kg for hydrochlorothiazide and 3.9 mg/kg for indapamide. In contrast, ethacrynic acid (up to 100 mg/kg), furosemide (up to 100 mg/kg) and spironolactone (up to 50 mg/kg) administered i.p. had no significant impact on the threshold for electroconvulsions in mice. The studied diuretics can be arranged with respect to their anticonvulsant potency in the MEST test as follows: indapamide > amiloride > hydrochlorothiazide. No anticonvulsant effects were observed for ethacrynic acid, furosemide or spironolactone in the MEST test in mice.

  9. Measuring single-shot, picosecond optical damage threshold in Ge, Si, and sapphire with a 5.1-μm laser

    DOE PAGES

    Agustsson, R.; Pogorelsky, I.; Arab, E.; ...

    2015-11-18

    Optical photonic structures driven by picosecond, GW-class lasers are emerging as promising novel sources of electron beams and high quality X-rays. Due to quadratic dependence on wavelength of the laser ponderomotive potential, the performance of such sources scales very favorably towards longer drive laser wavelengths. However, to take full advantage of photonic structures at mid-IR spectral region, it is important to determine optical breakdown limits of common optical materials. To this end, an experimental study was carried out at a wavelength of 5 µm, using a frequency-doubled CO 2 laser source, with 5 ps pulse length. Single-shot optical breakdowns weremore » detected and characterized at different laser intensities, and damage threshold values of 0.2, 0.3, and 7.0 J/cm 2, were established for Ge, Si, and sapphire, respectively. As a result, the measured damage threshold values were stable and repeatable within individual data sets, and across varying experimental conditions.« less

  10. Measuring single-shot, picosecond optical damage threshold in Ge, Si, and sapphire with a 5.1-μm laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agustsson, R.; Pogorelsky, I.; Arab, E.

    Optical photonic structures driven by picosecond, GW-class lasers are emerging as promising novel sources of electron beams and high quality X-rays. Due to quadratic dependence on wavelength of the laser ponderomotive potential, the performance of such sources scales very favorably towards longer drive laser wavelengths. However, to take full advantage of photonic structures at mid-IR spectral region, it is important to determine optical breakdown limits of common optical materials. To this end, an experimental study was carried out at a wavelength of 5 µm, using a frequency-doubled CO 2 laser source, with 5 ps pulse length. Single-shot optical breakdowns weremore » detected and characterized at different laser intensities, and damage threshold values of 0.2, 0.3, and 7.0 J/cm 2, were established for Ge, Si, and sapphire, respectively. As a result, the measured damage threshold values were stable and repeatable within individual data sets, and across varying experimental conditions.« less

  11. Masking Misfit in Confirmatory Factor Analysis by Increasing Unique Variances: A Cautionary Note on the Usefulness of Cutoff Values of Fit Indices

    ERIC Educational Resources Information Center

    Heene, Moritz; Hilbert, Sven; Draxler, Clemens; Ziegler, Matthias; Buhner, Markus

    2011-01-01

    Fit indices are widely used in order to test the model fit for structural equation models. In a highly influential study, Hu and Bentler (1999) showed that certain cutoff values for these indices could be derived, which, over time, has led to the reification of these suggested thresholds as "golden rules" for establishing the fit or other aspects…

  12. Partial Roc Reveals Superiority of Mutual Rank of Pearson's Correlation Coefficient as a Coexpression Measure to Elucidate Functional Association of Genes

    NASA Astrophysics Data System (ADS)

    Obayashi, Takeshi; Kinoshita, Kengo

    2013-01-01

    Gene coexpression analysis is a powerful approach to elucidate gene function. We have established and developed this approach using vast amount of publicly available gene expression data measured by microarray techniques. The coexpressed genes are used to estimate gene function of the guide gene or to construct gene coexpression networks. In the case to construct gene networks, researchers should introduce an arbitrary threshold of gene coexpression, because gene coexpression value is continuous value. In the viewpoint to introduce common threshold of gene coexpression, we previously reported rank of Pearson's correlation coefficient (PCC) is more useful than the original PCC value. In this manuscript, we re-assessed the measure of gene coexpression to construct gene coexpression network, and found that mutual rank (MR) of PCC showed better performance than rank of PCC and the original PCC in low false positive rate.

  13. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.

  14. Further studies to extend and test the area-time-integral technique applied to satellite data

    NASA Technical Reports Server (NTRS)

    Smith, Paul L.; Vonderhaar, Thomas H.

    1993-01-01

    The principal goal of this project is to establish relationships that would allow application of area-time integral (ATI) calculations based upon satellite data to estimate rainfall volumes. The research has been pursued using two different approaches, which for convenience can be designated as the 'fixed-threshold approach' and the 'variable-threshold approach'. In the former approach, an attempt is made to determine a single temperature threshold in the satellite infrared data that would yield ATI values for identifiable cloud clusters which are most closely related to the corresponding rainfall amounts. Results thus far have indicated that a strong correlation exists between the rain volumes and the satellite ATI values, but the optimum threshold for this relationship seems to differ from one geographic location to another. The difference is probably related to differences in the basic precipitation mechanisms that dominate in the different regions. The average rainfall rate associated with each cloudy pixel is also found to vary across the spectrum of ATI values. Work on the second, or 'variable-threshold', approach for determining the satellite ATI values was essentially suspended during this period due to exhaustion of project funds. Most of the ATI work thus far has dealt with cloud clusters from the Lagrangian or 'floating-target' point of view. For many purposes, however, the Eulerian or 'fixed-target' perspective is more appropriate. For a very large target area encompassing entire cluster life histories, the rain volume-ATI relationship would obviously be the same in either case. The important question for the Eulerian perspective is how small the fixed area can be made while maintaining consistency in that relationship.

  15. Normative behavioral thresholds for short tone-bursts.

    PubMed

    Beattie, R C; Rochverger, I

    2001-10-01

    Although tone-bursts have been commonly used in auditory brainstem response (ABR) evaluations for many years, national standards describing normal calibration values have not been established. This study was designed to gather normative threshold data to establish a physical reference for tone-burst stimuli that can be reproduced across clinics and laboratories. More specifically, we obtained norms for 3-msec tone-bursts presented at two repetition rates (9.3/sec and 39/sec), two gating functions (Trapezoid and Blackman), and four frequencies (500, 1000, 2000, and 4000 Hz). Our results are specified using three physical references: dB peak sound pressure level, dB peak-to-peak equivalent sound pressure level, and dB SPL (fast meter response, rate = 50 stimuli/sec). These data are offered for consideration when calibrating ABR equipment. The 39/sec stimulus rate yielded tone-burst thresholds that were approximately 3 dB lower than the 9.3/sec rate. The improvement in threshold with increasing stimulus rate may reflect the ability of the auditory system to integrate energy that occurs within a time interval of 200 to 500 msec (temporal integration). The Trapezoid gating function yielded thresholds that averaged 1.4 dB lower than the Blackman function. Although these differences are small and of little clinical importance, the cumulative effects of several instrument and/or procedural variables may yield clinically important differences.

  16. High-wafer-yield, high-performance vertical cavity surface-emitting lasers

    NASA Astrophysics Data System (ADS)

    Li, Gabriel S.; Yuen, Wupen; Lim, Sui F.; Chang-Hasnain, Constance J.

    1996-04-01

    Vertical cavity surface emitting lasers (VCSELs) with very low threshold current and voltage of 340 (mu) A and 1.5 V is achieved. The molecular beam epitaxially grown wafers are grown with a highly accurate, low cost and versatile pre-growth calibration technique. One- hundred percent VCSEL wafer yield is obtained. Low threshold current is achieved with a native oxide confined structure with excellent current confinement. Single transverse mode with stable, predetermined polarization direction up to 18 times threshold is also achieved, due to stable index guiding provided by the structure. This is the highest value reported to data for VCSELs. We have established that p-contact annealing in these devices is crucial for low voltage operation, contrary to the general belief. Uniform doping in the mirrors also appears not to be inferior to complicated doping engineering. With these design rules, very low threshold voltage VCSELs are achieved with very simple growth and fabrication steps.

  17. Minimum Transendothelial Electrical Resistance Thresholds for the Study of Small and Large Molecule Drug Transport in a Human in Vitro Blood-Brain Barrier Model.

    PubMed

    Mantle, Jennifer L; Min, Lie; Lee, Kelvin H

    2016-12-05

    A human cell-based in vitro model that can accurately predict drug penetration into the brain as well as metrics to assess these in vitro models are valuable for the development of new therapeutics. Here, human induced pluripotent stem cells (hPSCs) are differentiated into a polarized monolayer that express blood-brain barrier (BBB)-specific proteins and have transendothelial electrical resistance (TEER) values greater than 2500 Ω·cm 2 . By assessing the permeabilities of several known drugs, a benchmarking system to evaluate brain permeability of drugs was established. Furthermore, relationships between TEER and permeability to both small and large molecules were established, demonstrating that different minimum TEER thresholds must be achieved to study the brain transport of these two classes of drugs. This work demonstrates that this hPSC-derived BBB model exhibits an in vivo-like phenotype, and the benchmarks established here are useful for assessing functionality of other in vitro BBB models.

  18. A class of stochastic delayed SIR epidemic models with generalized nonlinear incidence rate and temporary immunity

    NASA Astrophysics Data System (ADS)

    Fan, Kuangang; Zhang, Yan; Gao, Shujing; Wei, Xiang

    2017-09-01

    A class of SIR epidemic model with generalized nonlinear incidence rate is presented in this paper. Temporary immunity and stochastic perturbation are also considered. The existence and uniqueness of the global positive solution is achieved. Sufficient conditions guaranteeing the extinction and persistence of the epidemic disease are established. Moreover, the threshold behavior is discussed, and the threshold value R0 is obtained. We show that if R0 < 1, the disease eventually becomes extinct with probability one, whereas if R0 > 1, then the system remains permanent in the mean.

  19. Establishing a rainfall threshold for flash flood warnings in China's mountainous areas based on a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Miao, Qinghua; Yang, Dawen; Yang, Hanbo; Li, Zhe

    2016-10-01

    Flash flooding is one of the most common natural hazards in China, particularly in mountainous areas, and usually causes heavy damage and casualties. However, the forecasting of flash flooding in mountainous regions remains challenging because of the short response time and limited monitoring capacity. This paper aims to establish a strategy for flash flood warnings in mountainous ungauged catchments across humid, semi-humid and semi-arid regions of China. First, we implement a geomorphology-based hydrological model (GBHM) in four mountainous catchments with drainage areas that ranges from 493 to 1601 km2. The results show that the GBHM can simulate flash floods appropriately in these four study catchments. We propose a method to determine the rainfall threshold for flood warning by using frequency analysis and binary classification based on long-term GBHM simulations that are forced by historical rainfall data to create a practically easy and straightforward approach for flash flood forecasting in ungauged mountainous catchments with drainage areas from tens to hundreds of square kilometers. The results show that the rainfall threshold value decreases significantly with increasing antecedent soil moisture in humid regions, while this value decreases slightly with increasing soil moisture in semi-humid and semi-arid regions. We also find that accumulative rainfall over a certain time span (or rainfall over a long time span) is an appropriate threshold for flash flood warnings in humid regions because the runoff is dominated by excess saturation. However, the rainfall intensity (or rainfall over a short time span) is more suitable in semi-humid and semi-arid regions because excess infiltration dominates the runoff in these regions. We conduct a comprehensive evaluation of the rainfall threshold and find that the proposed method produces reasonably accurate flash flood warnings in the study catchments. An evaluation of the performance at uncalibrated interior points in the four gauged catchments provides results that are indicative of the expected performance at ungauged locations. We also find that insufficient historical data lengths (13 years with a 5-year flood return period in this study) may introduce uncertainty in the estimation of the flood/rainfall threshold because of the small number of flood events that are used in binary classification. A data sample that contains enough flood events (10 events suggested in the present study) that exceed the threshold value is necessary to obtain acceptable results from binary classification.

  20. On flows of viscoelastic fluids under threshold-slip boundary conditions

    NASA Astrophysics Data System (ADS)

    Baranovskii, E. S.

    2018-03-01

    We investigate a boundary-value problem for the steady isothermal flow of an incompressible viscoelastic fluid of Oldroyd type in a 3D bounded domain with impermeable walls. We use the Fujita threshold-slip boundary condition. This condition states that the fluid can slip along a solid surface when the shear stresses reach a certain critical value; otherwise the slipping velocity is zero. Assuming that the flow domain is not rotationally symmetric, we prove an existence theorem for the corresponding slip problem in the framework of weak solutions. The proof uses methods for solving variational inequalities with pseudo-monotone operators and convex functionals, the method of introduction of auxiliary viscosity, as well as a passage-to-limit procedure based on energy estimates of approximate solutions, Korn’s inequality, and compactness arguments. Also, some properties and estimates of weak solutions are established.

  1. Security of a single-state semi-quantum key distribution protocol

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Qiu, Daowen; Mateus, Paulo

    2018-06-01

    Semi-quantum key distribution protocols are allowed to set up a secure secret key between two users. Compared with their full quantum counterparts, one of the two users is restricted to perform some "classical" or "semi-quantum" operations, which potentially makes them easily realizable by using less quantum resource. However, the semi-quantum key distribution protocols mainly rely on a two-way quantum channel. The eavesdropper has two opportunities to intercept the quantum states transmitted in the quantum communication stage. It may allow the eavesdropper to get more information and make the security analysis more complicated. In the past ten years, many semi-quantum key distribution protocols have been proposed and proved to be robust. However, there are few works concerning their unconditional security. It is doubted that how secure the semi-quantum ones are and how much noise they can tolerate to establish a secure secret key. In this paper, we prove the unconditional security of a single-state semi-quantum key distribution protocol proposed by Zou et al. (Phys Rev A 79:052312, 2009). We present a complete proof from information theory aspect by deriving a lower bound of the protocol's key rate in the asymptotic scenario. Using this bound, we figure out an error threshold value such that for all error rates that are less than this threshold value, the secure secret key can be established between the legitimate users definitely. Otherwise, the users should abort the protocol. We make an illustration of the protocol under the circumstance that the reverse quantum channel is a depolarizing one with parameter q. Additionally, we compare the error threshold value with some full quantum protocols and several existing semi-quantum ones whose unconditional security proofs have been provided recently.

  2. Normal and abnormal tissue identification system and method for medical images such as digital mammograms

    NASA Technical Reports Server (NTRS)

    Heine, John J. (Inventor); Clarke, Laurence P. (Inventor); Deans, Stanley R. (Inventor); Stauduhar, Richard Paul (Inventor); Cullers, David Kent (Inventor)

    2001-01-01

    A system and method for analyzing a medical image to determine whether an abnormality is present, for example, in digital mammograms, includes the application of a wavelet expansion to a raw image to obtain subspace images of varying resolution. At least one subspace image is selected that has a resolution commensurate with a desired predetermined detection resolution range. A functional form of a probability distribution function is determined for each selected subspace image, and an optimal statistical normal image region test is determined for each selected subspace image. A threshold level for the probability distribution function is established from the optimal statistical normal image region test for each selected subspace image. A region size comprising at least one sector is defined, and an output image is created that includes a combination of all regions for each selected subspace image. Each region has a first value when the region intensity level is above the threshold and a second value when the region intensity level is below the threshold. This permits the localization of a potential abnormality within the image.

  3. On the threshold conditions for electron beam damage of asbestos amosite fibers in the transmission electron microscope (TEM).

    PubMed

    Martin, Joannie; Beauparlant, Martin; Sauvé, Sébastien; L'Espérance, Gilles

    2016-12-01

    Asbestos amosite fibers were investigated to evaluate the damage caused by a transmission electron microscope (TEM) electron beam. Since elemental x-ray intensity ratios obtained by energy dispersive x-ray spectroscopy (EDS) are commonly used for asbestos identification, the impact of beam damage on these ratios was evaluated. It was determined that the magnesium/silicon ratio best represented the damage caused to the fiber. Various tests showed that most fibers have a current density threshold above which the chemical composition of the fiber is modified. The value of this threshold current density varied depending on the fiber, regardless of fiber diameter, and in some cases could not be determined. The existence of a threshold electron dose was also demonstrated. This value was dependent on the current density used and can be increased by providing a recovery period between exposures to the electron beam. This study also established that the electron beam current is directly related to the damage rate above a current density of 165 A/cm 2 . The large number of different results obtained suggest, that in order to ensure that the amosite fibers are not damaged, analysis should be conducted below a current density of 100 A/cm 2 .

  4. Using instrumental (CIE and reflectance) measures to predict consumers' acceptance of beef colour.

    PubMed

    Holman, Benjamin W B; van de Ven, Remy J; Mao, Yanwei; Coombs, Cassius E O; Hopkins, David L

    2017-05-01

    We aimed to establish colorimetric thresholds based upon the capacity for instrumental measures to predict consumer satisfaction with beef colour. A web-based survey was used to distribute standardised photographs of beef M. longissimus lumborum with known colorimetrics (L*, a*, b*, hue, chroma, ratio of reflectance at 630nm and 580nm, and estimated deoxymyoglobin, oxymyoglobin and metmyoglobin concentrations) for scrutiny. Consumer demographics and perceived importance of colour to beef value were also evaluated. It was found that a* provided the most simple and robust prediction of beef colour acceptability. Beef colour was considered acceptable (with 95% acceptance) when a* values were equal to or above 14.5. Demographic effects on this threshold were negligible, but consumer nationality and gender did contribute to variation in the relative importance of colour to beef value. These results provide future beef colour studies with context to interpret objective colour measures in terms of consumer acceptance and market appeal. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  5. Consolidation of Bimetallic Nanosized Particles and Formation of Nanocomposites Depending on Conditions of Shock Wave Compaction

    NASA Astrophysics Data System (ADS)

    Vorozhtsov, S. A.; Kudryashova, O. B.; Lerner, M. I.; Vorozhtsov, A. B.; Khrustalyov, A. P.; Pervikov, A. V.

    2017-11-01

    The authors consider and evaluate the physical parameters and regularities of the process of consolidation of Fe-Cu, Cu-Nb, Ag-Ni, Fe-Pb nanoparticles when creating composite materials by means of shock wave compaction. As a result of theoretical consideration of explosive compaction process, researchers established and discussed the physical process conditions, established a number of threshold pressure values corresponding to different target indicators of the state of the compact. The time of shock wave impact on powders for powder consolidation was estimated.

  6. Mapping change of older forest with nearest-neighbor imputation and Landsat time-series

    Treesearch

    Janet L. Ohmann; Matthew J. Gregory; Heather M. Roberts; Warren B. Cohen; Robert E. Kennedy; Zhiqiang Yang

    2012-01-01

    The Northwest Forest Plan (NWFP), which aims to conserve late-successional and old-growth forests (older forests) and associated species, established new policies on federal lands in the Pacific Northwest USA. As part of monitoring for the NWFP, we tested nearest-neighbor imputation for mapping change in older forest, defined by threshold values for forest attributes...

  7. Home blood pressure monitoring. Current knowledge and directions for future research.

    PubMed

    Reims, H; Fossum, E; Kjeldsen, S E; Julius, S

    2001-01-01

    Home blood pressure (BP) monitoring has become popular in clinical practice and several automated devices for home BP measurement are now recommendable. Home BP is generally lower than clinic BP, and similar to daytime ambulatory BP. Home BP measurement eliminates the white coat effect and provides a high number of readings, and it is considered more accurate and reproducible than clinic BP. It can improve the sensitivity and statistical power of clinical drug trials and may have a higher prognostic value than clinic BP. Home monitoring may improve compliance and BP control, and reduce costs of hypertension management. Diagnostic thresholds and treatment target values for home BP remain to be established by longitudinal studies. Until then, home BP monitoring is to be considered a supplement. However, high home BP may support or confirm the diagnosis made in the doctor's office, and low home BP may warrant ambulatory BP monitoring. During long-term follow-up, home BP monitoring provides an opportunity for close attention to BP levels and variations. The first international guidelines have established a consensus document with recommendations, including a proposal of preliminary diagnostic thresholds, but further research is needed to define the precise role of home BP monitoring in clinical practice.

  8. Resonances and thresholds in the Rydberg-level population of multiply charged ions at solid surfaces

    NASA Astrophysics Data System (ADS)

    Nedeljković, Lj. D.; Nedeljković, N. N.

    1998-12-01

    We present a theoretical study of resonances and thresholds, two specific features of Rydberg-state formation of multiply charged ions (Z=6, 7, and 8) escaping a solid surface at intermediate velocities (v~1 a.u.) in the normal emergence geometry. The resonances are recognized in pronounced maxima of the experimentally observed population curves of Ar VIII ions for resonant values of the principal quantum number n=nres=11 and for the angular momentum quantum numbers l=1 and 2. Absence of optical signals in detectors of beam-foil experiments for n>nthr of S VI and Cl VII ions (with l=0, 1, and 2) and Ar VIII for l=0 is interpreted as a threshold phenomenon. An interplay between resonance and threshold effects is established within the framework of quantum dynamics of the low angular momentum Rydberg-state formation, based on a generalization of Demkov-Ostrovskii's charge-exchange model. In the model proposed, the Ar VIII resonances appear as a consequence of electron tunneling in the very vicinity of the ion-surface potential barrier top and at some critical ion-surface distances Rc. The observed thresholds are explained by means of a decay mechanism of ionic Rydberg states formed dominantly above the Fermi level EF of a solid conduction band. The theoretically predicted resonant and threshold values, nres and nthr of the principal quantum number n, as well as the obtained population probabilities Pnl=Pnl(v,Z), are in sufficiently good agreement with all available experimental findings.

  9. Application of the threshold of toxicological concern concept to pharmaceutical manufacturing operations.

    PubMed

    Dolan, David G; Naumann, Bruce D; Sargent, Edward V; Maier, Andrew; Dourson, Michael

    2005-10-01

    A scientific rationale is provided for estimating acceptable daily intake values (ADIs) for compounds with limited or no toxicity information to support pharmaceutical manufacturing operations. These ADIs are based on application of the "thresholds of toxicological concern" (TTC) principle, in which levels of human exposure are estimated that pose no appreciable risk to human health. The same concept has been used by the US Food and Drug Administration (FDA) to establish "thresholds of regulation" for indirect food additives and adopted by the Joint FAO/WHO Expert Committee on Food Additives for flavoring substances. In practice, these values are used as a statement of safety and indicate when no actions need to be taken in a given exposure situation. Pharmaceutical manufacturing relies on ADIs for cleaning validation of process equipment and atypical extraneous matter investigations. To provide practical guidance for handling situations where relatively unstudied compounds with limited or no toxicity data are encountered, recommendations are provided on ADI values that correspond to three categories of compounds: (1) compounds that are likely to be carcinogenic, (2) compounds that are likely to be potent or highly toxic, and (3) compounds that are not likely to be potent, highly toxic or carcinogenic. Corresponding ADIs for these categories of materials are 1, 10, and 100 microg/day, respectively.

  10. Derivation of groundwater threshold values for analysis of impacts predicted at potential carbon sequestration sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Last, G. V.; Murray, C. J.; Bott, Y.

    2016-06-01

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts to groundwater quality due to carbon dioxide (CO 2) or brine leakage, should it occur from deep CO 2 storage reservoirs. These efforts targeted two classes of aquifer – an unconfined fractured carbonate aquifer based on the Edwards Aquifer in Texas, and a confined alluvium aquifer based on the High Plains Aquifer in Kansas. Hypothetical leakage scenarios focus on wellbores as the most likely conduits from the storage reservoir to an underground source of drinking water (USDW). To facilitate evaluationmore » of potential degradation of the USDWs, threshold values, below which there would be no predicted impacts, were determined for each of these two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities. Results demonstrate the importance of establishing baseline groundwater quality conditions that capture the spatial and temporal variability of the USDWs prior to CO 2 injection and storage.« less

  11. Cartographic quality of ERTS-1 images

    NASA Technical Reports Server (NTRS)

    Welch, R. I.

    1973-01-01

    Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.

  12. Threshold selection for classification of MR brain images by clustering method

    NASA Astrophysics Data System (ADS)

    Moldovanu, Simona; Obreja, Cristian; Moraru, Luminita

    2015-12-01

    Given a grey-intensity image, our method detects the optimal threshold for a suitable binarization of MR brain images. In MR brain image processing, the grey levels of pixels belonging to the object are not substantially different from the grey levels belonging to the background. Threshold optimization is an effective tool to separate objects from the background and further, in classification applications. This paper gives a detailed investigation on the selection of thresholds. Our method does not use the well-known method for binarization. Instead, we perform a simple threshold optimization which, in turn, will allow the best classification of the analyzed images into healthy and multiple sclerosis disease. The dissimilarity (or the distance between classes) has been established using the clustering method based on dendrograms. We tested our method using two classes of images: the first consists of 20 T2-weighted and 20 proton density PD-weighted scans from two healthy subjects and from two patients with multiple sclerosis. For each image and for each threshold, the number of the white pixels (or the area of white objects in binary image) has been determined. These pixel numbers represent the objects in clustering operation. The following optimum threshold values are obtained, T = 80 for PD images and T = 30 for T2w images. Each mentioned threshold separate clearly the clusters that belonging of the studied groups, healthy patient and multiple sclerosis disease.

  13. 76 FR 43149 - Approval and Promulgation of Air Quality Implementation Plans; New Mexico; Prevention of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... of Significant Deterioration (PSD) program to establish appropriate emission thresholds for... Mexico's December 1, 2010, proposed SIP revision establishes appropriate emissions thresholds for... appropriate thresholds for GHG permitting applicability into New Mexico's SIP, then paragraph (d) in Sec. 52...

  14. Sub-lethal Ocular Trauma (SLOT): Establishing a Standardized Blast Threshold to Facilitate Diagnostic, Early Treatment, and Recovery Studies for Blast Injuries to the Eye and Optic Nerve

    DTIC Science & Technology

    2013-09-01

    Piezoelectronics Model 137A23 ICP Blast Pressure Sensor). Figure 8. Metal plates located behind the driver nozzle plate can be removed to...physiological IOP values. Another 10% of eyes received for study were excluded because the B-scan and UBM ultrasonic pre-screening demonstrated the

  15. Natural environment application for NASP-X-30 design and mission planning

    NASA Technical Reports Server (NTRS)

    Johnson, D. L.; Hill, C. K.; Brown, S. C.; Batts, G. W.

    1993-01-01

    The NASA/MSFC Mission Analysis Program has recently been utilized in various National Aero-Space Plane (NASP) mission and operational planning scenarios. This paper focuses on presenting various atmospheric constraint statistics based on assumed NASP mission phases using established natural environment design, parametric, threshold values. Probabilities of no-go are calculated using atmospheric parameters such as temperature, humidity, density altitude, peak/steady-state winds, cloud cover/ceiling, thunderstorms, and precipitation. The program although developed to evaluate test or operational missions after flight constraints have been established, can provide valuable information in the design phase of the NASP X-30 program. Inputting the design values as flight constraints the Mission Analysis Program returns the probability of no-go, or launch delay, by hour by month. This output tells the X-30 program manager whether the design values are stringent enough to meet his required test flight schedules.

  16. Determination of irritant threshold concentrations of multiple tree, grass, weed and mould allergens for intradermal testing of horses residing in the southern USA.

    PubMed

    Lane, Martha J; Pucheu-Haston, Cherie M; Kearney, Michael T; Woodward, Michelle

    2017-12-01

    Appropriate allergen threshold concentrations (TCs) for intradermal testing (IDT) have not been established in horses for many pollen and mould allergens. To determine the TCs in non-allergic horses and describe the frequency of late phase reactions for 26 allergens, including trees, grasses, weeds and moulds in horses residing in the southern Unites States. Twenty four clinically normal horses in the southern United States. Threshold concentrations for different allergens were determined using IDT subjective measurements at 30 minutes. Delayed reactions were evaluated at 4 and 24 h. Threshold concentrations (all PNU/mL) were established for eight tree allergens (black willow 1,000, box elder 1,000, live oak 1,000, pecan 2,000, white ash 4,000, red oak 4,000, red mulberry 2,000 and green ash 2,000); two grass allergens (Johnson grass 250 PNU/mL and Kentucky blue grass 500 PNU/mL); two weeds (carelessweed 1,000 PNU/mL, great ragweed 500 PNU/mL) and one mould (Curvularia 8,000 PNU/mL). The TC was not determined due to excessive reactivity at the lowest concentration tested (1,000 PNU/mL) for bahia and perennial rye grass. Eleven other allergens did not meet the criteria to establish a TC when evaluated at 30 min due to lack of positive reactions. Multiple allergens caused positive reactions in ≥10% of horses at 4 h. Reactions at 24 h were rare with the exception of one horse. This study identified intradermal TC for multiple pollen and mould allergens in horses. These values may prove useful for optimizing allergen concentrations for IDT of allergic horses. © 2017 ESVD and ACVD.

  17. Threshold exceedance risk assessment in complex space-time systems

    NASA Astrophysics Data System (ADS)

    Angulo, José M.; Madrid, Ana E.; Romero, José L.

    2015-04-01

    Environmental and health impact risk assessment studies most often involve analysis and characterization of complex spatio-temporal dynamics. Recent developments in this context are addressed, among other objectives, to proper representation of structural heterogeneities, heavy-tailed processes, long-range dependence, intermittency, scaling behavior, etc. Extremal behaviour related to spatial threshold exceedances can be described in terms of geometrical characteristics and distribution patterns of excursion sets, which are the basis for construction of risk-related quantities, such as in the case of evolutionary study of 'hotspots' and long-term indicators of occurrence of extremal episodes. Derivation of flexible techniques, suitable for both the application under general conditions and the interpretation on singularities, is important for practice. Modern risk theory, a developing discipline motivated by the need to establish solid general mathematical-probabilistic foundations for rigorous definition and characterization of risk measures, has led to the introduction of a variety of classes and families, ranging from some conceptually inspired by specific fields of applications, to some intended to provide generality and flexibility to risk analysts under parametric specifications, etc. Quantile-based risk measures, such as Value-at-Risk (VaR), Average Value-at-Risk (AVaR), and generalization to spectral measures, are of particular interest for assessment under very general conditions. In this work, we study the application of quantile-based risk measures in the spatio-temporal context in relation to certain geometrical characteristics of spatial threshold exceedance sets. In particular, we establish a closed-form relationship between VaR, AVaR, and the expected value of threshold exceedance areas and excess volumes. Conditional simulation allows us, by means of empirical global and local spatial cumulative distributions, the derivation of various statistics of practical interest, and subsequent construction of dynamic risk maps. Further, we study the implementation of static and dynamic spatial deformation under this setup, meaningful, among other aspects, for incorporation of heterogeneities and/or covariate effects, or consideration of external factors for risk measurement. We illustrate this approach though Environment and Health applications. This work is partially supported by grant MTM2012-32666 of the Spanish Ministry of Economy and Competitiveness (co-financed by FEDER).

  18. Degree of Phosphorus Saturation and Soil Phosphorus Thresholds in an Ultisol Amended with Triple Superphosphate and Phosphate Rocks

    PubMed Central

    Gikonyo, E. W.; Zaharah, A. R.; Hanafi, M. M.; Anuar, A. R.

    2011-01-01

    Soil phosphorus (P) release capability could be assessed through the degree of P saturation (DPS). Our main objective was to determine DPS and, hence, P threshold DPS values of an Ultisol treated with triple superphosphate (TSP), Gafsa phosphate rocks (GPR), or Christmas Island phosphate rocks (CIPR), plus or minus manure. P release was determined by the iron oxide—impregnated paper strip (strip P), while DPS was determined from ammonium oxalate—extractable aluminum (Al), iron (Fe), and P. Soils were sampled from a closed incubation study involving soils treated with TSP, GPR, and CIPR at 0–400 mg P kg-1, and a field study where soils were fertilized with the same P sources at 100–300 kg P ha-1 plus or minus manure. The DPS was significantly influenced by P source x P rate, P source x manure (incubated soils), and by P source x P rate x time (field-sampled soils). Incubated soil results indicated that both initial P and total strip P were related to DPS by exponential functions: initial strip P = 1.38exp0.18DPS, R2 = 0.82** and total strip P = 8.01exp0.13DPS, R2 = 0.65**. Initial strip P was linearly related to total P; total P = 2.45, initial P + 8.41, R2 = 0.85**. The threshold DPS value established was about 22% (incubated soil). Field soils had lower DPS values <12% and strip P was related to initial DPS and average DPS in exponential functions: strip P = 2.6exp0.44DPS, R2 = 0.77** and strip P = 1.1DPS2 — 2.4DPS + 6.2, R2 = 0.58**, respectively. The threshold values were both at ≈8% and P release was 11–14 mg P kg-1. Results are evident that DPS can be used to predict P release, but the threshold values are environmentally sensitive; hence, recommendations should be based on field trials. PMID:21805012

  19. Degree of phosphorus saturation and soil phosphorus thresholds in an ultisol amended with triple superphosphate and phosphate rocks.

    PubMed

    Gikonyo, E W; Zaharah, A R; Hanafi, M M; Anuar, A R

    2011-07-28

    Soil phosphorus (P) release capability could be assessed through the degree of P saturation (DPS). Our main objective was to determine DPS and, hence, P threshold DPS values of an Ultisol treated with triple superphosphate (TSP), Gafsa phosphate rocks (GPR), or Christmas Island phosphate rocks (CIPR), plus or minus manure. P release was determined by the iron oxide-impregnated paper strip (strip P), while DPS was determined from ammonium oxalate-extractable aluminum (Al), iron (Fe), and P. Soils were sampled from a closed incubation study involving soils treated with TSP, GPR, and CIPR at 0-400 mg P kg-1, and a field study where soils were fertilized with the same P sources at 100-300 kg P ha-1 plus or minus manure. The DPS was significantly influenced by P source x P rate, P source x manure (incubated soils), and by P source x P rate x time (field-sampled soils). Incubated soil results indicated that both initial P and total strip P were related to DPS by exponential functions: initial strip P = 1.38exp0.18DPS, R2 = 0.82** and total strip P = 8.01exp0.13DPS, R2 = 0.65**. Initial strip P was linearly related to total P; total P = 2.45, initial P + 8.41, R2 = 0.85**. The threshold DPS value established was about 22% (incubated soil). Field soils had lower DPS values <12% and strip P was related to initial DPS and average DPS in exponential functions: strip P = 2.6exp0.44DPS, R2 = 0.77** and strip P = 1.1DPS2 ¨C 2.4DPS + 6.2, R2 = 0.58**, respectively. The threshold values were both approximately equal to 8% and P release was 11-14 mg P kg-1. Results are evident that DPS can be used to predict P release, but the threshold values are environmentally sensitive; hence, recommendations should be based on field trials.

  20. Higher certainty of the laser-induced damage threshold test with a redistributing data treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Lars; Mrohs, Marius; Gyamfi, Mark

    2015-10-15

    As a consequence of its statistical nature, the measurement of the laser-induced damage threshold holds always risks to over- or underestimate the real threshold value. As one of the established measurement procedures, the results of S-on-1 (and 1-on-1) tests outlined in the corresponding ISO standard 21 254 depend on the amount of data points and their distribution over the fluence scale. With the limited space on a test sample as well as the requirements on test site separation and beam sizes, the amount of data from one test is restricted. This paper reports on a way to treat damage testmore » data in order to reduce the statistical error and therefore measurement uncertainty. Three simple assumptions allow for the assignment of one data point to multiple data bins and therefore virtually increase the available data base.« less

  1. The difference between “equivalent” and “not different”

    DOE PAGES

    Anderson-Cook, Christine M.; Borror, Connie M.

    2015-10-27

    Often, experimenters wish to establish that populations of units can be considered equivalent to each other, in order to leverage improved knowledge about one population for characterizing the new population, or to establish the comparability of items. Equivalence tests have existed for many years, but their use in industry seems to have been largely restricted to biomedical applications, such as for assessing the equivalence of two drugs or protocols. We present the fundamentals of equivalence tests, compare them to traditional two-sample and ANOVA tests that are better suited to establishing differences in populations, and propose the use of a graphicalmore » summary to compare p-values across different thresholds of practically important differences.« less

  2. Phonation threshold pressure predictions using viscoelastic properties up to 1,400 Hz of injectables intended for Reinke's space.

    PubMed

    Klemuk, Sarah A; Lu, Xiaoying; Hoffman, Henry T; Titze, Ingo R

    2010-05-01

    Viscoelastic properties of numerous vocal fold injectables have been reported but not at speaking frequencies. For materials intended for Reinke's space, ramifications of property values are of great concern because of their impact on ease of voice onset. Our objectives were: 1) to measure viscoelastic properties of a new nonresorbing carbomer and well-known vocal fold injectables at vocalization frequencies using established and new instrumentation, and 2) to predict phonation threshold pressures using a computer model with intended placement in Reinke's space. Rheology and phonation threshold pressure calculations. Injectables were evaluated with a traditional rotational rheometer and a new piezo-rotary vibrator. Using these data at vocalization frequencies, phonation threshold pressures (PTP) were calculated for each biomaterial, assuming a low dimensional model with supraglottic coupling and adjusted vocal fold length and thickness at each frequency. Results were normalized to a nominal PTP value. Viscoelastic data were acquired at vocalization frequencies as high as 363 to 1,400 Hz for six new carbomer hydrogels, Hylan B, and Extracel intended for vocal fold Reinke's space injection and for Cymetra (lateral injection). Reliability was confirmed with good data overlap when measuring with either rheometer. PTP predictions ranged from 0.001 to 16 times the nominal PTP value of 0.283 kPa. Accurate viscoelastic measurements of vocal fold injectables are now possible at physiologic frequencies. Hylan B, Extracel, and the new carbomer hydrogels should generate easy vocal onset and sustainable vocalization based on their rheologic properties if injected into Reinke's space. Applications may vary depending on desired longevity of implant. Laryngoscope, 2010.

  3. K-edge energy-based calibration method for photon counting detectors

    NASA Astrophysics Data System (ADS)

    Ge, Yongshuai; Ji, Xu; Zhang, Ran; Li, Ke; Chen, Guang-Hong

    2018-01-01

    In recent years, potential applications of energy-resolved photon counting detectors (PCDs) in the x-ray medical imaging field have been actively investigated. Unlike conventional x-ray energy integration detectors, PCDs count the number of incident x-ray photons within certain energy windows. For PCDs, the interactions between x-ray photons and photoconductor generate electronic voltage pulse signals. The pulse height of each signal is proportional to the energy of the incident photons. By comparing the pulse height with the preset energy threshold values, x-ray photons with specific energies are recorded and sorted into different energy bins. To quantitatively understand the meaning of the energy threshold values, and thus to assign an absolute energy value to each energy bin, energy calibration is needed to establish the quantitative relationship between the threshold values and the corresponding effective photon energies. In practice, the energy calibration is not always easy, due to the lack of well-calibrated energy references for the working energy range of the PCDs. In this paper, a new method was developed to use the precise knowledge of the characteristic K-edge energy of materials to perform energy calibration. The proposed method was demonstrated using experimental data acquired from three K-edge materials (viz., iodine, gadolinium, and gold) on two different PCDs (Hydra and Flite, XCounter, Sweden). Finally, the proposed energy calibration method was further validated using a radioactive isotope (Am-241) with a known decay energy spectrum.

  4. A Cyfip2-Dependent Excitatory Interneuron Pathway Establishes the Innate Startle Threshold.

    PubMed

    Marsden, Kurt C; Jain, Roshan A; Wolman, Marc A; Echeverry, Fabio A; Nelson, Jessica C; Hayer, Katharina E; Miltenberg, Ben; Pereda, Alberto E; Granato, Michael

    2018-04-17

    Sensory experiences dynamically modify whether animals respond to a given stimulus, but it is unclear how innate behavioral thresholds are established. Here, we identify molecular and circuit-level mechanisms underlying the innate threshold of the zebrafish startle response. From a forward genetic screen, we isolated five mutant lines with reduced innate startle thresholds. Using whole-genome sequencing, we identify the causative mutation for one line to be in the fragile X mental retardation protein (FMRP)-interacting protein cyfip2. We show that cyfip2 acts independently of FMRP and that reactivation of cyfip2 restores the baseline threshold after phenotype onset. Finally, we show that cyfip2 regulates the innate startle threshold by reducing neural activity in a small group of excitatory hindbrain interneurons. Thus, we identify a selective set of genes critical to establishing an innate behavioral threshold and uncover a circuit-level role for cyfip2 in this process. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  5. 76 FR 59899 - Approval and Promulgation of Air Quality Implementation Plans; Indiana; Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Significant Deterioration (PSD) program to establish appropriate emission thresholds for determining which new... emissions above the thresholds established in the PSD regulations. DATES: This final rule is effective on... of GHG, and that do not limit PSD applicability to GHGs to the higher thresholds in the Tailoring...

  6. Normal standards for computer-ECG programs for prognostically and diagnostically important ECG variables derived from a large ethnically diverse female cohort: the Women's Health Initiative (WHI).

    PubMed

    Rautaharju, Pentti M; Zhang, Zhu-ming; Gregg, Richard E; Haisty, Wesley K; Z Vitolins, Mara; Curtis, Anne B; Warren, James; Horaĉek, Milan B; Zhou, Sophia H; Soliman, Elsayed Z

    2013-01-01

    Substantial new information has emerged recently about the prognostic value for a variety of new ECG variables. The objective of the present study was to establish reference standards for these novel risk predictors in a large, ethnically diverse cohort of healthy women from the Women's Health Initiative (WHI) study. The study population consisted of 36,299 healthy women. Racial differences in rate-adjusted QT end (QT(ea)) and QT peak (QT(pa)) intervals as linear functions of RR were small, leading to the conclusion that 450 and 390 ms are applicable as thresholds for prolonged and shortened QT(ea) and similarly, 365 and 295 ms for prolonged and shortened QT(pa), respectively. As a threshold for increased dispersion of global repolarization (T(peak)T(end) interval), 110 ms was established for white and Hispanic women and 120 ms for African-American and Asian women. ST elevation and depression values for the monitoring leads of each person with limb electrodes at Mason-Likar positions and chest leads at level of V1 and V2 were first computed from standard leads using lead transformation coefficients derived from 892 body surface maps, and subsequently normal standards were determined for the monitoring leads, including vessel-specific bipolar left anterior descending, left circumflex artery and right coronary artery leads. The results support the choice 150 μV as a tentative threshold for abnormal ST-onset elevation for all monitoring leads. Body mass index (BMI) had a profound effect on Cornell voltage and Sokolow-Lyon voltage in all racial groups and their utility for left ventricular hypertrophy classification remains open. Common thresholds for all racial groups are applicable for QT(ea), and QT(pa) intervals and ST elevation. Race-specific normal standards are required for many other ECG parameters. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Decomposition of Fuzzy Soft Sets with Finite Value Spaces

    PubMed Central

    Jun, Young Bae

    2014-01-01

    The notion of fuzzy soft sets is a hybrid soft computing model that integrates both gradualness and parameterization methods in harmony to deal with uncertainty. The decomposition of fuzzy soft sets is of great importance in both theory and practical applications with regard to decision making under uncertainty. This study aims to explore decomposition of fuzzy soft sets with finite value spaces. Scalar uni-product and int-product operations of fuzzy soft sets are introduced and some related properties are investigated. Using t-level soft sets, we define level equivalent relations and show that the quotient structure of the unit interval induced by level equivalent relations is isomorphic to the lattice consisting of all t-level soft sets of a given fuzzy soft set. We also introduce the concepts of crucial threshold values and complete threshold sets. Finally, some decomposition theorems for fuzzy soft sets with finite value spaces are established, illustrated by an example concerning the classification and rating of multimedia cell phones. The obtained results extend some classical decomposition theorems of fuzzy sets, since every fuzzy set can be viewed as a fuzzy soft set with a single parameter. PMID:24558342

  8. Decomposition of fuzzy soft sets with finite value spaces.

    PubMed

    Feng, Feng; Fujita, Hamido; Jun, Young Bae; Khan, Madad

    2014-01-01

    The notion of fuzzy soft sets is a hybrid soft computing model that integrates both gradualness and parameterization methods in harmony to deal with uncertainty. The decomposition of fuzzy soft sets is of great importance in both theory and practical applications with regard to decision making under uncertainty. This study aims to explore decomposition of fuzzy soft sets with finite value spaces. Scalar uni-product and int-product operations of fuzzy soft sets are introduced and some related properties are investigated. Using t-level soft sets, we define level equivalent relations and show that the quotient structure of the unit interval induced by level equivalent relations is isomorphic to the lattice consisting of all t-level soft sets of a given fuzzy soft set. We also introduce the concepts of crucial threshold values and complete threshold sets. Finally, some decomposition theorems for fuzzy soft sets with finite value spaces are established, illustrated by an example concerning the classification and rating of multimedia cell phones. The obtained results extend some classical decomposition theorems of fuzzy sets, since every fuzzy set can be viewed as a fuzzy soft set with a single parameter.

  9. Threshold selection for classification of MR brain images by clustering method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moldovanu, Simona; Dumitru Moţoc High School, 15 Milcov St., 800509, Galaţi; Obreja, Cristian

    Given a grey-intensity image, our method detects the optimal threshold for a suitable binarization of MR brain images. In MR brain image processing, the grey levels of pixels belonging to the object are not substantially different from the grey levels belonging to the background. Threshold optimization is an effective tool to separate objects from the background and further, in classification applications. This paper gives a detailed investigation on the selection of thresholds. Our method does not use the well-known method for binarization. Instead, we perform a simple threshold optimization which, in turn, will allow the best classification of the analyzedmore » images into healthy and multiple sclerosis disease. The dissimilarity (or the distance between classes) has been established using the clustering method based on dendrograms. We tested our method using two classes of images: the first consists of 20 T2-weighted and 20 proton density PD-weighted scans from two healthy subjects and from two patients with multiple sclerosis. For each image and for each threshold, the number of the white pixels (or the area of white objects in binary image) has been determined. These pixel numbers represent the objects in clustering operation. The following optimum threshold values are obtained, T = 80 for PD images and T = 30 for T2w images. Each mentioned threshold separate clearly the clusters that belonging of the studied groups, healthy patient and multiple sclerosis disease.« less

  10. Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.

    PubMed

    Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq

    2017-06-01

    The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.

  11. A national survey of HDR source knowledge among practicing radiation oncologists and residents: Establishing a willingness-to-pay threshold for cobalt-60 usage.

    PubMed

    Mailhot Vega, Raymond; Talcott, Wesley; Ishaq, Omar; Cohen, Patrice; Small, Christina J; Duckworth, Tamara; Sarria Bardales, Gustavo; Perez, Carmen A; Schiff, Peter B; Small, William; Harkenrider, Matthew M

    Ir-192 is the predominant source for high-dose-rate (HDR) brachytherapy in United States markets. Co-60, with longer half-life and fewer source exchanges, has piloted abroad with comparable clinical dosimetry but increased shielding requirements. We sought to identify practitioner knowledge of Co-60 and establish acceptable willingness-to-pay (WTP) thresholds for additional shielding requirements for use in future cost-benefit analysis. A nationwide survey of U.S. radiation oncologists was conducted from June to July 2015, assessing knowledge of HDR sources, brachytherapy unit shielding, and factors that may influence source-selection decision-making. Self-identified decision makers in radiotherapy equipment purchase and acquisition were asked their WTP on shielding should a more cost-effective source become available. Four hundred forty surveys were completed and included. Forty-four percent were ABS members. Twenty percent of respondents identified Co-60 as an HDR source. Respondents who identified Co-60 were significantly more likely to be ABS members, have attended a national brachytherapy conference, and be involved in brachytherapy selection. Sixty-six percent of self-identified decision makers stated that their facility would switch to a more cost-effective source than Ir-192, if available. Cost and experience were the most common reasons provided for not switching. The most common WTP value selected by respondents was <$25,000. A majority of respondents were unaware of Co-60 as a commercially available HDR source. This investigation was novel in directly assessing decision makers to establish WTP for shielding costs that source change to Co-60 may require. These results will be used to establish WTP threshold for future cost-benefit analysis. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  12. 41 CFR 102-73.40 - What happens if the dollar value of the project exceeds the prospectus threshold?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... dollar value of the project exceeds the prospectus threshold? 102-73.40 Section 102-73.40 Public... § 102-73.40 What happens if the dollar value of the project exceeds the prospectus threshold? Projects... the prospectus threshold. To obtain this approval, the Administrator of General Services will transmit...

  13. 41 CFR 102-73.40 - What happens if the dollar value of the project exceeds the prospectus threshold?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... dollar value of the project exceeds the prospectus threshold? 102-73.40 Section 102-73.40 Public... § 102-73.40 What happens if the dollar value of the project exceeds the prospectus threshold? Projects... the prospectus threshold. To obtain this approval, the Administrator of General Services will transmit...

  14. 41 CFR 102-73.40 - What happens if the dollar value of the project exceeds the prospectus threshold?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... dollar value of the project exceeds the prospectus threshold? 102-73.40 Section 102-73.40 Public... § 102-73.40 What happens if the dollar value of the project exceeds the prospectus threshold? Projects... the prospectus threshold. To obtain this approval, the Administrator of General Services will transmit...

  15. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  16. How to Assess the Value of Medicines?

    PubMed Central

    Simoens, Steven

    2010-01-01

    This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value. PMID:21607066

  17. How to assess the value of medicines?

    PubMed

    Simoens, Steven

    2010-01-01

    This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value.

  18. Soil contamination in landfills: a case study of a landfill in Czech Republic

    NASA Astrophysics Data System (ADS)

    Adamcová, D.; Vaverková, M. D.; Bartoň, S.; Havlíček, Z.; Břoušková, E.

    2016-02-01

    A phytotoxicity test was determined to assess ecotoxicity of landfill soil. Sinapis alba L. was used as a bioindicator of heavy metals. Soil samples 1-8, which were taken from the landfill body, edge of the landfill body, and its vicinity meet the limits for heavy metals Co, Cd, Pb, and Zn specified in the applicable legislation. Hg and Mn threshold values are not established in legislation, but values have been determined for the needs of the landfill operator. For heavy metals Cr, Cu, and Ni sample 2 exceeded the threshold values, which attained the highest values of all the samples tested for Cr, Cu, and Ni. For Cr and Ni the values were several times higher than values of the other samples. The second highest values for Cr, Cu, and Ni showed sample 6 and 7. Both samples exceeded the set limits. An increase in plant biomass was observed in plants growing on plates with soil samples, but no changes in appearance, slow growth, or necrotic lesions appeared. Ecotoxicity tests show that tested soils (concentration of 50 %) collected from the landfill body, edge of the landfill body, and its vicinity reach high percentage values of germination capacity of seeds of Sinapis alba L. (101-137 %). At a concentration of 25 %, tested soil samples exhibit lower values of germination capacity - in particular samples 3 to 8 - yet the seed germination capacity in all eight samples of tested soils ranges between 86 and 137 %.

  19. Soil contaminations in landfill: a case study of the landfill in Czech Republic

    NASA Astrophysics Data System (ADS)

    Adamcová, D.; Vaverková, M. D.; Bartoň, S.; Havlíček, Z.; Břoušková, E.

    2015-10-01

    Phytotoxicity test was determined to assess ecotoxicity of landfill soil. Sinapis alba L. was used as heavy metals bioindicator. Soil samples 1-8, which were taken from the landfill body, edge of the landfill body and its vicinity meet the limits for heavy metals Co, Cd, Pb, and Zn specified in the applicable legislation. Hg and Mn threshold values are not established in legislation, but values have been determined for the needs of the landfill operator. For heavy metals Cr, Cu, and Ni sample 2 exceeded the threshold values, which attained the highest values of all the samples tested for Cr, Cu and Ni. For Cr and Ni the values were several times higher than values of the other samples. The second highest values for Cr, Cu, and Ni showed sample 6 and 7. Both samples exceeded the set limits. An increase in plant biomass was observed in plants growing on plates with soil samples, but no changes in appearance, slow growth or necrotic lesions appeared. Ecotoxicity tests show that tested soils (concentration of 50 %) collected from the landfill body, edge of the landfill body and its vicinity reach high percentage values of germination capacity of seeds of Sinapis alba L. (101-137 %). At a concentration of 25 %, tested soil samples exhibit lower values of germination capacity; in particular samples 3 to 8, yet the seed germination capacity in all 8 samples of tested soils range between 86 and 137 %.

  20. A strategy to minimize the energy offset in carrier injection from excited dyes to inorganic semiconductors for efficient dye-sensitized solar energy conversion.

    PubMed

    Fujisawa, Jun-Ichi; Osawa, Ayumi; Hanaya, Minoru

    2016-08-10

    Photoinduced carrier injection from dyes to inorganic semiconductors is a crucial process in various dye-sensitized solar energy conversions such as photovoltaics and photocatalysis. It has been reported that an energy offset larger than 0.2-0.3 eV (threshold value) is required for efficient electron injection from excited dyes to metal-oxide semiconductors such as titanium dioxide (TiO2). Because the energy offset directly causes loss in the potential of injected electrons, it is a crucial issue to minimize the energy offset for efficient solar energy conversions. However, a fundamental understanding of the energy offset, especially the threshold value, has not been obtained yet. In this paper, we report the origin of the threshold value of the energy offset, solving the long-standing questions of why such a large energy offset is necessary for the electron injection and which factors govern the threshold value, and suggest a strategy to minimize the threshold value. The threshold value is determined by the sum of two reorganization energies in one-electron reduction of semiconductors and typically-used donor-acceptor (D-A) dyes. In fact, the estimated values (0.21-0.31 eV) for several D-A dyes are in good agreement with the threshold value, supporting our conclusion. In addition, our results reveal that the threshold value is possible to be reduced by enlarging the π-conjugated system of the acceptor moiety in dyes and enhancing its structural rigidity. Furthermore, we extend the analysis to hole injection from excited dyes to semiconductors. In this case, the threshold value is given by the sum of two reorganization energies in one-electron oxidation of semiconductors and D-A dyes.

  1. Two new competing pathways establish the threshold for cyclin-B-Cdk1 activation at the meiotic G2/M transition.

    PubMed

    Hiraoka, Daisaku; Aono, Ryota; Hanada, Shin-Ichiro; Okumura, Eiichi; Kishimoto, Takeo

    2016-08-15

    Extracellular ligands control biological phenomena. Cells distinguish physiological stimuli from weak noise stimuli by establishing a ligand-concentration threshold. Hormonal control of the meiotic G2/M transition in oocytes is essential for reproduction. However, the mechanism for threshold establishment is unclear. In starfish oocytes, maturation-inducing hormones activate the PI3K-Akt pathway through the Gβγ complex of heterotrimeric G-proteins. Akt directly phosphorylates both Cdc25 phosphatase and Myt1 kinase, resulting in activation of cyclin-B-Cdk1, which then induces meiotic G2/M transition. Here, we show that cyclin-B-Cdk1 is partially activated after subthreshold hormonal stimuli, but this triggers negative feedback, resulting in dephosphorylation of Akt sites on Cdc25 and Myt1, thereby canceling the signal. We also identified phosphatase activity towards Akt substrates that exists independent of stimuli. In contrast to these negative regulatory activities, an atypical Gβγ-dependent pathway enhances PI3K-Akt-dependent phosphorylation. Based on these findings, we propose a model for threshold establishment in which hormonal dose-dependent competition between these new pathways establishes a threshold; the atypical Gβγ-pathway becomes predominant over Cdk-dependent negative feedback when the stimulus exceeds this threshold. Our findings provide a regulatory connection between cell cycle and signal transduction machineries. © 2016. Published by The Company of Biologists Ltd.

  2. ACR appropriateness criteria(®) on abnormal vaginal bleeding.

    PubMed

    Bennett, Genevieve L; Andreotti, Rochelle F; Lee, Susanna I; Dejesus Allison, Sandra O; Brown, Douglas L; Dubinsky, Theodore; Glanc, Phyllis; Mitchell, Donald G; Podrasky, Ann E; Shipp, Thomas D; Siegel, Cary Lynn; Wong-You-Cheong, Jade J; Zelop, Carolyn M

    2011-07-01

    In evaluating a woman with abnormal vaginal bleeding, imaging cannot replace definitive histologic diagnosis but often plays an important role in screening, characterization of structural abnormalities, and directing appropriate patient care. Transvaginal ultrasound (TVUS) is generally the initial imaging modality of choice, with endometrial thickness a well-established predictor of endometrial disease in postmenopausal women. Endometrial thickness measurements of ≤5 mm and ≤4 mm have been advocated as appropriate upper threshold values to reasonably exclude endometrial carcinoma in postmenopausal women with vaginal bleeding; however, the best upper threshold endometrial thickness in the asymptomatic postmenopausal patient remains a subject of debate. Endometrial thickness in a premenopausal patient is a less reliable indicator of endometrial pathology since this may vary widely depending on the phase of menstrual cycle, and an upper threshold value for normal has not been well-established. Transabdominal ultrasound is generally an adjunct to TVUS and is most helpful when TVUS is not feasible or there is poor visualization of the endometrium. Hysterosonography may also allow for better delineation of both the endometrium and focal abnormalities in the endometrial cavity, leading to hysteroscopically directed biopsy or resection. Color and pulsed Doppler may provide additional characterization of a focal endometrial abnormality by demonstrating vascularity. MRI may also serve as an important problem-solving tool if the endometrium cannot be visualized on TVUS and hysterosonography is not possible, as well as for pretreatment planning of patients with suspected endometrial carcinoma. CT is generally not warranted for the evaluation of patients with abnormal bleeding, and an abnormal endometrium incidentally detected on CT should be further evaluated with TVUS. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. The polymyalgia rheumatica activity score in daily use: proposal for a definition of remission.

    PubMed

    Leeb, Burkhard F; Rintelen, Bernhard; Sautner, Judith; Fassl, Christian; Bird, Howard A

    2007-06-15

    To confirm the reliability and applicability of the Polymyalgia Rheumatica Disease Activity Score (PMR-AS), and to establish a threshold for remission. First, 78 patients with PMR (50 women/28 men, mean age 65.97 years) were enrolled in a cross-sectional evaluation. The PMR-AS, patient's satisfaction with disease status (PATSAT; range 1-5), erythrocyte sedimentation rate (ESR; first hour), and a visual analog scale of patients' general health assessment (VAS patient global; range 0-100) were recorded. Subsequently, another 39 PMR patients (24 women/15 men, mean age 68.12 years) were followed longitudinally. Relationships between the PMR-AS, PATSAT, ESR, and VAS patient global were analyzed by the Kruskal-Wallis test, Spearman's rank correlation, and kappa statistics. PMR-AS values in patients with a PATSAT score of 1 and a VAS patient global <10 formed the basis to establish a remission threshold. PMR-AS values were significantly related to PATSAT (P < 0.001), VAS patient global (P < 0.001), and ESR (P < 0.01). PATSAT and VAS patient global were reasonably different (kappa = 0.226). The median PMR-AS score in patients with PATSAT score 1 and VAS patient global <10 was 0.7 (range 0-3.3), and the respective 75th percentile was 1.3. To enhance applicability, a range from 0 to 1.5 was proposed to define remission in PMR. The median ESR in these patients was 10 mm/hour (range 3-28), indicating external validity. We demonstrated the reliability, validity, and applicability of the PMR-AS in daily routine. Moreover, we proposed a remission threshold (0-1.5) founded on patient-dependent parameters.

  4. Structured decision making as a conceptual framework to identify thresholds for conservation and management

    USGS Publications Warehouse

    Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.

    2009-01-01

    Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives component, and ecological thresholds may be embedded in models projecting consequences of management actions. Decision thresholds are determined by the above-listed components of a structured decision process. These components may themselves vary over time, inducing variation in the decision thresholds inherited from them. These dynamic decision thresholds can then be determined using adaptive management. We provide numerical examples (that are based on patch occupancy models) of structured decision processes that include all three kinds of thresholds. ?? 2009 by the Ecological Society of America.

  5. Defining Spino-Pelvic Alignment Thresholds: Should Operative Goals in Adult Spinal Deformity Surgery Account for Age?

    PubMed

    Lafage, Renaud; Schwab, Frank; Challier, Vincent; Henry, Jensen K; Gum, Jeffrey; Smith, Justin; Hostin, Richard; Shaffrey, Christopher; Kim, Han J; Ames, Christopher; Scheer, Justin; Klineberg, Eric; Bess, Shay; Burton, Douglas; Lafage, Virginie

    2016-01-01

    Retrospective review of prospective, multicenter database. The aim of the study was to determine age-specific spino-pelvic parameters, to extrapolate age-specific Oswestry Disability Index (ODI) values from published Short Form (SF)-36 Physical Component Score (PCS) data, and to propose age-specific realignment thresholds for adult spinal deformity (ASD). The Scoliosis Research Society-Schwab classification offers a framework for defining alignment in patients with ASD. Although age-specific changes in spinal alignment and patient-reported outcomes have been established in the literature, their relationship in the setting of ASD operative realignment has not been reported. ASD patients who received operative or nonoperative treatment were consecutively enrolled. Patients were stratified by age, consistent with published US-normative values (Norms) of the SF-36 PCS (<35, 35-44, 45-54, 55-64, 65-74, >75  y old). At baseline, relationships between between radiographic spino-pelvic parameters (lumbar-pelvic mismatch [PI-LL], pelvic tilt [PT], sagittal vertical axis [SVA], and T1 pelvic angle [TPA]), age, and PCS were established using linear regression analysis; normative PCS values were then used to establish age-specific targets. Correlation analysis with ODI and PCS was used to determine age-specific ideal alignment. Baseline analysis included 773 patients (53.7 y old, 54% operative, 83% female). There was a strong correlation between ODI and PCS (r = 0.814, P < 0.001), allowing for the extrapolation of US-normative ODI by age group. Linear regression analysis (all with r > 0.510, P < 0.001) combined with US-normative PCS values demonstrated that ideal spino-pelvic values increased with age, ranging from PT = 10.9 degrees, PI-LL = -10.5 degrees, and SVA = 4.1 mm for patients under 35 years to PT = 28.5 degrees, PI-LL = 16.7 degrees, and SVA = 78.1 mm for patients over 75 years. Clinically, older patients had greater compensation, more degenerative loss of lordosis, and were more pitched forward. This study demonstrated that sagittal spino-pelvic alignment varies with age. Thus, operative realignment targets should account for age, with younger patients requiring more rigorous alignment objectives.

  6. Randomness fault detection system

    NASA Technical Reports Server (NTRS)

    Russell, B. Don (Inventor); Aucoin, B. Michael (Inventor); Benner, Carl L. (Inventor)

    1996-01-01

    A method and apparatus are provided for detecting a fault on a power line carrying a line parameter such as a load current. The apparatus monitors and analyzes the load current to obtain an energy value. The energy value is compared to a threshold value stored in a buffer. If the energy value is greater than the threshold value a counter is incremented. If the energy value is greater than a high value threshold or less than a low value threshold then a second counter is incremented. If the difference between two subsequent energy values is greater than a constant then a third counter is incremented. A fault signal is issued if the counter is greater than a counter limit value and either the second counter is greater than a second limit value or the third counter is greater than a third limit value.

  7. Minimally important difference for the Expanded Prostate Cancer Index Composite Short Form.

    PubMed

    Skolarus, Ted A; Dunn, Rodney L; Sanda, Martin G; Chang, Peter; Greenfield, Thomas K; Litwin, Mark S; Wei, John T

    2015-01-01

    To establish a score threshold that constitutes a clinically relevant change for each domain of the Expanded Prostate Cancer Index Composite (EPIC) Short Form (EPIC-26). Although its use in clinical practice and clinical trials has increased worldwide, the clinical interpretation of this 26-item disease-specific patient-reported quality of life questionnaire for men with localized prostate cancer would be facilitated by characterization of score thresholds for clinically relevant change (the minimally important differences [MIDs]). We used distribution- and anchor-based approaches to establish the MID range for each EPIC-26 domain (urinary, sexual, bowel, and vitality/hormonal) based on a prospective multi-institutional cohort of 1201 men treated for prostate cancer between 2003 and 2006 and followed up for 3 years after treatment. For the anchor-based approach, we compared within-subject and between-subject score changes for each domain to an external "anchor" measure of overall cancer treatment satisfaction. We found the bowel and vitality/hormonal domains to have the lowest MID range (a 4-6 point change should be considered clinically relevant), whereas the sexual domain had the greatest MID values (10-12). Urinary incontinence appeared to have a greater MID range (6-9) than the urinary irritation/obstruction domain (5-7). Using 2 independent approaches, we established the MIDs for each EPIC-26 domain. A definition of these MID values is essential for the researcher or clinician to understand when changes in symptom burden among prostate cancer survivors are clinically relevant. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Phytoavailability of Cadmium (Cd) to Pak Choi (Brassica chinensis L.) Grown in Chinese Soils: A Model to Evaluate the Impact of Soil Cd Pollution on Potential Dietary Toxicity

    PubMed Central

    Yang, Xiaoe; Xiao, Wendan; Stoffella, Peter J.; Saghir, Aamir; Azam, Muhammad; Li, Tingqiang

    2014-01-01

    Food chain contamination by soil cadmium (Cd) through vegetable consumption poses a threat to human health. Therefore, an understanding is needed on the relationship between the phytoavailability of Cd in soils and its uptake in edible tissues of vegetables. The purpose of this study was to establish soil Cd thresholds of representative Chinese soils based on dietary toxicity to humans and develop a model to evaluate the phytoavailability of Cd to Pak choi (Brassica chinensis L.) based on soil properties. Mehlich-3 extractable Cd thresholds were more suitable for Stagnic Anthrosols, Calcareous, Ustic Cambosols, Typic Haplustalfs, Udic Ferrisols and Periudic Argosols with values of 0.30, 0.25, 0.18, 0.16, 0.15 and 0.03 mg kg−1, respectively, while total Cd is adequate threshold for Mollisols with a value of 0.86 mg kg−1. A stepwise regression model indicated that Cd phytoavailability to Pak choi was significantly influenced by soil pH, organic matter, total Zinc and Cd concentrations in soil. Therefore, since Cd accumulation in Pak choi varied with soil characteristics, they should be considered while assessing the environmental quality of soils to ensure the hygienically safe food production. PMID:25386790

  9. Phytoavailability of cadmium (Cd) to Pak choi (Brassica chinensis L.) grown in Chinese soils: a model to evaluate the impact of soil Cd pollution on potential dietary toxicity.

    PubMed

    Rafiq, Muhammad Tariq; Aziz, Rukhsanda; Yang, Xiaoe; Xiao, Wendan; Stoffella, Peter J; Saghir, Aamir; Azam, Muhammad; Li, Tingqiang

    2014-01-01

    Food chain contamination by soil cadmium (Cd) through vegetable consumption poses a threat to human health. Therefore, an understanding is needed on the relationship between the phytoavailability of Cd in soils and its uptake in edible tissues of vegetables. The purpose of this study was to establish soil Cd thresholds of representative Chinese soils based on dietary toxicity to humans and develop a model to evaluate the phytoavailability of Cd to Pak choi (Brassica chinensis L.) based on soil properties. Mehlich-3 extractable Cd thresholds were more suitable for Stagnic Anthrosols, Calcareous, Ustic Cambosols, Typic Haplustalfs, Udic Ferrisols and Periudic Argosols with values of 0.30, 0.25, 0.18, 0.16, 0.15 and 0.03 mg kg-1, respectively, while total Cd is adequate threshold for Mollisols with a value of 0.86 mg kg-1. A stepwise regression model indicated that Cd phytoavailability to Pak choi was significantly influenced by soil pH, organic matter, total Zinc and Cd concentrations in soil. Therefore, since Cd accumulation in Pak choi varied with soil characteristics, they should be considered while assessing the environmental quality of soils to ensure the hygienically safe food production.

  10. Type I and Type II error concerns in fMRI research: re-balancing the scale

    PubMed Central

    Cunningham, William A.

    2009-01-01

    Statistical thresholding (i.e. P-values) in fMRI research has become increasingly conservative over the past decade in an attempt to diminish Type I errors (i.e. false alarms) to a level traditionally allowed in behavioral science research. In this article, we examine the unintended negative consequences of this single-minded devotion to Type I errors: increased Type II errors (i.e. missing true effects), a bias toward studying large rather than small effects, a bias toward observing sensory and motor processes rather than complex cognitive and affective processes and deficient meta-analyses. Power analyses indicate that the reductions in acceptable P-values over time are producing dramatic increases in the Type II error rate. Moreover, the push for a mapwide false discovery rate (FDR) of 0.05 is based on the assumption that this is the FDR in most behavioral research; however, this is an inaccurate assessment of the conventions in actual behavioral research. We report simulations demonstrating that combined intensity and cluster size thresholds such as P < 0.005 with a 10 voxel extent produce a desirable balance between Types I and II error rates. This joint threshold produces high but acceptable Type II error rates and produces a FDR that is comparable to the effective FDR in typical behavioral science articles (while a 20 voxel extent threshold produces an actual FDR of 0.05 with relatively common imaging parameters). We recommend a greater focus on replication and meta-analysis rather than emphasizing single studies as the unit of analysis for establishing scientific truth. From this perspective, Type I errors are self-erasing because they will not replicate, thus allowing for more lenient thresholding to avoid Type II errors. PMID:20035017

  11. Prediction of Fracture Initiation in Hot Compression of Burn-Resistant Ti-35V-15Cr-0.3Si-0.1C Alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Saifei; Zeng, Weidong; Zhou, Dadi; Lai, Yunjin

    2015-11-01

    An important concern in hot working of metals is whether the desired deformation can be accomplished without fracture of the material. This paper builds a fracture prediction model to predict fracture initiation in hot compression of a burn-resistant beta-stabilized titanium alloy Ti-35V-15Cr-0.3Si-0.1C using a combined approach of upsetting experiments, theoretical failure criteria and finite element (FE) simulation techniques. A series of isothermal compression experiments on cylindrical specimens were conducted in temperature range of 900-1150 °C, strain rate of 0.01-10 s-1 first to obtain fracture samples and primary reduction data. Based on that, a comparison of eight commonly used theoretical failure criteria was made and Oh criterion was selected and coded into a subroutine. FE simulation of upsetting experiments on cylindrical specimens was then performed to determine the fracture threshold values of Oh criterion. By building a correlation between threshold values and the deforming parameters (temperature and strain rate, or Zener-Hollomon parameter), a new fracture prediction model based on Oh criterion was established. The new model shows an exponential decay relationship between threshold values and Zener-Hollomon parameter (Z), and the relative error of the model is less than 15%. This model was then applied successfully in the cogging of Ti-35V-15Cr-0.3Si-0.1C billet.

  12. Stable and selective self-assembly of α-lipoic acid on Ge(001) for biomolecule immobilization

    NASA Astrophysics Data System (ADS)

    Kazmierczak, M.; Flesch, J.; Mitzloff, J.; Capellini, G.; Klesse, W. M.; Skibitzki, O.; You, C.; Bettenhausen, M.; Witzigmann, B.; Piehler, J.; Schroeder, T.; Guha, S.

    2018-05-01

    We demonstrate a novel method for the stable and selective surface functionalization of germanium (Ge) embedded in silicon dioxide. The Ge(001) surface is functionalized using α-lipoic acid (ALA), which can potentially be utilized for the immobilization of a wide range of biomolecules. We present a detailed pH-dependence study to establish the effect of the incubation pH value on the adsorption layer of the ALA molecules. A threshold pH value for functionalization is identified, dividing the examined pH range into two regions. Below a pH value of 7, the formation of a disordered ALA multilayer is observed, whereas a stable well-ordered ALA mono- to bi-layer on Ge(001) is achieved at higher pH values. Furthermore, we analyze the stability of the ALA layer under ambient conditions, revealing the most stable functionalized Ge(001) surface to effectively resist oxidation for up to one week. Our established functionalization method paves the way towards the successful immobilization of biomolecules in future Ge-based biosensors.

  13. A new edge detection algorithm based on Canny idea

    NASA Astrophysics Data System (ADS)

    Feng, Yingke; Zhang, Jinmin; Wang, Siming

    2017-10-01

    The traditional Canny algorithm has poor self-adaptability threshold, and it is more sensitive to noise. In order to overcome these drawbacks, this paper proposed a new edge detection method based on Canny algorithm. Firstly, the media filtering and filtering based on the method of Euclidean distance are adopted to process it; secondly using the Frei-chen algorithm to calculate gradient amplitude; finally, using the Otsu algorithm to calculate partial gradient amplitude operation to get images of thresholds value, then find the average of all thresholds that had been calculated, half of the average is high threshold value, and the half of the high threshold value is low threshold value. Experiment results show that this new method can effectively suppress noise disturbance, keep the edge information, and also improve the edge detection accuracy.

  14. [Quality assurance of fine-needle aspiration cytology of the organized mammography screening].

    PubMed

    Bak, Mihály; Konyár, Eva; Schneider, Ferenc; Bidlek, Mária; Szabó, Eva; Nyári, Tibor; Godény, Mária; Kásler, Miklós

    2010-08-08

    The National Public Health Program has established the organized mammography screening in Hungary. The aim of our study was to determine the quality assurance of breast aspiration cytology. Cytology results were rated to 5 categories (C1, C2, C3, C4 and C5). All cytology reports were compared with the final histology diagnosis. 1361 women had aspiration cytology diagnosis performed from a total of 47718 mammography non-negative lesions. There were 805 (59.1%) benign and 187 (13.7%) malignant alterations. Sensitivity was 91%, specificity 88%, positive predictive value 96.6% and negative predictive value turned to be 71% (p<0.001). The auditing values of fine needle aspiration cytology in our laboratory meet, or in certain aspects exceed the proposed minimum threshold values.

  15. Behavior of a stochastic SIR epidemic model with saturated incidence and vaccination rules

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Li, Yang; Zhang, Qingling; Li, Aihua

    2018-07-01

    In this paper, the threshold behavior of a susceptible-infected-recovered (SIR) epidemic model with stochastic perturbation is investigated. Firstly, it is obtained that the system has a unique global positive solution with any positive initial value. Random effect may lead to disease extinction under a simple condition. Subsequently, sufficient condition for persistence has been established in the mean of the disease. Finally, some numerical simulations are carried out to confirm the analytical results.

  16. 76 FR 33170 - Defense Federal Acquisition Regulation Supplement; Inclusion of Option Amounts in Limitations on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ...DoD is issuing this final rule amending the Defense Federal Acquisition Regulation Supplement (DFARS) to implement section 826 of the National Defense Authorization Act for Fiscal Year 2011. Section 826 amended the DoD pilot program for transition to follow-on contracting after use of other transaction authority, to establish that the threshold limitation of $50 million for contracts and subcontracts under the program includes the dollar value of all options.

  17. A summary of the lateral cutoff analysis and results from NASA's Farfield Investigation of No-boom Thresholds

    NASA Astrophysics Data System (ADS)

    Cliatt, Larry J.; Hill, Michael A.; Haering, Edward A.; Arnac, Sarah R.

    2015-10-01

    In support of the ongoing effort by the National Aeronautics and Space Administration (NASA) to bring supersonic commercial travel to the public, NASA, in partnership with other industry organizations, conducted a flight research experiment to analyze acoustic propagation at the lateral edge of the sonic boom carpet. The name of the effort was the Farfield Investigation of No-boom Thresholds (FaINT). The research from FaINT determined an appropriate metric for sonic boom waveforms in the transition and shadow zones called Perceived Sound Exposure Level, established a value of 65 dB as a limit for the acoustic lateral extent of a sonic boom's noise region, analyzed change in sonic boom levels near lateral cutoff, and compared between real sonic boom measurements and numerical predictions.

  18. Effect of harmane on the convulsive threshold in epilepsy models in mice.

    PubMed

    Aricioglu, Feyza; Yillar, Okan; Korcegez, Eylem; Berkman, Kemal

    2003-12-01

    The study investigated the activity of harmane on maximal electroshock seizures (MES) and seizures induced by pentilentetrazole (PTZ) in mice. Initial studies established convulsive current 50 (CC(50)) values or MES and effective dose 50 (ED(50)) for PTZ to produce seizures. Harmane (2.5, 5.0, or 10 mg/kg intraperitoneally) increased the threshold of seizures in MES dose-dependently. The convulsions produced by PTZ were decreased by the low dose of harmane (2.5 mg/kg), but the high dose of harmane (10 mg/kg) resulted in worse grade V convulsions followed by more lethality compared with PTZ alone. Therefore, harmane seems to be protective against grand mal seizures in the MES model but not against a petit mal seizure model (PTZ) in mice.

  19. A Matter of Millimeters: Defining the Processes for Critical Clearances on Curiosity

    NASA Technical Reports Server (NTRS)

    Florow, Brandon

    2013-01-01

    The Mars Science Laboratory (MSL) mission presents an immense packaging problem in that it takes a rover the size of a car with a sky crane landing system and packs it tightly into a spacecraft. This creates many areas of close and critical clearances. Critical Clearances are defined as hardware-to-hardware or hardware-to-envelope clearances which fall below a pre-established location dependent threshold and pose a risk of hardware to hardware contact during events such as launch, entry, landing, and operations. Close Clearances, on the other hand, are defined as any clearance value that is chosen to be tracked but is larger than the critical clearance threshold for its region. Close clearances may be tracked for various reasons including uncertainty in design, large expected dynamic motion, etc.

  20. A Summary of the Lateral Cutoff Analysis and Results from Nasa's Farfield Investigation of No-Boom Thresholds

    NASA Technical Reports Server (NTRS)

    Cliatt, Larry J., II; Hill, Michael A.; Haering, Edward A., Jr.; Arnac, Sarah R.

    2015-01-01

    In support of the ongoing effort by the National Aeronautics and Space Administration (NASA) to bring supersonic commercial travel to the public, NASA, in partnership with other industry organizations, conducted a flight research experiment to analyze acoustic propagation at the lateral edge of the sonic boom carpet. The name of the effort was the Farfield Investigation of No-boom Thresholds (FaINT). The research from FaINT determined an appropriate metric for sonic boom waveforms in the transition and shadow zones called Perceived Sound Exposure Level, established a value of 65 dB as a limit for the acoustic lateral extent of a sonic boom's noise region, analyzed change in sonic boom levels near lateral cutoff, and compared between real sonic boom measurements and numerical predictions.

  1. Dependence of intravoxel incoherent motion diffusion MR threshold b-value selection for separating perfusion and diffusion compartments and liver fibrosis diagnostic performance.

    PubMed

    Wáng, Yì Xiáng J; Li, Yáo T; Chevallier, Olivier; Huang, Hua; Leung, Jason Chi Shun; Chen, Weitian; Lu, Pu-Xuan

    2018-01-01

    Background Intravoxel incoherent motion (IVIM) tissue parameters depend on the threshold b-value. Purpose To explore how threshold b-value impacts PF ( f), D slow ( D), and D fast ( D*) values and their performance for liver fibrosis detection. Material and Methods Fifteen healthy volunteers and 33 hepatitis B patients were included. With a 1.5-T magnetic resonance (MR) scanner and respiration gating, IVIM data were acquired with ten b-values of 10, 20, 40, 60, 80, 100, 150, 200, 400, and 800 s/mm 2 . Signal measurement was performed on the right liver. Segmented-unconstrained analysis was used to compute IVIM parameters and six threshold b-values in the range of 40-200 s/mm 2 were compared. PF, D slow , and D fast values were placed along the x-axis, y-axis, and z-axis, and a plane was defined to separate volunteers from patients. Results Higher threshold b-values were associated with higher PF measurement; while lower threshold b-values led to higher D slow and D fast measurements. The dependence of PF, D slow , and D fast on threshold b-value differed between healthy livers and fibrotic livers; with the healthy livers showing a higher dependence. Threshold b-value = 60 s/mm 2 showed the largest mean distance between healthy liver datapoints vs. fibrotic liver datapoints, and a classification and regression tree showed that a combination of PF (PF < 9.5%), D slow (D slow  < 1.239 × 10 -3 mm 2 /s), and D fast (D fast  < 20.85 × 10 -3 mm 2 /s) differentiated healthy individuals and all individual fibrotic livers with an area under the curve of logistic regression (AUC) of 1. Conclusion For segmented-unconstrained analysis, the selection of threshold b-value = 60 s/mm 2 improves IVIM differentiation between healthy livers and fibrotic livers.

  2. T1- Thresholds in Black Holes Increase Clinical-Radiological Correlation in Multiple Sclerosis Patients.

    PubMed

    Thaler, Christian; Faizy, Tobias; Sedlacik, Jan; Holst, Brigitte; Stellmann, Jan-Patrick; Young, Kim Lea; Heesen, Christoph; Fiehler, Jens; Siemonsen, Susanne

    2015-01-01

    Magnetic Resonance Imaging (MRI) is an established tool in diagnosing and evaluating disease activity in Multiple Sclerosis (MS). While clinical-radiological correlations are limited in general, hypointense T1 lesions (also known as Black Holes (BH)) have shown some promising results. The definition of BHs is very heterogeneous and depends on subjective visual evaluation. We aimed to improve clinical-radiological correlations by defining BHs using T1 relaxation time (T1-RT) thresholds to achieve best possible correlation between BH lesion volume and clinical disability. 40 patients with mainly relapsing-remitting MS underwent MRI including 3-dimensional fluid attenuated inversion recovery (FLAIR), magnetization-prepared rapid gradient echo (MPRAGE) before and after Gadolinium (GD) injection and double inversion-contrast magnetization-prepared rapid gradient echo (MP2RAGE) sequences. BHs (BHvis) were marked by two raters on native T1-weighted (T1w)-MPRAGE, contrast-enhancing lesions (CE lesions) on T1w-MPRAGE after GD and FLAIR lesions (total-FLAIR lesions) were detected separately. BHvis and total-FLAIR lesion maps were registered to MP2RAGE images, and the mean T1-RT were calculated for all lesion ROIs. Mean T1 values of the cortex (CTX) were calculated for each patient. Subsequently, Spearman rank correlations between clinical scores (Expanded Disability Status Scale and Multiple Sclerosis Functional Composite) and lesion volume were determined for different T1-RT thresholds. Significant differences in T1-RT were obtained between all different lesion types with highest T1 values in visually marked BHs (BHvis: 1453.3±213.4 ms, total-FLAIR lesions: 1394.33±187.38 ms, CTX: 1305.6±35.8 ms; p<0.05). Significant correlations between BHvis/total-FLAIR lesion volume and clinical disability were obtained for a wide range of T1-RT thresholds. The highest correlation for BHvis and total-FLAIR lesion masks were found at T1-RT>1500 ms (Expanded Disability Status Scale vs. lesion volume: rBHvis = 0.442 and rtotal-FLAIR = 0.497, p<0.05; Multiple Sclerosis Functional Composite vs. lesion volume: rBHvis = -0.53 and rtotal-FLAIR = -0.627, p<0.05). Clinical-radiological correlations in MS patients are increased by application of T1-RT thresholds. With the short acquisition time of the MP2RAGE sequences, quantitative T1 maps could be easily established in clinical studies.

  3. Developing a driving Safety Index using a Delphi stated preference experiment.

    PubMed

    Jamson, Samantha; Wardman, Mark; Batley, Richard; Carsten, Oliver

    2008-03-01

    Whilst empirical evidence is available concerning the effect of some aspects of driving behaviour on safety (e.g. speed choice), there is scant knowledge about safety thresholds, i.e. the point at which behaviour can be considered unsafe. Furthermore, it is almost impossible to ascertain the interaction between various aspects of driving behaviour. For example, how might drivers' lateral control of a vehicle be mediated by their speed choice-are the effects additive or do they cancel each other out. Complex experimental or observational studies would need to be undertaken to establish the nature of such effects. As an alternative, a Delphi study was undertaken to use expert judgement as a way of deriving a first approximation of these threshold and combinatory effects. Using a stated preference technique, road safety professionals make judgements about drivers' safe or unsafe behaviour. The aim was to understand the relative weightings that are assigned to a number of driver behaviours and thereby to construct a Safety Index. As expected, experts were able to establish thresholds, above (or below) which changes to the behavioural parameters had minimal impact on safety. This provided us with a Safety Index, based on a model that had face validity and a convincing range of values. However, the experts found the task of combining these driver behaviours more difficult, reflecting the elusive nature of safety estimates. Suggestions for future validation of our Safety Index are provided.

  4. An approach to derive groundwater and stream threshold values for total nitrogen and ensure good ecological status of associated aquatic ecosystems - example from a coastal catchment to a vulnerable Danish estuary.

    NASA Astrophysics Data System (ADS)

    Hinsby, Klaus; Markager, Stiig; Kronvang, Brian; Windolf, Jørgen; Sonnenborg, Torben; Sørensen, Lærke

    2015-04-01

    Nitrate, which typically makes up the major part (~>90%) of dissolved inorganic nitrogen in groundwater and surface water, is the most frequent pollutant responsible for European groundwater bodies failing to meet the good status objectives of the European Water Framework Directive generally when comparing groundwater monitoring data with the nitrate quality standard of the Groundwater Directive (50 mg/l = the WHO drinking water standard). Still, while more than 50 % of the European surface water bodies do not meet the objective of good ecological status "only" 25 % of groundwater bodies do not meet the objective of good chemical status according to the river basin management plans reported by the EU member states. However, based on a study on interactions between groundwater, streams and a Danish estuary we argue that nitrate threshold values for aerobic groundwater often need to be significantly below the nitrate quality standard to ensure good ecological status of associated surface water bodies, and hence that the chemical status of European groundwater is worse than indicated by the present assessments. Here we suggest a methodology for derivation of groundwater and stream threshold values for total nitrogen ("nitrate") in a coastal catchment based on assessment of maximum acceptable nitrogen loadings (thresholds) to the associated vulnerable estuary. The applied method use existing information on agricultural practices and point source emissions in the catchment, groundwater, stream quantity and quality monitoring data that all feed data to an integrated groundwater and surface water modelling tool enabling us to conduct an assessment of total nitrogen loads and threshold concentrations derived to ensure/restore good ecological status of the investigated estuary. For the catchment to the Horsens estuary in Denmark we estimate the stream and groundwater thresholds for total nitrogen to be about 13 and 27 mg/l (~ 12 and 25 mg/l of nitrate). The shown example of deriving nitrogen threshold concentrations is for groundwater and streams in a coastal catchment discharging to a vulnerable estuary in Denmark, but the principles may be applied to large river basins with sub-catchments in several countries such as e.g. the Danube or the Rhine. In this case the relevant countries need to collaborate on derivation of nitrogen thresholds based on e.g. maximum acceptable nitrogen loadings to the Black Sea / the North Sea, and finally agree on thresholds for different parts of the river basin. Phosphorus is another nutrient which frequently results in or contributes to the eutrophication of surface waters. The transport and retention processes of total phosphorus (TP) is more complex than for nitrate (or alternatively total N), and presently we are able to establish TP thresholds for streams but not for groundwater. Derivation of TP thresholds is covered in an accompanying paper by Kronvang et al.

  5. Threshold network of a financial market using the P-value of correlation coefficients

    NASA Astrophysics Data System (ADS)

    Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun

    2015-06-01

    Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.

  6. Comparison of conventional versus three-dimensional ultrasound in fetal renal pelvis measurement and their potential prediction of neonatal uropathies.

    PubMed

    Duin, L K; Nijhuis, J G; Scherjon, S A; Vossen, M; Willekes, C

    2016-01-01

    To establish a threshold value for fetal renal pelvis dilatation measured by automatic volume calculation (SonoAVC) in the third trimester of pregnancy to predict neonatal uropathies, and to compare these results with conventional antero-posterior (AP) measurement, fetal kidney 3D volume and renal parenchymal thickness. In a prospective cohort study, 125 fetuses with renal pelvis AP diameter of ≥5 mm both at 20 weeks of gestation and in the third trimester, underwent an additional 3D volume measurement of the fetal kidney in the third trimester. Receiver operating characteristic (ROC) curves for establishing threshold values for fetal renal pelvis volume, AP measurement, fetal kidney volume and renal parenchymal thickness to predict neonatal uropathies were analyzed. Also, sensitivity, specificity, area under the curve (AUC) and likelihood ratios were calculated. A cut-off point of 1.58 cm³ was identified in the third trimester of pregnancy (AUC 0.865 (95% CI 0.789-0.940), sensitivity 76.3%, specificity 87.4%, LR+ 6.06, LR- 0.27) for measurements with SonoAVC. A cut-off value of 11.5 mm was established in the third trimester of pregnancy (AUC 0.828 (95% CI 0.737-0.918), sensitivity 71.1%, specificity 85.1%, LR+ 4.77, LR- 0.34) for the conventional AP measurement. A cut-off point for fetal kidney volume was calculated at 13.29 cm³ (AUC 0.769 (95% CI 0.657-0.881), sensitivity 71%, specificity 66%, LR+ 2.09, LR- 0.44). For renal parenchymal thickness, a cut-off point of 8.4 mm was established (AUC 0.216 (95% CI 0.117-0.315), sensitivity 31.6%, specificity 32.6%, LR+ 0.47, LR- 2.10). This study demonstrates that 3D fetal renal pelvis volume measurements and AP measurements both have a good and comparable diagnostic performance, fetal renal volume a fair accuracy and renal parenchymal thickness a poor accuracy in predicting postnatal renal outcome.

  7. Comparability of children's sedentary time estimates derived from wrist worn GENEActiv and hip worn ActiGraph accelerometer thresholds.

    PubMed

    Boddy, Lynne M; Noonan, Robert J; Kim, Youngwon; Rowlands, Alex V; Welk, Greg J; Knowles, Zoe R; Fairclough, Stuart J

    2018-03-28

    To examine the comparability of children's free-living sedentary time (ST) derived from raw acceleration thresholds for wrist mounted GENEActiv accelerometer data, with ST estimated using the waist mounted ActiGraph 100count·min -1 threshold. Secondary data analysis. 108 10-11-year-old children (n=43 boys) from Liverpool, UK wore one ActiGraph GT3X+ and one GENEActiv accelerometer on their right hip and left wrist, respectively for seven days. Signal vector magnitude (SVM; mg) was calculated using the ENMO approach for GENEActiv data. ST was estimated from hip-worn ActiGraph data, applying the widely used 100count·min -1 threshold. ROC analysis using 10-fold hold-out cross-validation was conducted to establish a wrist-worn GENEActiv threshold comparable to the hip ActiGraph 100count·min -1 threshold. GENEActiv data were also classified using three empirical wrist thresholds and equivalence testing was completed. Analysis indicated that a GENEActiv SVM value of 51mg demonstrated fair to moderate agreement (Kappa: 0.32-0.41) with the 100count·min -1 threshold. However, the generated and empirical thresholds for GENEActiv devices were not significantly equivalent to ActiGraph 100count·min -1 . GENEActiv data classified using the 35.6mg threshold intended for ActiGraph devices generated significantly equivalent ST estimates as the ActiGraph 100count·min -1 . The newly generated and empirical GENEActiv wrist thresholds do not provide equivalent estimates of ST to the ActiGraph 100count·min -1 approach. More investigation is required to assess the validity of applying ActiGraph cutpoints to GENEActiv data. Future studies are needed to examine the backward compatibility of ST data and to produce a robust method of classifying SVM-derived ST. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  8. Estimation of the geochemical threshold and its statistical significance

    USGS Publications Warehouse

    Miesch, A.T.

    1981-01-01

    A statistic is proposed for estimating the geochemical threshold and its statistical significance, or it may be used to identify a group of extreme values that can be tested for significance by other means. The statistic is the maximum gap between adjacent values in an ordered array after each gap has been adjusted for the expected frequency. The values in the ordered array are geochemical values transformed by either ln(?? - ??) or ln(?? - ??) and then standardized so that the mean is zero and the variance is unity. The expected frequency is taken from a fitted normal curve with unit area. The midpoint of an adjusted gap that exceeds the corresponding critical value may be taken as an estimate of the geochemical threshold, and the associated probability indicates the likelihood that the threshold separates two geochemical populations. The adjusted gap test may fail to identify threshold values if the variation tends to be continuous from background values to the higher values that reflect mineralized ground. However, the test will serve to identify other anomalies that may be too subtle to have been noted by other means. ?? 1981.

  9. Net reclassification index at event rate: properties and relationships.

    PubMed

    Pencina, Michael J; Steyerberg, Ewout W; D'Agostino, Ralph B

    2017-12-10

    The net reclassification improvement (NRI) is an attractively simple summary measure quantifying improvement in performance because of addition of new risk marker(s) to a prediction model. Originally proposed for settings with well-established classification thresholds, it quickly extended into applications with no thresholds in common use. Here we aim to explore properties of the NRI at event rate. We express this NRI as a difference in performance measures for the new versus old model and show that the quantity underlying this difference is related to several global as well as decision analytic measures of model performance. It maximizes the relative utility (standardized net benefit) across all classification thresholds and can be viewed as the Kolmogorov-Smirnov distance between the distributions of risk among events and non-events. It can be expressed as a special case of the continuous NRI, measuring reclassification from the 'null' model with no predictors. It is also a criterion based on the value of information and quantifies the reduction in expected regret for a given regret function, casting the NRI at event rate as a measure of incremental reduction in expected regret. More generally, we find it informative to present plots of standardized net benefit/relative utility for the new versus old model across the domain of classification thresholds. Then, these plots can be summarized with their maximum values, and the increment in model performance can be described by the NRI at event rate. We provide theoretical examples and a clinical application on the evaluation of prognostic biomarkers for atrial fibrillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. 30 CFR 71.700 - Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Inhalation hazards; threshold limit values for... SURFACE WORK AREAS OF UNDERGROUND COAL MINES Airborne Contaminants § 71.700 Inhalation hazards; threshold... containing quartz, and asbestos dust) in excess of, on the basis of a time-weighted average, the threshold...

  11. Effect of active-region “volume” on the radiative properties of laser heterostructures with radiation output through the substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nekorkin, S. M.; Zvonkov, B. N.; Baidus, N. V.

    2017-01-15

    The radiative properties of InGaAs/GaAs/InGaP laser structures with radiation output through the substrate depending on the number of quantum wells in the active region and laser diodes on their basis are investigated. It is established that the presence of six–eight quantum wells in the active region is optimum from the viewpoint of observable values of the threshold current and the output optical power of lasers.

  12. Assessing the potential risk of Zika virus epidemics in temperate areas with established Aedes albopictus populations.

    PubMed

    Guzzetta, Giorgio; Poletti, Piero; Montarsi, Fabrizio; Baldacchino, Frederic; Capelli, Gioia; Rizzoli, Annapaola; Rosà, Roberto; Merler, Stefano

    2016-04-14

    Based on 2015 abundance of Aedes albopictus in nine northern Italian municipalities with temperate continental/oceanic climate, we estimated the basic reproductive number R0 for Zika virus (ZIKV) to be systematically below the epidemic threshold in most scenarios. Results were sensitive to the value of the probability of mosquito infection after biting a viraemic host. Therefore, further studies are required to improve models and predictions, namely evaluating vector competence and potential non-vector transmissions.

  13. Tornado risks and design windspeeds for the Oak Ridge Plant Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-08-01

    The effects of tornadoes and other extreme winds should be considered in establishing design criteria for structures to resist wind loads. Design standards that are incorporated in building codes do not normally include the effects of tornadoes in their wind load criteria. Some tornado risk models ignore the presence of nontornadic extreme winds. The purpose of this study is to determine the probability of tornadic and straight winds exceeding a threshold value in the geographical region surrounding the Oak Ridge, Tennessee plant site.

  14. Threshold Concepts in Finance: Student Perspectives

    ERIC Educational Resources Information Center

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  15. Thresholds for conservation and management: structured decision making as a conceptual framework

    USGS Publications Warehouse

    Nichols, James D.; Eaton, Mitchell J.; Martin, Julien; Edited by Guntenspergen, Glenn R.

    2014-01-01

    changes in system dynamics. They are frequently incorporated into ecological models used to project system responses to management actions. Utility thresholds are components of management objectives and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. Decision thresholds are derived from the other components of the decision process.We advocate a structured decision making (SDM) approach within which the following components are identified: objectives (possibly including utility thresholds), potential actions, models (possibly including ecological thresholds), monitoring program, and a solution algorithm (which produces decision thresholds). Adaptive resource management (ARM) is described as a special case of SDM developed for recurrent decision problems that are characterized by uncertainty. We believe that SDM, in general, and ARM, in particular, provide good approaches to conservation and management. Use of SDM and ARM also clarifies the distinct roles of ecological thresholds, utility thresholds, and decision thresholds in informed decision processes.

  16. Establishing seasonal and alert influenza thresholds in Cambodia using the WHO method: implications for effective utilization of influenza surveillance in the tropics and subtropics.

    PubMed

    Ly, Sovann; Arashiro, Takeshi; Ieng, Vanra; Tsuyuoka, Reiko; Parry, Amy; Horwood, Paul; Heng, Seng; Hamid, Sarah; Vandemaele, Katelijn; Chin, Savuth; Sar, Borann; Arima, Yuzo

    2017-01-01

    To establish seasonal and alert thresholds and transmission intensity categories for influenza to provide timely triggers for preventive measures or upscaling control measures in Cambodia. Using Cambodia's influenza-like illness (ILI) and laboratory-confirmed influenza surveillance data from 2009 to 2015, three parameters were assessed to monitor influenza activity: the proportion of ILI patients among all outpatients, proportion of ILI samples positive for influenza and the product of the two. With these parameters, four threshold levels (seasonal, moderate, high and alert) were established and transmission intensity was categorized based on a World Health Organization alignment method. Parameters were compared against their respective thresholds. Distinct seasonality was observed using the two parameters that incorporated laboratory data. Thresholds established using the composite parameter, combining syndromic and laboratory data, had the least number of false alarms in declaring season onset and were most useful in monitoring intensity. Unlike in temperate regions, the syndromic parameter was less useful in monitoring influenza activity or for setting thresholds. Influenza thresholds based on appropriate parameters have the potential to provide timely triggers for public health measures in a tropical country where monitoring and assessing influenza activity has been challenging. Based on these findings, the Ministry of Health plans to raise general awareness regarding influenza among the medical community and the general public. Our findings have important implications for countries in the tropics/subtropics and in resource-limited settings, and categorized transmission intensity can be used to assess severity of potential pandemic influenza as well as seasonal influenza.

  17. AREA RADIATION MONITOR

    DOEpatents

    Manning, F.W.; Groothuis, S.E.; Lykins, J.H.; Papke, D.M.

    1962-06-12

    S>An improved area radiation dose monitor is designed which is adapted to compensate continuously for background radiation below a threshold dose rate and to give warning when the dose integral of the dose rate of an above-threshold radiation excursion exceeds a selected value. This is accomplished by providing means for continuously charging an ionization chamber. The chamber provides a first current proportional to the incident radiation dose rate. Means are provided for generating a second current including means for nulling out the first current with the second current at all values of the first current corresponding to dose rates below a selected threshold dose rate value. The second current has a maximum value corresponding to that of the first current at the threshold dose rate. The excess of the first current over the second current, which occurs above the threshold, is integrated and an alarm is given at a selected integrated value of the excess corresponding to a selected radiation dose. (AEC)

  18. Skin notation in the context of workplace exposure standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scansetti, G.; Piolatto, G.; Rubino, G.F.

    1988-01-01

    In the establishment of workplace exposure standards, the potential for cutaneous absorption is taken into consideration through the addition of skin notation to the relevant substance. In the TLVs Documentation (ACGIH, 1986) dermal lethal dose to 50% (LD50) or human data are the bases for the assignment of skin notation to 91 of 168 substances. For the other substances, the skin attribution seems to be based on undocumented statements in 24 (14.5%), skin effects in 13 (8%), and analogy in 7 (4%), while in the remaining 33 (20%) any reference is lacking as to the basis for notation of themore » cutaneous route of entry. Furthermore, since the established cut-off value of 2 g/kg is sometimes bypassed when a notation is added or omitted, the use of dermal LD50 is perplexing. Given the relevance of the skin notation for the validation of threshold limit values (TLVs) in the workplace, a full examination and citation of all available scientific data are recommended when establishing the TLV of substances absorbable through the skin.« less

  19. It is time to develop ecological thresholds of toxicological concern to assist environmental hazard assessment.

    PubMed

    Belanger, Scott E; Sanderson, Hans; Embry, Michelle R; Coady, Katie; DeZwart, Dick; Farr, Brianna A; Gutsell, Steve; Halder, Marlies; Sternberg, Robin; Wilson, Peter

    2015-12-01

    The threshold of toxicological concern (TTC) concept is well established for assessing human safety of food-contact substances and has been reapplied for a variety of endpoints, including carcinogenicity, teratogenicity, and reproductive toxicity. The TTC establishes an exposure level for chemicals below which no appreciable risk to human health or the environment is expected, based on a de minimis value for toxicity identified for many chemicals. Threshold of toxicological concern approaches have benefits for screening-level risk assessments, including the potential for rapid decision-making, fully utilizing existing knowledge, reasonable conservativeness for chemicals used in lower volumes (low production volume chemicals (e.g., < 1 t/yr), and reduction or elimination of unnecessary animal tests. Higher production volume chemicals (>1 t/yr) would in principle always require specific information because of the presumed higher exposure potential. The TTC approach has found particular favor in the assessment of chemicals used in cosmetics and personal care products, as well as other chemicals traditionally used in low volumes. Use of the TTC in environmental safety is just beginning, and initial attempts are being published. Key questions focus on hazard extrapolation of diverse taxa across trophic levels, importance of mode of action, and whether safe concentrations for ecosystems estimated from acute or chronic toxicity data are equally useful and in what contexts. The present study provides an overview of the theoretical basis for developing an ecological (eco)-TTC, with an initial exploration of chemical assessment and boundary conditions for use. An international collaboration under the International Life Sciences Institute Health and Environmental Sciences Institute has been established to address challenges related to developing and applying useful eco-TTC concepts. © 2015 SETAC.

  20. Infrared laser damage thresholds in corneal tissue phantoms using femtosecond laser pulses

    NASA Astrophysics Data System (ADS)

    Boretsky, Adam R.; Clary, Joseph E.; Noojin, Gary D.; Rockwell, Benjamin A.

    2018-02-01

    Ultrafast lasers have become a fixture in many biomedical, industrial, telecommunications, and defense applications in recent years. These sources are capable of generating extremely high peak power that can cause laser-induced tissue breakdown through the formation of a plasma upon exposure. Despite the increasing prevalence of such lasers, current safety standards (ANSI Z136.1-2014) do not include maximum permissible exposure (MPE) values for the cornea with pulse durations less than one nanosecond. This study was designed to measure damage thresholds in corneal tissue phantoms in the near-infrared and mid-infrared to identify the wavelength dependence of laser damage thresholds from 1200-2500 nm. A high-energy regenerative amplifier and optical parametric amplifier outputting 100 femtosecond pulses with pulse energies up to 2 mJ were used to perform exposures and determine damage thresholds in transparent collagen gel tissue phantoms. Three-dimensional imaging, primarily optical coherence tomography, was used to evaluate tissue phantoms following exposure to determine ablation characteristics at the surface and within the bulk material. The determination of laser damage thresholds in the near-IR and mid-IR for ultrafast lasers will help to guide safety standards and establish the appropriate MPE levels for exposure sensitive ocular tissue such as the cornea. These data will help promote the safe use of ultrafast lasers for a wide range of applications.

  1. Doctoral conceptual thresholds in cellular and molecular biology

    NASA Astrophysics Data System (ADS)

    Feldon, David F.; Rates, Christopher; Sun, Chongning

    2017-12-01

    In the biological sciences, very little is known about the mechanisms by which doctoral students acquire the skills they need to become independent scientists. In the postsecondary biology education literature, identification of specific skills and effective methods for helping students to acquire them are limited to undergraduate education. To establish a foundation from which to investigate the developmental trajectory of biologists' research skills, it is necessary to identify those skills which are integral to doctoral study and distinct from skills acquired earlier in students' educational pathways. In this context, the current study engages the framework of threshold concepts to identify candidate skills that are both obstacles and significant opportunities for developing proficiency in conducting research. Such threshold concepts are typically characterised as transformative, integrative, irreversible, and challenging. The results from interviews and focus groups with current and former doctoral students in cellular and molecular biology suggest two such threshold concepts relevant to their subfield: the first is an ability to effectively engage primary research literature from the biological sciences in a way that is critical without dismissing the value of its contributions. The second is the ability to conceptualise appropriate control conditions necessary to design and interpret the results of experiments in an efficient and effective manner for research in the biological sciences as a discipline. Implications for prioritising and sequencing graduate training experiences are discussed on the basis of the identified thresholds.

  2. Crystal growth, perfection, linear and nonlinear optical, photoconductivity, dielectric, thermal and laser damage threshold properties of 4-methylimidazolium picrate: an interesting organic crystal for photonic and optoelectronic devices

    NASA Astrophysics Data System (ADS)

    Rajesh, K.; Arun, A.; Mani, A.; Praveen Kumar, P.

    2016-10-01

    The 4-methylimidazolium picrate has been synthesized and characterized successfully. Single and powder x-ray diffraction studies were conducted which confirmed the crystal structure, and the value of the strain was calculated. The crystal perfection was determined by a HRXR diffractometer. The transmission spectrum exhibited a better transmittance of the crystal in the entire visible region with a lower cut-off wavelength of 209 nm. The linear absorption value was calculated by the optical limiting method. A birefringence study was also carried out. Second and third order nonlinear optical properties of the crystal were found by second harmonic generation and the z-scan technique. The crystals were also characterized by dielectric measurement and a photoconductivity analyzer to determine the dielectric property and the optical conductivity of the crystal. The laser damage threshold activity of the grown crystal was studied by a Q-switched Nd:YAG laser beam. Thermal studies established that the compound did not undergo a phase transition and was stable up to 240 °C.

  3. Suppressing epidemic spreading by risk-averse migration in dynamical networks

    NASA Astrophysics Data System (ADS)

    Yang, Han-Xin; Tang, Ming; Wang, Zhen

    2018-01-01

    In this paper, we study the interplay between individual behaviors and epidemic spreading in a dynamical network. We distribute agents on a square-shaped region with periodic boundary conditions. Every agent is regarded as a node of the network and a wireless link is established between two agents if their geographical distance is less than a certain radius. At each time, every agent assesses the epidemic situation and make decisions on whether it should stay in or leave its current place. An agent will leave its current place with a speed if the number of infected neighbors reaches or exceeds a critical value E. Owing to the movement of agents, the network's structure is dynamical. Interestingly, we find that there exists an optimal value of E leading to the maximum epidemic threshold. This means that epidemic spreading can be effectively controlled by risk-averse migration. Besides, we find that the epidemic threshold increases as the recovering rate increases, decreases as the contact radius increases, and is maximized by an optimal moving speed. Our findings offer a deeper understanding of epidemic spreading in dynamical networks.

  4. Comparison of 2 real-time PCR assays for diagnosis of Pneumocystis jirovecii pneumonia in human immunodeficiency virus (HIV) and non-HIV immunocompromised patients.

    PubMed

    Montesinos, Isabel; Brancart, Françoise; Schepers, Kinda; Jacobs, Frederique; Denis, Olivier; Delforge, Marie-Luce

    2015-06-01

    A total of 120 bronchoalveolar lavage specimens from HIV and non-HIV immunocompromised patients, positive for Pneumocystis jirovecii by an "in house" real-time polymerase chain reaction (PCR), were evaluated by the Bio-Evolution Pneumocystis real-time PCR, a commercial quantitative assay. Patients were classified in 2 categories based on clinical and radiological findings: definite and unlikely Pneumocystis pneumonia (PCP). For the "in house" PCR, cycle threshold 34 was established as cut-off value to discriminate definite PCP from unlikely PCP with 65% and 85% of sensitivity and specificity, respectively. For the Bio-Evolution quantitative PCR, a cut-off value of 2.8×10(5)copies/mL was defined with 72% and 82% of sensitivity and specificity, respectively. Overlapped zones of results for definite and unlikely PCP were observed. Quantitative PCR is probably a useful tool for PCP diagnosis. However, for optimal management of PCP in non-HIV immunocompromised patients, operational thresholds should be assessed according to underlying diseases and other clinical and radiological parameters. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. MRI and Diffusion-weighted MRI Volumetry for Identification of Complete Tumor Responders After Preoperative Chemoradiotherapy in Patients With Rectal Cancer: A Bi-institutional Validation Study.

    PubMed

    Lambregts, Doenja M J; Rao, Sheng-Xiang; Sassen, Sander; Martens, Milou H; Heijnen, Luc A; Buijsen, Jeroen; Sosef, Meindert; Beets, Geerard L; Vliegen, Roy A; Beets-Tan, Regina G H

    2015-12-01

    Retrospective single-center studies have shown that diffusion-weighted magnetic resonance imaging (DWI) is promising for identification of patients with rectal cancer with a complete tumor response after neoadjuvant chemoradiotherapy (CRT), using certain volumetric thresholds. This study aims to validate the diagnostic value of these volume thresholds in a larger, independent, and bi-institutional patient cohort. A total of 112 patients with locally advanced rectal cancer (2 centers) treated with a long course of CRT were enrolled. Patients underwent standard T2W-magnetic resonance imaging and DWI, both pre- and post-CRT. Two experienced readers independently determined pre-CRT and post-CRT tumor volumes (cm) on T2W-magnetic resonance image and diffusion-weighted magnetic resonance image by means of freehand tumor delineation. Tumor volume reduction rates (Δvolume) were calculated. Previously determined T2W and DWI threshold values for prevolume, postvolume, and Δvolume were tested to "prospectively" assess their respective diagnostic value in discriminating patients with a complete tumor response from patients with residual tumor. Twenty patients had a complete response. Using the average measurements between the 2 readers, areas under the curve for the pre-/post-/Δvolumes was 0.73/0.82/0.78 for T2W-magnetic resonance imaging and 0.77/0.92/0.86 for DWI, respectively. For T2W-volumetry, sensitivity and specificity using the predefined volume thresholds were 55% and 74% for pre-, 60% and 89% for post-, and 60% and 86% for Δvolume. For DWI volumetry, sensitivity and specificity were 65% and 76% for pre-, 70% and 98% for post-, and 70% and 93% for Δvolume. Previously established DWI volume thresholds can be reproduced with good results. Post-CRT DWI volumetry offers the best results for the detection of patients with a complete response after CRT with an area under the curve of 0.92, sensitivity of 70%, and specificity of 98%.

  6. Effect of background correction on peak detection and quantification in online comprehensive two-dimensional liquid chromatography using diode array detection.

    PubMed

    Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W

    2012-09-07

    A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Determination of the measurement threshold in gamma-ray spectrometry.

    PubMed

    Korun, M; Vodenik, B; Zorko, B

    2017-03-01

    In gamma-ray spectrometry the measurement threshold describes the lover boundary of the interval of peak areas originating in the response of the spectrometer to gamma-rays from the sample measured. In this sense it presents a generalization of the net indication corresponding to the decision threshold, which is the measurement threshold at the quantity value zero for a predetermined probability for making errors of the first kind. Measurement thresholds were determined for peaks appearing in the spectra of radon daughters 214 Pb and 214 Bi by measuring the spectrum 35 times under repeatable conditions. For the calculation of the measurement threshold the probability for detection of the peaks and the mean relative uncertainty of the peak area were used. The relative measurement thresholds, the ratios between the measurement threshold and the mean peak area uncertainty, were determined for 54 peaks where the probability for detection varied between some percent and about 95% and the relative peak area uncertainty between 30% and 80%. The relative measurement thresholds vary considerably from peak to peak, although the nominal value of the sensitivity parameter defining the sensitivity for locating peaks was equal for all peaks. At the value of the sensitivity parameter used, the peak analysis does not locate peaks corresponding to the decision threshold with the probability in excess of 50%. This implies that peaks in the spectrum may not be located, although the true value of the measurand exceeds the decision threshold. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Thermal detection thresholds in 5-year-old preterm born children; IQ does matter.

    PubMed

    de Graaf, Joke; Valkenburg, Abraham J; Tibboel, Dick; van Dijk, Monique

    2012-07-01

    Experiencing pain at newborn age may have consequences on one's somatosensory perception later in life. Children's perception for cold and warm stimuli may be determined with the Thermal Sensory Analyzer (TSA) device by two different methods. This pilot study in 5-year-old children born preterm aimed at establishing whether the TSA method of limits, which is dependent of reaction time, and the method of levels, which is independent of reaction time, would yield different cold and warm detection thresholds. The second aim was to establish possible associations between intellectual ability and the detection thresholds obtained with either method. A convenience sample was drawn from the participants in an ongoing 5-year follow-up study of a randomized controlled trial on effects of morphine during mechanical ventilation. Thresholds were assessed using both methods and statistically compared. Possible associations between the child's intelligence quotient (IQ) and threshold levels were analyzed. The method of levels yielded more sensitive thresholds than did the method of limits, i.e. mean (SD) cold detection thresholds: 30.3 (1.4) versus 28.4 (1.7) (Cohen'sd=1.2, P=0.001) and warm detection thresholds; 33.9 (1.9) versus 35.6 (2.1) (Cohen's d=0.8, P=0.04). IQ was statistically significantly associated only with the detection thresholds obtained with the method of limits (cold: r=0.64, warm: r=-0.52). The TSA method of levels, is to be preferred over the method of limits in 5-year-old preterm born children, as it establishes more sensitive detection thresholds and is independent of IQ. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. THRESHOLD LOGIC.

    DTIC Science & Technology

    synthesis procedures; a ’best’ method is definitely established. (2) ’Symmetry Types for Threshold Logic’ is a tutorial expositon including a careful...development of the Goto-Takahasi self-dual type ideas. (3) ’Best Threshold Gate Decisions’ reports a comparison, on the 2470 7-argument threshold ...interpretation is shown best. (4) ’ Threshold Gate Networks’ reviews the previously discussed 2-algorithm in geometric terms, describes our FORTRAN

  10. Midline Shift Threshold Value for Hemiparesis in Chronic Subdural Hematoma.

    PubMed

    Juković, Mirela F; Stojanović, Dejan B

    2015-01-01

    Chronic subdural hematoma (CSDH) has a variety of clinical presentations, with numerous neurological symptoms and signs. Hemiparesis is one of the leading signs that potentially indicates CSDH. Purpose of this study was to determine the threshold (cut-off) value of midsagittal line (MSL) shift after which hemiparesis is likely to appear. The study evaluated 83 patients with 53 unilateral and 30 bilateral CSDHs in period of three years. Evaluated computed tomography (CT) findings in patients with CSDH were diameter of the hematoma and midsagittal line shift, measured on non-contrast CT scan in relation with occurrence of hemiparesis. Threshold values of MSL shift for both types of CSDHs were obtained as maximal (equal) sensitivity and specificity (intersection of the curves). MSL is a good predictor for hemiparesis occurrence (total sample, AUROC 0.75, p=0.0001). Unilateral and bilateral CSDHs had different threshold values of the MSL for hemiparesis development. Results suggested that in unilateral CSDH the threshold values of MSL could be at 10 mm (AUROC=0.65; p=0.07). For bilateral CSDH the threshold level of MSL shift was 4.5 mm (AUROC=0.77; p=0.01). Our study pointed on the phenomenon that midsagittal line shift can predict hemiparesis occurrence. Hemiparesis in patients with bilateral CSDH was more related to midsagittal line shift compared with unilateral CSDH. When value of midsagittal line shift exceed the threshold level, hemiparesis occurs with certain probability.

  11. Reduced rank regression via adaptive nuclear norm penalization

    PubMed Central

    Chen, Kun; Dong, Hongbo; Chan, Kung-Sik

    2014-01-01

    Summary We propose an adaptive nuclear norm penalization approach for low-rank matrix approximation, and use it to develop a new reduced rank estimation method for high-dimensional multivariate regression. The adaptive nuclear norm is defined as the weighted sum of the singular values of the matrix, and it is generally non-convex under the natural restriction that the weight decreases with the singular value. However, we show that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition. The method is computationally efficient, and the resulting solution path is continuous. The rank consistency of and prediction/estimation performance bounds for the estimator are established for a high-dimensional asymptotic regime. Simulation studies and an application in genetics demonstrate its efficacy. PMID:25045172

  12. Additional value of anaerobic threshold in a general mortality prediction model in a urban patient cohort with Chagas cardiomyopathy.

    PubMed

    Silva, Roberto Ribeiro da; Reis, Michel Silva; Pereira, Basílio de Bragança; Nascimento, Emilia Matos do; Pedrosa, Roberto Coury

    2017-12-01

    Anaerobic threshold (AT) is recognized as objective and direct measurement that reflects variations in metabolism of skeletal muscles during exercise. Its prognostic value in heart diseases of non-chagasic etiology is well established. However, the assessment of risk of death in Chagas heart disease is relatively well established by Rassi score. But, the added value that AT can bring to Rassi score has not been studied yet. To assess whether AT presents additional effect to Rassi score in patients with chronic Chagas' heart disease. Prospective research of dynamic cohort by review of 150 medical records of patients. Were selected for cohort 45 medical records of patients who underwent cardiopulmonary exercise testing between 1996-1997 and followed until September 2015. Data analysis to detect association between studied variables can be seen using a logistic regression model. The suitability of the models was verified using ROC curves and the coefficient of determination R 2 . 8 patients (17.78%) died by September 2015, with 7 of them (87.5%) from cardiovascular causes, of which 4 (57.14%) were considered on high risk by Rassi score. With Rassi score as independent variable, and death being the outcome, we obtained an area under the curve (AUC)=0.711, with R 2 =0.214. Instituting AT as independent variable, we found AUC=0.706, with R 2 =0.078. When we define Rassi score and AT as independent variables, it was obtained AUC=0.800 and R 2 =0.263. when AT is included in logistic regression, it increases by 5% the explanation (R 2 ) to the death estimation. Copyright © 2017 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Sesame allergy: role of specific IgE and skin-prick testing in predicting food challenge results.

    PubMed

    Permaul, Perdita; Stutius, Lisa M; Sheehan, William J; Rangsithienchai, Pitud; Walter, Jolan E; Twarog, Frank J; Young, Michael C; Scott, Jordan E; Schneider, Lynda C; Phipatanakul, Wanda

    2009-01-01

    There are conflicting data regarding the diagnostic value of sesame-specific IgE and sesame skin test. Currently, there are no established thresholds that predict clinical reactivity. We examined the correlation of sesame ImmunoCAP and skin-prick test (SPT) results with oral challenge outcomes in children suspected of having a sesame food allergy. We conducted a retrospective chart review of children, aged 2-12 years, receiving a sesame ImmunoCAP level, SPT, and food challenge from January 2004 to August 2008 at Children's Hospital Boston and affiliated allergy clinics. Food challenges were conducted in cases of questionable clinical history or a negative ImmunoCAP and/or negative SPT despite a convincing history. Thirty-three oral sesame challenges were conducted. Of the 33 challenges performed, 21% (n = 7) failed and 79% (n = 26) passed. A sesame-specific IgE level of > or = 7 kU(A)/L showed specificity of >90%. An SPT wheal size of > or = 6 mm showed specificity of >90%. Receiver operator characteristic (ROC) curve analysis for sesame-specific IgE revealed an area under the curve (AUC) of 0.56. ROC curve analysis for SPT wheal size revealed an AUC of 0.67. To our knowledge, this study represents the largest number of sesame challenges performed to evaluate the diagnostic value of both sesame-specific IgE and SPT. Based on our sample, both tests are not good predictors of true sesame allergy as determined by an oral challenge. We were unable to establish a threshold with a 95% positive predictive value for both sesame-specific IgE and SPT.

  14. [The analysis of threshold effect using Empower Stats software].

    PubMed

    Lin, Lin; Chen, Chang-zhong; Yu, Xiao-dan

    2013-11-01

    In many studies about biomedical research factors influence on the outcome variable, it has no influence or has a positive effect within a certain range. Exceeding a certain threshold value, the size of the effect and/or orientation will change, which called threshold effect. Whether there are threshold effects in the analysis of factors (x) on the outcome variable (y), it can be observed through a smooth curve fitting to see whether there is a piecewise linear relationship. And then using segmented regression model, LRT test and Bootstrap resampling method to analyze the threshold effect. Empower Stats software developed by American X & Y Solutions Inc has a threshold effect analysis module. You can input the threshold value at a given threshold segmentation simulated data. You may not input the threshold, but determined the optimal threshold analog data by the software automatically, and calculated the threshold confidence intervals.

  15. Catch bonds govern adhesion through L-selectin at threshold shear.

    PubMed

    Yago, Tadayuki; Wu, Jianhua; Wey, C Diana; Klopocki, Arkadiusz G; Zhu, Cheng; McEver, Rodger P

    2004-09-13

    Flow-enhanced cell adhesion is an unexplained phenomenon that might result from a transport-dependent increase in on-rates or a force-dependent decrease in off-rates of adhesive bonds. L-selectin requires a threshold shear to support leukocyte rolling on P-selectin glycoprotein ligand-1 (PSGL-1) and other vascular ligands. Low forces decrease L-selectin-PSGL-1 off-rates (catch bonds), whereas higher forces increase off-rates (slip bonds). We determined that a force-dependent decrease in off-rates dictated flow-enhanced rolling of L-selectin-bearing microspheres or neutrophils on PSGL-1. Catch bonds enabled increasing force to convert short-lived tethers into longer-lived tethers, which decreased rolling velocities and increased the regularity of rolling steps as shear rose from the threshold to an optimal value. As shear increased above the optimum, transitions to slip bonds shortened tether lifetimes, which increased rolling velocities and decreased rolling regularity. Thus, force-dependent alterations of bond lifetimes govern L-selectin-dependent cell adhesion below and above the shear optimum. These findings establish the first biological function for catch bonds as a mechanism for flow-enhanced cell adhesion.

  16. Threshold concepts: implications for the management of natural resources

    USGS Publications Warehouse

    Guntenspergen, Glenn R.; Gross, John

    2014-01-01

    Threshold concepts can have broad relevance in natural resource management. However, the concept of ecological thresholds has not been widely incorporated or adopted in management goals. This largely stems from the uncertainty revolving around threshold levels and the post hoc analyses that have generally been used to identify them. Natural resource managers have a need for new tools and approaches that will help them assess the existence and detection of conditions that demand management actions. Recognition of additional threshold concepts include: utility thresholds (which are based on human values about ecological systems) and decision thresholds (which reflect management objectives and values and include ecological knowledge about a system) as well as ecological thresholds. All of these concepts provide a framework for considering the use of threshold concepts in natural resource decision making.

  17. Validity of Simpson-Angus Scale (SAS) in a naturalistic schizophrenia population.

    PubMed

    Janno, Sven; Holi, Matti M; Tuisku, Katinka; Wahlbeck, Kristian

    2005-03-17

    Simpson-Angus Scale (SAS) is an established instrument for neuroleptic-induced parkinsonism (NIP), but its statistical properties have been studied insufficiently. Some shortcomings concerning its content have been suggested as well. According to a recent report, the widely used SAS mean score cut-off value 0.3 of for NIP detection may be too low. Our aim was to evaluate SAS against DSM-IV diagnostic criteria for NIP and objective motor assessment (actometry). Ninety-nine chronic institutionalised schizophrenia patients were evaluated during the same interview by standardised actometric recording and SAS. The diagnosis of NIP was based on DSM-IV criteria. Internal consistency measured by Cronbach's alpha, convergence to actometry and the capacity for NIP case detection were assessed. Cronbach's alpha for the scale was 0.79. SAS discriminated between DSM-IV NIP and non-NIP patients. The actometric findings did not correlate with SAS. ROC-analysis yielded a good case detection power for SAS mean score. The optimal threshold value of SAS mean score was between 0.65 and 0.95, i.e. clearly higher than previously suggested threshold value. We conclude that SAS seems a reliable and valid instrument. The previously commonly used cut-off mean score of 0.3 has been too low resulting in low specificity, and we suggest a new cut-off value of 0.65, whereby specificity could be doubled without loosing sensitivity.

  18. Validity of Simpson-Angus Scale (SAS) in a naturalistic schizophrenia population

    PubMed Central

    Janno, Sven; Holi, Matti M; Tuisku, Katinka; Wahlbeck, Kristian

    2005-01-01

    Background Simpson-Angus Scale (SAS) is an established instrument for neuroleptic-induced parkinsonism (NIP), but its statistical properties have been studied insufficiently. Some shortcomings concerning its content have been suggested as well. According to a recent report, the widely used SAS mean score cut-off value 0.3 of for NIP detection may be too low. Our aim was to evaluate SAS against DSM-IV diagnostic criteria for NIP and objective motor assessment (actometry). Methods Ninety-nine chronic institutionalised schizophrenia patients were evaluated during the same interview by standardised actometric recording and SAS. The diagnosis of NIP was based on DSM-IV criteria. Internal consistency measured by Cronbach's α, convergence to actometry and the capacity for NIP case detection were assessed. Results Cronbach's α for the scale was 0.79. SAS discriminated between DSM-IV NIP and non-NIP patients. The actometric findings did not correlate with SAS. ROC-analysis yielded a good case detection power for SAS mean score. The optimal threshold value of SAS mean score was between 0.65 and 0.95, i.e. clearly higher than previously suggested threshold value. Conclusion We conclude that SAS seems a reliable and valid instrument. The previously commonly used cut-off mean score of 0.3 has been too low resulting in low specificity, and we suggest a new cut-off value of 0.65, whereby specificity could be doubled without loosing sensitivity. PMID:15774006

  19. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Insulin resistance (HOMA-IR) cut-off values and the metabolic syndrome in a general adult population: effect of gender and age: EPIRCE cross-sectional study

    PubMed Central

    2013-01-01

    Background Insulin resistance has been associated with metabolic and hemodynamic alterations and higher cardio metabolic risk. There is great variability in the threshold homeostasis model assessment of insulin resistance (HOMA-IR) levels to define insulin resistance. The purpose of this study was to describe the influence of age and gender in the estimation of HOMA-IR optimal cut-off values to identify subjects with higher cardio metabolic risk in a general adult population. Methods It included 2459 adults (range 20–92 years, 58.4% women) in a random Spanish population sample. As an accurate indicator of cardio metabolic risk, Metabolic Syndrome (MetS), both by International Diabetes Federation criteria and by Adult Treatment Panel III criteria, were used. The effect of age was analyzed in individuals with and without diabetes mellitus separately. ROC regression methodology was used to evaluate the effect of age on HOMA-IR performance in classifying cardio metabolic risk. Results In Spanish population the threshold value of HOMA-IR drops from 3.46 using 90th percentile criteria to 2.05 taking into account of MetS components. In non-diabetic women, but no in men, we found a significant non-linear effect of age on the accuracy of HOMA-IR. In non-diabetic men, the cut-off values were 1.85. All values are between 70th-75th percentiles of HOMA-IR levels in adult Spanish population. Conclusions The consideration of the cardio metabolic risk to establish the cut-off points of HOMA-IR, to define insulin resistance instead of using a percentile of the population distribution, would increase its clinical utility in identifying those patients in whom the presence of multiple metabolic risk factors imparts an increased metabolic and cardiovascular risk. The threshold levels must be modified by age in non-diabetic women. PMID:24131857

  1. Insulin resistance (HOMA-IR) cut-off values and the metabolic syndrome in a general adult population: effect of gender and age: EPIRCE cross-sectional study.

    PubMed

    Gayoso-Diz, Pilar; Otero-González, Alfonso; Rodriguez-Alvarez, María Xosé; Gude, Francisco; García, Fernando; De Francisco, Angel; Quintela, Arturo González

    2013-10-16

    Insulin resistance has been associated with metabolic and hemodynamic alterations and higher cardio metabolic risk. There is great variability in the threshold homeostasis model assessment of insulin resistance (HOMA-IR) levels to define insulin resistance. The purpose of this study was to describe the influence of age and gender in the estimation of HOMA-IR optimal cut-off values to identify subjects with higher cardio metabolic risk in a general adult population. It included 2459 adults (range 20-92 years, 58.4% women) in a random Spanish population sample. As an accurate indicator of cardio metabolic risk, Metabolic Syndrome (MetS), both by International Diabetes Federation criteria and by Adult Treatment Panel III criteria, were used. The effect of age was analyzed in individuals with and without diabetes mellitus separately. ROC regression methodology was used to evaluate the effect of age on HOMA-IR performance in classifying cardio metabolic risk. In Spanish population the threshold value of HOMA-IR drops from 3.46 using 90th percentile criteria to 2.05 taking into account of MetS components. In non-diabetic women, but no in men, we found a significant non-linear effect of age on the accuracy of HOMA-IR. In non-diabetic men, the cut-off values were 1.85. All values are between 70th-75th percentiles of HOMA-IR levels in adult Spanish population. The consideration of the cardio metabolic risk to establish the cut-off points of HOMA-IR, to define insulin resistance instead of using a percentile of the population distribution, would increase its clinical utility in identifying those patients in whom the presence of multiple metabolic risk factors imparts an increased metabolic and cardiovascular risk. The threshold levels must be modified by age in non-diabetic women.

  2. The evolution of altruism in spatial threshold public goods games via an insurance mechanism

    NASA Astrophysics Data System (ADS)

    Zhang, Jianlei; Zhang, Chunyan

    2015-05-01

    The persistence of cooperation in public goods situations has become an important puzzle for researchers. This paper considers the threshold public goods games where the option of insurance is provided for players from the standpoint of diversification of risk, envisaging the possibility of multiple strategies in such scenarios. In this setting, the provision point is defined in terms of the minimum number of contributors in one threshold public goods game, below which the game fails. In the presence of risk and insurance, more contributions are motivated if (1) only cooperators can opt to be insured and thus their contribution loss in the aborted games can be (partly or full) covered by the insurance; (2) insured cooperators obtain larger compensation, at lower values of the threshold point (the required minimum number of contributors). Moreover, results suggest the dominance of insured defectors who get a better promotion by more profitable benefits from insurance. We provide results of extensive computer simulations in the realm of spatial games (random regular networks and scale-free networks here), and support this study with analytical results for well-mixed populations. Our study is expected to establish a causal link between the widespread altruistic behaviors and the existing insurance system.

  3. Photoacoustic signals denoising of the glucose aqueous solutions using an improved wavelet threshold method

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Xiong, Zhihua

    2016-10-01

    The photoacoustic signals denoising of glucose is one of most important steps in the quality identification of the fruit because the real-time photoacoustic singals of glucose are easily interfered by all kinds of noises. To remove the noises and some useless information, an improved wavelet threshld function were proposed. Compared with the traditional wavelet hard and soft threshold functions, the improved wavelet threshold function can overcome the pseudo-oscillation effect of the denoised photoacoustic signals due to the continuity of the improved wavelet threshold function, and the error between the denoised signals and the original signals can be decreased. To validate the feasibility of the improved wavelet threshold function denoising, the denoising simulation experiments based on MATLAB programmimg were performed. In the simulation experiments, the standard test signal was used, and three different denoising methods were used and compared with the improved wavelet threshold function. The signal-to-noise ratio (SNR) and the root-mean-square error (RMSE) values were used to evaluate the performance of the improved wavelet threshold function denoising. The experimental results demonstrate that the SNR value of the improved wavelet threshold function is largest and the RMSE value is lest, which fully verifies that the improved wavelet threshold function denoising is feasible. Finally, the improved wavelet threshold function denoising was used to remove the noises of the photoacoustic signals of the glucose solutions. The denoising effect is also very good. Therefore, the improved wavelet threshold function denoising proposed by this paper, has a potential value in the field of denoising for the photoacoustic singals.

  4. 76 FR 77128 - Alternate Tonnage Threshold for Oil Spill Response Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for Oil Spill Response Vessels (OSRVs), which are properly certificated under 46 CFR subchapter L. The present size threshold of 500 gross registered tons is based on the U.S. regulatory measurement system. This rule provides an alternative for owners and operators of offshore supply vessels (OSVs) that may result in an increase in oil spill response capacity and capability.

  5. Determination and validation of soil thresholds for cadmium based on food quality standard and health risk assessment.

    PubMed

    Ding, Changfeng; Ma, Yibing; Li, Xiaogang; Zhang, Taolin; Wang, Xingxiang

    2018-04-01

    Cadmium (Cd) is an environmental toxicant with high rates of soil-plant transfer. It is essential to establish an accurate soil threshold for the implementation of soil management practices. This study takes root vegetable as an example to derive soil thresholds for Cd based on the food quality standard as well as health risk assessment using species sensitivity distribution (SSD). A soil type-specific bioconcentration factor (BCF, ratio of Cd concentration in plant to that in soil) generated from soil with a proper Cd concentration gradient was calculated and applied in the derivation of soil thresholds instead of a generic BCF value to minimize the uncertainty. The sensitivity variations of twelve root vegetable cultivars for accumulating soil Cd and the empirical soil-plant transfer model were investigated and developed in greenhouse experiments. After normalization, the hazardous concentrations from the fifth percentile of the distribution based on added Cd (HC5 add ) were calculated from the SSD curves fitted by Burr Type III distribution. The derived soil thresholds were presented as continuous or scenario criteria depending on the combination of soil pH and organic carbon content. The soil thresholds based on food quality standard were on average 0.7-fold of those based on health risk assessment, and were further validated to be reliable using independent data from field survey and published articles. The results suggested that deriving soil thresholds for Cd using SSD method is robust and also applicable to other crops as well as other trace elements that have the potential to cause health risk issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Chronic Migraine Is Associated With Sustained Elevation of Somatosensory Temporal Discrimination Thresholds.

    PubMed

    Vuralli, Doga; Evren Boran, H; Cengiz, Bulent; Coskun, Ozlem; Bolay, Hayrunnisa

    2016-10-01

    Migraine headache attacks have been shown to be accompanied by significant prolongation of somatosensory temporal discrimination threshold values, supporting signs of disrupted sensorial processing in migraine. Chronic migraine is one of the most debilitating and challenging headache disorders with no available biomarker. We aimed to test the diagnostic value of somatosensory temporal discrimination for chronic migraine in this prospective, controlled study. Fifteen chronic migraine patients and 15 healthy controls completed the study. Chronic migraine patients were evaluated twice, during a headache and headache-free period. Somatosensory temporal discrimination threshold values were evaluated in both hands. Duration of migraine and chronic migraine, headache intensity, clinical features accompanying headache such as nausea, photophobia, phonophobia and osmophobia, and pressure pain thresholds were also recorded. In the chronic migraine group, somatosensory temporal discrimination threshold values on the headache day (138.8 ± 21.8 ms for the right hand and 141.2 ± 17.4 ms for the left hand) were significantly higher than somatosensory temporal discrimination threshold values on the headache free day (121.5 ± 13.8 ms for the right hand and 122.8 ± 12.6 ms for the left hand, P = .003 and P < .0001, respectively) and somatosensory temporal discrimination thresholds of healthy volunteers (35.4 ± 5.5 ms for the right hand and 36.4 ± 5.4 ms for the left hand, P < .0001 and P < .0001, respectively). Somatosensory temporal discrimination threshold values of chronic migraine patients on the headache free day were significantly prolonged compared to somatosensory temporal discrimination threshold values of the control group (121.5 ± 13.8 ms vs 35.4 ± 5.5 ms for the right hand, P < .0001 and 122.8 ± 12.6 ms vs 36.4 ± 5.4 ms for the left hand, P < .0001). Somatosensory temporal discrimination threshold values of the hand contralateral to the headache lateralization (153.3 ± 13.7 ms) were significantly higher (P < .0001) than the ipsilateral hand (118.2 ± 11.9 ms) in chronic migraine patients when headache was lateralized. The headache intensity of chronic migraine patients rated with visual analog score was positively correlated with the contralateral somatosensory temporal discrimination threshold values. Somatosensory temporal discrimination thresholds persist elevated during the headache-free intervals in patients with chronic migraine. By providing evidence for the first time for unremitting disruption of central sensory processing, somatosensory temporal discrimination test stands out as a promising neurophysiological biomarker for chronic migraine. © 2016 American Headache Society.

  7. Spatially implicit approaches to understand the manipulation of mating success for insect invasion management

    Treesearch

    Takehiko Yamanaka; Andrew M. Liebhold

    2009-01-01

    Recent work indicates that Allee effects (the positive relationship between population size and per capita growth rate) are critical in determining the successful establishment of invading species. Allee effects may create population thresholds, and failure to establish is likely if invading populations fall below these thresholds. There are many mechanisms that may...

  8. Challenges in devising economic spray thresholds for a major pest of Australian canola, the redlegged earth mite (Halotydeus destructor).

    PubMed

    Arthur, Aston L; Hoffmann, Ary A; Umina, Paul A

    2015-10-01

    A key component for spray decision-making in IPM programmes is the establishment of economic injury levels (EILs) and economic thresholds (ETs). We aimed to establish an EIL for the redlegged earth mite (Halotydeus destructor Tucker) on canola. Complex interactions between mite numbers, feeding damage and plant recovery were found, highlighting the challenges in linking H. destructor numbers to yield. A guide of 10 mites plant(-1) was established at the first-true-leaf stage; however, simple relationships were not evident at other crop development stages, making it difficult to establish reliable EILs based on mite number. Yield was, however, strongly associated with plant damage and plant densities, reflecting the impact of mite feeding damage and indicating a plant-based alternative for establishing thresholds for H. destructor. Drawing on data from multiple field trials, we show that plant densities below 30-40 plants m(-2) could be used as a proxy for mite damage when reliable estimates of mite densities are not possible. This plant-based threshold provides a practical tool that avoids the difficulties of accurately estimating mite densities. The approach may be applicable to other situations where production conditions are unpredictable and interactions between pests and plant hosts are complex. © 2015 Society of Chemical Industry.

  9. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    PubMed

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  10. The formation of continuous opinion dynamics based on a gambling mechanism and its sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Alexandre Wang, Qiuping; Li, Wei; Cai, Xu

    2017-09-01

    The formation of continuous opinion dynamics is investigated based on a virtual gambling mechanism where agents fight for a limited resource. We propose a model with agents holding opinions between -1 and 1. Agents are segregated into two cliques according to the sign of their opinions. Local communication happens only when the opinion distance between corresponding agents is no larger than a pre-defined confidence threshold. Theoretical analysis regarding special cases provides a deep understanding of the roles of both the resource allocation parameter and confidence threshold in the formation of opinion dynamics. For a sparse network, the evolution of opinion dynamics is negligible in the region of low confidence threshold when the mindless agents are absent. Numerical results also imply that, in the presence of economic agents, high confidence threshold is required for apparent clustering of agents in opinion. Moreover, a consensus state is generated only when the following three conditions are satisfied simultaneously: mindless agents are absent, the resource is concentrated in one clique, and confidence threshold tends to a critical value(=1.25+2/ka ; k_a>8/3 , the average number of friends of individual agents). For fixed a confidence threshold and resource allocation parameter, the most chaotic steady state of the dynamics happens when the fraction of mindless agents is about 0.7. It is also demonstrated that economic agents are more likely to win at gambling, compared to mindless ones. Finally, the importance of three involved parameters in establishing the uncertainty of model response is quantified in terms of Latin hypercube sampling-based sensitivity analysis.

  11. Study of blur discrimination for 3D stereo viewing

    NASA Astrophysics Data System (ADS)

    Subedar, Mahesh; Karam, Lina J.

    2014-03-01

    Blur is an important attribute in the study and modeling of the human visual system. Blur discrimination was studied extensively using 2D test patterns. In this study, we present the details of subjective tests performed to measure blur discrimination thresholds using stereoscopic 3D test patterns. Specifically, the effect of disparity on the blur discrimination thresholds is studied on a passive stereoscopic 3D display. The blur discrimination thresholds are measured using stereoscopic 3D test patterns with positive, negative and zero disparity values, at multiple reference blur levels. A disparity value of zero represents the 2D viewing case where both the eyes will observe the same image. The subjective test results indicate that the blur discrimination thresholds remain constant as we vary the disparity value. This further indicates that binocular disparity does not affect blur discrimination thresholds and the models developed for 2D blur discrimination thresholds can be extended to stereoscopic 3D blur discrimination thresholds. We have presented fitting of the Weber model to the 3D blur discrimination thresholds measured from the subjective experiments.

  12. Reference guide to odor thresholds for hazardous air pollutants listed in the Clean Air Act amendments of 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, W.S.; Shoaf, C.R.; Velasquez, S.F.

    1992-03-01

    In response to numerous requests for information related to odor thresholds, this document was prepared by the Air Risk Information Support Center in its role in providing technical assistance to State and Local government agencies on risk assessment of air pollutants. A discussion of basic concepts related to olfactory function and the measurement of odor thresholds is presented. A detailed discussion of criteria which are used to evaluate the quality of published odor threshold values is provided. The use of odor threshold information in risk assessment is discussed. The results of a literature search and review of odor threshold informationmore » for the chemicals listed as hazardous air pollutants in the Clean Air Act amendments of 1990 is presented. The published odor threshold values are critically evaluated based on the criteria discussed and the values of acceptable quality are used to determine a geometric mean or best estimate.« less

  13. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    PubMed Central

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  14. Pseudo-diode based on protonic/electronic hybrid oxide transistor

    NASA Astrophysics Data System (ADS)

    Fu, Yang Ming; Liu, Yang Hui; Zhu, Li Qiang; Xiao, Hui; Song, An Ran

    2018-01-01

    Current rectification behavior has been proved to be essential in modern electronics. Here, a pseudo-diode is proposed based on protonic/electronic hybrid indium-gallium-zinc oxide electric-double-layer (EDL) transistor. The oxide EDL transistors are fabricated by using phosphorous silicate glass (PSG) based proton conducting electrolyte as gate dielectric. A diode operation mode is established on the transistor, originating from field configurable proton fluxes within the PSG electrolyte. Current rectification ratios have been modulated to values ranged between ˜4 and ˜50 000 with gate electrode biased at voltages ranged between -0.7 V and 0.1 V. Interestingly, the proposed pseudo-diode also exhibits field reconfigurable threshold voltages. When the gate is biased at -0.5 V and 0.3 V, threshold voltages are set to ˜-1.3 V and -0.55 V, respectively. The proposed pseudo-diode may find potential applications in brain-inspired platforms and low-power portable systems.

  15. The rhesus monkey (Macaca mulatta) as a flight candidate

    NASA Technical Reports Server (NTRS)

    Debourne, M. N. G.; Bourne, G. H.; Mcclure, H. M.

    1977-01-01

    The intelligence and ruggedness of rhesus monkeys, as well as the abundance of normative data on their anatomy, physiology, and biochemistry, and the availability of captive bred animals qualify them for selection as candidates for orbital flight and weightlessness studies. Baseline data discussed include: physical characteristics, auditory thresholds, visual accuity, blood, serological taxomony, immunogenetics, cytogenics, circadian rhythms, respiration, cardiovascular values, corticosteroid response to charr restraint, microscopy of tissues, pathology, nutrition, and learning skills. Results from various tests used to establish the baseline data are presented in tables.

  16. Qualitative analysis of a stochastic epidemic model with specific functional response and temporary immunity

    NASA Astrophysics Data System (ADS)

    Hattaf, Khalid; Mahrouf, Marouane; Adnani, Jihad; Yousfi, Noura

    2018-01-01

    In this paper, we propose a stochastic delayed epidemic model with specific functional response. The time delay represents temporary immunity period, i.e., time from recovery to becoming susceptible again. We first show that the proposed model is mathematically and biologically well-posed. Moreover, the extinction of the disease and the persistence in the mean are established in the terms of a threshold value R0S which is smaller than the basic reproduction number R0 of the corresponding deterministic system.

  17. AnimalFinder: A semi-automated system for animal detection in time-lapse camera trap images

    USGS Publications Warehouse

    Price Tack, Jennifer L.; West, Brian S.; McGowan, Conor P.; Ditchkoff, Stephen S.; Reeves, Stanley J.; Keever, Allison; Grand, James B.

    2017-01-01

    Although the use of camera traps in wildlife management is well established, technologies to automate image processing have been much slower in development, despite their potential to drastically reduce personnel time and cost required to review photos. We developed AnimalFinder in MATLAB® to identify animal presence in time-lapse camera trap images by comparing individual photos to all images contained within the subset of images (i.e. photos from the same survey and site), with some manual processing required to remove false positives and collect other relevant data (species, sex, etc.). We tested AnimalFinder on a set of camera trap images and compared the presence/absence results with manual-only review with white-tailed deer (Odocoileus virginianus), wild pigs (Sus scrofa), and raccoons (Procyon lotor). We compared abundance estimates, model rankings, and coefficient estimates of detection and abundance for white-tailed deer using N-mixture models. AnimalFinder performance varied depending on a threshold value that affects program sensitivity to frequently occurring pixels in a series of images. Higher threshold values led to fewer false negatives (missed deer images) but increased manual processing time, but even at the highest threshold value, the program reduced the images requiring manual review by ~40% and correctly identified >90% of deer, raccoon, and wild pig images. Estimates of white-tailed deer were similar between AnimalFinder and the manual-only method (~1–2 deer difference, depending on the model), as were model rankings and coefficient estimates. Our results show that the program significantly reduced data processing time and may increase efficiency of camera trapping surveys.

  18. Novel methodologies for spectral classification of exon and intron sequences

    NASA Astrophysics Data System (ADS)

    Kwan, Hon Keung; Kwan, Benjamin Y. M.; Kwan, Jennifer Y. Y.

    2012-12-01

    Digital processing of a nucleotide sequence requires it to be mapped to a numerical sequence in which the choice of nucleotide to numeric mapping affects how well its biological properties can be preserved and reflected from nucleotide domain to numerical domain. Digital spectral analysis of nucleotide sequences unfolds a period-3 power spectral value which is more prominent in an exon sequence as compared to that of an intron sequence. The success of a period-3 based exon and intron classification depends on the choice of a threshold value. The main purposes of this article are to introduce novel codes for 1-sequence numerical representations for spectral analysis and compare them to existing codes to determine appropriate representation, and to introduce novel thresholding methods for more accurate period-3 based exon and intron classification of an unknown sequence. The main findings of this study are summarized as follows: Among sixteen 1-sequence numerical representations, the K-Quaternary Code I offers an attractive performance. A windowed 1-sequence numerical representation (with window length of 9, 15, and 24 bases) offers a possible speed gain over non-windowed 4-sequence Voss representation which increases as sequence length increases. A winner threshold value (chosen from the best among two defined threshold values and one other threshold value) offers a top precision for classifying an unknown sequence of specified fixed lengths. An interpolated winner threshold value applicable to an unknown and arbitrary length sequence can be estimated from the winner threshold values of fixed length sequences with a comparable performance. In general, precision increases as sequence length increases. The study contributes an effective spectral analysis of nucleotide sequences to better reveal embedded properties, and has potential applications in improved genome annotation.

  19. Olfactory Threshold of Chlorine in Oxygen.

    DTIC Science & Technology

    1977-09-01

    The odor threshold of chlorine in oxygen was determined. Measurements were conducted in an altitude chamber, which provided an odor-free and noise...free background. Human male volunteers, with no previous olfactory acuity testing experience, served as panelists. Threshold values were affected by...time intervals between trials and by age differences. The mean threshold value for 11 subjects was 0.08 ppm obtained by positive responses to the lowest detectable level of chlorine in oxygen, 50% of the time. (Author)

  20. 77 FR 5700 - Approval and Promulgation of Implementation Plans; New Hampshire: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... appropriate emission thresholds for determining which new stationary sources and modification projects become... affects major stationary sources in New Hampshire that have GHG emissions above the thresholds established... higher thresholds in the Tailoring Rule, EPA published a final rule on December 30, 2010, narrowing its...

  1. 77 FR 60907 - Approval and Promulgation of Implementation Plans; Vermont: Prevention of Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ...) program to establish appropriate emission thresholds for determining which new stationary sources and.... This action affects major stationary sources in Vermont that have GHG emissions above the thresholds... of GHG, and do not limit PSD applicability to GHGs to the higher thresholds in the Tailoring Rule...

  2. SU-D-9A-02: Relative Effects of Threshold Choice and Spatial Resolution Modeling On SUV and Volume Quantification in F18-FDG PET Imaging of Anal Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, F; Shandong Cancer Hospital and Insititute, Jinan, Shandong; Bowsher, J

    2014-06-01

    Purpose: PET imaging with F18-FDG is utilized for treatment planning, treatment assessment, and prognosis. A region of interest (ROI) encompassing the tumor may be determined on the PET image, often by a threshold T on the PET standard uptake values (SUVs). Several studies have shown prognostic value for relevant ROI properties including maximum SUV value (SUVmax), metabolic tumor volume (MTV), and total glycolytic activity (TGA). The choice of threshold T may affect mean SUV value (SUVmean), MTV, and TGA. Recently spatial resolution modeling (SRM) has been introduced on many PET systems. SRM may also affect these ROI properties. The purposemore » of this work is to investigate the relative influence of SRM and threshold choice T on SUVmean, MTV, TGA, and SUVmax. Methods: For 9 anal cancer patients, 18F-FDG PET scans were performed prior to treatment. PET images were reconstructed by 2 iterations of Ordered Subsets Expectation Maximization (OSEM), with and without SRM. ROI contours were generated by 5 different SUV threshold values T: 2.5, 3.0, 30%, 40%, and 50% of SUVmax. Paired-samples t tests were used to compare SUVmean, MTV, and TGA (a) for SRM on versus off and (b) between each pair of threshold values T. SUVmax was also compared for SRM on versus off. Results: For almost all (57/60) comparisons of 2 different threshold values, SUVmean, MTV, and TGA showed statistically significant variation. For comparison of SRM on versus off, there were no statistically significant changes in SUVmax and TGA, but there were statistically significant changes in MTV for T=2.5 and T=3.0 and in SUVmean for all T. Conclusion: The near-universal statistical significance of threshold choice T suggests that, regarding harmonization across sites, threshold choice may be a greater concern than choice of SRM. However, broader study is warranted, e.g. other iterations of OSEM should be considered.« less

  3. Thresholds for the cost-effectiveness of interventions: alternative approaches.

    PubMed

    Marseille, Elliot; Larson, Bruce; Kazi, Dhruv S; Kahn, James G; Rosen, Sydney

    2015-02-01

    Many countries use the cost-effectiveness thresholds recommended by the World Health Organization's Choosing Interventions that are Cost-Effective project (WHO-CHOICE) when evaluating health interventions. This project sets the threshold for cost-effectiveness as the cost of the intervention per disability-adjusted life-year (DALY) averted less than three times the country's annual gross domestic product (GDP) per capita. Highly cost-effective interventions are defined as meeting a threshold per DALY averted of once the annual GDP per capita. We argue that reliance on these thresholds reduces the value of cost-effectiveness analyses and makes such analyses too blunt to be useful for most decision-making in the field of public health. Use of these thresholds has little theoretical justification, skirts the difficult but necessary ranking of the relative values of locally-applicable interventions and omits any consideration of what is truly affordable. The WHO-CHOICE thresholds set such a low bar for cost-effectiveness that very few interventions with evidence of efficacy can be ruled out. The thresholds have little value in assessing the trade-offs that decision-makers must confront. We present alternative approaches for applying cost-effectiveness criteria to choices in the allocation of health-care resources.

  4. NIS/publications

    Science.gov Websites

    Viewer. Reaction Q-Values and Thresholds This tool computes reaction Q-values and thresholds using , uncertainties, and correlations using 30 energy ranges. Simple tables of reaction uncertainties are also

  5. Non-equilibrium relaxation in a stochastic lattice Lotka-Volterra model

    NASA Astrophysics Data System (ADS)

    Chen, Sheng; Täuber, Uwe C.

    2016-04-01

    We employ Monte Carlo simulations to study a stochastic Lotka-Volterra model on a two-dimensional square lattice with periodic boundary conditions. If the (local) prey carrying capacity is finite, there exists an extinction threshold for the predator population that separates a stable active two-species coexistence phase from an inactive state wherein only prey survive. Holding all other rates fixed, we investigate the non-equilibrium relaxation of the predator density in the vicinity of the critical predation rate. As expected, we observe critical slowing-down, i.e., a power law dependence of the relaxation time on the predation rate, and algebraic decay of the predator density at the extinction critical point. The numerically determined critical exponents are in accord with the established values of the directed percolation universality class. Following a sudden predation rate change to its critical value, one finds critical aging for the predator density autocorrelation function that is also governed by universal scaling exponents. This aging scaling signature of the active-to-absorbing state phase transition emerges at significantly earlier times than the stationary critical power laws, and could thus serve as an advanced indicator of the (predator) population’s proximity to its extinction threshold.

  6. Concurrent segregation and erosion effects in medium-energy iron beam patterning of silicon surfaces

    NASA Astrophysics Data System (ADS)

    Redondo-Cubero, A.; Lorenz, K.; Palomares, F. J.; Muñoz, A.; Castro, M.; Muñoz-García, J.; Cuerno, R.; Vázquez, L.

    2018-07-01

    We have bombarded crystalline silicon targets with a 40 keV Fe+ ion beam at different incidence angles. The resulting surfaces have been characterized by atomic force, current-sensing and magnetic force microscopies, scanning electron microscopy, and x-ray photoelectron spectroscopy. We have found that there is a threshold angle smaller than 40° for the formation of ripple patterns, which is definitely lower than those frequently reported for noble gas ion beams. We compare our observations with estimates of the value of the critical angle and of additional basic properties of the patterning process, which are based on a continuum model whose parameters are obtained from binary collision simulations. We have further studied experimentally the ripple structures and measured how the surface slopes change with the ion incidence angle. We explore in particular detail the fluence dependence of the pattern for an incidence angle value (40°) close to the threshold. Initially, rimmed holes appear randomly scattered on the surface, which evolve into large, bug-like structures. Further increasing the ion fluence induces a smooth, rippled background morphology. By means of microscopy techniques, a correlation between the morphology of these structures and their metal content can be unambiguously established.

  7. No-Impact Threshold Values for NRAP's Reduced Order Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Last, George V.; Murray, Christopher J.; Brown, Christopher F.

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROMmore » for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.« less

  8. Thresholds of Extinction: Simulation Strategies in Environmental Values Education.

    ERIC Educational Resources Information Center

    Glew, Frank

    1990-01-01

    Describes a simulation exercise for campers and an accompanying curriculum unit--"Thresholds of Extinction"--that addresses the issues of endangered species. Uses this context to illustrate steps in the process of values development: awareness, gathering data, resolution (decision making), responsibility (acting on values), and…

  9. Comparison of edge detection techniques for M7 subtype Leukemic cell in terms of noise filters and threshold value

    NASA Astrophysics Data System (ADS)

    Salam, Afifah Salmi Abdul; Isa, Mohd. Nazrin Md.; Ahmad, Muhammad Imran; Che Ismail, Rizalafande

    2017-11-01

    This paper will focus on the study and identifying various threshold values for two commonly used edge detection techniques, which are Sobel and Canny Edge detection. The idea is to determine which values are apt in giving accurate results in identifying a particular leukemic cell. In addition, evaluating suitability of edge detectors are also essential as feature extraction of the cell depends greatly on image segmentation (edge detection). Firstly, an image of M7 subtype of Acute Myelocytic Leukemia (AML) is chosen due to its diagnosing which were found lacking. Next, for an enhancement in image quality, noise filters are applied. Hence, by comparing images with no filter, median and average filter, useful information can be acquired. Each threshold value is fixed with value 0, 0.25 and 0.5. From the investigation found, without any filter, Canny with a threshold value of 0.5 yields the best result.

  10. The stability of color discrimination threshold determined using pseudoisochromatic test plates

    NASA Astrophysics Data System (ADS)

    Zutere, B.; Jurasevska Luse, K.; Livzane, A.

    2014-09-01

    Congenital red-green color vision deficiency is one of the most common genetic disorders. A previously printed set of pseudoisochromatic plates (KAMS test, 2012) was created for individual discrimination threshold determination in case of mild congenital red-green color vision deficiency using neutral colors (colors confused with gray). The diagnostics of color blind subjects was performed with Richmond HRR (4th edition, 2002) test, Oculus HMC anomaloscope, and further the examination was made using the KAMS test. 4 male subjects aged 20 to 24 years old participated in the study: all of them were diagnosed with deuteranomalia. Due to the design of the plates, the threshold of every subject in each trial was defined as the plate total color difference value ΔE at which the stimulus was detected 75% of the time, so the just-noticeable difference (jnd) was calculated in CIE LAB DeltaE (ΔE) units. Authors performed repeated discrimination threshold measurements (5 times) for all four subjects under controlled illumination conditions. Psychophysical data were taken by sampling an observer's performance on a psychophysical task at a number of different stimulus saturation levels. Results show that a total color difference value ΔE threshold exists for each individual tested with the KAMS pseudoisochromatic plates, this threshold value does not change significantly in multiple measurements. Deuteranomal threshold values aquired using greenish plates of KAMS test are significantly higher than thresholds acquired using reddish plates. A strong positive correlation (R=0.94) exists between anomaloscope matching range (MR) and deuteranomal thresholds aquired by the KAMS test and (R=0.81) between error score in the Richmond HRR test and thresholds aquired by the KAMS test.

  11. Metabolic Tumor Volume and Total Lesion Glycolysis in Oropharyngeal Cancer Treated With Definitive Radiotherapy: Which Threshold Is the Best Predictor of Local Control?

    PubMed

    Castelli, Joël; Depeursinge, Adrien; de Bari, Berardino; Devillers, Anne; de Crevoisier, Renaud; Bourhis, Jean; Prior, John O

    2017-06-01

    In the context of oropharyngeal cancer treated with definitive radiotherapy, the aim of this retrospective study was to identify the best threshold value to compute metabolic tumor volume (MTV) and/or total lesion glycolysis to predict local-regional control (LRC) and disease-free survival. One hundred twenty patients with a locally advanced oropharyngeal cancer from 2 different institutions treated with definitive radiotherapy underwent FDG PET/CT before treatment. Various MTVs and total lesion glycolysis were defined based on 2 segmentation methods: (i) an absolute threshold of SUV (0-20 g/mL) or (ii) a relative threshold for SUVmax (0%-100%). The parameters' predictive capabilities for disease-free survival and LRC were assessed using the Harrell C-index and Cox regression model. Relative thresholds between 40% and 68% and absolute threshold between 5.5 and 7 had a similar predictive value for LRC (C-index = 0.65 and 0.64, respectively). Metabolic tumor volume had a higher predictive value than gross tumor volume (C-index = 0.61) and SUVmax (C-index = 0.54). Metabolic tumor volume computed with a relative threshold of 51% of SUVmax was the best predictor of disease-free survival (hazard ratio, 1.23 [per 10 mL], P = 0.009) and LRC (hazard ratio: 1.22 [per 10 mL], P = 0.02). The use of different thresholds within a reasonable range (between 5.5 and 7 for an absolute threshold and between 40% and 68% for a relative threshold) seems to have no major impact on the predictive value of MTV. This parameter may be used to identify patient with a high risk of recurrence and who may benefit from treatment intensification.

  12. Experimental Protocol to Determine the Chloride Threshold Value for Corrosion in Samples Taken from Reinforced Concrete Structures

    PubMed Central

    Angst, Ueli M.; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard

    2017-01-01

    The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing. PMID:28892023

  13. Experimental Protocol to Determine the Chloride Threshold Value for Corrosion in Samples Taken from Reinforced Concrete Structures.

    PubMed

    Angst, Ueli M; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard

    2017-08-31

    The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing.

  14. A study of surface ozone variability over the Iberian Peninsula during the last fifty years

    NASA Astrophysics Data System (ADS)

    Fernández-Fernández, M. I.; Gallego, M. C.; García, J. A.; Acero, F. J.

    2011-02-01

    There is good evidence for an increase in the global surface level of ozone in the past century. In this work we present an analysis of 18 surface ozone series over the Iberian Peninsula, considering the target values of ozone for the protection of human health and for the protection of vegetation, as well as the information and alert thresholds established by the current European Directive on ambient air quality and cleaner air for Europe (Directive 2008/50/EC). The results show that the stations located on the Cantabrian coast exceeded neither the target value for the protection of human health nor the target value for the protection of vegetation. The information threshold was exceeded in most of the stations, while the alert threshold was only exceeded in one. The seasonal and daily evolution of ozone concentrations were as expected. A trend analysis of three surface ozone concentration indices (monthly median and 98th percentile, and monthly maximum of the daily maximum 8-h mean) was performed both for the whole period of each station and for the common period from 2001 to 2007 for all the months of the year. It was noted that generally the south of the Iberian Peninsula presented increasing trends for the three indices, especially in the last six months of the year, and the north decreasing trends. Finally, a correlation analysis was performed between the daily maximum 8-h mean and both daily mean temperature and daily mean solar radiation for the whole and the common periods. For all stations, there was a significant positive association at a 5% significance level between the daily maximum 8-h mean and the two meteorological variables of up to approximately 0.5. The spatial distribution of these association values from 2001 to 2007 showed a positive northwest to southeast gradient over the Iberian Peninsula.

  15. Determination of optimum threshold values for EMG time domain features; a multi-dataset investigation

    NASA Astrophysics Data System (ADS)

    Nlandu Kamavuako, Ernest; Scheme, Erik Justin; Englehart, Kevin Brian

    2016-08-01

    Objective. For over two decades, Hudgins’ set of time domain features have extensively been applied for classification of hand motions. The calculation of slope sign change and zero crossing features uses a threshold to attenuate the effect of background noise. However, there is no consensus on the optimum threshold value. In this study, we investigate for the first time the effect of threshold selection on the feature space and classification accuracy using multiple datasets. Approach. In the first part, four datasets were used, and classification error (CE), separability index, scatter matrix separability criterion, and cardinality of the features were used as performance measures. In the second part, data from eight classes were collected during two separate days with two days in between from eight able-bodied subjects. The threshold for each feature was computed as a factor (R = 0:0.01:4) times the average root mean square of data during rest. For each day, we quantified CE for R = 0 (CEr0) and minimum error (CEbest). Moreover, a cross day threshold validation was applied where, for example, CE of day two (CEodt) is computed based on optimum threshold from day one and vice versa. Finally, we quantified the effect of the threshold when using training data from one day and test data of the other. Main results. All performance metrics generally degraded with increasing threshold values. On average, CEbest (5.26 ± 2.42%) was significantly better than CEr0 (7.51 ± 2.41%, P = 0.018), and CEodt (7.50 ± 2.50%, P = 0.021). During the two-fold validation between days, CEbest performed similar to CEr0. Interestingly, when using the threshold values optimized per subject from day one and day two respectively, on the cross-days classification, the performance decreased. Significance. We have demonstrated that threshold value has a strong impact on the feature space and that an optimum threshold can be quantified. However, this optimum threshold is highly data and subject driven and thus do not generalize well. There is a strong evidence that R = 0 provides a good trade-off between system performance and generalization. These findings are important for practical use of pattern recognition based myoelectric control.

  16. Determination of optimum threshold values for EMG time domain features; a multi-dataset investigation.

    PubMed

    Kamavuako, Ernest Nlandu; Scheme, Erik Justin; Englehart, Kevin Brian

    2016-08-01

    For over two decades, Hudgins' set of time domain features have extensively been applied for classification of hand motions. The calculation of slope sign change and zero crossing features uses a threshold to attenuate the effect of background noise. However, there is no consensus on the optimum threshold value. In this study, we investigate for the first time the effect of threshold selection on the feature space and classification accuracy using multiple datasets. In the first part, four datasets were used, and classification error (CE), separability index, scatter matrix separability criterion, and cardinality of the features were used as performance measures. In the second part, data from eight classes were collected during two separate days with two days in between from eight able-bodied subjects. The threshold for each feature was computed as a factor (R = 0:0.01:4) times the average root mean square of data during rest. For each day, we quantified CE for R = 0 (CEr0) and minimum error (CEbest). Moreover, a cross day threshold validation was applied where, for example, CE of day two (CEodt) is computed based on optimum threshold from day one and vice versa. Finally, we quantified the effect of the threshold when using training data from one day and test data of the other. All performance metrics generally degraded with increasing threshold values. On average, CEbest (5.26 ± 2.42%) was significantly better than CEr0 (7.51 ± 2.41%, P = 0.018), and CEodt (7.50 ± 2.50%, P = 0.021). During the two-fold validation between days, CEbest performed similar to CEr0. Interestingly, when using the threshold values optimized per subject from day one and day two respectively, on the cross-days classification, the performance decreased. We have demonstrated that threshold value has a strong impact on the feature space and that an optimum threshold can be quantified. However, this optimum threshold is highly data and subject driven and thus do not generalize well. There is a strong evidence that R = 0 provides a good trade-off between system performance and generalization. These findings are important for practical use of pattern recognition based myoelectric control.

  17. Comparison of magnetic resonance spectroscopy, proton density fat fraction and histological analysis in the quantification of liver steatosis in children and adolescents

    PubMed Central

    Di Martino, Michele; Pacifico, Lucia; Bezzi, Mario; Di Miscio, Rossella; Sacconi, Beatrice; Chiesa, Claudio; Catalano, Carlo

    2016-01-01

    AIM To establish a threshold value for liver fat content between healthy children and those with non-alcoholic fatty liver disease (NAFLD) by using magnetic resonance imaging (MRI), with liver biopsy serving as a reference standard. METHODS The study was approved by the local ethics committee, and written informed consent was obtained from all participants and their legal guardians before the study began. Twenty-seven children with NAFLD underwent liver biopsy to assess the presence of nonalcoholic steatohepatitis. The assessment of liver fat fraction was performed using MRI, with a high field magnet and 2D gradient-echo and multiple-echo T1-weighted sequence with low flip angle and single-voxel point-resolved ¹H MR-Spectroscopy (¹H-MRS), corrected for T1 and T2* decays. Receiver operating characteristic curve analysis was used to determine the best cut-off value. Lin coefficient test was used to evaluate the correlation between histology, MRS and MRI-PDFF. A Mann-Whitney U-test and multivariate analysis were performed to analyze the continuous variables. RESULTS According to MRS, the threshold value between healthy children and those with NAFLD is 6%; using MRI-PDFF, a cut-off value of 3.5% is suggested. The Lin analysis revealed a good fit between the histology and MRS as well as MRI-PDFF. CONCLUSION MRS is an accurate and precise method for detecting NAFLD in children. PMID:27818597

  18. Identifying optimal threshold statistics for elimination of hookworm using a stochastic simulation model.

    PubMed

    Truscott, James E; Werkman, Marleen; Wright, James E; Farrell, Sam H; Sarkar, Rajiv; Ásbjörnsdóttir, Kristjana; Anderson, Roy M

    2017-06-30

    There is an increased focus on whether mass drug administration (MDA) programmes alone can interrupt the transmission of soil-transmitted helminths (STH). Mathematical models can be used to model these interventions and are increasingly being implemented to inform investigators about expected trial outcome and the choice of optimum study design. One key factor is the choice of threshold for detecting elimination. However, there are currently no thresholds defined for STH regarding breaking transmission. We develop a simulation of an elimination study, based on the DeWorm3 project, using an individual-based stochastic disease transmission model in conjunction with models of MDA, sampling, diagnostics and the construction of study clusters. The simulation is then used to analyse the relationship between the study end-point elimination threshold and whether elimination is achieved in the long term within the model. We analyse the quality of a range of statistics in terms of the positive predictive values (PPV) and how they depend on a range of covariates, including threshold values, baseline prevalence, measurement time point and how clusters are constructed. End-point infection prevalence performs well in discriminating between villages that achieve interruption of transmission and those that do not, although the quality of the threshold is sensitive to baseline prevalence and threshold value. Optimal post-treatment prevalence threshold value for determining elimination is in the range 2% or less when the baseline prevalence range is broad. For multiple clusters of communities, both the probability of elimination and the ability of thresholds to detect it are strongly dependent on the size of the cluster and the size distribution of the constituent communities. Number of communities in a cluster is a key indicator of probability of elimination and PPV. Extending the time, post-study endpoint, at which the threshold statistic is measured improves PPV value in discriminating between eliminating clusters and those that bounce back. The probability of elimination and PPV are very sensitive to baseline prevalence for individual communities. However, most studies and programmes are constructed on the basis of clusters. Since elimination occurs within smaller population sub-units, the construction of clusters introduces new sensitivities for elimination threshold values to cluster size and the underlying population structure. Study simulation offers an opportunity to investigate key sources of sensitivity for elimination studies and programme designs in advance and to tailor interventions to prevailing local or national conditions.

  19. Influenza surveillance in Europe: establishing epidemic thresholds by the Moving Epidemic Method

    PubMed Central

    Vega, Tomás; Lozano, Jose Eugenio; Meerhoff, Tamara; Snacken, René; Mott, Joshua; Ortiz de Lejarazu, Raul; Nunes, Baltazar

    2012-01-01

    Please cite this paper as: Vega et al. (2012) Influenza surveillance in Europe: establishing epidemic thresholds by the moving epidemic method. Influenza and Other Respiratory Viruses 7(4), 546–558. Background  Timely influenza surveillance is important to monitor influenza epidemics. Objectives  (i) To calculate the epidemic threshold for influenza‐like illness (ILI) and acute respiratory infections (ARI) in 19 countries, as well as the thresholds for different levels of intensity. (ii) To evaluate the performance of these thresholds. Methods  The moving epidemic method (MEM) has been developed to determine the baseline influenza activity and an epidemic threshold. False alerts, detection lags and timeliness of the detection of epidemics were calculated. The performance was evaluated using a cross‐validation procedure. Results  The overall sensitivity of the MEM threshold was 71·8% and the specificity was 95·5%. The median of the timeliness was 1 week (range: 0–4·5). Conclusions  The method produced a robust and specific signal to detect influenza epidemics. The good balance between the sensitivity and specificity of the epidemic threshold to detect seasonal epidemics and avoid false alerts has advantages for public health purposes. This method may serve as standard to define the start of the annual influenza epidemic in countries in Europe. PMID:22897919

  20. Urine stability and steroid profile: towards a screening index of urine sample degradation for anti-doping purpose.

    PubMed

    Mazzarino, Monica; Abate, Maria Gabriella; Alocci, Roberto; Rossi, Francesca; Stinchelli, Raffaella; Molaioni, Francesco; de la Torre, Xavier; Botrè, Francesco

    2011-01-10

    The presence of microorganisms in urine samples, under favourable conditions of storage and transportation, may alter the concentration of steroid hormones, thus altering the correct evaluation of the urinary steroid profile in doping control analysis. According to the rules of the World Anti-Doping Agency (WADA technical document TD2004 EAAS), a testosterone deconjugation higher than 5% and the presence of 5α-androstane-3,17-dione and 5β-androstane-3,17-dione in the deconjugated fraction, are reliable indicators of urine degradation. The determination of these markers would require an additional quantitative analysis since the steroids screening analysis, in anti-doping laboratories, is performed in the total (free+conjugated) fraction. The aim of this work is therefore to establish reliable threshold values for some representative compounds (namely 5α-androstane-3,17-dione and 5β-androstane-3,17-dione) in the total fraction in order to predict directly at the screening stage the potential microbial degradation of the urine samples. Preliminary evidence on the most suitable degradation indexes has been obtained by measuring the urinary concentration of testosterone, epitestosterone, 5α-androstane-3,17-dione and 5β-androstane-3,17-dione by gas chromatography-mass spectrometric every day for 15 days in the deconjugated, glucuronide and total fraction of 10 pools of urines from 60 healthy subjects, stored under different pH and temperature conditions, and isolating the samples with one or more markers of degradation according to the WADA technical document TD2004EAAS. The threshold values for 5α-androstane-3,17-dione and 5β-androstane-3,17-dione were therefore obtained correlating the testosterone deconjugation rate with the urinary concentrations of 5α-androstane-3,17-dione and 5β-androstane-3,17-dione in the total fraction. The threshold values suggested as indexes of urine degradation in the total fraction were: 10 ng mL(-1) for 5α-androstane-3,17-dione and 20 ng mL(-1) for 5β-androstane-3,17-dione. The validity of this approach was confirmed by the analysis of routine samples for more than five months (i.e. on a total of more than 4000 urine samples): samples with a concentration of total 5α-androstane-3,17-dione and 5β-androstane-3,17-dione higher than the threshold values showed a percentage of free testosterone higher than 5 of its total amount; whereas free testosterone in a percentage higher than 5 of its total amount was not detected in urines with a concentration of total 5α-androstane-3,17-dione and 5β-androstane-3,17-dione lower than the threshold values. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Thresholding Based on Maximum Weighted Object Correlation for Rail Defect Detection

    NASA Astrophysics Data System (ADS)

    Li, Qingyong; Huang, Yaping; Liang, Zhengping; Luo, Siwei

    Automatic thresholding is an important technique for rail defect detection, but traditional methods are not competent enough to fit the characteristics of this application. This paper proposes the Maximum Weighted Object Correlation (MWOC) thresholding method, fitting the features that rail images are unimodal and defect proportion is small. MWOC selects a threshold by optimizing the product of object correlation and the weight term that expresses the proportion of thresholded defects. Our experimental results demonstrate that MWOC achieves misclassification error of 0.85%, and outperforms the other well-established thresholding methods, including Otsu, maximum correlation thresholding, maximum entropy thresholding and valley-emphasis method, for the application of rail defect detection.

  2. Defining operating rules for mitigation of drought effects on water supply systems

    NASA Astrophysics Data System (ADS)

    Rossi, G.; Caporali, E.; Garrote, L.; Federici, G. V.

    2012-04-01

    Reservoirs play a pivotal role for water supply systems regulation and management especially during drought periods. Optimization of reservoir releases, related to drought mitigation rules is particularly required. The hydrologic state of the system is evaluated defining some threshold values, expressed in probabilistic terms. Risk deficit curves are used to reduce the ensemble of possible rules for simulation. Threshold values can be linked to specific actions in an operational context in different levels of severity, i.e. normal, pre-alert, alert and emergency scenarios. A simplified model of the water resources system is built to evaluate the threshold values and the management rules. The threshold values are defined considering the probability to satisfy a given fraction of the demand in a certain time horizon, and are validated with a long term simulation that takes into account the characteristics of the evaluated system. The threshold levels determine some curves that define reservoir releases as a function of existing storage volume. A demand reduction is related to each threshold level. The rules to manage the system in drought conditions, the threshold levels and the reductions are optimized using long term simulations with different hypothesized states of the system. Synthetic sequences of flows with the same statistical properties of the historical ones are produced to evaluate the system behaviour. Performances of different values of reduction and different threshold curves are evaluated using different objective function and performances indices. The methodology is applied to the urban area Firenze-Prato-Pistoia in central Tuscany, in Central Italy. The considered demand centres are Firenze and Bagno a Ripoli that have, accordingly to the census ISTAT 2001, a total of 395.000 inhabitants.

  3. 75 FR 32845 - Consultative Examination-Annual Onsite Review of Medical Providers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    .... ACTION: Final rules. SUMMARY: We are revising the threshold billing amount that triggers annual on-site... titles II and XVI of the Social Security Act (Act). The revision will raise the threshold amount to reflect the increase in billing amounts since we first established the threshold amount in 1991. We expect...

  4. A new threshold of apparent diffusion coefficient values in white matter after successful tissue plasminogen activator treatment for acute brain ischemia.

    PubMed

    Sato, Atsushi; Shimizu, Yusaku; Koyama, Junichi; Hongo, Kazuhiro

    2017-06-01

    Tissue plasminogen activator (tPA) is effective for the treatment of acute brain ischemia, but may trigger fatal brain edema or hemorrhage if the brain ischemia results in a large infarct. Herein, we attempted to predict the extent of infarcts by determining the optimal threshold of ADC values on DWI that predictively distinguishes between infarct and reversible areas, and by reconstructing color-coded images based on this threshold. The study subjects consisted of 36 patients with acute brain ischemia in whom MRA had confirmed reopening of the occluded arteries in a short time (mean: 99min) after tPA treatment. We measured the apparetnt diffusion coefficient (ADC) values in several small regions of interest over the white matter within high-intensity areas on the initial diffusion weighted image (DWI); then, by comparing the findings to the follow-up images, we obtained the optimal threshold of ADC values using receiver-operating characteristic analysis. The threshold obtained (583×10 -6 m 2 /s) was lower than those previously reported; this threshold could distinguish between infarct and reversible areas with considerable accuracy (sensitivity: 0.87, specificity: 0.94). The threshold obtained and the reconstructed images were predictive of the final radiological result of tPA treatment, and this threshold may be helpful in determining the appropriate management of patients with acute brain ischemia. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  5. Getting the message across: using ecological integrity to communicate with resource managers

    USGS Publications Warehouse

    Mitchell, Brian R.; Tierney, Geraldine L.; Schweiger, E. William; Miller, Kathryn M.; Faber-Langendoen, Don; Grace, James B.

    2014-01-01

    This chapter describes and illustrates how concepts of ecological integrity, thresholds, and reference conditions can be integrated into a research and monitoring framework for natural resource management. Ecological integrity has been defined as a measure of the composition, structure, and function of an ecosystem in relation to the system’s natural or historical range of variation, as well as perturbations caused by natural or anthropogenic agents of change. Using ecological integrity to communicate with managers requires five steps, often implemented iteratively: (1) document the scale of the project and the current conceptual understanding and reference conditions of the ecosystem, (2) select appropriate metrics representing integrity, (3) define externally verified assessment points (metric values that signify an ecological change or need for management action) for the metrics, (4) collect data and calculate metric scores, and (5) summarize the status of the ecosystem using a variety of reporting methods. While we present the steps linearly for conceptual clarity, actual implementation of this approach may require addressing the steps in a different order or revisiting steps (such as metric selection) multiple times as data are collected. Knowledge of relevant ecological thresholds is important when metrics are selected, because thresholds identify where small changes in an environmental driver produce large responses in the ecosystem. Metrics with thresholds at or just beyond the limits of a system’s range of natural variability can be excellent, since moving beyond the normal range produces a marked change in their values. Alternatively, metrics with thresholds within but near the edge of the range of natural variability can serve as harbingers of potential change. Identifying thresholds also contributes to decisions about selection of assessment points. In particular, if there is a significant resistance to perturbation in an ecosystem, with threshold behavior not occurring until well beyond the historical range of variation, this may provide a scientific basis for shifting an ecological assessment point beyond the historical range. We present two case studies using ongoing monitoring by the US National Park Service Vital Signs program that illustrate the use of an ecological integrity approach to communicate ecosystem status to resource managers. The Wetland Ecological Integrity in Rocky Mountain National Park case study uses an analytical approach that specifically incorporates threshold detection into the process of establishing assessment points. The Forest Ecological Integrity of Northeastern National Parks case study describes a method for reporting ecological integrity to resource managers and other decision makers. We believe our approach has the potential for wide applicability for natural resource management.

  6. Numerical simulation and experimental investigation of Ti-6Al-4V melted by CW fiber laser at different pressures

    NASA Astrophysics Data System (ADS)

    Tabassum, Aasma; Zhou, Jie; Han, Bing; Ni, Xiao-wu; Sardar, Maryam

    2017-07-01

    The interaction of continuous wave (CW) fiber laser with Ti-6Al-4V alloy is investigated numerically and experimentally at different laser fluence values and ambient pressures of N2 atmosphere to determine the melting time threshold of Ti-6Al-4V alloy. A 2D-axisymmetric numerical model considering heat transfer and laminar flow is established to describe the melting process. The simulation results indicate that material melts earlier at lower pressure (8.0 Pa) than at higher pressure (8.8×104 Pa) in several milliseconds with the same laser fluence. The experimental results demonstrate that the melting time threshold at high laser fluence (above 1.89×108 W/m2) is shorter for lower pressure (vacuum), which is consistent with the simulation. While the melting time threshold at low laser fluence (below 1.89×108 W/m2) is shorter for higher pressure. The possible aspects which can affect the melting process include the increased heat loss induced by the heat conduction between the metal surface and the ambient gas with the increased pressure, and the absorption variation of the coarse surface resulted from the chemical reaction.

  7. Novel threshold pressure sensors based on nonlinear dynamics of MEMS resonators

    NASA Astrophysics Data System (ADS)

    Hasan, Mohammad H.; Alsaleem, Fadi M.; Ouakad, Hassen M.

    2018-06-01

    Triggering an alarm in a car for low air-pressure in the tire or tripping an HVAC compressor if the refrigerant pressure is lower than a threshold value are examples for applications where measuring the amount of pressure is not as important as determining if the pressure has exceeded a threshold value for an action to occur. Unfortunately, current technology still relies on analog pressure sensors to perform this functionality by adding a complex interface (extra circuitry, controllers, and/or decision units). In this paper, we demonstrate two new smart tunable-threshold pressure switch concepts that can reduce the complexity of a threshold pressure sensor. The first concept is based on the nonlinear subharmonic resonance of a straight double cantilever microbeam with a proof mass and the other concept is based on the snap-through bi-stability of a clamped-clamped MEMS shallow arch. In both designs, the sensor operation concept is simple. Any actuation performed at a certain pressure lower than a threshold value will activate a nonlinear dynamic behavior (subharmonic resonance or snap-through bi-stability) yielding a large output that would be interpreted as a logic value of ONE, or ON. Once the pressure exceeds the threshold value, the nonlinear response ceases to exist, yielding a small output that would be interpreted as a logic value of ZERO, or OFF. A lumped, single degree of freedom model for the double cantilever beam, that is validated using experimental data, and a continuous beam model for the arch beam, are used to simulate the operation range of the proposed sensors by identifying the relationship between the excitation signal and the critical cut-off pressure.

  8. Optimal Threshold Determination for Interpreting Semantic Similarity and Particularity: Application to the Comparison of Gene Sets and Metabolic Pathways Using GO and ChEBI

    PubMed Central

    Bettembourg, Charles; Diot, Christian; Dameron, Olivier

    2015-01-01

    Background The analysis of gene annotations referencing back to Gene Ontology plays an important role in the interpretation of high-throughput experiments results. This analysis typically involves semantic similarity and particularity measures that quantify the importance of the Gene Ontology annotations. However, there is currently no sound method supporting the interpretation of the similarity and particularity values in order to determine whether two genes are similar or whether one gene has some significant particular function. Interpretation is frequently based either on an implicit threshold, or an arbitrary one (typically 0.5). Here we investigate a method for determining thresholds supporting the interpretation of the results of a semantic comparison. Results We propose a method for determining the optimal similarity threshold by minimizing the proportions of false-positive and false-negative similarity matches. We compared the distributions of the similarity values of pairs of similar genes and pairs of non-similar genes. These comparisons were performed separately for all three branches of the Gene Ontology. In all situations, we found overlap between the similar and the non-similar distributions, indicating that some similar genes had a similarity value lower than the similarity value of some non-similar genes. We then extend this method to the semantic particularity measure and to a similarity measure applied to the ChEBI ontology. Thresholds were evaluated over the whole HomoloGene database. For each group of homologous genes, we computed all the similarity and particularity values between pairs of genes. Finally, we focused on the PPAR multigene family to show that the similarity and particularity patterns obtained with our thresholds were better at discriminating orthologs and paralogs than those obtained using default thresholds. Conclusion We developed a method for determining optimal semantic similarity and particularity thresholds. We applied this method on the GO and ChEBI ontologies. Qualitative analysis using the thresholds on the PPAR multigene family yielded biologically-relevant patterns. PMID:26230274

  9. Self-Supervised Learning to Visually Detect Terrain Surfaces for Autonomous Robots Operating in Forested Terrain

    DTIC Science & Technology

    2012-01-01

    values of EAFP, EAFN, and EAF, can be compared with three user-defined threshold values, TAFP, TAFN, and TAF . These threshold values determine the update...values were chosen as TAFP = E0AFP + 0.02, TAFN = E0AFN + 0.02, and TAF = E0AF + 0.02). We called the value of 0.02 the margin of error tolerance. In

  10. Verification and Enhancement of VIIRS Day-Night Band Power Outage Detection Product

    NASA Astrophysics Data System (ADS)

    Burke, A.; Schultz, L. A.; Omitaomu, O.; Molthan, A.; Cole, T.; Griffin, R.

    2017-12-01

    The NASA SPoRT (Short-term Prediction Research and Transition) Center has collaborated with scientists at NASA Goddard Space Flight Center to create a power outage detection product from radiance data obtained by the VIIRS (Visible Infrared Imaging Radiometer Suite) sensor aboard the Suomi-NPP satellite. This product uses a composite of pre-event radiance values from the VIIRS Day-Night Band to establish a baseline of "normal" nighttime lights for a study area. Then, after a severe weather event or other disaster, post-event images are compared to the composite to generate a percent-of-normal radiance product to identify areas that are experiencing outages and to aid in disaster response and monitor recovery. This project will use ground-truth county-level outage data provided by Oak Ridge National Laboratory (ORNL) in order validate the product and to establish a percent-of-normal threshold for identifying power outages. Once a threshold is found, ORNL's LandScan Global population data will be combined with the product to estimate how many electrical customers are being affected by power outages after a disaster. Two case studies will be explored to examine power outage recovery after severe weather events, including Hurricane Matthew from 2016 and the Washington D.C. Derecho event of 2012.

  11. Usage of fMRI for pre-surgical planning in brain tumor and vascular lesion patients: task and statistical threshold effects on language lateralization.

    PubMed

    Nadkarni, Tanvi N; Andreoli, Matthew J; Nair, Veena A; Yin, Peng; Young, Brittany M; Kundu, Bornali; Pankratz, Joshua; Radtke, Andrew; Holdsworth, Ryan; Kuo, John S; Field, Aaron S; Baskaya, Mustafa K; Moritz, Chad H; Meyerand, M Elizabeth; Prabhakaran, Vivek

    2015-01-01

    Functional magnetic resonance imaging (fMRI) is a non-invasive pre-surgical tool used to assess localization and lateralization of language function in brain tumor and vascular lesion patients in order to guide neurosurgeons as they devise a surgical approach to treat these lesions. We investigated the effect of varying the statistical thresholds as well as the type of language tasks on functional activation patterns and language lateralization. We hypothesized that language lateralization indices (LIs) would be threshold- and task-dependent. Imaging data were collected from brain tumor patients (n = 67, average age 48 years) and vascular lesion patients (n = 25, average age 43 years) who received pre-operative fMRI scanning. Both patient groups performed expressive (antonym and/or letter-word generation) and receptive (tumor patients performed text-reading; vascular lesion patients performed text-listening) language tasks. A control group (n = 25, average age 45 years) performed the letter-word generation task. Brain tumor patients showed left-lateralization during the antonym-word generation and text-reading tasks at high threshold values and bilateral activation during the letter-word generation task, irrespective of the threshold values. Vascular lesion patients showed left-lateralization during the antonym and letter-word generation, and text-listening tasks at high threshold values. Our results suggest that the type of task and the applied statistical threshold influence LI and that the threshold effects on LI may be task-specific. Thus identifying critical functional regions and computing LIs should be conducted on an individual subject basis, using a continuum of threshold values with different tasks to provide the most accurate information for surgical planning to minimize post-operative language deficits.

  12. Verification of the tumor volume delineation method using a fixed threshold of peak standardized uptake value.

    PubMed

    Koyama, Kazuya; Mitsumoto, Takuya; Shiraishi, Takahiro; Tsuda, Keisuke; Nishiyama, Atsushi; Inoue, Kazumasa; Yoshikawa, Kyosan; Hatano, Kazuo; Kubota, Kazuo; Fukushi, Masahiro

    2017-09-01

    We aimed to determine the difference in tumor volume associated with the reconstruction model in positron-emission tomography (PET). To reduce the influence of the reconstruction model, we suggested a method to measure the tumor volume using the relative threshold method with a fixed threshold based on peak standardized uptake value (SUV peak ). The efficacy of our method was verified using 18 F-2-fluoro-2-deoxy-D-glucose PET/computed tomography images of 20 patients with lung cancer. The tumor volume was determined using the relative threshold method with a fixed threshold based on the SUV peak . The PET data were reconstructed using the ordered-subset expectation maximization (OSEM) model, the OSEM + time-of-flight (TOF) model, and the OSEM + TOF + point-spread function (PSF) model. The volume differences associated with the reconstruction algorithm (%VD) were compared. For comparison, the tumor volume was measured using the relative threshold method based on the maximum SUV (SUV max ). For the OSEM and TOF models, the mean %VD values were -0.06 ± 8.07 and -2.04 ± 4.23% for the fixed 40% threshold according to the SUV max and the SUV peak, respectively. The effect of our method in this case seemed to be minor. For the OSEM and PSF models, the mean %VD values were -20.41 ± 14.47 and -13.87 ± 6.59% for the fixed 40% threshold according to the SUV max and SUV peak , respectively. Our new method enabled the measurement of tumor volume with a fixed threshold and reduced the influence of the changes in tumor volume associated with the reconstruction model.

  13. Is there a threshold level of maternal education sufficient to reduce child undernutrition? Evidence from Malawi, Tanzania and Zimbabwe.

    PubMed

    Makoka, Donald; Masibo, Peninah Kinya

    2015-08-22

    Maternal education is strongly associated with young child nutrition outcomes. However, the threshold of the level of maternal education that reduces the level of undernutrition in children is not well established. This paper investigates the level of threshold of maternal education that influences child nutrition outcomes using Demographic and Health Survey data from Malawi (2010), Tanzania (2009-10) and Zimbabwe (2005-06). The total number of children (weighted sample) was 4,563 in Malawi; 4,821 children in Tanzania; and 3,473 children in Zimbabwe Demographic and Health Surveys. Using three measures of child nutritional status: stunting, wasting and underweight, we employ a survey logistic regression to analyse the influence of various levels of maternal education on child nutrition outcomes. In Malawi, 45% of the children were stunted, 42% in Tanzania and 33% in Zimbabwe. There were 12% children underweight in Malawi and Zimbabwe and 16% in Tanzania.The level of wasting was 6% of children in Malawi, 5% in Tanzania and 4% in Zimbabwe. Stunting was significantly (p values < 0.0001) associated with mother's educational level in all the three countries. Higher levels of maternal education reduced the odds of child stunting, underweight and wasting in the three countries. The maternal threshold for stunting is more than ten years of schooling. Wasting and underweight have lower threshold levels. These results imply that the free primary education in the three African countries may not be sufficient and policies to keep girls in school beyond primary school hold more promise of addressing child undernutrition.

  14. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    NASA Astrophysics Data System (ADS)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  15. The asymmetry of U.S. monetary policy: Evidence from a threshold Taylor rule with time-varying threshold values

    NASA Astrophysics Data System (ADS)

    Zhu, Yanli; Chen, Haiqiang

    2017-05-01

    In this paper, we revisit the issue whether U.S. monetary policy is asymmetric by estimating a forward-looking threshold Taylor rule with quarterly data from 1955 to 2015. In order to capture the potential heterogeneity for regime shift mechanism under different economic conditions, we modify the threshold model by assuming the threshold value as a latent variable following an autoregressive (AR) dynamic process. We use the unemployment rate as the threshold variable and separate the sample into two periods: expansion periods and recession periods. Our findings support that the U.S. monetary policy operations are asymmetric in these two regimes. More precisely, the monetary authority tends to implement an active Taylor rule with a weaker response to the inflation gap (the deviation of inflation from its target) and a stronger response to the output gap (the deviation of output from its potential level) in recession periods. The threshold value, interpreted as the targeted unemployment rate of monetary authorities, exhibits significant time-varying properties, confirming the conjecture that policy makers may adjust their reference point for the unemployment rate accordingly to reflect their attitude on the health of general economy.

  16. Evaluation of the stability indices for the thunderstorm forecasting in the region of Belgrade, Serbia

    NASA Astrophysics Data System (ADS)

    Vujović, D.; Paskota, M.; Todorović, N.; Vučković, V.

    2015-07-01

    The pre-convective atmosphere over Serbia during the ten-year period (2001-2010) was investigated using the radiosonde data from one meteorological station and the thunderstorm observations from thirteen SYNOP meteorological stations. In order to verify their ability to forecast a thunderstorm, several stability indices were examined. Rank sum scores (RSSs) were used to segregate indices and parameters which can differentiate between a thunderstorm and no-thunderstorm event. The following indices had the best RSS values: Lifted index (LI), K index (KI), Showalter index (SI), Boyden index (BI), Total totals (TT), dew-point temperature and mixing ratio. The threshold value test was used in order to determine the appropriate threshold values for these variables. The threshold with the best skill scores was chosen as the optimal. The thresholds were validated in two ways: through the control data set, and comparing the calculated indices thresholds with the values of indices for a randomly chosen day with an observed thunderstorm. The index with the highest skill for thunderstorm forecasting was LI, and then SI, KI and TT. The BI had the poorest skill scores.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Tanmoy, E-mail: tanmoybesus@gmail.com; Singh, Harkirat, E-mail: tanmoybesus@gmail.com; Mitra, Chiranjib, E-mail: tanmoybesus@gmail.com

    Violation of Bell’s inequality test has been established as an efficient tool to determine the presence of entanglement in quantum spin 1/2 magnets. Herein, macroscopic thermodynamic quantities, namely, magnetic susceptibility and specific heat have been employed to perform Bell’s inequality test for [NH{sub 4}CuPO{sub 4}, H{sub 2}O], a spin 1/2 antiferromagnet with nearest neighbor interactions. The mean value of the Bell operator is quantified and plotted as a function of temperature. The threshold temperature is determined above which the Bell’s inequality is not violated and a good consistency is found between the analyses done on magnetic and thermal data.

  18. Occupational exposure limits for carcinogens--variant approaches by different countries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, W.A.

    1989-09-01

    The differences in treatment of occupational exposure limits for carcinogens by 24 countries is described along with a discussion of the American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit values (TLV) treatment, the similar treatment of the new Occupational Safety and Health Administration (OSHA) standard, and the treatment by provinces of Canada. The unique listing by the Federal Republic of Germany of so-called technical guiding concentrations of a group of carcinogens is discussed with the note that Austria used this same system. Publications on justification for establishing occupational exposure limits for certain carcinogens are discussed also.

  19. Rapid and Reliable Damage Proxy Map from InSAR Coherence

    NASA Technical Reports Server (NTRS)

    Yun, Sang-Ho; Fielding, Eric; Simons, Mark; Agram, Piyush; Rosen, Paul; Owen, Susan; Webb, Frank

    2012-01-01

    Future radar satellites will visit SoCal within a day after a disaster event. Data acquisition latency in 2015-2020 is 8 to approx. 15 hours. Data transfer latency that often involves human/agency intervention far exceeds the data acquisition latency. Need interagency cooperation to establish automatic pipeline for data transfer. The algorithm is tested with ALOS PALSAR data of Pasadena, California. Quantitative quality assessment is being pursued: Meeting with Pasadena City Hall computer engineers for a complete list of demolition/construction project 1. Estimate the probability of detection and probability of false alarm 2. Estimate the optimal threshold value.

  20. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    NASA Astrophysics Data System (ADS)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  1. A critique of the use of indicator-species scores for identifying thresholds in species responses

    USGS Publications Warehouse

    Cuffney, Thomas F.; Qian, Song S.

    2013-01-01

    Identification of ecological thresholds is important both for theoretical and applied ecology. Recently, Baker and King (2010, King and Baker 2010) proposed a method, threshold indicator analysis (TITAN), to calculate species and community thresholds based on indicator species scores adapted from Dufrêne and Legendre (1997). We tested the ability of TITAN to detect thresholds using models with (broken-stick, disjointed broken-stick, dose-response, step-function, Gaussian) and without (linear) definitive thresholds. TITAN accurately and consistently detected thresholds in step-function models, but not in models characterized by abrupt changes in response slopes or response direction. Threshold detection in TITAN was very sensitive to the distribution of 0 values, which caused TITAN to identify thresholds associated with relatively small differences in the distribution of 0 values while ignoring thresholds associated with large changes in abundance. Threshold identification and tests of statistical significance were based on the same data permutations resulting in inflated estimates of statistical significance. Application of bootstrapping to the split-point problem that underlies TITAN led to underestimates of the confidence intervals of thresholds. Bias in the derivation of the z-scores used to identify TITAN thresholds and skewedness in the distribution of data along the gradient produced TITAN thresholds that were much more similar than the actual thresholds. This tendency may account for the synchronicity of thresholds reported in TITAN analyses. The thresholds identified by TITAN represented disparate characteristics of species responses that, when coupled with the inability of TITAN to identify thresholds accurately and consistently, does not support the aggregation of individual species thresholds into a community threshold.

  2. Salicylate-induced changes in auditory thresholds of adolescent and adult rats.

    PubMed

    Brennan, J F; Brown, C A; Jastreboff, P J

    1996-01-01

    Shifts in auditory intensity thresholds after salicylate administration were examined in postweanling and adult pigmented rats at frequencies ranging from 1 to 35 kHz. A total of 132 subjects from both age levels were tested under two-way active avoidance or one-way active avoidance paradigms. Estimated thresholds were inferred from behavioral responses to presentations of descending and ascending series of intensities for each test frequency value. Reliable threshold estimates were found under both avoidance conditioning methods, and compared to controls, subjects at both age levels showed threshold shifts at selective higher frequency values after salicylate injection, and the extent of shifts was related to salicylate dose level.

  3. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  4. Evaluation of different radon guideline values based on characterization of ecological risk and visualization of lung cancer mortality trends in British Columbia, Canada.

    PubMed

    Branion-Calles, Michael C; Nelson, Trisalyn A; Henderson, Sarah B

    2015-11-19

    There is no safe concentration of radon gas, but guideline values provide threshold concentrations that are used to map areas at higher risk. These values vary between different regions, countries, and organizations, which can lead to differential classification of risk. For example the World Health Organization suggests a 100 Bq m(-3)value, while Health Canada recommends 200 Bq m(-3). Our objective was to describe how different thresholds characterized ecological radon risk and their visual association with lung cancer mortality trends in British Columbia, Canada. Eight threshold values between 50 and 600 Bq m(-3) were identified, and classes of radon vulnerability were defined based on whether the observed 95(th) percentile radon concentration was above or below each value. A balanced random forest algorithm was used to model vulnerability, and the results were mapped. We compared high vulnerability areas, their estimated populations, and differences in lung cancer mortality trends stratified by smoking prevalence and sex. Classification accuracy improved as the threshold concentrations decreased and the area classified as high vulnerability increased. Majority of the population lived within areas of lower vulnerability regardless of the threshold value. Thresholds as low as 50 Bq m(-3) were associated with higher lung cancer mortality, even in areas with low smoking prevalence. Temporal trends in lung cancer mortality were increasing for women, while decreasing for men. Radon contributes to lung cancer in British Columbia. The results of the study contribute evidence supporting the use of a reference level lower than the current guideline of 200 Bq m(-3) for the province.

  5. Accelerometer thresholds: Accounting for body mass reduces discrepancies between measures of physical activity for individuals with overweight and obesity.

    PubMed

    Raiber, Lilian; Christensen, Rebecca A G; Jamnik, Veronica K; Kuk, Jennifer L

    2017-01-01

    The objective of this study was to explore whether accelerometer thresholds that are adjusted to account for differences in body mass influence discrepancies between self-report and accelerometer-measured physical activity (PA) volume for individuals with overweight and obesity. We analyzed 6164 adults from the National Health and Nutrition Examination Survey between 2003-2006. Established accelerometer thresholds were adjusted to account for differences in body mass to produce a similar energy expenditure (EE) rate as individuals with normal weight. Moderate-, vigorous-, and moderate- to vigorous-intensity PA (MVPA) durations were measured using established and adjusted accelerometer thresholds and compared with self-report. Durations of self-report were longer than accelerometer-measured MVPA using established thresholds (normal weight: 57.8 ± 2.4 vs 9.0 ± 0.5 min/day, overweight: 56.1 ± 2.7 vs 7.4 ± 0.5 min/day, and obesity: 46.5 ± 2.2 vs 3.7 ± 0.3 min/day). Durations of subjective and objective PA were negatively associated with body mass index (BMI) (P < 0.05). Using adjusted thresholds increased MVPA durations, and reduced discrepancies between accelerometer and self-report measures for overweight and obese groups by 6.0 ± 0.3 min/day and 17.7 ± 0.8 min/day, respectively (P < 0.05). Using accelerometer thresholds that represent equal EE rates across BMI categories reduced the discrepancies between durations of subjective and objective PA for overweight and obese groups. However, accelerometer-measured PA generally remained shorter than durations of self-report within all BMI categories. Further research may be necessary to improve analytical approaches when using objective measures of PA for individuals with overweight or obesity.

  6. Sympathetic Release of Splenic Monocytes Promotes Recurring Anxiety Following Repeated Social Defeat

    PubMed Central

    McKim, Daniel B.; Patterson, Jenna M.; Wohleb, Eric S.; Jarrett, Brant; Reader, Brenda; Godbout, Jonathan P.; Sheridan, John F.

    2015-01-01

    Background Neuroinflammatory signaling may contribute to the pathophysiology of chronic anxiety disorders. Previous work showed that repeated social defeat (RSD) in mice promoted stress-sensitization that was characterized by the recurrence of anxiety following sub-threshold stress 24 days after RSD. Furthermore, splenectomy following RSD prevented the recurrence of anxiety in stress-sensitized (SS) mice. We hypothesize that the spleen of RSD-exposed mice became a reservoir of primed monocytes that were released following neuroendocrine activation by sub-threshold stress. Methods Mice were subjected to sub-threshold stress (i.e., single cycle of social defeat) 24 days after RSD, and immune and behavioral measures were taken. Results Sub-threshold stress 24 days after RSD re-established anxiety-like behavior that was associated with egress of Ly6Chi monocytes from the spleen. Moreover, splenectomy prior to RSD blocked monocyte trafficking to the brain and prevented anxiety-like behavior following sub-threshold stress. Splenectomy, however, had no effect on monocyte accumulation or anxiety when determined 14 hours after RSD. In addition, splenocytes cultured 24 days after RSD exhibited a primed inflammatory phenotype. Peripheral sympathetic inhibition prior to sub-threshold stress blocked monocyte trafficking from the spleen to the brain and prevented the re-establishment of anxiety in RSD-sensitized mice. Last, β-adrenergic antagonism also prevented splenic monocyte egress after acute stress. Conclusion The spleen served as a unique reservoir of primed monocytes that were readily released following sympathetic activation by sub-threshold stress that promoted the re-establishment of anxiety. Collectively, the long-term storage of primed monocytes in the spleen may have a profound influence on recurring anxiety disorders. PMID:26281717

  7. Calculation of photoionization cross section near auto-ionizing lines and magnesium photoionization cross section near threshold

    NASA Technical Reports Server (NTRS)

    Moore, E. N.; Altick, P. L.

    1972-01-01

    The research performed is briefly reviewed. A simple method was developed for the calculation of continuum states of atoms when autoionization is present. The method was employed to give the first theoretical cross section for beryllium and magnesium; the results indicate that the values used previously at threshold were sometimes seriously in error. These threshold values have potential applications in astrophysical abundance estimates.

  8. The Montreal Cognitive Assessment and the mini-mental state examination as screening instruments for cognitive impairment: item analyses and threshold scores.

    PubMed

    Damian, Anne M; Jacobson, Sandra A; Hentz, Joseph G; Belden, Christine M; Shill, Holly A; Sabbagh, Marwan N; Caviness, John N; Adler, Charles H

    2011-01-01

    To perform an item analysis of the Montreal Cognitive Assessment (MoCA) versus the Mini-Mental State Examination (MMSE) in the prediction of cognitive impairment, and to examine the characteristics of different MoCA threshold scores. 135 subjects enrolled in a longitudinal clinicopathologic study were administered the MoCA by a single physician and the MMSE by a trained research assistant. Subjects were classified as cognitively impaired or cognitively normal based on independent neuropsychological testing. 89 subjects were found to be cognitively normal, and 46 cognitively impaired (20 with dementia, 26 with mild cognitive impairment). The MoCA was superior in both sensitivity and specificity to the MMSE, although not all MoCA tasks were of equal predictive value. A MoCA threshold score of 26 had a sensitivity of 98% and a specificity of 52% in this population. In a population with a 20% prevalence of cognitive impairment, a threshold of 24 was optimal (negative predictive value 96%, positive predictive value 47%). This analysis suggests the potential for creating an abbreviated MoCA. For screening in primary care, the MoCA threshold of 26 appears optimal. For testing in a memory disorders clinic, a lower threshold has better predictive value. Copyright © 2011 S. Karger AG, Basel.

  9. Implications of the new Centers for Disease Control and Prevention blood lead reference value.

    PubMed

    Burns, Mackenzie S; Gerstenberger, Shawn L

    2014-06-01

    The Centers for Disease Control and Prevention recently established a new reference value (≥ 5 μg/dL) as the standard for identifying children with elevated blood lead levels (EBLs). At present, 535,000 US children aged 1 to 5 years (2.6%) are estimated to have EBLs according to the new standard, versus 0.8% according to the previous standard (≥ 10 μg/dL). Because EBLs signify the threshold for public health intervention, this new definition increases demands on lead poisoning prevention efforts. Primary prevention has been proven to reduce lead poisoning cases and is also cost effective; however, federal budget cuts threaten the existence of such programs. Protection for the highest-risk children necessitates a reinstatement of federal funding to previous levels.

  10. The criteria for establishing an acceptable range of chemical, physical and biological indicators for the purpose of ecological standards developing

    NASA Astrophysics Data System (ADS)

    Evdokimova, Maria; Glazunov, Gennady; Yakovlev, Aleksandr

    2017-04-01

    The basis for development of standards for soil quality is based on the assessment of their resistance to external influences. The main criterion for assessing the environmental sustainability of soils and lands is the ability to perform their ecological functions (Nkonya et al, 2011, 2013; Costanza et al, 2014, Dobrovolsky and Nikitin, 1990; Yakovlev, Evdokimova, 2011). The limiting value of indicators of the state of the environment (physical, chemical, biological and other) corresponds to the value at which stability of environmental components is preserved (the ability to heal itself). Tht threshold for effect of stressor should be identified by the methods of bioindication and biotesting. The analysis obtained by these methods aimed to identify the highest indicator values of physical or chemical (concentration or dose of the stressor) effects, which have not yet fairly established negative changes in the organism, population of organisms or community. Using a theoretical model (Yakovlev et al, 2009, Gendugov., 2013) the problem of finding the threshold concentration is reduced to the finding of the singular points characterizing macroscopic "kinetics" of response in the phase space of dependence of the response rate upon the impact indicator. Singular points are determined by the analysis of derivatives. The theoretical model allows to calculate the singular points of the model (six of them), one of which, the maximum point corresponds to the highest concentration of the stressor at which it had no adverse effects on the test organisms. This point corresponds to the lowest concentration of the stressor at which it has no longer a stimulatory (hormesis) effect. Six singular points divide the whole range of stressors values (concentration) on seven bands with a unique range for each set of values of "macrokinetic" indicators of the living cells response to the impact of the stressor (concentration). Thus, the use of theoretical equations allowed us 1) to establish categories (borders) of soil quality on an the empirical scale of environmental quality and 2) to detail the category of quality in the range of hormesis, that is, in the range of weak positive effects of the stressor. The solution of the equation in the phase space of dependence of response upon exposure is: q=C/z^b*exp(-K/z), where q - is a measurable response of living organisms on exposure to the stressor, the concentration of which is equal to z; C -the constant of integration that makes sense of coefficient, which is scaling the value of q; b - the coefficient of the growth rate responding on the increase of z; K - the coefficient of the decline rate of q responding with increasing z. The equation coefficients C, b, K are found by fitting the model to the experimental data got by the method of nonlinear regression using an available software package. The abscissa of the maximum point is of a particular interest, because it corresponds to: 1. the lowest concentration of the stressor, which does not manifest its stimulatory (hormesis) effect, and at the same time - 2. the largest concentration of the stressor, which has not shown its negative effect. That is, it meets the requirements for threshold concentrations of the stressor and can be used in the development of the environmental quality standards. Acknowledgments: This study was supported by the Russian Science Foundation, project no. 143800023.

  11. Threshold-based insulin-pump interruption for reduction of hypoglycemia.

    PubMed

    Bergenstal, Richard M; Klonoff, David C; Garg, Satish K; Bode, Bruce W; Meredith, Melissa; Slover, Robert H; Ahmann, Andrew J; Welsh, John B; Lee, Scott W; Kaufman, Francine R

    2013-07-18

    The threshold-suspend feature of sensor-augmented insulin pumps is designed to minimize the risk of hypoglycemia by interrupting insulin delivery at a preset sensor glucose value. We evaluated sensor-augmented insulin-pump therapy with and without the threshold-suspend feature in patients with nocturnal hypoglycemia. We randomly assigned patients with type 1 diabetes and documented nocturnal hypoglycemia to receive sensor-augmented insulin-pump therapy with or without the threshold-suspend feature for 3 months. The primary safety outcome was the change in the glycated hemoglobin level. The primary efficacy outcome was the area under the curve (AUC) for nocturnal hypoglycemic events. Two-hour threshold-suspend events were analyzed with respect to subsequent sensor glucose values. A total of 247 patients were randomly assigned to receive sensor-augmented insulin-pump therapy with the threshold-suspend feature (threshold-suspend group, 121 patients) or standard sensor-augmented insulin-pump therapy (control group, 126 patients). The changes in glycated hemoglobin values were similar in the two groups. The mean AUC for nocturnal hypoglycemic events was 37.5% lower in the threshold-suspend group than in the control group (980 ± 1200 mg per deciliter [54.4 ± 66.6 mmol per liter] × minutes vs. 1568 ± 1995 mg per deciliter [87.0 ± 110.7 mmol per liter] × minutes, P<0.001). Nocturnal hypoglycemic events occurred 31.8% less frequently in the threshold-suspend group than in the control group (1.5 ± 1.0 vs. 2.2 ± 1.3 per patient-week, P<0.001). The percentages of nocturnal sensor glucose values of less than 50 mg per deciliter (2.8 mmol per liter), 50 to less than 60 mg per deciliter (3.3 mmol per liter), and 60 to less than 70 mg per deciliter (3.9 mmol per liter) were significantly reduced in the threshold-suspend group (P<0.001 for each range). After 1438 instances at night in which the pump was stopped for 2 hours, the mean sensor glucose value was 92.6 ± 40.7 mg per deciliter (5.1 ± 2.3 mmol per liter). Four patients (all in the control group) had a severe hypoglycemic event; no patients had diabetic ketoacidosis. This study showed that over a 3-month period the use of sensor-augmented insulin-pump therapy with the threshold-suspend feature reduced nocturnal hypoglycemia, without increasing glycated hemoglobin values. (Funded by Medtronic MiniMed; ASPIRE ClinicalTrials.gov number, NCT01497938.).

  12. Assessing the nutrient intake of a low-carbohydrate, high-fat (LCHF) diet: a hypothetical case study design

    PubMed Central

    Zinn, Caryn; Rush, Amy; Johnson, Rebecca

    2018-01-01

    Objective The low-carbohydrate, high-fat (LCHF) diet is becoming increasingly employed in clinical dietetic practice as a means to manage many health-related conditions. Yet, it continues to remain contentious in nutrition circles due to a belief that the diet is devoid of nutrients and concern around its saturated fat content. This work aimed to assess the micronutrient intake of the LCHF diet under two conditions of saturated fat thresholds. Design In this descriptive study, two LCHF meal plans were designed for two hypothetical cases representing the average Australian male and female weight-stable adult. National documented heights, a body mass index of 22.5 to establish weight and a 1.6 activity factor were used to estimate total energy intake using the Schofield equation. Carbohydrate was limited to <130 g, protein was set at 15%–25% of total energy and fat supplied the remaining calories. One version of the diet aligned with the national saturated fat guideline threshold of <10% of total energy and the other included saturated fat ad libitum. Primary outcomes The primary outcomes included all micronutrients, which were assessed using FoodWorks dietary analysis software against national Australian/New Zealand nutrient reference value (NRV) thresholds. Results All of the meal plans exceeded the minimum NRV thresholds, apart from iron in the female meal plans, which achieved 86%–98% of the threshold. Saturated fat intake was logistically unable to be reduced below the 10% threshold for the male plan but exceeded the threshold by 2 g (0.6%). Conclusion Despite macronutrient proportions not aligning with current national dietary guidelines, a well-planned LCHF meal plan can be considered micronutrient replete. This is an important finding for health professionals, consumers and critics of LCHF nutrition, as it dispels the myth that these diets are suboptimal in their micronutrient supply. As with any diet, for optimal nutrient achievement, meals need to be well formulated. PMID:29439004

  13. Forest Loss and the Biodiversity Threshold: An Evaluation Considering Species Habitat Requirements and the Use of Matrix Habitats

    PubMed Central

    Estavillo, Candelaria; Pardini, Renata; da Rocha, Pedro Luís Bernardo

    2013-01-01

    Habitat loss is the main driver of the current biodiversity crisis, a landscape-scale process that affects the survival of spatially-structured populations. Although it is well-established that species responses to habitat loss can be abrupt, the existence of a biodiversity threshold is still the cause of much controversy in the literature and would require that most species respond similarly to the loss of native vegetation. Here we test the existence of a biodiversity threshold, i.e. an abrupt decline in species richness, with habitat loss. We draw on a spatially-replicated dataset on Atlantic forest small mammals, consisting of 16 sampling sites divided between forests and matrix habitats in each of five 3600-ha landscapes (varying from 5% to 45% forest cover), and on an a priori classification of species into habitat requirement categories (forest specialists, habitat generalists and open-area specialists). Forest specialists declined abruptly below 30% of forest cover, and spillover to the matrix occurred only in more forested landscapes. Generalists responded positively to landscape heterogeneity, peaking at intermediary levels of forest cover. Open area specialists dominated the matrix and did not spillover to forests. As a result of these distinct responses, we observed a biodiversity threshold for the small mammal community below 30% forest cover, and a peak in species richness just above this threshold. Our results highlight that cross habitat spillover may be asymmetrical and contingent on landscape context, occurring mainly from forests to the matrix and only in more forested landscapes. Moreover, they indicate the potential for biodiversity thresholds in human-modified landscapes, and the importance of landscape heterogeneity to biodiversity. Since forest loss affected not only the conservation value of forest patches, but also the potential for biodiversity-mediated services in anthropogenic habitats, our work indicates the importance of proactive measures to avoid human-modified landscapes to cross this threshold. PMID:24324776

  14. Prediction of area under the curve for a p-glycoprotein, a CYP3A4 and a CYP2C9 substrate using a single time point strategy: assessment using fexofenadine, itraconazole and losartan and metabolites.

    PubMed

    Srinivas, Nuggehally R

    2016-01-01

    In the present age of polypharmacy, limited sampling strategy becomes important to verify if drug levels are within the prescribed threshold limits from efficacy and safety considerations. The need to establish reliable single time concentration dependent models to predict exposure becomes important from cost and time perspectives. A simple unweighted linear regression model was developed to describe the relationship between Cmax versus AUC for fexofenadine, losartan, EXP3174, itraconazole and hydroxyitraconazole. The fold difference, defined as the quotient of the observed and predicted AUC values, were evaluated along with statistical comparison of the predicted versus observed values. The correlation between Cmax versus AUC was well established for all the five drugs with a correlation coefficient (r) ranging from 0.9130 to 0.9997. Majority of the predicted values for all the five drugs (77%) were contained within a narrow boundary of 0.75- to 1.5-fold difference. The r values for observed versus predicted AUC were 0.9653 (n = 145), 0.8342 (n = 76), 0.9524 (n = 88), 0.9339 (n = 89) and 0.9452 (n = 66) for fexofenadine, losartan, EXP3174, itraconazole and hydroxyitraconazole, respectively. Cmax versus AUC relationships were established for all drugs and were amenable for limited sampling strategy for AUC prediction. However, fexofenadine, EXP3174 and hydroxyitraconazole may be most relevant for AUC prediction by a single time concentration as judged by the various criteria applied in this study.

  15. Digital audio watermarking using moment-preserving thresholding

    NASA Astrophysics Data System (ADS)

    Choi, DooSeop; Jung, Hae Kyung; Choi, Hyuk; Kim, Taejeong

    2007-09-01

    The Moment-Preserving Thresholding technique for digital images has been used in digital image processing for decades, especially in image binarization and image compression. Its main strength lies in that the binary values that the MPT produces as a result, called representative values, are usually unaffected when the signal being thresholded goes through a signal processing operation. The two representative values in MPT together with the threshold value are obtained by solving the system of the preservation equations for the first, second, and third moment. Relying on this robustness of the representative values to various signal processing attacks considered in the watermarking context, this paper proposes a new watermarking scheme for audio signals. The watermark is embedded in the root-sum-square (RSS) of the two representative values of each signal block using the quantization technique. As a result, the RSS values are modified by scaling the signal according to the watermark bit sequence under the constraint of inaudibility relative to the human psycho-acoustic model. We also address and suggest solutions to the problem of synchronization and power scaling attacks. Experimental results show that the proposed scheme maintains high audio quality and robustness to various attacks including MP3 compression, re-sampling, jittering, and, DA/AD conversion.

  16. Spatial difference in genetic variation for fenitrothion tolerance between local populations of Daphnia galeata in Lake Kasumigaura, Japan.

    PubMed

    Mano, Hiroyuki; Tanaka, Yoshinari

    2017-12-01

    This study examines the spatial difference in genetic variation for tolerance to a pesticide, fenitrothion, in Daphnia galeata at field sites in Lake Kasumigaura, Japan. We estimated genetic values of isofemale lines established from dormant eggs of D. galeata collected from field sampling sites with the toxicant threshold model applied using acute toxicity. We compared genetic values and variances and broad-sense heritability across different sites in the lake. Results showed that the mean tolerance values to fenitrothion did not differ spatially. The variance in genetic value and heritability of fenitrothion tolerance significantly differed between sampling sites, revealing that long-term ecological risk of fenitrothion may differ between local populations in the lake. These results have implications for aquatic toxicology research, suggesting that differences in genetic variation of tolerance to a chemical among local populations must be considered for understanding the long-term ecological risks of the chemical over a large geographic area.

  17. Caries-removal effectiveness of a papain-based chemo-mechanical agent: A quantitative micro-CT study.

    PubMed

    Neves, Aline A; Lourenço, Roseane A; Alves, Haimon D; Lopes, Ricardo T; Primo, Laura G

    2015-01-01

    The aim of this study was to access the effectiveness and specificity of a papain-based chemo-mechanical caries-removal agent in providing minimum residual caries after cavity preparation. In order to do it, extracted carious molars were selected and scanned in a micro-CT before and after caries-removal procedures with the papain-based gel. Similar parameters for acquisition and reconstruction of the image stacks were used between the scans. After classification of the dentin substrate based on mineral density intervals and establishment of a carious tissue threshold, volumetric parameters related to effectiveness (mineral density of removed dentin volume and residual dentin tissue) and specificity (relation between carious dentin in removed volume and initial caries) of this caries-removal agent were obtained. In general, removed dentin volume was similar or higher than the initial carious volume, indicating that the method was able to effectively remove dentin tissue. Samples with an almost perfect accuracy in carious dentin removal also showed an increased removal of caries-affected tissue. On the contrary, less or no affected dentin was removed in samples where some carious tissue was left in residual dentin. Mineral density values in residual dentin were always higher or similar to the threshold for mineral density values in carious dentin. In conclusion, the papain-based gel was effective in removing carious dentin up to a conservative in vitro threshold. Lesion characteristics, such as activity and morphology of enamel lesion, may also influence caries-removal properties of the method. © Wiley Periodicals, Inc.

  18. Phenotypes and clinical context of hypercontractility in high resolution esophageal pressure topography (EPT)

    PubMed Central

    Roman, Sabine; Pandolfino, John E; Chen, Joan; Boris, Lubomyr; Luger, Daniel; Kahrilas, Peter J

    2013-01-01

    Backgrounds & Aims This study aimed to refine the criteria for esophageal hypercontractility in high resolution esophageal pressure topography (EPT) and examine the clinical context in which it occurs. Subjects & Methods 72 control subjects were used to define the threshold for hypercontractility as a distal contractile integral (DCI) greater than observed in normals. 2,000 consecutive EPT studies were reviewed to find patients exceeding this threshold. Concomitant EPT and clinical variables were explored. Results The greatest DCI value observed in any swallow among the control subjects was 7,732 mmHg-s-cm; the threshold for hypercontractility was established as a swallow with DCI >8,000 mmHg-s-cm. 44 patients were identified with a median maximal DCI of 11,077 mmHg-s-cm, all with normal contractile propagation and normal distal contractile latency, thereby excluding achalasia and distal esophageal spasm. Hypercontractility was associated with multipeaked contractions in 82% of instances leading to the name Jackhammer Esophagus . Dysphagia was the dominant symptom although subsets of patients had hypercontractility in the context of EGJ outflow obstruction, reflux disease, or as an apparent primary motility disorder. Conclusion We describe an extreme phenotype of hypercontractility characterized in EPT by the occurrence of at least a single contraction with DCI > 8,000 mmHg-s-cm, a value not encountered in control subjects. This phenomenon, branded Jackhammer Esophagus was usually accompanied by dysphagia and occurred both in association with other esophageal pathology (EGJ outflow obstruction, reflux disease) or as an isolated motility disturbance. Further studies are required to define the pathophysiology and treatment of this disorder. PMID:21931377

  19. 77 FR 38729 - Alternate Tonnage Threshold for Oil Spill Response Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for oil spill response vessels, which are properly certificated under 46 CFR chapter I, subchapter L. The present size threshold of 500 gross register tons is based on the U.S. regulatory measurement system. This final rule provides an alternative for owners and operators of offshore supply vessels that may result in an increase in oil spill response capacity and capability. This final rule adopts, without change, the interim rule amending 46 CFR part 126 published in the Federal Register on Monday, December 12, 2011.

  20. Oxygen saturation in healthy children aged 5 to 16 years residing in Huayllay, Peru at 4340 m.

    PubMed

    Schult, Sandra; Canelo-Aybar, Carlos

    2011-01-01

    Hypoxemia is a major life-threatening complication of childhood pneumonia. The threshold points for hypoxemia vary with altitude. However, few published data describe that normal range of variation. The purpose of this study was to establish reference values of normal mean Sao(2) levels and an approximate cutoff point to define hypoxemia for clinical purposes above 4300 meters above sea level (masl). Children aged 5 to 16 yr were examined during primary care visits at the Huayllay Health Center. Huayllay is a rural community located at 4340 m in the province of Pasco in the Peruvian Andes. We collected basic sociodemographic data and evaluated three outcomes: arterial oxygen saturation (Sao(2)) with a pulse oximeter, heart rate, and respiratory rate. Comparisons of main outcomes among age groups (5-6, 7-8, 9-10, 11-12, 13-14, and 15-16 yr) and sex were performed using linear regression models. The correlation of Sao(2) with heart rate and respiration rate was established by Pearson's correlation test. We evaluated 583 children, of whom 386 were included in the study. The average age was 10.3 yr; 55.7% were female. The average Sao(2), heart rate, and respiratory rate were 85.7% (95% CI: 85.2-86.2), 80.4/min (95% CI: 79.0-81.9), and 19.9/min (95% CI: 19.6-20.2), respectively. Sao(2) increased with age (p < 0.001). No differences by sex were observed. The mean minus two standard deviations of Sao(2) (threshold point for hypoxemia) ranged from 73.8% to 81.8% by age group. At 4300 m, the reference values for hypoxemia may be 14.2% lower than at sea level. This difference must be considered when diagnosing hypoxemia or deciding oxygen supplementation at high altitude. Other studies are needed to determine whether this reference value is appropriate for clinical use.

  1. Dependence of Interfacial Excess on the Threshold Value of the Isoconcentration Surface

    NASA Technical Reports Server (NTRS)

    Yoon, Kevin E.; Noebe, Ronald D.; Hellman, Olof C.; Seidman, David N.

    2004-01-01

    The proximity histogram (or proxigram for short) is used for analyzing data collected by a three-dimensional atom probe microscope. The interfacial excess of Re (2.41 +/- 0.68 atoms/sq nm) is calculated by employing a proxigram in a completely geometrically independent way for gamma/gamma' interfaces in Rene N6, a third-generation single-crystal Ni-based superalloy. A possible dependence of interfacial excess on the variation of the threshold value of an isoconcentration surface is investigated using the data collected for Rene N6 alloy. It is demonstrated that the dependence of the interfacial excess value on the threshold value of the isoconcentration surface is weak.

  2. Sequential monitoring of beach litter using webcams.

    PubMed

    Kako, Shin'ichiro; Isobe, Atsuhiko; Magome, Shinya

    2010-05-01

    This study attempts to establish a system for the sequential monitoring of beach litter using webcams placed at the Ookushi beach, Goto Islands, Japan, to establish the temporal variability in the quantities of beach litter every 90 min over a one and a half year period. The time series of the quantities of beach litter, computed by counting pixels with a greater lightness than a threshold value in photographs, shows that litter does not increase monotonically on the beach, but fluctuates mainly on a monthly time scale or less. To investigate what factors influence this variability, the time derivative of the quantity of beach litter is compared with satellite-derived wind speeds. It is found that the beach litter quantities vary largely with winds, but there may be other influencing factors. (c) 2010 Elsevier Ltd. All rights reserved.

  3. Simplified 4-item criteria for polycystic ovary syndrome: A bridge too far?

    PubMed

    Indran, Inthrani R; Huang, Zhongwei; Khin, Lay Wai; Chan, Jerry K Y; Viardot-Foucault, Veronique; Yong, Eu Leong

    2018-05-30

    Although the Rotterdam 2003 polycystic ovarian syndrome (PCOS) diagnostic criteria is widely used, the need to consider multiple variables makes it unwieldy in clinical practice. We propose a simplified PCOS criteria wherein diagnosis is made if two of the following three items were present: (i) oligomenorrhoea, (ii) anti-mullerian hormone (AMH) above threshold and/or (iii) hyperandrogenism defined as either testosterone above threshold and/or the presence of hirsutism. This prospective cross-sectional study consists of healthy women (n = 157) recruited at an annual hospital health screen for staff and volunteers from the university community, and a patient cohort (n = 174) comprising women referred for suspected PCOS. We used the healthy cohort to establish threshold values for serum testosterone, antral follicle counts (AFC), ovarian volume (OV) and AMH. Women from the patient cohort, classified as PCOS by simplified PCOS criteria, AMH alone and Rotterdam 2003, were compared with respect to prevalence of oligomenorrhoea, hyperandrogenism and metabolic indices. In healthy women, testosterone ≥1.89 nmol/L, AFC ≥22 follicles and OV ≥8.44 mL, best predicted oligomenorrhoea and were used as threshold values for PCOS criteria. An AMH level ≥37.0 pmol/L best predicted polycystic ovarian morphology. AMH alone as a single biomarker demonstrated poor specificity (58.9%) for PCOS compared to Rotterdam 2003. In contrast, there was a 94% overlap in women selected as PCOS by the simplified PCOS criteria and Rotterdam 2003. The population characteristics of these two groups of PCOS women showed no significant mean differences in androgenic, ovarian, AMH and metabolic (BMI, HOMA-IR) variables. Our data recommend the simplified PCOS criteria with population-specific thresholds for diagnosis of PCOS. Its ability to replace ovarian ultrasound biometry with the highly correlated variable AMH, and use of testosterone as a single marker for hyperandrogenaemia alongside the key symptoms of oligomenorrhoea and hirsutism confers significant clinical potential for the diagnosis of PCOS. © 2018 John Wiley & Sons Ltd.

  4. A fuzzy optimal threshold technique for medical images

    NASA Astrophysics Data System (ADS)

    Thirupathi Kannan, Balaji; Krishnasamy, Krishnaveni; Pradeep Kumar Kenny, S.

    2012-01-01

    A new fuzzy based thresholding method for medical images especially cervical cytology images having blob and mosaic structures is proposed in this paper. Many existing thresholding algorithms may segment either blob or mosaic images but there aren't any single algorithm that can do both. In this paper, an input cervical cytology image is binarized, preprocessed and the pixel value with minimum Fuzzy Gaussian Index is identified as an optimal threshold value and used for segmentation. The proposed technique is tested on various cervical cytology images having blob or mosaic structures, compared with various existing algorithms and proved better than the existing algorithms.

  5. Low-threshold field emission in planar cathodes with nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Zhigalov, V.; Petukhov, V.; Emelianov, A.; Timoshenkov, V.; Chaplygin, Yu.; Pavlov, A.; Shamanaev, A.

    2016-12-01

    Nanocarbon materials are of great interest as field emission cathodes due to their low threshold voltage. In this work current-voltage characteristics of nanocarbon electrodes were studied. Low-threshold emission was found in planar samples where field enhancement is negligible (<10). Electron work function values, calculated by Fowler-Nordheim theory, are anomalous low (<1 eV) and come into collision with directly measured work function values in fabricated planar samples (4.1-4.4 eV). Non-applicability of Fowler-Nordheim theory for the nanocarbon materials was confirmed. The reasons of low-threshold emission in nanocarbon materials are discussed.

  6. Visions for a Pan-European digital data infrastructure for groundwater quantity and quality data relevant for implementation of the Water Framework Directive.

    NASA Astrophysics Data System (ADS)

    Hinsby, Klaus; Broers, Hans Peter

    2014-05-01

    The EU Water Framework and Groundwater Directives stipulate that EU member states (MS) should ensure good groundwater chemical and quantitative by 2015. For the assessment of good chemical status the MS have to establish Natural Background Levels (NBLs) and Threshold Values (TVs) for groundwater bodies at risk and compare current concentration levels to these. In addition the MS shall ensure trend reversals in cases where contaminants or water levels show critical increasing or decreasing trends. The EU MS have to demonstrate that the quantitative and chemical status of its groundwater bodies does not put drinking water, ecosystems or other legitimate uses at risk. Easy on-line access to relevant visualizations of groundwater quality and quantity data of e.g. nitrate, chloride, arsenic and water tables in Europe's major aquifer types compiled from national databases would be of great importance for managers, authorities and scientists conducting risk and status assessments. The Water Resources Expert Group of the EuroGeoSurveys propose to develop Pan-European interactive on-line digital maps and visualizations of concentrations levels and trends, as well as calculated natural background levels and threshold values for the most important aquifer types of Europe mainly derived based on principles established in the former EU project "BRIDGE" - Background cRiteria for the IDentification of Groundwater Thresholds. Further, we propose to develop Pan-European digital and dynamic maps and cross sections in close collaboration with ecologists, which delineate dependent or associated terrestrial and aquatic ecosystems across Europe where groundwater quantity and quality plays a significant role in sustaining good ecological status of the ecosystem, and where the water resources and ecosystems are most vulnerable to climate change. Finally, integrated water resources management requires integrated consideration of both deep and shallow groundwater and surface water issues and interaction. It is therefore proposed to map regions of Europe that use coupled groundwater-surface water models in integrated water resources and river basin management. In the presentation we will show selected examples of data visualizations of importance to integrated water resources and river basin management and the implementation of the Water Framework Directive.

  7. Bayesian methods for estimating GEBVs of threshold traits

    PubMed Central

    Wang, C-L; Ding, X-D; Wang, J-Y; Liu, J-F; Fu, W-X; Zhang, Z; Yin, Z-J; Zhang, Q

    2013-01-01

    Estimation of genomic breeding values is the key step in genomic selection (GS). Many methods have been proposed for continuous traits, but methods for threshold traits are still scarce. Here we introduced threshold model to the framework of GS, and specifically, we extended the three Bayesian methods BayesA, BayesB and BayesCπ on the basis of threshold model for estimating genomic breeding values of threshold traits, and the extended methods are correspondingly termed BayesTA, BayesTB and BayesTCπ. Computing procedures of the three BayesT methods using Markov Chain Monte Carlo algorithm were derived. A simulation study was performed to investigate the benefit of the presented methods in accuracy with the genomic estimated breeding values (GEBVs) for threshold traits. Factors affecting the performance of the three BayesT methods were addressed. As expected, the three BayesT methods generally performed better than the corresponding normal Bayesian methods, in particular when the number of phenotypic categories was small. In the standard scenario (number of categories=2, incidence=30%, number of quantitative trait loci=50, h2=0.3), the accuracies were improved by 30.4%, 2.4%, and 5.7% points, respectively. In most scenarios, BayesTB and BayesTCπ generated similar accuracies and both performed better than BayesTA. In conclusion, our work proved that threshold model fits well for predicting GEBVs of threshold traits, and BayesTCπ is supposed to be the method of choice for GS of threshold traits. PMID:23149458

  8. Method and apparatus for analog pulse pile-up rejection

    DOEpatents

    De Geronimo, Gianluigi

    2013-12-31

    A method and apparatus for pulse pile-up rejection are disclosed. The apparatus comprises a delay value application constituent configured to receive a threshold-crossing time value, and provide an adjustable value according to a delay value and the threshold-crossing time value; and a comparison constituent configured to receive a peak-occurrence time value and the adjustable value, compare the peak-occurrence time value with the adjustable value, indicate pulse acceptance if the peak-occurrence time value is less than or equal to the adjustable value, and indicate pulse rejection if the peak-occurrence time value is greater than the adjustable value.

  9. Method and apparatus for analog pulse pile-up rejection

    DOEpatents

    De Geronimo, Gianluigi

    2014-11-18

    A method and apparatus for pulse pile-up rejection are disclosed. The apparatus comprises a delay value application constituent configured to receive a threshold-crossing time value, and provide an adjustable value according to a delay value and the threshold-crossing time value; and a comparison constituent configured to receive a peak-occurrence time value and the adjustable value, compare the peak-occurrence time value with the adjustable value, indicate pulse acceptance if the peak-occurrence time value is less than or equal to the adjustable value, and indicate pulse rejection if the peak-occurrence time value is greater than the adjustable value.

  10. Psychometric properties of the Chinese version of resilience scale specific to cancer: an item response theory analysis.

    PubMed

    Ye, Zeng Jie; Liang, Mu Zi; Zhang, Hao Wei; Li, Peng Fei; Ouyang, Xue Ren; Yu, Yuan Liang; Liu, Mei Ling; Qiu, Hong Zhong

    2018-06-01

    Classic theory test has been used to develop and validate the 25-item Resilience Scale Specific to Cancer (RS-SC) in Chinese patients with cancer. This study was designed to provide additional information about the discriminative value of the individual items tested with an item response theory analysis. A two-parameter graded response model was performed to examine whether any of the items of the RS-SC exhibited problems with the ordering and steps of thresholds, as well as the ability of items to discriminate patients with different resilience levels using item characteristic curves. A sample of 214 Chinese patients with cancer diagnosis was analyzed. The established three-dimension structure of the RS-SC was confirmed. Several items showed problematic thresholds or discrimination ability and require further revision. Some problematic items should be refined and a short-form of RS-SC maybe feasible in clinical settings in order to reduce burden on patients. However, the generalizability of these findings warrants further investigations.

  11. Threshold Dynamics of a Temperature-Dependent Stage-Structured Mosquito Population Model with Nested Delays.

    PubMed

    Wang, Xiunan; Zou, Xingfu

    2018-05-21

    Mosquito-borne diseases remain a significant threat to public health and economics. Since mosquitoes are quite sensitive to temperature, global warming may not only worsen the disease transmission case in current endemic areas but also facilitate mosquito population together with pathogens to establish in new regions. Therefore, understanding mosquito population dynamics under the impact of temperature is considerably important for making disease control policies. In this paper, we develop a stage-structured mosquito population model in the environment of a temperature-controlled experiment. The model turns out to be a system of periodic delay differential equations with periodic delays. We show that the basic reproduction number is a threshold parameter which determines whether the mosquito population goes to extinction or remains persistent. We then estimate the parameter values for Aedes aegypti, the mosquito that transmits dengue virus. We verify the analytic result by numerical simulations with the temperature data of Colombo, Sri Lanka where a dengue outbreak occurred in 2017.

  12. Experimental establishment of the erosion nature of the pulsed low-threshold optical breakdown of air near the surface

    NASA Astrophysics Data System (ADS)

    Min'ko, L. Ia.; Chumakov, A. N.; Chivel', Iu. A.

    1988-08-01

    Nanosecond kinetic spectroscopy methods are used to establish the erosion nature of the pulsed low-threshold optical breakdown of air near the surface upon exposure of certain metals (indium, lead) to microsecond neodymium and CO2 laser radiation. It is shown that this optical breakdown of air by CO2 laser radiation is accompanied by the formation of a plasma spectrum which is optically thin in the visible range.

  13. Threshold Phenomenon for the Quintic Wave Equation in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Krieger, Joachim; Nakanishi, Kenji; Schlag, Wilhelm

    2014-04-01

    For the critical focusing wave equation in the radial case, we establish the role of the "center stable" manifold constructed in Krieger and Schlag (Am J Math 129(3):843-913, 2007) near the ground state ( W, 0) as a threshold between blowup and scattering to zero, establishing a conjecture going back to numerical work by Bizoń et al. (Nonlinearity 17(6):2187-2201, 2004). The underlying topology is stronger than the energy norm.

  14. Can adaptive threshold-based metabolic tumor volume (MTV) and lean body mass corrected standard uptake value (SUL) predict prognosis in head and neck cancer patients treated with definitive radiotherapy/chemoradiotherapy?

    PubMed

    Akagunduz, Ozlem Ozkaya; Savas, Recep; Yalman, Deniz; Kocacelebi, Kenan; Esassolak, Mustafa

    2015-11-01

    To evaluate the predictive value of adaptive threshold-based metabolic tumor volume (MTV), maximum standardized uptake value (SUVmax) and maximum lean body mass corrected SUV (SULmax) measured on pretreatment positron emission tomography and computed tomography (PET/CT) imaging in head and neck cancer patients treated with definitive radiotherapy/chemoradiotherapy. Pretreatment PET/CT of the 62 patients with locally advanced head and neck cancer who were treated consecutively between May 2010 and February 2013 were reviewed retrospectively. The maximum FDG uptake of the primary tumor was defined according to SUVmax and SULmax. Multiple threshold levels between 60% and 10% of the SUVmax and SULmax were tested with intervals of 5% to 10% in order to define the most suitable threshold value for the metabolic activity of each patient's tumor (adaptive threshold). MTV was calculated according to this value. We evaluated the relationship of mean values of MTV, SUVmax and SULmax with treatment response, local recurrence, distant metastasis and disease-related death. Receiver-operating characteristic (ROC) curve analysis was done to obtain optimal predictive cut-off values for MTV and SULmax which were found to have a predictive value. Local recurrence-free (LRFS), disease-free (DFS) and overall survival (OS) were examined according to these cut-offs. Forty six patients had complete response, 15 had partial response, and 1 had stable disease 6 weeks after the completion of treatment. Median follow-up of the entire cohort was 18 months. Of 46 complete responders 10 had local recurrence, and of 16 partial or no responders 10 had local progression. Eighteen patients died. Adaptive threshold-based MTV had significant predictive value for treatment response (p=0.011), local recurrence/progression (p=0.050), and disease-related death (p=0.024). SULmax had a predictive value for local recurrence/progression (p=0.030). ROC curves analysis revealed a cut-off value of 14.00 mL for MTV and 10.15 for SULmax. Three-year LRFS and DFS rates were significantly lower in patients with MTV ≥ 14.00 mL (p=0.026, p=0.018 respectively), and SULmax≥10.15 (p=0.017, p=0.022 respectively). SULmax did not have a significant predictive value for OS whereas MTV had (p=0.025). Adaptive threshold-based MTV and SULmax could have a role in predicting local control and survival in head and neck cancer patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Heavy metals in intensive greenhouse vegetable production systems along Yellow Sea of China: Levels, transfer and health risk.

    PubMed

    Hu, Wenyou; Huang, Biao; Tian, Kang; Holm, Peter E; Zhang, Yanxia

    2017-01-01

    Recently, greenhouse vegetable production (GVP) has grown rapidly and counts a large proportion of vegetable production in China. In this study, the accumulation, health risk and threshold values of selected heavy metals were evaluated systematically. A total of 120 paired soil and vegetable samples were collected from three typical intensive GVP systems along the Yellow Sea of China. Mean concentrations of Cd, As, Hg, Pb, Cu and Zn in greenhouse soils were 0.21, 7.12, 0.05, 19.81, 24.95 and 94.11 mg kg -1 , respectively. Compared to rootstalk and fruit vegetables, leafy vegetables had relatively high concentrations and transfer factors of heavy metals. The accumulation of heavy metals in soils was affected by soil pH and soil organic matter. The calculated hazard quotients (HQ) of the heavy metals by vegetable consumption decreased in the order of leafy > rootstalk > fruit vegetables with hazard index (HI) values of 0.61, 0.33 and 0.26, respectively. The HI values were all below 1, which indicates that there is a low risk of greenhouse vegetable consumption. Soil threshold values (STVs) of heavy metals in GVP system were established according to the health risk assessment. The relatively lower transfer factors of rootstalk and fruit vegetables and higher STVs suggest that these types of vegetables are more suitable for cultivation in greenhouse soils. This study will provide an useful reference for controlling heavy metals and developing sustainable GVP. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Early pregnancy assessment with transvaginal ultrasound scanning.

    PubMed Central

    Daya, S; Woods, S; Ward, S; Lappalainen, R; Caco, C

    1991-01-01

    OBJECTIVE: To establish normal parameters in early pregnancy through transvaginal ultrasonography so that gestational age can be determined and to correlate the sonographic findings with serum human chorionic gonadotropin (hCG) levels calibrated against the first international reference preparation standard. SETTING: Infertility clinic. PATIENTS: Thirty-five women with normal intrauterine pregnancy. INTERVENTIONS: Serial measurement of the serum hCG level and the diameter of the gestational sac through transvaginal ultrasonography. MAIN RESULTS: The gestational sac could not be visualized when the hCG level was less than 1100 IU/L. The average growth rate of the sac was 0.9 mm/d. The threshold values for sac diameter, serum hCG level and gestational age below which the yolk sac was not visible were 3.7 mm, 1900 IU/L and 36 days respectively; the corresponding values above which the yolk sac was always visible were 6.7 mm, 5800 IU/L and 40 days. The threshold values below which cardiac activity was not visible were 8.3 mm, 9200 IU/L and 41 days respectively, and the corresponding values above which cardiac activity was always visible were 14.0 mm, 24,000 IU/L and 46 days. The mean gestational ages and the 95% confidence and prediction intervals were tabulated so that measurement of the gestational sac diameter could be used to estimate gestational age early in normal pregnancy. CONCLUSIONS: Transvaginal ultrasonography enables detection of an intrauterine sac and reliable estimation of gestational age on the basis of sac dimensions before an embryo can be seen. PMID:1993291

  17. The impact of manual threshold selection in medical additive manufacturing.

    PubMed

    van Eijnatten, Maureen; Koivisto, Juha; Karhu, Kalle; Forouzanfar, Tymour; Wolff, Jan

    2017-04-01

    Medical additive manufacturing requires standard tessellation language (STL) models. Such models are commonly derived from computed tomography (CT) images using thresholding. Threshold selection can be performed manually or automatically. The aim of this study was to assess the impact of manual and default threshold selection on the reliability and accuracy of skull STL models using different CT technologies. One female and one male human cadaver head were imaged using multi-detector row CT, dual-energy CT, and two cone-beam CT scanners. Four medical engineers manually thresholded the bony structures on all CT images. The lowest and highest selected mean threshold values and the default threshold value were used to generate skull STL models. Geometric variations between all manually thresholded STL models were calculated. Furthermore, in order to calculate the accuracy of the manually and default thresholded STL models, all STL models were superimposed on an optical scan of the dry female and male skulls ("gold standard"). The intra- and inter-observer variability of the manual threshold selection was good (intra-class correlation coefficients >0.9). All engineers selected grey values closer to soft tissue to compensate for bone voids. Geometric variations between the manually thresholded STL models were 0.13 mm (multi-detector row CT), 0.59 mm (dual-energy CT), and 0.55 mm (cone-beam CT). All STL models demonstrated inaccuracies ranging from -0.8 to +1.1 mm (multi-detector row CT), -0.7 to +2.0 mm (dual-energy CT), and -2.3 to +4.8 mm (cone-beam CT). This study demonstrates that manual threshold selection results in better STL models than default thresholding. The use of dual-energy CT and cone-beam CT technology in its present form does not deliver reliable or accurate STL models for medical additive manufacturing. New approaches are required that are based on pattern recognition and machine learning algorithms.

  18. Health impacts due to particulate air pollution in Volos City, Greece.

    PubMed

    Moustris, Konstantinos P; Proias, George T; Larissi, Ioanna K; Nastos, Panagiotis T; Koukouletsos, Konstantinos V; Paliatsos, Athanasios G

    2016-01-01

    There is great consensus among the scientific community that suspended particulate matter is considered as one of the most harmful pollutants, particularly the inhalable particulate matter with aerodynamic diameter less than 10 μm (PM10) causing respiratory health problems and heart disorders. Average daily concentrations exceeding established standard values appear, among other cases, to be the main cause of such episodes, especially during Saharan dust episodes, a natural phenomenon that degrades air quality in the urban area of Volos. In this study the AirQ2.2.3 model, developed by the World Health Organization (WHO) European Center for Environment and Health, was used to evaluate adverse health effects by PM10 pollution in the city of Volos during a 5-year period (2007-2011). Volos is a coastal medium size city in the Thessaly region. The city is located on the northern side of the Gulf of Pagassitikos, on the east coast of Central Greece. Air pollution data were obtained by a fully automated monitoring station, which was established by the Municipal Water Supply and Sewage Department in the Greater Area of Volos, located in the centre of the city. The results of the current study indicate that when the mean annual PM10 concentration exceeds the corresponding European Union (EU) threshold value, the number of hospital admissions for respiratory disease (HARD) is increased by 25% on average. There is also an estimated increase of about 2.5% in HARD compared to the expected annual HARD cases for Volos. Finally, a strong correlation was found between the number of days exceeding the EU daily threshold concentration ([PM10] ≥ 50 μg m(-3)) and the annual HARD cases.

  19. Drawing a baseline in aesthetic quality assessment

    NASA Astrophysics Data System (ADS)

    Rubio, Fernando; Flores, M. Julia; Puerta, Jose M.

    2018-04-01

    Aesthetic classification of images is an inherently subjective task. There does not exist a validated collection of images/photographs labeled as having good or bad quality from experts. Nowadays, the closest approximation to that is to use databases of photos where a group of users rate each image. Hence, there is not a unique good/bad label but a rating distribution given by users voting. Due to this peculiarity, it is not possible to state the problem of binary aesthetic supervised classification in such a direct mode as other Computer Vision tasks. Recent literature follows an approach where researchers utilize the average rates from the users for each image, and they establish an arbitrary threshold to determine their class or label. In this way, images above the threshold are considered of good quality, while images below the threshold are seen as bad quality. This paper analyzes current literature, and it reviews those attributes able to represent an image, differentiating into three families: specific, general and deep features. Among those which have been proved more competitive, we have selected a representative subset, being our main goal to establish a clear experimental framework. Finally, once features were selected, we have used them for the full AVA dataset. We have to remark that to perform validation we report not only accuracy values, which is not that informative in this case, but also, metrics able to evaluate classification power within imbalanced datasets. We have conducted a series of experiments so that distinct well-known classifiers are learned from data. Like that, this paper provides what we could consider valuable and valid baseline results for the given problem.

  20. Assessment of the performances of AcuStar HIT and the combination with heparin-induced multiple electrode aggregometry: a retrospective study.

    PubMed

    Minet, V; Bailly, N; Douxfils, J; Osselaer, J C; Laloy, J; Chatelain, C; Elalamy, I; Chatelain, B; Dogné, J M; Mullier, F

    2013-09-01

    Early diagnosis of immune heparin-induced thrombocytopenia (HIT) is challenging. HemosIL® AcuStar HIT and heparin-induced multiple electrode aggregometry (HIMEA) were recently proposed as rapid diagnostic methods. We conducted a study to assess performances of AcuStar HIT-IgG (PF4-H) and AcuStar HIT-Ab (PF4-H). The secondary objective was to compare the performances of the combination of Acustar HIT and HIMEA with standardised clinical diagnosis. Sera of 104 suspected HIT patients were retrospectively tested with AcuStar HIT. HIMEA was performed on available sera (n=81). The clinical diagnosis was established by analysing in a standardized manner the patient's medical records. These tests were also compared with PF4-Enhanced®, LTA, and SRA in subsets of patients. Thresholds were determined using ROC curve analysis with clinical outcome as reference. Using the recommended thresholds (1.00AU), the negative predictive value (NPV) of HIT-IgG and HIT-Ab were 100.0% (95% CI: 95.9%-100.0% and 95.7%-100.0%). The positive predictive value (PPV) were 64.3% (95% CI: 35.1%-87.2.2%) and 45.0% (95% CI: 23.2%-68.6%), respectively. Using our thresholds (HIT-IgG: 2.89AU, HIT-Ab: 9.41AU), NPV of HIT-IgG and HIT-Ab were 100.0% (95% CI: 96.0%-100.0% and 96.1%-100.0%). PPV were 75.0% (95% CI: 42.7%-94.5%) and 81.8% (95% CI: 48.3%-97.7%), respectively. Of the 79 patients with a medium-high pretest probability score, 67 were negative using HIT-IgG (PF4-H) test at our thresholds. HIMEA was performed on HIT-IgG positive patients. Using this combination, only one patient on 79 was incorrectly diagnosed. Acustar HIT showed good performances to exclude the diagnosis of HIT. Combination with HIMEA improves PPV. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less

  2. Cadence (steps/min) and intensity during ambulation in 6-20 year olds: the CADENCE-kids study.

    PubMed

    Tudor-Locke, Catrine; Schuna, John M; Han, Ho; Aguiar, Elroy J; Larrivee, Sandra; Hsia, Daniel S; Ducharme, Scott W; Barreira, Tiago V; Johnson, William D

    2018-02-26

    Steps/day is widely utilized to estimate the total volume of ambulatory activity, but it does not directly reflect intensity, a central tenet of public health guidelines. Cadence (steps/min) represents an overlooked opportunity to describe the intensity of ambulatory activity. We sought to establish thresholds linking directly observed cadence with objectively measured intensity in 6-20 year olds. One hundred twenty participants completed multiple 5-min bouts on a treadmill, from 13.4 m/min (0.80 km/h) to 134.0 m/min (8.04 km/h). The protocol was terminated when participants naturally transitioned to running, or if they chose to not continue. Steps were visually counted and intensity was objectively measured using a portable metabolic system. Youth metabolic equivalents (METy) were calculated for 6-17 year olds, with moderate intensity defined as ≥4 and < 6 METy, and vigorous intensity as ≥6 METy. Traditional METs were calculated for 18-20 year olds, with moderate intensity defined as ≥3 and < 6 METs, and vigorous intensity defined as ≥6 METs. Optimal cadence thresholds for moderate and vigorous intensity were identified using segmented random coefficients models and receiver operating characteristic (ROC) curves. Participants were on average (± SD) aged 13.1 ± 4.3 years, weighed 55.8 ± 22.3 kg, and had a BMI z-score of 0.58 ± 1.21. Moderate intensity thresholds (from regression and ROC analyses) ranged from 128.4 steps/min among 6-8 year olds to 87.3 steps/min among 18-20 year olds. Comparable values for vigorous intensity ranged from 157.7 steps/min among 6-8 year olds to 119.3 steps/min among 18-20 year olds. Considering both regression and ROC approaches, heuristic cadence thresholds (i.e., evidence-based, practical, rounded) ranged from 125 to 90 steps/min for moderate intensity, and 155 to 125 steps/min for vigorous intensity, with higher cadences for younger age groups. Sensitivities and specificities for these heuristic thresholds ranged from 77.8 to 99.0%, indicating fair to excellent classification accuracy. These heuristic cadence thresholds may be used to prescribe physical activity intensity in public health recommendations. In the research and clinical context, these heuristic cadence thresholds have apparent value for accelerometer-based analytical approaches to determine the intensity of ambulatory activity.

  3. Robust crop and weed segmentation under uncontrolled outdoor illumination

    USDA-ARS?s Scientific Manuscript database

    A new machine vision for weed detection was developed from RGB color model images. Processes included in the algorithm for the detection were excessive green conversion, threshold value computation by statistical analysis, adaptive image segmentation by adjusting the threshold value, median filter, ...

  4. [The prognostic value of cardio-pulmonary exercise test parameters in patients with asymptomatic ischemic heart dysfunction during 2-years observation].

    PubMed

    Skrzypek, Agnieszka; Nessler, Jadwiga

    2015-01-01

    Measurement of oxygen uptake at the maximal exercise (VO2max) in the cardio-pulmonary exercise test provides the most reliable information about exertion tolerance. Establishment of VO2peak, VE/CO2 and AT value in the early diagnosis of asymptomatic heart dysfunction in patients with coronary disease (CAD) and prognosis during 2-years observation. The study population: 57 patients (35 M) with CAD, without any signs or symptoms of heart dysfunction, without any features of myocardial infarction, in the age 51.08 +/- 4.01. The analysis was performed twice: in the beginning and after 2-years observation. Physical examinations, echocardiographic parameters [(assessment of systolic and diastolic dysfunction of the left ventricle (LV)] and spiroergometric parameters (VO2peak, VE/CO2 at AT). On the basis of echocardiographic examination, there were created groups of patients: Group A--the patients with normal LV function (n=32; 56.2%; 23 M); Group B--the patients with diastolic heart dysfunction (n=22; 38.6%; 10 M); Group A--32 patients in the age of 50.9 +/- 4, 23 men. Values of VO2pe ak :28.8 +/- 6 ml/kg/min, VE/CO2 28.8 +/- 4.9 and AT 18 +/- 2.5. Group B--the patients with diastolic heart dysfunction: 22 (39%) patients; 10 men, in the age of 51.2 +/- 4.3. Values of VO2peak: 26 +/- 3.4 mi/ kg/min, VE/CO2 31.2 +/- 5.1 and AT 16 +/- 2.5. In the beginning of the study was established significantly differences between anaerobic threshold and degree of heart dysfunction (p=0.039). (1) There was observed that VO2 A and VE/CO2 depended on filling LV profile LV and also of systolic LV function. Anaerobic threshold significantly depended on LV filling pattern. (2) In asymptomatic patients with LV diastolic dysfunction and VO2peak < or = 18.4 ml/kg/min was observed progression of LV diastolic dysfunction during two years.

  5. Extended high-frequency thresholds in college students: effects of music player use and other recreational noise.

    PubMed

    Le Prell, Colleen G; Spankovich, Christopher; Lobariñas, Edward; Griffiths, Scott K

    2013-09-01

    Human hearing is sensitive to sounds from as low as 20 Hz to as high as 20,000 Hz in normal ears. However, clinical tests of human hearing rarely include extended high-frequency (EHF) threshold assessments, at frequencies extending beyond 8000 Hz. EHF thresholds have been suggested for use monitoring the earliest effects of noise on the inner ear, although the clinical usefulness of EHF threshold testing is not well established for this purpose. The primary objective of this study was to determine if EHF thresholds in healthy, young adult college students vary as a function of recreational noise exposure. A retrospective analysis of a laboratory database was conducted; all participants with both EHF threshold testing and noise history data were included. The potential for "preclinical" EHF deficits was assessed based on the measured thresholds, with the noise surveys used to estimate recreational noise exposure. EHF thresholds measured during participation in other ongoing studies were available from 87 participants (34 male and 53 female); all participants had hearing within normal clinical limits (≤25 HL) at conventional frequencies (0.25-8 kHz). EHF thresholds closely matched standard reference thresholds [ANSI S3.6 (1996) Annex C]. There were statistically reliable threshold differences in participants who used music players, with 3-6 dB worse thresholds at the highest test frequencies (10-16 kHz) in participants who reported long-term use of music player devices (>5 yr), or higher listening levels during music player use. It should be possible to detect small changes in high-frequency hearing for patients or participants who undergo repeated testing at periodic intervals. However, the increased population-level variability in thresholds at the highest frequencies will make it difficult to identify the presence of small but potentially important deficits in otherwise normal-hearing individuals who do not have previously established baseline data. American Academy of Audiology.

  6. Comparative effectiveness and cost-effectiveness analyses frequently agree on value.

    PubMed

    Glick, Henry A; McElligott, Sean; Pauly, Mark V; Willke, Richard J; Bergquist, Henry; Doshi, Jalpa; Fleisher, Lee A; Kinosian, Bruce; Perfetto, Eleanor; Polsky, Daniel E; Schwartz, J Sanford

    2015-05-01

    The Patient-Centered Outcomes Research Institute, known as PCORI, was established by Congress as part of the Affordable Care Act (ACA) to promote evidence-based treatment. Provisions of the ACA prohibit the use of a cost-effectiveness analysis threshold and quality-adjusted life-years (QALYs) in PCORI comparative effectiveness studies, which has been understood as a prohibition on support for PCORI's conducting conventional cost-effectiveness analyses. This constraint complicates evidence-based choices where incremental improvements in outcomes are achieved at increased costs of care. How frequently this limitation inhibits efficient cost containment, also a goal of the ACA, depends on how often more effective treatment is not cost-effective relative to less effective treatment. We examined the largest database of studies of comparisons of effectiveness and cost-effectiveness to see how often there is disagreement between the more effective treatment and the cost-effective treatment, for various thresholds that may define good value. We found that under the benchmark assumption, disagreement between the two types of analyses occurs in 19 percent of cases. Disagreement is more likely to occur if a treatment intervention is musculoskeletal and less likely to occur if it is surgical or involves secondary prevention, or if the study was funded by a pharmaceutical company. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Quantity of Candida Colonies in Saliva: 
A Diagnostic Evaluation for Oral Candidiasis.

    PubMed

    Zhou, Pei Ru; Hua, Hong; Liu, Xiao Song

    To investigate the relationship between the quantity of Candida colonies in saliva and oral candidiasis (OC), as well as to identify the threshold for distinguishing oral candidiasis from healthy carriage. A diagnostic test was conducted in 197 patients with different oral problems. The diagnosis of OC was established based on clinical features. Whole saliva samples from the subjects were cultured for Candida species. Receiver operating characteristic (ROC) curve analysis was used in this study. OC patients had significantly more Candida colony-forming units per millilitre saliva (795 cfu/ml) than asymptomatic carriers (40 cfu/ml; P < 0.05). Among different types of candidiasis, the quantity of Candida colonies differed. The number of Candida colonies in pseudomembranous type was significantly higher than that in the erythematous type (P < 0.05). Candida albicans was the predominant species of Candida. The cut-off point with the best fit for OC diagnosis was calculated to be 266 cfu/ml. The sensitivity and specificity were 0.720 and 0.825, respectively. Analysis of the ROC curve indicated that Candida colonies had a high diagnostic value for OC, as demonstrated by the area under the curve (AUC = 0.873). Based on this study, the value of 270 cfu/ml was considered a threshold for distinguishing OC from carriage.

  8. Development of a commercially viable piezoelectric force sensor system for static force measurement

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Luo, Xinwei; Liu, Jingcheng; Li, Min; Qin, Lan

    2017-09-01

    A compensation method for measuring static force with a commercial piezoelectric force sensor is proposed to disprove the theory that piezoelectric sensors and generators can only operate under dynamic force. After studying the model of the piezoelectric force sensor measurement system, the principle of static force measurement using a piezoelectric material or piezoelectric force sensor is analyzed. Then, the distribution law of the decay time constant of the measurement system and the variation law of the measurement system’s output are studied, and a compensation method based on the time interval threshold Δ t and attenuation threshold Δ {{u}th} is proposed. By calibrating the system and considering the influences of the environment and the hardware, a suitable Δ {{u}th} value is determined, and the system’s output attenuation is compensated based on the Δ {{u}th} value to realize the measurement. Finally, a static force measurement system with a piezoelectric force sensor is developed based on the compensation method. The experimental results confirm the successful development of a simple compensation method for static force measurement with a commercial piezoelectric force sensor. In addition, it is established that, contrary to the current perception, a piezoelectric force sensor system can be used to measure static force through further calibration.

  9. Pre-impact fall detection system using dynamic threshold and 3D bounding box

    NASA Astrophysics Data System (ADS)

    Otanasap, Nuth; Boonbrahm, Poonpong

    2017-02-01

    Fall prevention and detection system have to subjugate many challenges in order to develop an efficient those system. Some of the difficult problems are obtrusion, occlusion and overlay in vision based system. Other associated issues are privacy, cost, noise, computation complexity and definition of threshold values. Estimating human motion using vision based usually involves with partial overlay, caused either by direction of view point between objects or body parts and camera, and these issues have to be taken into consideration. This paper proposes the use of dynamic threshold based and bounding box posture analysis method with multiple Kinect cameras setting for human posture analysis and fall detection. The proposed work only uses two Kinect cameras for acquiring distributed values and differentiating activities between normal and falls. If the peak value of head velocity is greater than the dynamic threshold value, bounding box posture analysis will be used to confirm fall occurrence. Furthermore, information captured by multiple Kinect placed in right angle will address the skeleton overlay problem due to single Kinect. This work contributes on the fusion of multiple Kinect based skeletons, based on dynamic threshold and bounding box posture analysis which is the only research work reported so far.

  10. Local bleaching thresholds established by remote sensing techniques vary among reefs with deviating bleaching patterns during the 2012 event in the Arabian/Persian Gulf.

    PubMed

    Shuail, Dawood; Wiedenmann, Jörg; D'Angelo, Cecilia; Baird, Andrew H; Pratchett, Morgan S; Riegl, Bernhard; Burt, John A; Petrov, Peter; Amos, Carl

    2016-04-30

    A severe bleaching event affected coral communities off the coast of Abu Dhabi, UAE in August/September, 2012. In Saadiyat and Ras Ghanada reefs ~40% of the corals showed signs of bleaching. In contrast, only 15% of the corals were affected on Delma reef. Bleaching threshold temperatures for these sites were established using remotely sensed sea surface temperature (SST) data recorded by MODIS-Aqua. The calculated threshold temperatures varied between locations (34.48 °C, 34.55 °C, 35.05 °C), resulting in site-specific deviations in the numbers of days during which these thresholds were exceeded. Hence, the less severe bleaching of Delma reef might be explained by the lower relative heat stress experienced by this coral community. However, the dominance of Porites spp. that is associated with the long-term exposure of Delma reef to elevated temperatures, as well as the more pristine setting may have additionally contributed to the higher coral bleaching threshold for this site. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  11. Threshold to N-methyl-D-aspartate-induced seizures in mice undergoing chronic nutritional magnesium deprivation is lowered in a way partly responsive to acute magnesium and antioxidant administrations.

    PubMed

    Maurois, Pierre; Pages, Nicole; Bac, Pierre; German-Fattal, Michèle; Agnani, Geneviève; Delplanque, Bernadette; Durlach, Jean; Poupaert, Jacques; Vamecq, Joseph

    2009-02-01

    Magnesium deficiency may be induced by a diet impoverished in magnesium. This nutritional deficit promotes chronic inflammatory and oxidative stresses, hyperexcitability and, in mice, susceptibility to audiogenic seizures. Potentiation by low-magnesium concentrations of the opening of N-methyl-D-aspartate (NMDA) receptor/calcium channel in in vitro and ex vivo studies, and responsiveness to magnesium of in vivo brain injury states are now well established. By contrast, little or no specific attention has been, however, paid to the in vivo NMDA receptor function/excitability in magnesium deficiency. The present work reports for the first time that, in mice undergoing chronic nutritional deprivation in magnesium (35 v. 930 parts per million for 27 d in OF1 mice), NMDA-induced seizure threshold is significantly decreased (38 % of normal values). The attenuation in the drop of NMDA seizure threshold (percentage of reversal) was 58 and 20 % upon acute intraperitoneal administrations of magnesium chloride hexahydrate (28 mg magnesium/kg) and the antioxidant ebselen (20 mg/kg), respectively. In nutritionally magnesium-deprived animals, audiogenic seizures are completely prevented by these compound doses. Taken as a whole, our data emphasise that chronic magnesium deprivation in mice is a nutritional in vivo model for a lowered NMDA receptor activation threshold. This nutritional model responds remarkably to acute magnesium supply and moderately to acute antioxidant administration.

  12. Tectonic uplift, threshold hillslopes, and denudation rates in a developing mountain range

    USGS Publications Warehouse

    Binnie, S.A.; Phillips, W.M.; Summerfield, M.A.; Fifield, L.K.

    2007-01-01

    Studies across a broad range of drainage basins have established a positive correlation between mean slope gradient and denudation rates. It has been suggested, however, that this relationship breaks down for catchments where slopes are at their threshold angle of stability because, in such cases, denudation is controlled by the rate of tectonic uplift through the rate of channel incision and frequency of slope failure. This mechanism is evaluated for the San Bernardino Mountains, California, a nascent range that incorporates both threshold hill-slopes and remnants of pre-uplift topography. Concentrations of in situ-produced cosmogenic 10Be in alluvial sediments are used to quantify catchment-wide denudation rates and show a broadly linear relationship with mean slope gradient up to ???30??: above this value denudation rates vary substantially for similar mean slope gradients. We propose that this decoupling in the slope gradient-denudation rate relationship marks the emergence of threshold topography and coincides with the transition from transport-limited to detachment-limited denudation. The survival in the San Bernardino Mountains of surfaces formed prior to uplift provides information on the topographic evolution of the range, in particular the transition from slope-gradient-dependent rates of denudation to a regime where denudation rates are controlled by rates of tectonic uplift. This type of transition may represent a general model for the denudational response to orogenic uplift and topographic evolution during the early stages of mountain building. ?? 2007 The Geological Society of America.

  13. The conventional tuning fork as a quantitative tool for vibration threshold.

    PubMed

    Alanazy, Mohammed H; Alfurayh, Nuha A; Almweisheer, Shaza N; Aljafen, Bandar N; Muayqil, Taim

    2018-01-01

    This study was undertaken to describe a method for quantifying vibration when using a conventional tuning fork (CTF) in comparison to a Rydel-Seiffer tuning fork (RSTF) and to provide reference values. Vibration thresholds at index finger and big toe were obtained in 281 participants. Spearman's correlations were performed. Age, weight, and height were analyzed for their covariate effects on vibration threshold. Reference values at the fifth percentile were obtained by quantile regression. The correlation coefficients between CTF and RSTF values at finger/toe were 0.59/0.64 (P = 0.001 for both). Among covariates, only age had a significant effect on vibration threshold. Reference values for CTF at finger/toe for the age groups 20-39 and 40-60 years were 7.4/4.9 and 5.8/4.6 s, respectively. Reference values for RSTF at finger/toe for the age groups 20-39 and 40-60 years were 6.9/5.5 and 6.2/4.7, respectively. CTF provides quantitative values that are as good as those provided by RSTF. Age-stratified reference data are provided. Muscle Nerve 57: 49-53, 2018. © 2017 Wiley Periodicals, Inc.

  14. Reading for Integration, Identifying Complementary Threshold Concepts: The ACRL "Framework" in Conversation with "Naming What We Know: Threshold Concepts of Writing"

    ERIC Educational Resources Information Center

    Johnson, Brittney; McCracken, I. Moriah

    2016-01-01

    In 2015, threshold concepts formed the foundation of two disciplinary documents: The "ACRL Framework for Information Literacy" (2015) and "Naming What We Know: Threshold Concepts of Writing Studies" (2015). While there is no consensus in the fields about the value of threshold concepts in teaching, reading the six Frames in the…

  15. Inclusion of Exercise Intensities Above the Lactate Threshold in VO2/Running Speed Regression Does not Improve the Precision of Accumulated Oxygen Deficit Estimation in Endurance-Trained Runners

    PubMed Central

    Reis, Victor M.; Silva, António J.; Ascensão, António; Duarte, José A.

    2005-01-01

    The present study intended to verify if the inclusion of intensities above lactate threshold (LT) in the VO2/running speed regression (RSR) affects the estimation error of accumulated oxygen deficit (AOD) during a treadmill running performed by endurance-trained subjects. Fourteen male endurance-trained runners performed a sub maximal treadmill running test followed by an exhaustive supra maximal test 48h later. The total energy demand (TED) and the AOD during the supra maximal test were calculated from the RSR established on first testing. For those purposes two regressions were used: a complete regression (CR) including all available sub maximal VO2 measurements and a sub threshold regression (STR) including solely the VO2 values measured during exercise intensities below LT. TED mean values obtained with CR and STR were not significantly different under the two conditions of analysis (177.71 ± 5.99 and 174.03 ± 6.53 ml·kg-1, respectively). Also the mean values of AOD obtained with CR and STR did not differ under the two conditions (49.75 ± 8.38 and 45.8 9 ± 9.79 ml·kg-1, respectively). Moreover, the precision of those estimations was also similar under the two procedures. The mean error for TED estimation was 3.27 ± 1.58 and 3.41 ± 1.85 ml·kg-1 (for CR and STR, respectively) and the mean error for AOD estimation was 5.03 ± 0.32 and 5.14 ± 0.35 ml·kg-1 (for CR and STR, respectively). The results indicated that the inclusion of exercise intensities above LT in the RSR does not improve the precision of the AOD estimation in endurance-trained runners. However, the use of STR may induce an underestimation of AOD comparatively to the use of CR. Key Points It has been suggested that the inclusion of exercise intensities above the lactate threshold in the VO2/power regression can significantly affect the estimation of the energy cost and, thus, the estimation of the AOD. However data on the precision of those AOD measurements is rarely provided. We have evaluated the effects of the inclusion of those exercise intensities on the AOD precision. The results have indicated that the inclusion of exercise intensities above the lactate threshold in the VO2/running speed regression does not improve the precision of AOD estimation in endurance-trained runners. However, the use of sub threshold regressions may induce an underestimation of AOD comparatively to the use of complete regressions. PMID:24501560

  16. A threshold method for immunological correlates of protection

    PubMed Central

    2013-01-01

    Background Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. Methods We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Results Highly significant thresholds with p-values less than 0.01 were found for 13 of the 15 datasets. Considerable variability was seen in the widths of confidence intervals. Relative risks indicated around 70% or better protection in 11 datasets and relevance of the estimated threshold to imply strong protection. Goodness-of-fit was generally acceptable. Conclusions The a:b model offers a formal statistical method of estimation of thresholds differentiating susceptible from protected individuals which has previously depended on putative statements based on visual inspection of data. PMID:23448322

  17. Differences in two-point discrimination and sensory threshold in the blind between braille and text reading: a pilot study.

    PubMed

    Noh, Ji-Woong; Park, Byoung-Sun; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kang, Ji-Hye; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Kim, Ju-Young; Kim, Junghwan

    2015-06-01

    [Purpose] This study investigated two-point discrimination (TPD) and the electrical sensory threshold of the blind to define the effect of using Braille on the tactile and electrical senses. [Subjects and Methods] Twenty-eight blind participants were divided equally into a text-reading and a Braille-reading group. We measured tactile sensory and electrical thresholds using the TPD method and a transcutaneous electrical nerve stimulator. [Results] The left palm TPD values were significantly different between the groups. The values of the electrical sensory threshold in the left hand, the electrical pain threshold in the left hand, and the electrical pain threshold in the right hand were significantly lower in the Braille group than in the text group. [Conclusion] These findings make it difficult to explain the difference in tactility between groups, excluding both palms. However, our data show that using Braille can enhance development of the sensory median nerve in the blind, particularly in terms of the electrical sensory and pain thresholds.

  18. Differences in two-point discrimination and sensory threshold in the blind between braille and text reading: a pilot study

    PubMed Central

    Noh, Ji-Woong; Park, Byoung-Sun; Kim, Mee-Young; Lee, Lim-Kyu; Yang, Seung-Min; Lee, Won-Deok; Shin, Yong-Sub; Kang, Ji-Hye; Kim, Ju-Hyun; Lee, Jeong-Uk; Kwak, Taek-Yong; Lee, Tae-Hyun; Kim, Ju-Young; Kim, Junghwan

    2015-01-01

    [Purpose] This study investigated two-point discrimination (TPD) and the electrical sensory threshold of the blind to define the effect of using Braille on the tactile and electrical senses. [Subjects and Methods] Twenty-eight blind participants were divided equally into a text-reading and a Braille-reading group. We measured tactile sensory and electrical thresholds using the TPD method and a transcutaneous electrical nerve stimulator. [Results] The left palm TPD values were significantly different between the groups. The values of the electrical sensory threshold in the left hand, the electrical pain threshold in the left hand, and the electrical pain threshold in the right hand were significantly lower in the Braille group than in the text group. [Conclusion] These findings make it difficult to explain the difference in tactility between groups, excluding both palms. However, our data show that using Braille can enhance development of the sensory median nerve in the blind, particularly in terms of the electrical sensory and pain thresholds. PMID:26180348

  19. A Specimen Size Effect on the Fatigue Crack Growth Rate Threshold of IN 718

    NASA Technical Reports Server (NTRS)

    Garr, K. R.; Hresko, G. C., III

    1998-01-01

    Fatigue crack growth rate (FCGR) tests were conducted on IN 718 in the solution annealed and aged condition at room temperature in accordance with E647-87. As part of each test, the FCGR threshold was measured using the decreasing Delta K method. A new heat of material was being tested and some of this material was sent to a different laboratory which wanted to use a specimen with a 127 mm width. Threshold data previously had been established on specimens with a width of 50.8 mm. As a check of the laboratory, tests were conducted at room temperature and R equal to 0.1 for comparison with the earlier data. The results were a threshold significantly higher than previously observed. Interchanging of specimen sizes and laboratories showed that the results were not due to a heat-to-heat or lab-to-lab variation. The results to be presented here are those obtained at the original laboratory. Growth rates were measured using the electric potential drop technique at R values of 0.1, 0.7, and 0.9. Compact tension specimen sizes with planer dimensions of 25.4 mm, 50.8 mm, and 127 mm were used. Crack growth rates at threshold were generally below 2.5 X 10(exp -8) mm / cycle. Closure measurements were made on some of the specimens by a manual procedure using a clip gage. When the crack growth rate data for the specimens tested at R equal to 0.1 were plotted as a function of applied Delta K, the thresholds varied with specimen width. The larger the width, the higher the threshold. The thresholds varied from 6.5 MPa-m(exp 1/2) for the 25.4 mm specimen to 15.4 MPa-m(exp 1/2) for the 127 mm specimen. At R equal to 0.7, the 25.4 mm and 50.8 mm specimens had essentially the same threshold, about 2.9 MPa-m(exp 1/2)while the 127 mm specimen had a threshold of 4.5 MPa-m(exp 1/2). When plotted as a function of effective Delta K, the R equal to 0.1 data are essentially normalized. Various aspects of the test procedure will be discussed as well as the results of analysis of the data using some different closure models.

  20. Health hazards of ultrafine metal and metal oxide powders

    NASA Technical Reports Server (NTRS)

    Boylen, G. W., Jr.; Chamberlin, R. I.; Viles, F. J.

    1969-01-01

    Study reveals that suggested threshold limit values are from two to fifty times lower than current recommended threshold limit values. Proposed safe limits of exposure to the ultrafine dusts are based on known toxic potential of various materials as determined in particle size ranges.

  1. 48 CFR 41.401 - Monthly and annual review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... values exceeding the simplified acquisition threshold, on an annual basis. Annual reviews of accounts with annual values at or below the simplified acquisition threshold shall be conducted when deemed... services to each facility under the utility's most economical, applicable rate and to examine competitive...

  2. Identification and classification of carcinogens: procedures of the Chemical Substances Threshold Limit Value Committee, ACGIH. American Conference of Governmental Industrial Hygienists.

    PubMed Central

    Spirtas, R; Steinberg, M; Wands, R C; Weisburger, E K

    1986-01-01

    The Chemical Substances Threshold Limit Value Committee of the American Conference of Governmental Industrial Hygienists has refined its procedures for evaluating carcinogens. Types of epidemiologic and toxicologic evidence used are reviewed and a discussion is presented on how the Committee evaluates data on carcinogenicity. Although it has not been conclusively determined whether biological thresholds exist for all types of carcinogens, the Committee will continue to develop guidelines for permissible exposures to carcinogens. The Committee will continue to use the safety factor approach to setting Threshold Limit Values for carcinogens, despite its shortcomings. A compilation has been developed for lists of substances considered to be carcinogenic by several scientific groups. The Committee will use this information to help to identify and classify carcinogens for its evaluation. PMID:3752326

  3. Identification and classification of carcinogens: procedures of the Chemical Substances Threshold Limit Value Committee, ACGIH. American Conference of Governmental Industrial Hygienists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spirtas, R.; Steinberg, M.; Wands, R.C.

    1986-10-01

    The Chemical Substances Threshold Limit Value Committee of the American Conference of Governmental Industrial Hygienists has refined its procedures for evaluating carcinogens. Types of epidemiologic and toxicologic evidence used are reviewed and a discussion is presented on how the Committee evaluates data on carcinogenicity. Although it has not been conclusively determined whether biological thresholds exist for all types of carcinogens, the Committee will continue to develop guidelines for permissible exposures to carcinogens. The Committee will continue to use the safety factor approach to setting Threshold Limit Values for carcinogens, despite its shortcomings. A compilation has been developed for lists ofmore » substances considered to be carcinogenic by several scientific groups. The Committee will use this information to help to identify and classify carcinogens for its evaluation.« less

  4. 20 CFR 404.1670 - General.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND DISABILITY INSURANCE (1950... three established threshold levels, one being performance accuracy, for two consecutive quarters, and... period. During this 3-month period we will not require the State agency to meet the threshold levels...

  5. 20 CFR 416.1070 - General.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED... three established threshold levels, one being performance accuracy, for two consecutive quarters, and... period. During this 3-month period we will not require the State agency to meet the threshold levels...

  6. On the thresholds in modeling of high flows via artificial neural networks - A bootstrapping analysis

    NASA Astrophysics Data System (ADS)

    Panagoulia, D.; Trichakis, I.

    2012-04-01

    Considering the growing interest in simulating hydrological phenomena with artificial neural networks (ANNs), it is useful to figure out the potential and limits of these models. In this study, the main objective is to examine how to improve the ability of an ANN model to simulate extreme values of flow utilizing a priori knowledge of threshold values. A three-layer feedforward ANN was trained by using the back propagation algorithm and the logistic function as activation function. By using the thresholds, the flow was partitioned in low (x < μ), medium (μ ≤ x ≤ μ + 2σ) and high (x > μ + 2σ) values. The employed ANN model was trained for high flow partition and all flow data too. The developed methodology was implemented over a mountainous river catchment (the Mesochora catchment in northwestern Greece). The ANN model received as inputs pseudo-precipitation (rain plus melt) and previous observed flow data. After the training was completed the bootstrapping methodology was applied to calculate the ANN confidence intervals (CIs) for a 95% nominal coverage. The calculated CIs included only the uncertainty, which comes from the calibration procedure. The results showed that an ANN model trained specifically for high flows, with a priori knowledge of the thresholds, can simulate these extreme values much better (RMSE is 31.4% less) than an ANN model trained with all data of the available time series and using a posteriori threshold values. On the other hand the width of CIs increases by 54.9% with a simultaneous increase by 64.4% of the actual coverage for the high flows (a priori partition). The narrower CIs of the high flows trained with all data may be attributed to the smoothing effect produced from the use of the full data sets. Overall, the results suggest that an ANN model trained with a priori knowledge of the threshold values has an increased ability in simulating extreme values compared with an ANN model trained with all the data and a posteriori knowledge of the thresholds.

  7. Setting nutrient thresholds to support an ecological assessment based on nutrient enrichment, potential primary production and undesirable disturbance.

    PubMed

    Devlin, Michelle; Painting, Suzanne; Best, Mike

    2007-01-01

    The EU Water Framework Directive recognises that ecological status is supported by the prevailing physico-chemical conditions in each water body. This paper describes an approach to providing guidance on setting thresholds for nutrients taking account of the biological response to nutrient enrichment evident in different types of water. Indices of pressure, state and impact are used to achieve a robust nutrient (nitrogen) threshold by considering each individual index relative to a defined standard, scale or threshold. These indices include winter nitrogen concentrations relative to a predetermined reference value; the potential of the waterbody to support phytoplankton growth (estimated as primary production); and detection of an undesirable disturbance (measured as dissolved oxygen). Proposed reference values are based on a combination of historical records, offshore (limited human influence) nutrient concentrations, literature values and modelled data. Statistical confidence is based on a number of attributes, including distance of confidence limits away from a reference threshold and how well the model is populated with real data. This evidence based approach ensures that nutrient thresholds are based on knowledge of real and measurable biological responses in transitional and coastal waters.

  8. Noise contribution to the correlation between temperature-induced localized reflectance of diabetic skin and blood glucose.

    PubMed

    Lowery, Michael G; Calfin, Brenda; Yeh, Shu-Jen; Doan, Tao; Shain, Eric; Hanna, Charles; Hohs, Ronald; Kantor, Stan; Lindberg, John; Khalil, Omar S

    2006-01-01

    We used the effect of temperature on the localized reflectance of human skin to assess the role of noise sources on the correlation between temperature-induced fractional change in optical density of human skin (DeltaOD(T)) and blood glucose concentration [BG]. Two temperature-controlled optical probes at 30 degrees C contacted the skin, one was then cooled by -10 degrees C; the other was heated by +10 degrees C. DeltaOD(T) upon cooling or heating was correlated with capillary [BG] of diabetic volunteers over a period of three days. Calibration models in the first two days were used to predict [BG] in the third day. We examined the conditions where the correlation coefficient (R2) for predicting [BG] in a third day ranked higher than R2 values resulting from fitting permutations of randomized [BG] to the same DeltaOD(T) values. It was possible to establish a four-term linear regression correlation between DeltaOD(T) upon cooling and [BG] with a correlation coefficient higher than that of an established noise threshold in diabetic patients that were mostly females with less than 20 years of diabetes duration. The ability to predict [BG] values with a correlation coefficient above biological and body-interface noise varied between the cases of cooling and heating.

  9. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  10. Establishing a threshold for the number of missing days using 7 d pedometer data.

    PubMed

    Kang, Minsoo; Hart, Peter D; Kim, Youngdeok

    2012-11-01

    The purpose of this study was to examine the threshold of the number of missing days of recovery using the individual information (II)-centered approach. Data for this study came from 86 participants, aged from 17 to 79 years old, who had 7 consecutive days of complete pedometer (Yamax SW 200) wear. Missing datasets (1 d through 5 d missing) were created by a SAS random process 10,000 times each. All missing values were replaced using the II-centered approach. A 7 d average was calculated for each dataset, including the complete dataset. Repeated measure ANOVA was used to determine the differences between 1 d through 5 d missing datasets and the complete dataset. Mean absolute percentage error (MAPE) was also computed. Mean (SD) daily step count for the complete 7 d dataset was 7979 (3084). Mean (SD) values for the 1 d through 5 d missing datasets were 8072 (3218), 8066 (3109), 7968 (3273), 7741 (3050) and 8314 (3529), respectively (p > 0.05). The lower MAPEs were estimated for 1 d missing (5.2%, 95% confidence interval (CI) 4.4-6.0) and 2 d missing (8.4%, 95% CI 7.0-9.8), while all others were greater than 10%. The results of this study show that the 1 d through 5 d missing datasets, with replaced values, were not significantly different from the complete dataset. Based on the MAPE results, it is not recommended to replace more than two days of missing step counts.

  11. Higher criticism thresholding: Optimal feature selection when useful features are rare and weak.

    PubMed

    Donoho, David; Jin, Jiashun

    2008-09-30

    In important application fields today-genomics and proteomics are examples-selecting a small subset of useful features is crucial for success of Linear Classification Analysis. We study feature selection by thresholding of feature Z-scores and introduce a principle of threshold selection, based on the notion of higher criticism (HC). For i = 1, 2, ..., p, let pi(i) denote the two-sided P-value associated with the ith feature Z-score and pi((i)) denote the ith order statistic of the collection of P-values. The HC threshold is the absolute Z-score corresponding to the P-value maximizing the HC objective (i/p - pi((i)))/sqrt{i/p(1-i/p)}. We consider a rare/weak (RW) feature model, where the fraction of useful features is small and the useful features are each too weak to be of much use on their own. HC thresholding (HCT) has interesting behavior in this setting, with an intimate link between maximizing the HC objective and minimizing the error rate of the designed classifier, and very different behavior from popular threshold selection procedures such as false discovery rate thresholding (FDRT). In the most challenging RW settings, HCT uses an unconventionally low threshold; this keeps the missed-feature detection rate under better control than FDRT and yields a classifier with improved misclassification performance. Replacing cross-validated threshold selection in the popular Shrunken Centroid classifier with the computationally less expensive and simpler HCT reduces the variance of the selected threshold and the error rate of the constructed classifier. Results on standard real datasets and in asymptotic theory confirm the advantages of HCT.

  12. Higher criticism thresholding: Optimal feature selection when useful features are rare and weak

    PubMed Central

    Donoho, David; Jin, Jiashun

    2008-01-01

    In important application fields today—genomics and proteomics are examples—selecting a small subset of useful features is crucial for success of Linear Classification Analysis. We study feature selection by thresholding of feature Z-scores and introduce a principle of threshold selection, based on the notion of higher criticism (HC). For i = 1, 2, …, p, let πi denote the two-sided P-value associated with the ith feature Z-score and π(i) denote the ith order statistic of the collection of P-values. The HC threshold is the absolute Z-score corresponding to the P-value maximizing the HC objective (i/p − π(i))/i/p(1−i/p). We consider a rare/weak (RW) feature model, where the fraction of useful features is small and the useful features are each too weak to be of much use on their own. HC thresholding (HCT) has interesting behavior in this setting, with an intimate link between maximizing the HC objective and minimizing the error rate of the designed classifier, and very different behavior from popular threshold selection procedures such as false discovery rate thresholding (FDRT). In the most challenging RW settings, HCT uses an unconventionally low threshold; this keeps the missed-feature detection rate under better control than FDRT and yields a classifier with improved misclassification performance. Replacing cross-validated threshold selection in the popular Shrunken Centroid classifier with the computationally less expensive and simpler HCT reduces the variance of the selected threshold and the error rate of the constructed classifier. Results on standard real datasets and in asymptotic theory confirm the advantages of HCT. PMID:18815365

  13. Relationships between annual plant productivity, nitrogen deposition and fire size in low-elevation California desert scrub

    USGS Publications Warehouse

    Rao, Leela E.; Matchett, John R.; Brooks, Matthew L.; Johns, Robert; Minnich, Richard A.; Allen, Edith B.

    2014-01-01

    Although precipitation is correlated with fire size in desert ecosystems and is typically used as an indirect surrogate for fine fuel load, a direct link between fine fuel biomass and fire size has not been established. In addition, nitrogen (N) deposition can affect fire risk through its fertilisation effect on fine fuel production. In this study, we examine the relationships between fire size and precipitation, N deposition and biomass with emphasis on identifying biomass and N deposition thresholds associated with fire spreading across the landscape. We used a 28-year fire record of 582 burns from low-elevation desert scrub to evaluate the relationship of precipitation, N deposition and biomass with the distribution of fire sizes using quantile regression. We found that models using annual biomass have similar predictive ability to those using precipitation and N deposition at the lower to intermediate portions of the fire size distribution. No distinct biomass threshold was found, although within the 99th percentile of the distribution fire size increased with greater than 125 g m–2 of winter fine fuel production. The study did not produce an N deposition threshold, but did validate the value of 125 g m–2 of fine fuel for spread of fires.

  14. Usage of fMRI for pre-surgical planning in brain tumor and vascular lesion patients: Task and statistical threshold effects on language lateralization☆☆☆

    PubMed Central

    Nadkarni, Tanvi N.; Andreoli, Matthew J.; Nair, Veena A.; Yin, Peng; Young, Brittany M.; Kundu, Bornali; Pankratz, Joshua; Radtke, Andrew; Holdsworth, Ryan; Kuo, John S.; Field, Aaron S.; Baskaya, Mustafa K.; Moritz, Chad H.; Meyerand, M. Elizabeth; Prabhakaran, Vivek

    2014-01-01

    Background and purpose Functional magnetic resonance imaging (fMRI) is a non-invasive pre-surgical tool used to assess localization and lateralization of language function in brain tumor and vascular lesion patients in order to guide neurosurgeons as they devise a surgical approach to treat these lesions. We investigated the effect of varying the statistical thresholds as well as the type of language tasks on functional activation patterns and language lateralization. We hypothesized that language lateralization indices (LIs) would be threshold- and task-dependent. Materials and methods Imaging data were collected from brain tumor patients (n = 67, average age 48 years) and vascular lesion patients (n = 25, average age 43 years) who received pre-operative fMRI scanning. Both patient groups performed expressive (antonym and/or letter-word generation) and receptive (tumor patients performed text-reading; vascular lesion patients performed text-listening) language tasks. A control group (n = 25, average age 45 years) performed the letter-word generation task. Results Brain tumor patients showed left-lateralization during the antonym-word generation and text-reading tasks at high threshold values and bilateral activation during the letter-word generation task, irrespective of the threshold values. Vascular lesion patients showed left-lateralization during the antonym and letter-word generation, and text-listening tasks at high threshold values. Conclusion Our results suggest that the type of task and the applied statistical threshold influence LI and that the threshold effects on LI may be task-specific. Thus identifying critical functional regions and computing LIs should be conducted on an individual subject basis, using a continuum of threshold values with different tasks to provide the most accurate information for surgical planning to minimize post-operative language deficits. PMID:25685705

  15. Largely ignored: the impact of the threshold value for a QALY on the importance of a transferability factor.

    PubMed

    Vemer, Pepijn; Rutten-van Mölken, Maureen P M H

    2011-10-01

    Recently, several checklists systematically assessed factors that affect the transferability of cost-effectiveness (CE) studies between jurisdictions. The role of the threshold value for a QALY has been given little consideration in these checklists, even though the importance of a factor as a cause of between country differences in CE depends on this threshold. In this paper, we study the impact of the willingness-to-pay (WTP) per QALY on the importance of transferability factors in the case of smoking cessation support (SCS). We investigated, for several values of the WTP, how differences between six countries affect the incremental net monetary benefit (INMB) of SCS. The investigated factors were demography, smoking prevalence, mortality, epidemiology and costs of smoking-related diseases, resource use and unit costs of SCS, utility weights and discount rates. We found that when the WTP decreased, factors that mainly affect health outcomes became less important and factors that mainly effect costs became more important. With a WTP below 1,000, the factors most responsible for between country differences in INMB were resource use and unit costs of SCS and the costs of smoking-related diseases. Utility values had little impact. At a threshold above 10,000, between country differences were primarily due to different discount rates, utility weights and epidemiology of smoking-related diseases. Costs of smoking-related diseases had little impact. At all thresholds, demography had little impact. We concluded that, when judging the transferability of a CE study, we should consider the between country differences in WTP threshold values.

  16. Conception, fabrication and characterization of a silicon based MEMS inertial switch with a threshold value of 5 g

    NASA Astrophysics Data System (ADS)

    Zhang, Fengtian; Wang, Chao; Yuan, Mingquan; Tang, Bin; Xiong, Zhuang

    2017-12-01

    Most of the MEMS inertial switches developed in recent years are intended for shock and impact sensing with a threshold value above 50 g. In order to follow the requirement of detecting linear acceleration signal at low-g level, a silicon based MEMS inertial switch with a threshold value of 5 g was designed, fabricated and characterized. The switch consisted of a large proof mass, supported by circular spiral springs. An analytical model of the structure stiffness of the proposed switch was derived and verified by finite-element simulation. The structure fabrication was based on a customized double-buried layer silicon-on-insulator wafer and encapsulated by glass wafers. The centrifugal experiment and nanoindentation experiment were performed to measure the threshold value as well as the structure stiffness. The actual threshold values were measured to be 0.1-0.3 g lower than the pre-designed value of 5 g due to the dimension loss during non-contact lithography processing. Concerning the reliability assessment, a series of environmental experiments were conducted and the switches remained operational without excessive errors. However, both the random vibration and the shock tests indicate that the metal particles generated during collision of contact parts might affect the contact reliability and long-time stability. According to the conclusion reached in this report, an attentive study on switch contact behavior should be included in future research.

  17. Self-Organization on Social Media: Endo-Exo Bursts and Baseline Fluctuations

    PubMed Central

    Oka, Mizuki; Hashimoto, Yasuhiro; Ikegami, Takashi

    2014-01-01

    A salient dynamic property of social media is bursting behavior. In this paper, we study bursting behavior in terms of the temporal relation between a preceding baseline fluctuation and the successive burst response using a frequency time series of 3,000 keywords on Twitter. We found that there is a fluctuation threshold up to which the burst size increases as the fluctuation increases and that above the threshold, there appears a variety of burst sizes. We call this threshold the critical threshold. Investigating this threshold in relation to endogenous bursts and exogenous bursts based on peak ratio and burst size reveals that the bursts below this threshold are endogenously caused and above this threshold, exogenous bursts emerge. Analysis of the 3,000 keywords shows that all the nouns have both endogenous and exogenous origins of bursts and that each keyword has a critical threshold in the baseline fluctuation value to distinguish between the two. Having a threshold for an input value for activating the system implies that Twitter is an excitable medium. These findings are useful for characterizing how excitable a keyword is on Twitter and could be used, for example, to predict the response to particular information on social media. PMID:25329610

  18. Cost-effectiveness thresholds: methods for setting and examples from around the world.

    PubMed

    Santos, André Soares; Guerra-Junior, Augusto Afonso; Godman, Brian; Morton, Alec; Ruas, Cristina Mariano

    2018-06-01

    Cost-effectiveness thresholds (CETs) are used to judge if an intervention represents sufficient value for money to merit adoption in healthcare systems. The study was motivated by the Brazilian context of HTA, where meetings are being conducted to decide on the definition of a threshold. Areas covered: An electronic search was conducted on Medline (via PubMed), Lilacs (via BVS) and ScienceDirect followed by a complementary search of references of included studies, Google Scholar and conference abstracts. Cost-effectiveness thresholds are usually calculated through three different approaches: the willingness-to-pay, representative of welfare economics; the precedent method, based on the value of an already funded technology; and the opportunity cost method, which links the threshold to the volume of health displaced. An explicit threshold has never been formally adopted in most places. Some countries have defined thresholds, with some flexibility to consider other factors. An implicit threshold could be determined by research of funded cases. Expert commentary: CETs have had an important role as a 'bridging concept' between the world of academic research and the 'real world' of healthcare prioritization. The definition of a cost-effectiveness threshold is paramount for the construction of a transparent and efficient Health Technology Assessment system.

  19. Threshold Values for Identification of Contamination Predicted by Reduced-Order Models

    DOE PAGES

    Last, George V.; Murray, Christopher J.; Bott, Yi-Ju; ...

    2014-12-31

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts on underground sources of drinking water (USDWs) if CO2 or brine leaks from deep CO2 storage reservoirs. Threshold values, below which there would be no predicted impacts, were determined for portions of two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities.

  20. Processing circuitry for single channel radiation detector

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D. (Inventor); Delaune, Paul B. (Inventor); Turner, Kathryn M. (Inventor)

    2009-01-01

    Processing circuitry is provided for a high voltage operated radiation detector. An event detector utilizes a comparator configured to produce an event signal based on a leading edge threshold value. A preferred event detector does not produce another event signal until a trailing edge threshold value is satisfied. The event signal can be utilized for counting the number of particle hits and also for controlling data collection operation for a peak detect circuit and timer. The leading edge threshold value is programmable such that it can be reprogrammed by a remote computer. A digital high voltage control is preferably operable to monitor and adjust high voltage for the detector.

  1. Response of algal metrics to nutrients and physical factors and identification of nutrient thresholds in agricultural streams

    USGS Publications Warehouse

    Black, R.W.; Moran, P.W.; Frankforter, J.D.

    2011-01-01

    Many streams within the United States are impaired due to nutrient enrichment, particularly in agricultural settings. The present study examines the response of benthic algal communities in agricultural and minimally disturbed sites from across the western United States to a suite of environmental factors, including nutrients, collected at multiple scales. The first objective was to identify the relative importance of nutrients, habitat and watershed features, and macroinvertebrate trophic structure to explain algal metrics derived from deposition and erosion habitats. The second objective was to determine if thresholds in total nitrogen (TN) and total phosphorus (TP) related to algal metrics could be identified and how these thresholds varied across metrics and habitats. Nutrient concentrations within the agricultural areas were elevated and greater than published threshold values. All algal metrics examined responded to nutrients as hypothesized. Although nutrients typically were the most important variables in explaining the variation in each of the algal metrics, environmental factors operating at multiple scales also were important. Calculated thresholds for TN or TP based on the algal metrics generated from samples collected from erosion and deposition habitats were not significantly different. Little variability in threshold values for each metric for TN and TP was observed. The consistency of the threshold values measured across multiple metrics and habitats suggest that the thresholds identified in this study are ecologically relevant. Additional work to characterize the relationship between algal metrics, physical and chemical features, and nuisance algal growth would be of benefit to the development of nutrient thresholds and criteria. ?? 2010 The Author(s).

  2. Response of algal metrics to nutrients and physical factors and identification of nutrient thresholds in agricultural streams.

    PubMed

    Black, Robert W; Moran, Patrick W; Frankforter, Jill D

    2011-04-01

    Many streams within the United States are impaired due to nutrient enrichment, particularly in agricultural settings. The present study examines the response of benthic algal communities in agricultural and minimally disturbed sites from across the western United States to a suite of environmental factors, including nutrients, collected at multiple scales. The first objective was to identify the relative importance of nutrients, habitat and watershed features, and macroinvertebrate trophic structure to explain algal metrics derived from deposition and erosion habitats. The second objective was to determine if thresholds in total nitrogen (TN) and total phosphorus (TP) related to algal metrics could be identified and how these thresholds varied across metrics and habitats. Nutrient concentrations within the agricultural areas were elevated and greater than published threshold values. All algal metrics examined responded to nutrients as hypothesized. Although nutrients typically were the most important variables in explaining the variation in each of the algal metrics, environmental factors operating at multiple scales also were important. Calculated thresholds for TN or TP based on the algal metrics generated from samples collected from erosion and deposition habitats were not significantly different. Little variability in threshold values for each metric for TN and TP was observed. The consistency of the threshold values measured across multiple metrics and habitats suggest that the thresholds identified in this study are ecologically relevant. Additional work to characterize the relationship between algal metrics, physical and chemical features, and nuisance algal growth would be of benefit to the development of nutrient thresholds and criteria.

  3. An assessment of air quality reflecting the chemosensory irritation impact of mixtures of volatile organic compounds.

    PubMed

    Abraham, Michael H; Gola, Joelle M R; Cometto-Muñiz, J Enrique

    2016-01-01

    We present a method to assess the air quality of an environment based on the chemosensory irritation impact of mixtures of volatile organic compounds (VOCs) present in such environment. We begin by approximating the sigmoid function that characterizes psychometric plots of probability of irritation detection (Q) versus VOC vapor concentration to a linear function. First, we apply an established equation that correlates and predicts human sensory irritation thresholds (SIT) (i.e., nasal and eye irritation) based on the transfer of the VOC from the gas phase to biophases, e.g., nasal mucus and tear film. Second, we expand the equation to include other biological data (e.g., odor detection thresholds) and to include further VOCs that act mainly by "specific" effects rather than by transfer (i.e., "physical") effects as defined in the article. Then we show that, for 72 VOCs in common, Q values based on our calculated SITs are consistent with the Threshold Limit Values (TLVs) listed for those same VOCs on the basis of sensory irritation by the American Conference of Governmental Industrial Hygienists (ACGIH). Third, we set two equations to calculate the probability (Qmix) that a given air sample containing a number of VOCs could elicit chemosensory irritation: one equation based on response addition (Qmix scale: 0.00 to 1.00) and the other based on dose addition (1000*Qmix scale: 0 to 2000). We further validate the applicability of our air quality assessment method by showing that both Qmix scales provide values consistent with the expected sensory irritation burden from VOC mixtures present in a wide variety of indoor and outdoor environments as reported on field studies in the literature. These scales take into account both the concentration of VOCs at a particular site and the propensity of the VOCs to evoke sensory irritation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Brief communication: Using averaged soil moisture estimates to improve the performances of a regional-scale landslide early warning system

    NASA Astrophysics Data System (ADS)

    Segoni, Samuele; Rosi, Ascanio; Lagomarsino, Daniela; Fanti, Riccardo; Casagli, Nicola

    2018-03-01

    We communicate the results of a preliminary investigation aimed at improving a state-of-the-art RSLEWS (regional-scale landslide early warning system) based on rainfall thresholds by integrating mean soil moisture values averaged over the territorial units of the system. We tested two approaches. The simplest can be easily applied to improve other RSLEWS: it is based on a soil moisture threshold value under which rainfall thresholds are not used because landslides are not expected to occur. Another approach deeply modifies the original RSLEWS: thresholds based on antecedent rainfall accumulated over long periods are substituted with soil moisture thresholds. A back analysis demonstrated that both approaches consistently reduced false alarms, while the second approach reduced missed alarms as well.

  5. Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates

    PubMed Central

    Malone, Brian J.

    2017-01-01

    Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194

  6. Detection and quantification system for monitoring instruments

    DOEpatents

    Dzenitis, John M [Danville, CA; Hertzog, Claudia K [Houston, TX; Makarewicz, Anthony J [Livermore, CA; Henderer, Bruce D [Livermore, CA; Riot, Vincent J [Oakland, CA

    2008-08-12

    A method of detecting real events by obtaining a set of recent signal results, calculating measures of the noise or variation based on the set of recent signal results, calculating an expected baseline value based on the set of recent signal results, determining sample deviation, calculating an allowable deviation by multiplying the sample deviation by a threshold factor, setting an alarm threshold from the baseline value plus or minus the allowable deviation, and determining whether the signal results exceed the alarm threshold.

  7. Implications of the New Centers for Disease Control and Prevention Blood Lead Reference Value

    PubMed Central

    Burns, Mackenzie S.; Gerstenberger, Shawn L.

    2014-01-01

    The Centers for Disease Control and Prevention recently established a new reference value (≥ 5 μg/dL) as the standard for identifying children with elevated blood lead levels (EBLs). At present, 535 000 US children aged 1 to 5 years (2.6%) are estimated to have EBLs according to the new standard, versus 0.8% according to the previous standard (≥ 10 μg/dL). Because EBLs signify the threshold for public health intervention, this new definition increases demands on lead poisoning prevention efforts. Primary prevention has been proven to reduce lead poisoning cases and is also cost effective; however, federal budget cuts threaten the existence of such programs. Protection for the highest-risk children necessitates a reinstatement of federal funding to previous levels. PMID:24825227

  8. Influence of taekwondo as security martial arts training on anaerobic threshold, cardiorespiratory fitness, and blood lactate recovery.

    PubMed

    Kim, Dae-Young; Seo, Byoung-Do; Choi, Pan-Am

    2014-04-01

    [Purpose] This study was conducted to determine the influence of Taekwondo as security martial arts training on anaerobic threshold, cardiorespiratory fitness, and blood lactate recovery. [Subjects and Methods] Fourteen healthy university students were recruited and divided into an exercise group and a control group (n = 7 in each group). The subjects who participated in the experiment were subjected to an exercise loading test in which anaerobic threshold, value of ventilation, oxygen uptake, maximal oxygen uptake, heart rate, and maximal values of ventilation / heart rate were measured during the exercise, immediately after maximum exercise loading, and at 1, 3, 5, 10, and 15 min of recovery. [Results] At the anaerobic threshold time point, the exercise group showed a significantly longer time to reach anaerobic threshold. The exercise group showed significantly higher values for the time to reach VO2max, maximal values of ventilation, maximal oxygen uptake and maximal values of ventilation / heart rate. Significant changes were observed in the value of ventilation volumes at the 1- and 5-min recovery time points within the exercise group; oxygen uptake and maximal oxygen uptake were significantly different at the 5- and 10-min time points; heart rate was significantly different at the 1- and 3-min time points; and maximal values of ventilation / heart rate was significantly different at the 5-min time point. The exercise group showed significant decreases in blood lactate levels at the 15- and 30-min recovery time points. [Conclusion] The study results revealed that Taekwondo as a security martial arts training increases the maximal oxygen uptake and anaerobic threshold and accelerates an individual's recovery to the normal state of cardiorespiratory fitness and blood lactate level. These results are expected to contribute to the execution of more effective security services in emergencies in which violence can occur.

  9. 78 FR 79283 - Community Reinvestment Act Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-30

    ... thresholds used to define ``small bank'' or ``small savings association'' and ``intermediate small bank'' or ``intermediate small savings association.'' As required by the CRA regulations, the adjustment to the threshold... Agencies' CRA regulations establish CRA performance standards for small and intermediate small banks and...

  10. Detection thresholds for gaps, overlaps, and no-gap-no-overlaps.

    PubMed

    Heldner, Mattias

    2011-07-01

    Detection thresholds for gaps and overlaps, that is acoustic and perceived silences and stretches of overlapping speech in speaker changes, were determined. Subliminal gaps and overlaps were categorized as no-gap-no-overlaps. The established gap and overlap detection thresholds both corresponded to the duration of a long vowel, or about 120 ms. These detection thresholds are valuable for mapping the perceptual speaker change categories gaps, overlaps, and no-gap-no-overlaps into the acoustic domain. Furthermore, the detection thresholds allow generation and understanding of gaps, overlaps, and no-gap-no-overlaps in human-like spoken dialogue systems. © 2011 Acoustical Society of America

  11. Geochemistry of soils along a transect from Central Mexico to the Pacific Coast: a pilot study for continental-scale geochemical mapping

    USGS Publications Warehouse

    Chiprés, J.A.; de la Calleja,; Tellez, J.I.; Jiménez, F.; Cruz, Carlos; Guerrero, E.G.; Castro, J.; Monroy, M.G.; Salinas, J.C.

    2009-01-01

    The Mexican Geological Survey (SGM), the National Institute of Statistics, Geography and Informatics (INEGI) and the Autonomous University of San Luis Potosi (UASLP) have established a multidisciplinary team with the objective of creating a national program of geochemical mapping of soils in Mexico. This is being done as part of the North American Soil Geochemical Landscapes Project in partnership with the US Geological Survey and the Geological Survey of Canada. As the first step, a pilot study was conducted over a transect that extends from the Mexico–US border near Ciudad Juarez in the north to the Pacific Ocean in the south. This pilot transect was conducted in two phases, and this paper presents results from the first phase, which sampled soils at about a 40-km spacing along a 730-km transect beginning in Central Mexico and ending at the Pacific Coast. Samples were collected from the A and C horizons at each site and 60 elements were analyzed. This pilot study demonstrates that geochemical mapping based on a 40-km spacing is adequate to identify broad-scale geochemical patterns. Geologic influence (i.e., soil parent material) was the most important factor influencing the distribution of elements along the transect, followed by the influence of regional mineralization. The study also showed that influence by human activities over the transect is minimal except possibly in large mining districts. A comparison of element abundance in the A horizon with the environmental soil guidelines in Mexico showed that the natural concentrations of the studied soils were lower than the established threshold for soil restoration with the exception of V and As. The former had a median value (75 mg/kg) approximately equal to the value established in Mexico for soil restoration in agricultural and residential lands (78 mg/kg), and the latter had three values higher than the 22 mg/kg threshold for soil restoration in agricultural and residential lands. These cases demonstrate the importance of knowing the national- and regional-scale geochemistry of Mexican soils as a support for the decision-making process, particularly for the proper formulation and application of soil guidelines designed to protect human and ecosystem health.

  12. A new concept for the environmental risk assessment of poorly water soluble compounds and its application to consumer products.

    PubMed

    Tolls, Johannes; Müller, Martin; Willing, Andreas; Steber, Josef

    2009-07-01

    Many consumer products contain lipophilic, poorly soluble ingredients representing large-volume substances whose aquatic toxicity cannot be adequately determined with standard methods for a number of reasons. In such cases, a recently developed approach can be used to define an aquatic exposure threshold of no concern (ETNCaq; i.e., a concentration below which no adverse affects on the environment are to be expected). A risk assessment can be performed by comparing the ETNCaq value with the aquatic exposure levels of poorly soluble substances. Accordingly, the aquatic exposure levels of substances with water solubility below the ETNCaq will not exceed the ecotoxicological no-effect concentration; therefore, their risk can be assessed as being negligible. The ETNCaq value relevant for substances with a narcotic mode of action is 1.9 microg/L. To apply the above risk assessment strategy, the solubility in water needs to be known. Most frequently, this parameter is estimated by means of quantitative structure/activity relationships based on the log octanol-water partition coefficient (log Kow). The predictive value of several calculation models for water solubility has been investigated by this method with the use of more recent experimental solubility data for lipophilic compounds. A linear regression model was shown to be the most suitable for providing correct predictions without underestimation of real water solubility. To define a log Kow threshold suitable for reliably predicting a water solubility of less than 1.9 microg/L, a confidence limit was established by statistical comparison of the experimental solubility data with their log Kow. It was found that a threshold of log Kow = 7 generally allows discrimination between substances with solubility greater than and less than 1.9 microg/L. Accordingly, organic substances with a baseline toxicity and log Kow > 7 do not require further testing to prove that they have low environmental risk. In applying this concept, the uncertainty of the prediction of water solubility can be accounted for. If the predicted solubility in water is to be below ETNCaq with a probability of 95%, the corresponding log Kow value is 8.

  13. Temporal and Spatial Development of dB/dt During Substorms

    NASA Astrophysics Data System (ADS)

    Weygand, J. M.; Chu, X.

    2017-12-01

    Ground induced currents (GICs) due to space weather are a threat to high voltage power transmission systems. However, knowledge of ground conductivity is the largest source of errors in the determination of GICs. A good proxy for GICs is dB/dt obtained from the Bx and By components of the magnetic field fluctuations. It is known that dB/dt values associated with magnetic storms can reach dangerous levels for power transmission systems. On the other hand, it is not uncommon for dB/dt values associated with substorms to exceed critical thresholds of 1.5 nT/s [Pulkkinen et al., 2011; 2013] and 5 nT/s [Molinski et al., 2000] and the temporal and spatial changes of the dB/dt associated with substorms, unlike storms, are not well understood. Using two dimensional maps of dB/dt over North America and Greenland derived from the spherical elementary currents [Weygand et al., 2011], we investigate the temporal and spatial change of dB/dt for both a single substorm event and a two dimensional superposed epoch analysis of many substorms. Both the single event and the statistical analysis show a sudden increase of dB/dt at substorm onset followed by an expansion poleward, westward, and eastward after the onset during the expansion phase. This temporal and spatial development of the dB/dt resembles the temporal and spatial change of the auroral emissions. Substorm values of dB/dt peak shortly after the auroral onset time and in at least one event exceeded 6.5 nT/s for a non-storm time substorm. In many of our 24 cases the area that exceeds the Pulkkinen et al. [2011; 2013] threshold of 1.5 nT/s over several million square kilometers and after about 30 minutes the dB/dt values fall below the threshold level. These results address one of goals of the Space Weather Action Plan, which are to establish benchmarks for space weather events and improve modeling and prediction of their impacts on infrastructure.

  14. You Spin my Head Right Round: Threshold of Limited Immersion for Rotation Gains in Redirected Walking.

    PubMed

    Schmitz, Patric; Hildebrandt, Julian; Valdez, Andre Calero; Kobbelt, Leif; Ziefle, Martina

    2018-04-01

    In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user's immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.

  15. A compilation of safety impact information for extractables associated with materials used in pharmaceutical packaging, delivery, administration, and manufacturing systems.

    PubMed

    Jenke, Dennis; Carlson, Tage

    2014-01-01

    Demonstrating suitability for intended use is necessary to register packaging, delivery/administration, or manufacturing systems for pharmaceutical products. During their use, such systems may interact with the pharmaceutical product, potentially adding extraneous entities to those products. These extraneous entities, termed leachables, have the potential to affect the product's performance and/or safety. To establish the potential safety impact, drug products and their packaging, delivery, or manufacturing systems are tested for leachables or extractables, respectively. This generally involves testing a sample (either the extract or the drug product) by a means that produces a test method response and then correlating the test method response with the identity and concentration of the entity causing the response. Oftentimes, analytical tests produce responses that cannot readily establish the associated entity's identity. Entities associated with un-interpretable responses are termed unknowns. Scientifically justifiable thresholds are used to establish those individual unknowns that represent an acceptable patient safety risk and thus which do not require further identification and, conversely, those unknowns whose potential safety impact require that they be identified. Such thresholds are typically based on the statistical analysis of datasets containing toxicological information for more or less relevant compounds. This article documents toxicological information for over 540 extractables identified in laboratory testing of polymeric materials used in pharmaceutical applications. Relevant toxicological endpoints, such as NOELs (no observed effects), NOAELs (no adverse effects), TDLOs (lowest published toxic dose), and others were collated for these extractables or their structurally similar surrogates and were systematically assessed to produce a risk index, which represents a daily intake value for life-long intravenous administration. This systematic approach uses four uncertainty factors, each assigned a factor of 10, which consider the quality and relevance of the data, differences in route of administration, non-human species to human extrapolations, and inter-individual variation among humans. In addition to the risk index values, all extractables and most of their surrogates were classified for structural safety alerts using Cramer rules and for mutagenicity alerts using an in silico approach (Benigni/Bossa rule base for mutagenicity via Toxtree). Lastly, in vitro mutagenicity data (Ames Salmonella typimurium and Mouse Lymphoma tests) were collected from available databases (Chemical Carcinogenesis Research Information and Carcinogenic Potency Database). The frequency distributions of the resulting data were established; in general risk index values were normally distributed around a band ranging from 5 to 20 mg/day. The risk index associated with 95% level of the cumulative distribution plot was approximately 0.1 mg/day. Thirteen extractables in the dataset had individual risk index values less than 0.1 mg/day, although four of these had additional risk indices, based on multiple different toxicological endpoints, above 0.1 mg/day. Additionally, approximately 50% of the extractables were classified in Cramer Class 1 (low risk of toxicity) and approximately 35% were in Cramer Class 3 (no basis to assume safety). Lastly, roughly 20% of the extractables triggered either an in vitro or in silico alert for mutagenicity. When Cramer classifications and the mutagenicity alerts were compared to the risk indices, extractables with safety alerts generally had lower risk index values, although the differences in the risk index data distributions, extractables with or without alerts, were small and subtle. Leachables from packaging systems, manufacturing systems, or delivery devices can accumulate in drug products and potentially affect the drug product. Although drug products can be analyzed for leachables (and material extracts can be analyzed for extractables), not all leachables or extractables can be fully identified. Safety thresholds can be used to establish whether the unidentified substances can be deemed to be safe or whether additional analytical efforts need to be made to secure the identities. These thresholds are typically based on the statistical analysis of datasets containing toxicological information for more or less relevant compounds. This article contains safety data for over 500 extractables that were identified in laboratory characterizations of polymers used in pharmaceutical applications. The safety data consists of structural toxicity classifications of the extractables as well as calculated risk indices, where the risk indices were obtained by subjecting toxicological safety data, such as NOELs (no observed effects), NOAELs (no adverse effects), TDLOs (lowest published toxic dose), and others to a systematic evaluation process using appropriate uncertainty factors. Thus the risk index values represent daily exposures for the lifetime intravenous administration of drugs. The frequency distributions of the risk indices and Cramer classifications were examined. The risk index values were normally distributed around a range of 5 to 20 mg/day, and the risk index associated with the 95% level of the cumulative frequency plot was 0.1 mg/day. Approximately 50% of the extractables were in Cramer Class 1 (low risk of toxicity) and approximately 35% were in Cramer Class 3 (high risk of toxicity). Approximately 20% of the extractables produced an in vitro or in silico mutagenicity alert. In general, the distribution of risk index values was not strongly correlated with the either extractables' Cramer classification or by mutagenicity alerts. However, extractables with either in vitro or in silico alerts were somewhat more likely to have low risk index values. © PDA, Inc. 2014.

  16. Evaluation of induced color changes in chicken breast meat during simulation of pink color defect.

    PubMed

    Holownia, K; Chinnan, M S; Reynolds, A E; Koehler, P E

    2003-06-01

    The objective of the study was to establish a pink threshold and simulate the pink defect in cooked chicken breast meat with treatment combinations that would induce significant changes in the color of raw and cooked meat. The subjective pink threshold used in judging pink discoloration was established at a* = 3.8. Samples of three color groups (normal, lighter than normal, and darker than normal) of boneless, skinless chicken breast muscles were selected based on instrumental color values. The in situ changes were induced using sodium chloride, sodium tripolyphosphate, sodium erythorbate, and sodium nitrite at two levels: present and not present. Fillets in all treatments were subjected to individual injections, followed by tumbling, cooking, and chilling. Samples were analyzed for color [lightness (L*), red/green axis (a*), yellow/blue axis (b*)] and reflectance spectra. Simulation of the pink defect was achieved in eight of the 16 treatment combinations when sodium nitrite was present and in an additional two treatment combinations when it was absent. Pinking in cooked samples was affected (P < 0.05) by L* of raw meat color. Results confirmed that it was possible to simulate the undesired pinking in cooked chicken white meat when in situ conditions were induced by sodium chloride, sodium tripolyphosphate, and sodium nitrite. The continuation of the simulation study can aid in developing alternative processing methods to eliminate potential pink defects.

  17. Exploratory and spatial data analysis (EDA-SDA) for determining regional background levels and anomalies of potentially toxic elements in soils from Catorce-Matehuala, Mexico

    USGS Publications Warehouse

    Chiprés, J.A.; Castro-Larragoitia, J.; Monroy, M.G.

    2009-01-01

    The threshold between geochemical background and anomalies can be influenced by the methodology selected for its estimation. Environmental evaluations, particularly those conducted in mineralized areas, must consider this when trying to determinate the natural geochemical status of a study area, quantifying human impacts, or establishing soil restoration values for contaminated sites. Some methods in environmental geochemistry incorporate the premise that anomalies (natural or anthropogenic) and background data are characterized by their own probabilistic distributions. One of these methods uses exploratory data analysis (EDA) on regional geochemical data sets coupled with a geographic information system (GIS) to spatially understand the processes that influence the geochemical landscape in a technique that can be called a spatial data analysis (SDA). This EDA-SDA methodology was used to establish the regional background range from the area of Catorce-Matehuala in north-central Mexico. Probability plots of the data, particularly for those areas affected by human activities, show that the regional geochemical background population is composed of smaller subpopulations associated with factors such as soil type and parent material. This paper demonstrates that the EDA-SDA method offers more certainty in defining thresholds between geochemical background and anomaly than a numeric technique, making it a useful tool for regional geochemical landscape analysis and environmental geochemistry studies.

  18. De Minimis Thresholds for Federal Building Metering Appropriateness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, Jordan W.

    2015-03-31

    The U.S. Department of Energy (DOE) is required by statute and Presidential Memorandum to establish guidelines for agencies to meter their Federal buildings for energy (electricity, natural gas, and steam) and water. See 42 U.S.C. § 8253(e). DOE issued guidance in February 2006 on the installation of electric meters in Federal buildings. A recent update to the 2006 guidance accounts for more current metering practices within the Federal Government. The updated metering guidance specifies that all Federal buildings shall be considered “appropriate” for energy or water metering unless identified for potential exclusion. In developing the updated guidance to carry outmore » the statue, Congress also directed DOE to (among other things) establish exclusions from the metering requirements based on the de minimis quantity of energy use of a Federal building, industrial process, or structure. This paper discusses the method used to identify de minimis values.« less

  19. Applied regional monitoring of the vernal advancement and retrogradation (green wave effect) of natural vegetation in the Great Plains corridor. [Southwest Texas

    NASA Technical Reports Server (NTRS)

    Harlan, J. C.; Deering, D. W. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Rangelands in southwest Texas were used to establish threshold values and limitations on measuring herbaceous biomass under typical arid and semi-arid range conditions. Previous regression relationships established between ND6 and green biomass for two different ecosystems were similar. The west Texas data set for brush-free sites was too small to be statistically conclusive. It appears that a line with a third (and steeper) slope would be best for the west Texas data, and that line would intersect the other two. Results show that similar relationships exist between ND6 and green biomass under low brush canopy cover conditions, but local variations require a calibration to determine the best fit for an ecosystem. The brush canopy has a detrimental effect on the ND6 vs. herbaceous green biomass relationship.

  20. [Clinical experiences with four newly developed, surface modified stimulation electrodes].

    PubMed

    Winter, U J; Fritsch, J; Liebing, J; Höpp, H W; Hilger, H H

    1993-05-01

    Newly developed pacing electrodes with so-called porous surfaces promise a significantly improved post-operative pacing and sensing threshold. We therefore investigated four newly developed leads (ELA-PMCF-860 n = 10; Biotronik-60/4-DNP n = 10, CPI-4010 n = 10, Intermedics-421-03-Biopore n = 6) connected to two different pacing devices (Intermedics NOVA II, Medtronic PASYS) in 36 patients (18 men, 18 women, age: 69.7 +/- 9.8 years) suffering from symptomatic bradycardia. The individual electrode maturation process was investigated by means of repeated measurements of pacing threshold, electrode impedance in acute, subacute, and chronic phase, as well as energy consumption and sensing behavior in the chronic phase. However, with the exception of the 4010, the investigated leads showed largely varying values of the pacing threshold with individual peaks occurring from the second up to the 13th week. All leads had nearly similar chronic pacing thresholds (PMCF 0.13 +/- 0.07; DNP 0.25 +/- 0.18; Biopore 0.15 +/- 0.05; 4010 0.14 +/- 0.05 ms). Impedance measurements revealed higher, but not significantly different values for the DNP (PMCF 582 +/- 112, DNP 755 +/- 88, Biopore 650 +/- 15, 4010 718 +/- 104 Ohm). Despite differing values for pacing threshold and impedance, the energy consumption in the chronic phase during threshold-adapted, but secure stimulation (3 * impulse-width at pacing threshold) were comparable.

  1. Characterizing Decision-Analysis Performances of Risk Prediction Models Using ADAPT Curves.

    PubMed

    Lee, Wen-Chung; Wu, Yun-Chun

    2016-01-01

    The area under the receiver operating characteristic curve is a widely used index to characterize the performance of diagnostic tests and prediction models. However, the index does not explicitly acknowledge the utilities of risk predictions. Moreover, for most clinical settings, what counts is whether a prediction model can guide therapeutic decisions in a way that improves patient outcomes, rather than to simply update probabilities.Based on decision theory, the authors propose an alternative index, the "average deviation about the probability threshold" (ADAPT).An ADAPT curve (a plot of ADAPT value against the probability threshold) neatly characterizes the decision-analysis performances of a risk prediction model.Several prediction models can be compared for their ADAPT values at a chosen probability threshold, for a range of plausible threshold values, or for the whole ADAPT curves. This should greatly facilitate the selection of diagnostic tests and prediction models.

  2. Trends in Medicare Part D Medication Therapy Management Eligibility Criteria

    PubMed Central

    Wang, Junling; Shih, Ya-Chen Tina; Qin, Yolanda; Young, Theo; Thomas, Zachary; Spivey, Christina A.; Solomon, David K.; Chisholm-Burns, Marie

    2015-01-01

    Background To increase the enrollment rate of medication therapy management (MTM) programs in Medicare Part D plans, the US Centers for Medicare & Medicaid Services (CMS) lowered the allowable eligibility thresholds based on the number of chronic diseases and Part D drugs for Medicare Part D plans for 2010 and after. However, an increase in MTM enrollment rates has not been realized. Objectives To describe trends in MTM eligibility thresholds used by Medicare Part D plans and to identify patterns that may hinder enrollment in MTM programs. Methods This study analyzed data extracted from the Medicare Part D MTM Programs Fact Sheets (2008–2014). The annual percentages of utilizing each threshold value of the number of chronic diseases and Part D drugs, as well as other aspects of MTM enrollment practices, were analyzed among Medicare MTM programs that were established by Medicare Part D plans. Results For 2010 and after, increased proportions of Medicare Part D plans set their eligibility thresholds at the maximum numbers allowable. For example, in 2008, 48.7% of Medicare Part D plans (N = 347:712) opened MTM enrollment to Medicare beneficiaries with only 2 chronic disease states (specific diseases varied between plans), whereas the other half restricted enrollment to patients with a minimum of 3 to 5 chronic disease states. After 2010, only approximately 20% of plans opened their MTM enrollment to patients with 2 chronic disease states, with the remaining 80% restricting enrollment to patients with 3 or more chronic diseases. Conclusion The policy change by CMS for 2010 and after is associated with increased proportions of plans setting their MTM eligibility thresholds at the maximum numbers allowable. Changes to the eligibility thresholds by Medicare Part D plans might have acted as a barrier for increased MTM enrollment. Thus, CMS may need to identify alternative strategies to increase MTM enrollment in Medicare plans. PMID:26380030

  3. Relationship between behavioral and physiological spectral-ripple discrimination.

    PubMed

    Won, Jong Ho; Clinard, Christopher G; Kwon, Seeyoun; Dasika, Vasant K; Nie, Kaibao; Drennan, Ward R; Tremblay, Kelly L; Rubinstein, Jay T

    2011-06-01

    Previous studies have found a significant correlation between spectral-ripple discrimination and speech and music perception in cochlear implant (CI) users. This relationship could be of use to clinicians and scientists who are interested in using spectral-ripple stimuli in the assessment and habilitation of CI users. However, previous psychoacoustic tasks used to assess spectral discrimination are not suitable for all populations, and it would be beneficial to develop methods that could be used to test all age ranges, including pediatric implant users. Additionally, it is important to understand how ripple stimuli are processed in the central auditory system and how their neural representation contributes to behavioral performance. For this reason, we developed a single-interval, yes/no paradigm that could potentially be used both behaviorally and electrophysiologically to estimate spectral-ripple threshold. In experiment 1, behavioral thresholds obtained using the single-interval method were compared to thresholds obtained using a previously established three-alternative forced-choice method. A significant correlation was found (r = 0.84, p = 0.0002) in 14 adult CI users. The spectral-ripple threshold obtained using the new method also correlated with speech perception in quiet and noise. In experiment 2, the effect of the number of vocoder-processing channels on the behavioral and physiological threshold in normal-hearing listeners was determined. Behavioral thresholds, using the new single-interval method, as well as cortical P1-N1-P2 responses changed as a function of the number of channels. Better behavioral and physiological performance (i.e., better discrimination ability at higher ripple densities) was observed as more channels added. In experiment 3, the relationship between behavioral and physiological data was examined. Amplitudes of the P1-N1-P2 "change" responses were significantly correlated with d' values from the single-interval behavioral procedure. Results suggest that the single-interval procedure with spectral-ripple phase inversion in ongoing stimuli is a valid approach for measuring behavioral or physiological spectral resolution.

  4. Using natural range of variation to set decision thresholds: a case study for great plains grasslands

    USGS Publications Warehouse

    Symstad, Amy J.; Jonas, Jayne L.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    Natural range of variation (NRV) may be used to establish decision thresholds or action assessment points when ecological thresholds are either unknown or do not exist for attributes of interest in a managed ecosystem. The process for estimating NRV involves identifying spatial and temporal scales that adequately capture the heterogeneity of the ecosystem; compiling data for the attributes of interest via study of historic records, analysis and interpretation of proxy records, modeling, space-for-time substitutions, or analysis of long-term monitoring data; and quantifying the NRV from those data. At least 19 National Park Service (NPS) units in North America’s Great Plains are monitoring plant species richness and evenness as indicators of vegetation integrity in native grasslands, but little information on natural, temporal variability of these indicators is available. In this case study, we use six long-term vegetation monitoring datasets to quantify the temporal variability of these attributes in reference conditions for a variety of Great Plains grassland types, and then illustrate the implications of using different NRVs based on these quantities for setting management decision thresholds. Temporal variability of richness (as measured by the coefficient of variation, CV) is fairly consistent across the wide variety of conditions occurring in Colorado shortgrass prairie to Minnesota tallgrass sand savanna (CV 0.20–0.45) and generally less than that of production at the same sites. Temporal variability of evenness spans a greater range of CV than richness, and it is greater than that of production in some sites but less in other sites. This natural temporal variability may mask undesirable changes in Great Plains grasslands vegetation. Consequently, we suggest that managers consider using a relatively narrow NRV (interquartile range of all richness or evenness values observed in reference conditions) for designating a surveillance threshold, at which greater attention to the situation would be paid, and a broader NRV for designating management thresholds, at which action would be instigated.

  5. Are all intertidal wetlands naturally created equal? Bottlenecks, thresholds and knowledge gaps to mangrove and saltmarsh ecosystems

    USGS Publications Warehouse

    Friess, Daniel A.; Krauss, Ken W.; Horstman, Erik M.; Balke, Thorsten; Bouma, Tjeerd J.; Galli, Demis; Webb, Edward L.

    2011-01-01

    Intertidal wetlands such as saltmarshes and mangroves provide numerous important ecological functions, though they are in rapid and global decline. To better conserve and restore these wetland ecosystems, we need an understanding of the fundamental natural bottlenecks and thresholds to their establishment and long-term ecological maintenance. Despite inhabiting similar intertidal positions, the biological traits of these systems differ markedly in structure, phenology, life history, phylogeny and dispersal, suggesting large differences in biophysical interactions. By providing the first systematic comparison between saltmarshes and mangroves, we unravel how the interplay between species-specific life-history traits, biophysical interactions and biogeomorphological feedback processes determine where, when and what wetland can establish, the thresholds to long-term ecosystem stability, and constraints to genetic connectivity between intertidal wetland populations at the landscape level. To understand these process interactions, research into the constraints to wetland development, and biological adaptations to overcome these critical bottlenecks and thresholds requires a truly interdisciplinary approach.

  6. A Connection Admission Control Method for Web Server Systems

    NASA Astrophysics Data System (ADS)

    Satake, Shinsuke; Inai, Hiroshi; Saito, Tomoya; Arai, Tsuyoshi

    Most browsers establish multiple connections and download files in parallel to reduce the response time. On the other hand, a web server limits the total number of connections to prevent from being overloaded. That could decrease the response time, but would increase the loss probability, the probability of which a newly arriving client is rejected. This paper proposes a connection admission control method which accepts only one connection from a newly arriving client when the number of connections exceeds a threshold, but accepts new multiple connections when the number of connections is less than the threshold. Our method is aimed at reducing the response time by allowing as many clients as possible to establish multiple connections, and also reducing the loss probability. In order to reduce spending time to examine an adequate threshold for web server administrators, we introduce a procedure which approximately calculates the loss probability under a condition that the threshold is given. Via simulation, we validate the approximation and show effectiveness of the admission control.

  7. On the need for a time- and location-dependent estimation of the NDSI threshold value for reducing existing uncertainties in snow cover maps at different scales

    NASA Astrophysics Data System (ADS)

    Härer, Stefan; Bernhardt, Matthias; Siebers, Matthias; Schulz, Karsten

    2018-05-01

    Knowledge of current snow cover extent is essential for characterizing energy and moisture fluxes at the Earth's surface. The snow-covered area (SCA) is often estimated by using optical satellite information in combination with the normalized-difference snow index (NDSI). The NDSI thereby uses a threshold for the definition if a satellite pixel is assumed to be snow covered or snow free. The spatiotemporal representativeness of the standard threshold of 0.4 is however questionable at the local scale. Here, we use local snow cover maps derived from ground-based photography to continuously calibrate the NDSI threshold values (NDSIthr) of Landsat satellite images at two European mountain sites of the period from 2010 to 2015. The Research Catchment Zugspitzplatt (RCZ, Germany) and Vernagtferner area (VF, Austria) are both located within a single Landsat scene. Nevertheless, the long-term analysis of the NDSIthr demonstrated that the NDSIthr at these sites are not correlated (r = 0.17) and different than the standard threshold of 0.4. For further comparison, a dynamic and locally optimized NDSI threshold was used as well as another locally optimized literature threshold value (0.7). It was shown that large uncertainties in the prediction of the SCA of up to 24.1 % exist in satellite snow cover maps in cases where the standard threshold of 0.4 is used, but a newly developed calibrated quadratic polynomial model which accounts for seasonal threshold dynamics can reduce this error. The model minimizes the SCA uncertainties at the calibration site VF by 50 % in the evaluation period and was also able to improve the results at RCZ in a significant way. Additionally, a scaling experiment shows that the positive effect of a locally adapted threshold diminishes using a pixel size of 500 m or larger, underlining the general applicability of the standard threshold at larger scales.

  8. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  9. Enhancement of the Daytime MODIS Based Aircraft Icing Potential Algorithm Using Mesoscale Model Data

    DTIC Science & Technology

    2006-03-01

    January, 15, 2006 ...... 37 x Figure 25. ROC curves using 3 hour PIREPs and Alexander Tmap with symbols plotted at the 0.5 threshold values...42 Figure 26. ROC curves using 3 hour PIREPs and Alexander Tmap with symbols plotted at the 0.5 threshold values...Table 4. Results using T icing potential values from the Alexander Tmap , and 3 Hour PIREPs

  10. Measurand transient signal suppressor

    NASA Technical Reports Server (NTRS)

    Bozeman, Richard J., Jr. (Inventor)

    1994-01-01

    A transient signal suppressor for use in a controls system which is adapted to respond to a change in a physical parameter whenever it crosses a predetermined threshold value in a selected direction of increasing or decreasing values with respect to the threshold value and is sustained for a selected discrete time interval is presented. The suppressor includes a sensor transducer for sensing the physical parameter and generating an electrical input signal whenever the sensed physical parameter crosses the threshold level in the selected direction. A manually operated switch is provided for adapting the suppressor to produce an output drive signal whenever the physical parameter crosses the threshold value in the selected direction of increasing or decreasing values. A time delay circuit is selectively adjustable for suppressing the transducer input signal for a preselected one of a plurality of available discrete suppression time and producing an output signal only if the input signal is sustained for a time greater than the selected suppression time. An electronic gate is coupled to receive the transducer input signal and the timer output signal and produce an output drive signal for energizing a control relay whenever the transducer input is a non-transient signal which is sustained beyond the selected time interval.

  11. In-Network Processing of an Iceberg Join Query in Wireless Sensor Networks Based on 2-Way Fragment Semijoins

    PubMed Central

    Kang, Hyunchul

    2015-01-01

    We investigate the in-network processing of an iceberg join query in wireless sensor networks (WSNs). An iceberg join is a special type of join where only those joined tuples whose cardinality exceeds a certain threshold (called iceberg threshold) are qualified for the result. Processing such a join involves the value matching for the join predicate as well as the checking of the cardinality constraint for the iceberg threshold. In the previous scheme, the value matching is carried out as the main task for filtering non-joinable tuples while the iceberg threshold is treated as an additional constraint. We take an alternative approach, meeting the cardinality constraint first and matching values next. In this approach, with a logical fragmentation of the join operand relations on the aggregate counts of the joining attribute values, the optimal sequence of 2-way fragment semijoins is generated, where each fragment semijoin employs a Bloom filter as a synopsis of the joining attribute values. This sequence filters non-joinable tuples in an energy-efficient way in WSNs. Through implementation and a set of detailed experiments, we show that our alternative approach considerably outperforms the previous one. PMID:25774710

  12. [Evaluation of signal noise ratio on analysis of clear cell renal cell carcinoma using DWI with multi-b values].

    PubMed

    Ding, Jiule; Xing, Wei; Chen, Jie; Dai, Yongming; Sun, Jun; Li, Dengfa

    2014-01-21

    To explore the influence of signal noise ratio (SNR) on analysis of clear cell renal cell carcinoma (CCRCC) using DWI with multi-b values. The images of 17 cases with CCRCC were analyzed, including 17 masses and 9 pure cysts. The signal intensity of the cysts and masses was measured separately on DWI for each b value. The minimal SNR, as the threshold, was recorded when the signal curve manifest as the single exponential line. The SNR of the CCRCC was calculated on DWI for each b value, and compared with the threshold by independent Two-sample t Test. The signal decreased on DWI with increased b factors for both pure cysts and CCRCC. The threshold is 1.29 ± 0.17, and the signal intensity of the cysts on DWI with multi-b values shown as a single exponential line when b ≤ 800 s/mm(2). For the CCRCC, the SNR is similar to the threshold when b = 1 000 s/mm(2) (t = 0.40, P = 0.69), and is lower when b = 1 200 s/mm(2) (t = -2.38, P = 0.03). The SNR should be sufficient for quantitative analysis of DWI, and the maximal b value is 1000 s/mm(2) for CCRCC.

  13. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits.

    PubMed

    Asimit, Jennifer L; Panoutsopoulou, Kalliope; Wheeler, Eleanor; Berndt, Sonja I; Cordell, Heather J; Morris, Andrew P; Zeggini, Eleftheria; Barroso, Inês

    2015-12-01

    Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  14. Lower thresholds for lifetime health effects in mammals from high-LET radiation - Comparison with chronic low-LET radiation.

    PubMed

    Sazykina, Tatiana G; Kryshev, Alexander I

    2016-12-01

    Lower threshold dose rates and confidence limits are quantified for lifetime radiation effects in mammalian animals from internally deposited alpha-emitting radionuclides. Extensive datasets on effects from internal alpha-emitters are compiled from the International Radiobiological Archives. In total, the compiled database includes 257 records, which are analyzed by means of non-parametric order statistics. The generic lower threshold for alpha-emitters in mammalian animals (combined datasets) is 6.6·10 -5  Gy day -1 . Thresholds for individual alpha-emitting elements differ considerably: plutonium and americium - 2.0·10 -5  Gy day -1 ; radium - 2.1·10 -4  Gy day -1 . Threshold for chronic low-LET radiation is previously estimated at 1·10 -3  Gy day -1 . For low exposures, the following values of alpha radiation weighting factor w R for internally deposited alpha-emitters in mammals are quantified: w R (α) = 15 as a generic value for the whole group of alpha-emitters; w R (Pu) = 50 for plutonium; w R (Am) = 50 for americium; w R (Ra) = 5 for radium. These values are proposed to serve as radiation weighting factors in calculations of equivalent doses to non-human biota. The lower threshold dose rate for long-lived mammals (dogs) is significantly lower than comparing with the threshold for short-lived mammals (mice): 2.7·10 -5  Gy day -1 , and 2.0·10 -4  Gy day -1 , respectively. The difference in thresholds is exactly reflecting the relationship between the natural longevity of these two species. Graded scale of severity in lifetime radiation effects in mammals is developed, based on compiled datasets. Being placed on the severity scale, the effects of internal alpha-emitters are situated in the zones of considerably lower dose rates than effects of the same severity caused by low-LET radiation. RBE values, calculated for effects of equal severity, are found to depend on the intensity of chronic exposure: different RBE values are characteristic for low, moderate, and high lifetime exposures (30, 70, and 13, respectively). The results of the study provide a basis for selecting correct values of radiation weighting factors in dose assessment to non-human biota. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  16. Mercury demethylation in waterbird livers: Dose-response thresholds and differences among species

    USGS Publications Warehouse

    Eagles-Smith, Collin A.; Ackerman, Joshua T.; Julie, Y.E.E.; Adelsbach, T.L.

    2009-01-01

    We assessed methylmercury (MeHg) demethylation in the livers of adults and chicks of four waterbird species that commonly breed in San Francisco Bay: American avocets, black-necked stilts, Caspian terns, and Forster's terns. In adults (all species combined), we found strong evidence for a threshold, model where MeHg demethylation occurred above a hepatic total mercury concentration threshold of 8.51 ?? 0.93 ??g/g dry weight, and there was a strong decline in %MeHg values as total mercury (THg) concentrations increased above 8.51 ??g/g dry weight. Conversely, there was no evidence for a demethylation threshold in chicks, and we found that %MeHg values declined linearly with increasing THg concentrations. For adults, we also found taxonomie differences in the demethylation responses, with avocets and stilts showing a higher demethylation rate than that of terns when concentrations exceeded the threshold, whereas terns had a lower demethylation threshold (7.48 ?? 1.48 ??g/g dry wt) than that of avocets and stilts (9.91 ?? 1.29 ??g/g dry wt). Finally, we assessed the role of selenium (Se) in the demethylation process. Selenium concentrations were positively correlated with inorganic Hg in livers of birds above the demethylation threshold but not below. This suggests that Se may act as a binding site for demethylated Hg and may reduce the potential for secondary toxicity. Our findings indicate that waterbirds demethylate mercury in their livers if exposure exceeds a threshold value and suggest that taxonomie differences in demethylation ability may be an important factor in evaluating species-specific risk to MeHg exposure. Further, we provide strong evidence for a threshold of approximately 8.5 ??g/g dry weight of THg in the liver where demethylation is initiated. ?? 2009 SETAC.

  17. Oil-in-Water Emulsion Exhibits Bitterness-Suppressing Effects in a Sensory Threshold Study.

    PubMed

    Torrico, Damir Dennis; Sae-Eaw, Amporn; Sriwattana, Sujinda; Boeneke, Charles; Prinyawiwatkul, Witoon

    2015-06-01

    Little is known about how emulsion characteristics affect saltiness/bitterness perception. Sensory detection and recognition thresholds of NaCl, caffeine, and KCl in aqueous solution compared with oil-in-water emulsion systems were evaluated. For emulsions, NaCl, KCl, or caffeine were dissolved in water + emulsifier and mixed with canola oil (20% by weight). Two emulsions were prepared: emulsion 1 (viscosity = 257 cP) and emulsion 2 (viscosity = 59 cP). The forced-choice ascending concentration series method of limits (ASTM E-679-04) was used to determine detection and/or recognition thresholds at 25 °C. Group best estimate threshold (GBET) geometric means were expressed as g/100 mL. Comparing NaCl with KCl, there were no significant differences in detection GBET values for all systems (0.0197 - 0.0354). For saltiness recognition thresholds, KCl GBET values were higher compared with NaCl GBET (0.0822 - 0.1070 compared with 0.0471 - 0.0501). For NaCl and KCl, emulsion 1 and/or emulsion 2 did not significantly affect the saltiness recognition threshold compared with that of the aqueous solution. However, the bitterness recognition thresholds of caffeine and KCl in solution were significantly lower than in the emulsions (0.0242 - 0.0586 compared with 0.0754 - 0.1025). Gender generally had a marginal effect on threshold values. This study showed that, compared with the aqueous solutions, emulsions did not significantly affect the saltiness recognition threshold of NaCl and KCl, but exhibited bitterness-suppressing effects on KCl and/or caffeine. © 2015 Institute of Food Technologists®

  18. The effect of the stability threshold on time to stabilization and its reliability following a single leg drop jump landing.

    PubMed

    Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H

    2016-02-08

    We aimed to provide insight in how threshold selection affects time to stabilization (TTS) and its reliability to support selection of methods to determine TTS. Eighty-two elite youth soccer players performed six single leg drop jump landings. The TTS was calculated based on four processed signals: raw ground reaction force (GRF) signal (RAW), moving root mean square window (RMS), sequential average (SA) or unbounded third order polynomial fit (TOP). For each trial and processing method a wide range of thresholds was applied. Per threshold, reliability of the TTS was assessed through intra-class correlation coefficients (ICC) for the vertical (V), anteroposterior (AP) and mediolateral (ML) direction of force. Low thresholds resulted in a sharp increase of TTS values and in the percentage of trials in which TTS exceeded trial duration. The TTS and ICC were essentially similar for RAW and RMS in all directions; ICC's were mostly 'insufficient' (<0.4) to 'fair' (0.4-0.6) for the entire range of thresholds. The SA signals resulted in the most stable ICC values across thresholds, being 'substantial' (>0.8) for V, and 'moderate' (0.6-0.8) for AP and ML. The ICC's for TOP were 'substantial' for V, 'moderate' for AP, and 'fair' for ML. The present findings did not reveal an optimal threshold to assess TTS in elite youth soccer players following a single leg drop jump landing. Irrespective of threshold selection, the SA and TOP methods yielded sufficiently reliable TTS values, while for RAW and RMS the reliability was insufficient to differentiate between players. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Evaluation of the most suitable threshold value for modelling snow glacier melt through T- index approach: the case study of Forni Glacier (Italian Alps)

    NASA Astrophysics Data System (ADS)

    Senese, Antonella; Maugeri, Maurizio; Vuillermoz, Elisa; Smiraglia, Claudio; Diolaiuti, Guglielmina

    2014-05-01

    Glacier melt occurs whenever the surface temperature is null (273.15 K) and the net energy budget is positive. These conditions can be assessed by analyzing meteorological and energy data acquired by a supraglacial Automatic Weather Station (AWS). In the case this latter is not present at the glacier surface the assessment of actual melting conditions and the evaluation of melt amount is difficult and degree-day (also named T-index) models are applied. These approaches require the choice of a correct temperature threshold. In fact, melt does not necessarily occur at daily air temperatures higher than 273.15 K, since it is determined by the energy budget which in turn is only indirectly affected by air temperature. This is the case of the late spring period when ablation processes start at the glacier surface thus progressively reducing snow thickness. In this study, to detect the most indicative air temperature threshold witnessing melt conditions in the April-June period, we analyzed air temperature data recorded from 2006 to 2012 by a supraglacial AWS (at 2631 m a.s.l.) on the ablation tongue of the Forni Glacier (Italy), and by a weather station located nearby the studied glacier (at Bormio, 1225 m a.s.l.). Moreover we evaluated the glacier energy budget (which gives the actual melt, Senese et al., 2012) and the snow water equivalent values during this time-frame. Then the ablation amount was estimated both from the surface energy balance (MEB from supraglacial AWS data) and from degree-day method (MT-INDEX, in this latter case applying the mean tropospheric lapse rate to temperature data acquired at Bormio changing the air temperature threshold) and the results were compared. We found that the mean tropospheric lapse rate permits a good and reliable reconstruction of daily glacier air temperature conditions and the major uncertainty in the computation of snow melt from degree-day models is driven by the choice of an appropriate air temperature threshold. Then, to assess the most suitable threshold, we firstly analyzed hourly MEB values to detect if ablation occurs and how long this phenomenon takes (number of hours per day). The largest part of the melting (97.7%) resulted occurring on days featuring at least 6 melting hours thus suggesting to consider their minimum average daily temperature value as a suitable threshold (268.1 K). Then we ran a simple T-index model applying different threshold values. The threshold which better reproduces snow melting results the value 268.1 K. Summarizing using a 5.0 K lower threshold value (with respect to the largely applied 273.15 K) permits the best reconstruction of glacier melt and it results in agreement with findings by van den Broeke et al. (2010) in Greenland ice sheet. Then probably the choice of a 268 K value as threshold for computing degree days amount could be generalized and applied not only on Greenland glaciers but also on Mid latitude and Alpine ones. This work was carried out under the umbrella of the SHARE Stelvio Project funded by the Lombardy Region and managed by FLA and EvK2-CNR Committee.

  20. 78 FR 46799 - Use of Market Economy Input Prices in Nonmarket Economy Proceedings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ...The Department of Commerce (``Department'') is modifying its regulation which states that the Department normally will use the price that a nonmarket economy (``NME'') producer pays to a market economy supplier when a factor of production is purchased from a market economy supplier and paid for in market economy currency, in the calculation of normal value (``NV'') in antidumping proceedings involving NME countries. The rule establishes a requirement that the input at issue be produced in one or more market economy countries, and a revised threshold requiring that ``substantially all'' (i.e., 85 percent) of an input be purchased from one or more market economy suppliers before the Department uses the purchase price paid to value the entire factor of production. The Department is making this change because it finds that a market economy input price is not the best available information for valuing all purchases of that input when market economy purchases of an input do not account for substantially all purchases of the input.

  1. CO32- concentration and pCO2 thresholds for calcification and dissolution on the Molokai reef flat, Hawaii

    USGS Publications Warehouse

    Yates, K.K.; Halley, R.B.

    2006-01-01

    The severity of the impact of elevated atmospheric pCO2 to coral reef ecosystems depends, in part, on how sea-water pCO2 affects the balance between calcification and dissolution of carbonate sediments. Presently, there are insufficient published data that relate concentrations of pCO 2 and CO32- to in situ rates of reef calcification in natural settings to accurately predict the impact of elevated atmospheric pCO2 on calcification and dissolution processes. Rates of net calcification and dissolution, CO32- concentrations, and pCO2 were measured, in situ, on patch reefs, bare sand, and coral rubble on the Molokai reef flat in Hawaii. Rates of calcification ranged from 0.03 to 2.30 mmol CaCO3 m-2 h-1 and dissolution ranged from -0.05 to -3.3 mmol CaCO3 m-2 h-1. Calcification and dissolution varied diurnally with net calcification primarily occurring during the day and net dissolution occurring at night. These data were used to calculate threshold values for pCO2 and CO32- at which rates of calcification and dissolution are equivalent. Results indicate that calcification and dissolution are linearly correlated with both CO32- and pCO2. Threshold pCO2 and CO32- values for individual substrate types showed considerable variation. The average pCO2 threshold value for all substrate types was 654??195 ??atm and ranged from 467 to 1003 ??atm. The average CO32- threshold value was 152??24 ??mol kg-1, ranging from 113 to 184 ??mol kg-1. Ambient seawater measurements of pCO2 and CO32- indicate that CO32- and pCO2 threshold values for all substrate types were both exceeded, simultaneously, 13% of the time at present day atmospheric pCO2 concentrations. It is predicted that atmospheric pCO2 will exceed the average pCO2 threshold value for calcification and dissolution on the Molokai reef flat by the year 2100.

  2. Comparison of automatic and visual methods used for image segmentation in Endodontics: a microCT study.

    PubMed

    Queiroz, Polyane Mazucatto; Rovaris, Karla; Santaella, Gustavo Machado; Haiter-Neto, Francisco; Freitas, Deborah Queiroz

    2017-01-01

    To calculate root canal volume and surface area in microCT images, an image segmentation by selecting threshold values is required, which can be determined by visual or automatic methods. Visual determination is influenced by the operator's visual acuity, while the automatic method is done entirely by computer algorithms. To compare between visual and automatic segmentation, and to determine the influence of the operator's visual acuity on the reproducibility of root canal volume and area measurements. Images from 31 extracted human anterior teeth were scanned with a μCT scanner. Three experienced examiners performed visual image segmentation, and threshold values were recorded. Automatic segmentation was done using the "Automatic Threshold Tool" available in the dedicated software provided by the scanner's manufacturer. Volume and area measurements were performed using the threshold values determined both visually and automatically. The paired Student's t-test showed no significant difference between visual and automatic segmentation methods regarding root canal volume measurements (p=0.93) and root canal surface (p=0.79). Although visual and automatic segmentation methods can be used to determine the threshold and calculate root canal volume and surface, the automatic method may be the most suitable for ensuring the reproducibility of threshold determination.

  3. Dissolved Oxygen Thresholds to Protect Designated Aquatic Life Uses in Estuaries

    EPA Science Inventory

    Most if not all coastal states in the US have established numeric thresholds for dissolved oxygen (DO) to protect aquatic life in estuaries. Some are in the process, or have recently completed, revisions of their criteria based on newer science. Often, a toxicological approach ...

  4. Final Rule: Extremely Hazardous Substance List and Threshold Planning Quantities; Emergency Planning and Release Notification Requirements (52 FR 13378)

    EPA Pesticide Factsheets

    April 22, 1987: This FR established the list of extremely hazardous substances (EHSs) and their threshold planning quantities (TPQs). Also codified reporting and notification requirements for facilities with EHS. Do not use for current compliance purposes.

  5. Aquatic Rational Threshold Value (RTV) Concepts for Army Environmental Impact Assessment.

    DTIC Science & Technology

    1979-07-01

    rreversible impacts. In aquatic impacts. Examination of the etymology of “ration al systems, bot h the possible cause-effect relationships threshold value...namics, aqueous chemistry . toxicology, a driving function. 30 3’ The shading effects of ripar- and aquatic ecology. In addition , when man ’s use ian

  6. 30 CFR 71.700 - Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... gases, dust, fumes, mists, and vapors. 71.700 Section 71.700 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY HEALTH STANDARDS-SURFACE COAL MINES AND... limit values adopted by the American Conference of Governmental Industrial Hygienists in “Threshold...

  7. Rainfall threshold definition using an entropy decision approach and radar data

    NASA Astrophysics Data System (ADS)

    Montesarchio, V.; Ridolfi, E.; Russo, F.; Napolitano, F.

    2011-07-01

    Flash flood events are floods characterised by a very rapid response of basins to storms, often resulting in loss of life and property damage. Due to the specific space-time scale of this type of flood, the lead time available for triggering civil protection measures is typically short. Rainfall threshold values specify the amount of precipitation for a given duration that generates a critical discharge in a given river cross section. If the threshold values are exceeded, it can produce a critical situation in river sites exposed to alluvial risk. It is therefore possible to directly compare the observed or forecasted precipitation with critical reference values, without running online real-time forecasting systems. The focus of this study is the Mignone River basin, located in Central Italy. The critical rainfall threshold values are evaluated by minimising a utility function based on the informative entropy concept and by using a simulation approach based on radar data. The study concludes with a system performance analysis, in terms of correctly issued warnings, false alarms and missed alarms.

  8. A score-statistic approach for determining threshold values in QTL mapping.

    PubMed

    Kao, Chen-Hung; Ho, Hsiang-An

    2012-06-01

    Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.

  9. Economic values under inappropriate normal distribution assumptions.

    PubMed

    Sadeghi-Sefidmazgi, A; Nejati-Javaremi, A; Moradi-Shahrbabak, M; Miraei-Ashtiani, S R; Amer, P R

    2012-08-01

    The objectives of this study were to quantify the errors in economic values (EVs) for traits affected by cost or price thresholds when skewed or kurtotic distributions of varying degree are assumed to be normal and when data with a normal distribution is subject to censoring. EVs were estimated for a continuous trait with dichotomous economic implications because of a price premium or penalty arising from a threshold ranging between -4 and 4 standard deviations from the mean. In order to evaluate the impacts of skewness, positive and negative excess kurtosis, standard skew normal, Pearson and the raised cosine distributions were used, respectively. For the various evaluable levels of skewness and kurtosis, the results showed that EVs can be underestimated or overestimated by more than 100% when price determining thresholds fall within a range from the mean that might be expected in practice. Estimates of EVs were very sensitive to censoring or missing data. In contrast to practical genetic evaluation, economic evaluation is very sensitive to lack of normality and missing data. Although in some special situations, the presence of multiple thresholds may attenuate the combined effect of errors at each threshold point, in practical situations there is a tendency for a few key thresholds to dominate the EV, and there are many situations where errors could be compounded across multiple thresholds. In the development of breeding objectives for non-normal continuous traits influenced by value thresholds, it is necessary to select a transformation that will resolve problems of non-normality or consider alternative methods that are less sensitive to non-normality.

  10. Age structure and mortality of walleyes in Kansas reservoirs: Use of mortality caps to establish realistic management objectives

    USGS Publications Warehouse

    Quist, M.C.; Stephen, J.L.; Guy, C.S.; Schultz, R.D.

    2004-01-01

    Age structure, total annual mortality, and mortality caps (maximum mortality thresholds established by managers) were investigated for walleye Sander vitreus (formerly Stizostedion vitreum) populations sampled from eight Kansas reservoirs during 1991-1999. We assessed age structure by examining the relative frequency of different ages in the population; total annual mortality of age-2 and older walleyes was estimated by use of a weighted catch curve. To evaluate the utility of mortality caps, we modeled threshold values of mortality by varying growth rates and management objectives. Estimated mortality thresholds were then compared with observed growth and mortality rates. The maximum age of walleyes varied from 5 to 11 years across reservoirs. Age structure was dominated (???72%) by walleyes age 3 and younger in all reservoirs, corresponding to ages that were not yet vulnerable to harvest. Total annual mortality rates varied from 40.7% to 59.5% across reservoirs and averaged 51.1% overall (SE = 2.3). Analysis of mortality caps indicated that a management objective of 500 mm for the mean length of walleyes harvested by anglers was realistic for all reservoirs with a 457-mm minimum length limit but not for those with a 381-mm minimum length limit. For a 500-mm mean length objective to be realized for reservoirs with a 381-mm length limit, managers must either reduce mortality rates (e.g., through restrictive harvest regulations) or increase growth of walleyes. When the assumed objective was to maintain the mean length of harvested walleyes at current levels, the observed annual mortality rates were below the mortality cap for all reservoirs except one. Mortality caps also provided insight on management objectives expressed in terms of proportional stock density (PSD). Results indicated that a PSD objective of 20-40 was realistic for most reservoirs. This study provides important walleye mortality information that can be used for monitoring or for inclusion into population models; these results can also be combined with those of other studies to investigate large-scale differences in walleye mortality. Our analysis illustrates the utility of mortality caps for monitoring walleye populations and for establishing realistic management goals.

  11. Threshold Haemoglobin Levels and the Prognosis of Stable Coronary Disease: Two New Cohorts and a Systematic Review and Meta-Analysis

    PubMed Central

    Shah, Anoop D.; Nicholas, Owen; Timmis, Adam D.; Feder, Gene; Abrams, Keith R.; Chen, Ruoling; Hingorani, Aroon D.; Hemingway, Harry

    2011-01-01

    Background Low haemoglobin concentration has been associated with adverse prognosis in patients with angina and myocardial infarction (MI), but the strength and shape of the association and the presence of any threshold has not been precisely evaluated. Methods and findings A retrospective cohort study was carried out using the UK General Practice Research Database. 20,131 people with a new diagnosis of stable angina and no previous acute coronary syndrome, and 14,171 people with first MI who survived for at least 7 days were followed up for a mean of 3.2 years. Using semi-parametric Cox regression and multiple adjustment, there was evidence of threshold haemoglobin values below which mortality increased in a graded continuous fashion. For men with MI, the threshold value was 13.5 g/dl (95% confidence interval [CI] 13.2–13.9); the 29.5% of patients with haemoglobin below this threshold had an associated hazard ratio for mortality of 2.00 (95% CI 1.76–2.29) compared to those with haemoglobin values in the lowest risk range. Women tended to have lower threshold haemoglobin values (e.g, for MI 12.8 g/dl; 95% CI 12.1–13.5) but the shape and strength of association did not differ between the genders, nor between patients with angina and MI. We did a systematic review and meta-analysis that identified ten previously published studies, reporting a total of only 1,127 endpoints, but none evaluated thresholds of risk. Conclusions There is an association between low haemoglobin concentration and increased mortality. A large proportion of patients with coronary disease have haemoglobin concentrations below the thresholds of risk defined here. Intervention trials would clarify whether increasing the haemoglobin concentration reduces mortality. Please see later in the article for the Editors' Summary PMID:21655315

  12. Deterministic Approach for Estimating Critical Rainfall Threshold of Rainfall-induced Landslide in Taiwan

    NASA Astrophysics Data System (ADS)

    Chung, Ming-Chien; Tan, Chih-Hao; Chen, Mien-Min; Su, Tai-Wei

    2013-04-01

    Taiwan is an active mountain belt created by the oblique collision between the northern Luzon arc and the Asian continental margin. The inherent complexities of geological nature create numerous discontinuities through rock masses and relatively steep hillside on the island. In recent years, the increase in the frequency and intensity of extreme natural events due to global warming or climate change brought significant landslides. The causes of landslides in these slopes are attributed to a number of factors. As is well known, rainfall is one of the most significant triggering factors for landslide occurrence. In general, the rainfall infiltration results in changing the suction and the moisture of soil, raising the unit weight of soil, and reducing the shear strength of soil in the colluvium of landslide. The stability of landslide is closely related to the groundwater pressure in response to rainfall infiltration, the geological and topographical conditions, and the physical and mechanical parameters. To assess the potential susceptibility to landslide, an effective modeling of rainfall-induced landslide is essential. In this paper, a deterministic approach is adopted to estimate the critical rainfall threshold of the rainfall-induced landslide. The critical rainfall threshold is defined as the accumulated rainfall while the safety factor of the slope is equal to 1.0. First, the process of deterministic approach establishes the hydrogeological conceptual model of the slope based on a series of in-situ investigations, including geological drilling, surface geological investigation, geophysical investigation, and borehole explorations. The material strength and hydraulic properties of the model were given by the field and laboratory tests. Second, the hydraulic and mechanical parameters of the model are calibrated with the long-term monitoring data. Furthermore, a two-dimensional numerical program, GeoStudio, was employed to perform the modelling practice. Finally, the critical rainfall threshold of the slope can be obtained by the coupled analysis of rainfall, infiltration, seepage, and slope stability. Taking the slope located at 50k+650 on Tainan county road No 174 as an example, it located at Zeng-Wun river watershed in the southern Taiwan, is an active landslide due to typhoon events. Coordinates for the case study site are 194925, 2567208 (TWD97). The site was selected as the results of previous reports and geological survey. According to the Central Weather Bureau, the annual precipitation is about 2,450 mm, the highest monthly value is in August with 630 mm, and the lowest value is in November with 13 mm. The results show that the critical rainfall threshold of the study case is around 640 mm. It means that there should be alarmed when the accumulated rainfall over 640 mm. Our preliminary results appear to be useful for rainfall-induced landslide hazard assessments. The findings are also a good reference to establish an early warning system of landslides and develop strategies to prevent so much misfortune from happening in the future.

  13. Variability of argon laser-induced sensory and pain thresholds on human oral mucosa and skin.

    PubMed Central

    Svensson, P.; Bjerring, P.; Arendt-Nielsen, L.; Kaaber, S.

    1991-01-01

    The variability of laser-induced pain perception on human oral mucosa and hairy skin was investigated in order to establish a new method for evaluation of pain in the orofacial region. A high-energy argon laser was used for experimental pain stimulation, and sensory and pain thresholds were determined. The intra-individual coefficients of variation for oral thresholds were comparable to cutaneous thresholds. However, inter-individual variation was smaller for oral thresholds, which could be due to larger variation in cutaneous optical properties. The short-term and 24-hr changes in thresholds on both surfaces were less than 9%. The results indicate that habituation to laser thresholds may account for part of the intra-individual variation observed. However, the subjective ratings of the intensity of the laser stimuli were constant. Thus, oral thresholds may, like cutaneous thresholds, be used for assessment and quantification of analgesic efficacies and to investigate various pain conditions. PMID:1814248

  14. Revising two-point discrimination assessment in normal aging and in patients with polyneuropathies.

    PubMed

    van Nes, S I; Faber, C G; Hamers, R M T P; Harschnitz, O; Bakkers, M; Hermans, M C E; Meijer, R J; van Doorn, P A; Merkies, I S J

    2008-07-01

    To revise the static and dynamic normative values for the two-point discrimination test and to examine its applicability and validity in patients with a polyneuropathy. Two-point discrimination threshold values were assessed in 427 healthy controls and 99 patients mildly affected by a polyneuropathy. The controls were divided into seven age groups ranging from 20-29, 30-39,..., up to 80 years and older; each group consisted of at least 30 men and 30 women. Two-point discrimination examination took place under standardised conditions on the index finger. Correlation studies were performed between the scores obtained and the values derived from the Weinstein Enhanced Sensory Test (WEST) and the arm grade of the Overall Disability SumScore (ODSS) in the patients' group (validity studies). Finally, the sensitivity to detect patients mildly affected by a polyneuropathy was evaluated for static and dynamic assessments. There was a significant age-dependent increase in the two-point discrimination values. No significant gender difference was found. The dynamic threshold values were lower than the static scores. The two-point discrimination values obtained correlated significantly with the arm grade of the ODSS (static values: r = 0.33, p = 0.04; dynamic values: r = 0.37, p = 0.02) and the scores of the WEST in patients (static values: r = 0.58, p = 0.0001; dynamic values: r = 0.55, p = 0.0002). The sensitivity for the static and dynamic threshold values was 28% and 33%, respectively. This study provides age-related normative two-point discrimination threshold values using a two-point discriminator (an aesthesiometer). This easily applicable instrument could be used as part of a more extensive neurological sensory evaluation.

  15. Effects of threshold on single-target detection by using modified amplitude-modulated joint transform correlator

    NASA Astrophysics Data System (ADS)

    Kaewkasi, Pitchaya; Widjaja, Joewono; Uozumi, Jun

    2007-03-01

    Effects of threshold value on detection performance of the modified amplitude-modulated joint transform correlator are quantitatively studied using computer simulation. Fingerprint and human face images are used as test scenes in the presence of noise and a contrast difference. Simulation results demonstrate that this correlator improves detection performance for both types of image used, but moreso for human face images. Optimal detection of low-contrast human face images obscured by strong noise can be obtained by selecting an appropriate threshold value.

  16. Life satisfaction, QALYs, and the monetary value of health.

    PubMed

    Huang, Li; Frijters, Paul; Dalziel, Kim; Clarke, Philip

    2018-06-18

    The monetary value of a quality-adjusted life-year (QALY) is frequently used to assess the benefits of health interventions and inform funding decisions. However, there is little consensus on methods for the estimation of this monetary value. In this study, we use life satisfaction as an indicator of 'experienced utility', and estimate the dollar equivalent value of a QALY using a fixed effect model with instrumental variable estimators. Using a nationally-representative longitudinal survey including 28,347 individuals followed during 2002-2015 in Australia, we estimate that individual's willingness to pay for one QALY is approximately A$42,000-A$67,000, and the willingness to pay for not having a long-term condition approximately A$2000 per year. As the estimates are derived using population-level data and a wellbeing measurement of life satisfaction, the approach has the advantage of being socially inclusive and recognizes the significant meaning of people's subjective valuations of health. The method could be particularly useful for nations where QALY thresholds are not yet validated or established. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Supra-threshold epidermis injury from near-infrared laser radiation prior to ablation onset

    NASA Astrophysics Data System (ADS)

    DeLisi, Michael P.; Peterson, Amanda M.; Lile, Lily A.; Noojin, Gary D.; Shingledecker, Aurora D.; Stolarski, David J.; Zohner, Justin J.; Kumru, Semih S.; Thomas, Robert J.

    2017-02-01

    With continued advancement of solid-state laser technology, high-energy lasers operating in the near-infrared (NIR) band are being applied in an increasing number of manufacturing techniques and medical treatments. Safety-related investigations of potentially harmful laser interaction with skin are commonplace, consisting of establishing the maximum permissible exposure (MPE) thresholds under various conditions, often utilizing the minimally-visible lesion (MVL) metric as an indication of damage. Likewise, characterization of ablation onset and velocity is of interest for therapeutic and surgical use, and concerns exceptionally high irradiance levels. However, skin injury response between these two exposure ranges is not well understood. This study utilized a 1070-nm Yb-doped, diode-pumped fiber laser to explore the response of excised porcine skin tissue to high-energy exposures within the supra-threshold injury region without inducing ablation. Concurrent high-speed videography was employed to assess the effect on the epidermis, with a dichotomous response determination given for three progressive damage event categories: observable permanent distortion on the surface, formation of an epidermal bubble due to bounded intra-cutaneous water vaporization, and rupture of said bubble during laser exposure. ED50 values were calculated for these categories under various pulse configurations and beam diameters, and logistic regression models predicted injury events with approximately 90% accuracy. The distinction of skin response into categories of increasing degrees of damage expands the current understanding of high-energy laser safety while also underlining the unique biophysical effects during induced water phase change in tissue. These observations could prove useful in augmenting biothermomechanical models of laser exposure in the supra-threshold region.

  18. Concerns Around Budget Impact Thresholds: Not All Drugs Are the Same.

    PubMed

    Ciarametaro, Michael; Abedi, Susan; Sohn, Adam; Ge, Colin Fan; Odedara, Neel; Dubois, Robert

    2017-02-01

    The use of budget thresholds is a recent development in the United States (e.g., the Institute for Clinical and Economic Review drug assessments). Budget thresholds establish limits that require some type of budgetary action if exceeded. This research focused on the advisability of using product-level budget thresholds as fixed spending caps by examining whether they are likely to improve or worsen market efficiency over status quo. The aim of this study was to determine whether fixed product-level spending caps are advisable for biopharmaceuticals. We systematically examined 5-year, postlaunch revenue for drugs that launched in the United States between 2003 and 2014 using the IMS MIDAS database. For products launched between 2011 and 2014, we used historical revenue as the baseline and trended out 60 months postlaunch based on exponential smoothing. Forecasted fifth-year revenue was compared to analyst reports. Fifth-year revenue was compared against a hypothetical $904 million spending cap to determine the amount of annual spending that might require reallocation. Descriptive statistics of 5-year, postlaunch revenue and annual spending requiring reallocation were calculated. Adhering to a $904 million product-level spending cap requires that approximately one-third of new drug spending be reallocated to other goods and services that have the potential to be less cost-effective due to significant barriers. Fixed product-level spending caps have the potential to reduce market efficiency due to their independence from value and the presence of important operational challenges. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Invertebrate and fish assemblage relations to dissolved Oxygen minima in lowland streams of southwestern Louisiana

    USGS Publications Warehouse

    Justus, B.G.; Mize, Scott V.; Kroes, Daniel; Wallace, James E.

    2012-01-01

    Dissolved oxygen (DO) concentrations in lowland streams are naturally lower than those in upland streams; however, in some regions where monitoring data are lacking, DO criteria originally established for upland streams have been applied to lowland streams. This study investigated the DO concentrations at which fish and invertebrate assemblages at 35 sites located on lowland streams in southwestern Louisiana began to demonstrate biological thresholds.Average threshold values for taxa richness, diversity and abundance metrics were 2.6 and 2.3 mg/L for the invertebrate and fish assemblages, respectively. These thresholds are approximately twice the DO concentration that some native fish species are capable of tolerating and are comparable with DO criteria that have been recently applied to some coastal streams in Louisiana and Texas. DO minima >2.5 mg/L were favoured for all but extremely tolerant taxa. Extremely tolerant taxa had respiratory adaptations that gave them a competitive advantage, and their success when DO minima were <2 mg/L could be related more to reductions in competition or predation than to DO concentration directly.DO generally had an inverse relation to the amount of agriculture in the buffer area; however, DO concentrations at sites with both low and high amounts of agriculture (including three least-disturbed sites) declined to <2.5 mg/L. Thus, although DO fell below a concentration that was identified as an approximate biological threshold, sources of this condition were sometimes natural (allochthonous material) and had little relation to anthropogenic activity.

  20. Anaerobic Threshold and Salivary α-amylase during Incremental Exercise.

    PubMed

    Akizuki, Kazunori; Yazaki, Syouichirou; Echizenya, Yuki; Ohashi, Yukari

    2014-07-01

    [Purpose] The purpose of this study was to clarify the validity of salivary α-amylase as a method of quickly estimating anaerobic threshold and to establish the relationship between salivary α-amylase and double-product breakpoint in order to create a way to adjust exercise intensity to a safe and effective range. [Subjects and Methods] Eleven healthy young adults performed an incremental exercise test using a cycle ergometer. During the incremental exercise test, oxygen consumption, carbon dioxide production, and ventilatory equivalent were measured using a breath-by-breath gas analyzer. Systolic blood pressure and heart rate were measured to calculate the double product, from which double-product breakpoint was determined. Salivary α-amylase was measured to calculate the salivary threshold. [Results] One-way ANOVA revealed no significant differences among workloads at the anaerobic threshold, double-product breakpoint, and salivary threshold. Significant correlations were found between anaerobic threshold and salivary threshold and between anaerobic threshold and double-product breakpoint. [Conclusion] As a method for estimating anaerobic threshold, salivary threshold was as good as or better than determination of double-product breakpoint because the correlation between anaerobic threshold and salivary threshold was higher than the correlation between anaerobic threshold and double-product breakpoint. Therefore, salivary threshold is a useful index of anaerobic threshold during an incremental workload.

  1. Ecological Threshold for Toxicological Concern (eco-TTC): exploring the importance of non-standard species

    EPA Science Inventory

    The Threshold for Toxicological Concern (TTC) is well-established for assessing human safety of indirect food-contact substances and has been applied to a variety of endpoints. Recently, we have proposed an extension to the human safety TTC concept for environmental applications,...

  2. 76 FR 22070 - Federal Acquisition Regulation; Service Contracts Reporting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-20

    ... be required to report this information if the order meets the thresholds established in FAR 4.1603 (e... Definitions. 4.1602 Applicability. 4.1603 Contractor reporting requirements. 4.1604 Contracting officer..., and orders for services that meet or exceed the thresholds at 4.1603; and (3) Contractors and first...

  3. [Research on the threshold of Chl-a in Lake Taihu based on microcystins].

    PubMed

    Wei, Dai-chun; Su, Jing; Ji, Dan-feng; Fu, Xiao-yong; Wang, Ji; Huo, Shou-liang; Cui, Chi-fei; Tang, Jun; Xi, Bei-dou

    2014-12-01

    Water samples were collected in Lake Taihu from June to October in 2013 in order to investigate the threshold of chlorophyll a (Chl-a). The concentrations of three microcystins isomers (MC-LR, MC-RR, MC-YR) were detected by means of solid phase extraction and high performance liquid chromatography-tandem mass spectrometry. The correlations between various MCs and eutrophication factors, for instance of total nitrogen (TN), total phosphorus (TP), chlorophyll a, permanganate index etc were analyzed. The threshold of Chl-a was studied based on the relationships between MC-LR, MCs and Chl-a. The results showed that Lake Taihu was severely polluted by MCs and its spatial distribution could be described as follows: the concentration in Meiliang Bay was the highest, followed by Gonghu Bay and Western Lake, and Lake Center; the least polluted areas were in Lake Xuhu and Southern Lake. The concentration of MC-LR was the highest among the 3 MCs. The correlation analysis indicated that MC-LR, MC-RR, MC-YR and MCs had very positive correlation with permanganate index, TN, TP and Chl-a (P < 0.01). The threshold value of Chl-a was 12.26 mg x m(-3) according to the standard thresholds of MC-LR and MCs in drinking water. The threshold value of Chl-a in Lake Taihu was very close to the standard in the State of North Carolina, which demonstrated that the threshold value provided in this study was reasonable.

  4. Bilevel thresholding of sliced image of sludge floc.

    PubMed

    Chu, C P; Lee, D J

    2004-02-15

    This work examined the feasibility of employing various thresholding algorithms to determining the optimal bilevel thresholding value for estimating the geometric parameters of sludge flocs from the microtome sliced images and from the confocal laser scanning microscope images. Morphological information extracted from images depends on the bilevel thresholding value. According to the evaluation on the luminescence-inverted images and fractal curves (quadric Koch curve and Sierpinski carpet), Otsu's method yields more stable performance than other histogram-based algorithms and is chosen to obtain the porosity. The maximum convex perimeter method, however, can probe the shapes and spatial distribution of the pores among the biomass granules in real sludge flocs. A combined algorithm is recommended for probing the sludge floc structure.

  5. The threshold of a stochastic delayed SIR epidemic model with temporary immunity

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Chen, Qingmei; Jiang, Daqing

    2016-05-01

    This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.

  6. 77 FR 546 - Adjustment of Nationwide Significant Risk Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-05

    ...In accordance with Appendix D to Title 49 Code of Federal Regulations (CFR) Part 222, Use of Locomotive Horns at Highway-Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the fifth update to the NSRT, which has fallen from 14,007 to 13,722.

  7. 75 FR 82136 - Adjustment of Nationwide Significant Risk Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ...In accordance with Appendix D to Title 49 Code of Federal Regulations (CFR) Part 222, Use of Locomotive Horns at Highway-Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the fourth update to the NSRT, which has fallen from 18,775 to 14,007.

  8. 78 FR 70623 - Adjustment of Nationwide Significant Risk Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ...In accordance with appendix D to title 49 Code of Federal Regulations (CFR) part 222, Use of Locomotive Horns at Public Highway- Rail Grade Crossings, FRA is updating the Nationwide Significant Risk Threshold (NSRT). This action is needed to ensure that the public has the proper threshold of permissible risk for calculating quiet zones established in relationship to the NSRT. This is the sixth update to the NSRT, which is increasing from 13,722 to 14,347.

  9. Thresholds of probable problematic gambling involvement for the German population: Results of the Pathological Gambling and Epidemiology (PAGE) Study.

    PubMed

    Brosowski, Tim; Hayer, Tobias; Meyer, Gerhard; Rumpf, Hans-Jürgen; John, Ulrich; Bischof, Anja; Meyer, Christian

    2015-09-01

    Consumption measures in gambling research may help to establish thresholds of low-risk gambling as 1 part of evidence-based responsible gambling strategies. The aim of this study is to replicate existing Canadian thresholds of probable low-risk gambling (Currie et al., 2006) in a representative dataset of German gambling behavior (Pathological Gambling and Epidemiology [PAGE]; N = 15,023). Receiver-operating characteristic curves applied in a training dataset (60%) extracted robust thresholds of low-risk gambling across 4 nonexclusive definitions of gambling problems (1 + to 4 + Diagnostic and Statistical Manual for Mental Disorders-Fifth Edition [DSM-5] Composite International Diagnostic Interview [CIDI] symptoms), different indicators of gambling involvement (across all game types; form-specific) and different timeframes (lifetime; last year). Logistic regressions applied in a test dataset (40%) to cross-validate the heuristics of probable low-risk gambling incorporated confounding covariates (age, gender, education, migration, and unemployment) and confirmed the strong concurrent validity of the thresholds. Moreover, it was possible to establish robust form-specific thresholds of low-risk gambling (only for gaming machines and poker). Possible implications for early detection of problem gamblers in offline or online environments are discussed. Results substantiate international knowledge about problem gambling prevention and contribute to a German discussion about empirically based guidelines of low-risk gambling. (c) 2015 APA, all rights reserved).

  10. Low latency counter event indication

    DOEpatents

    Gara, Alan G [Mount Kisco, NY; Salapura, Valentina [Chappaqua, NY

    2008-09-16

    A hybrid counter array device for counting events with interrupt indication includes a first counter portion comprising N counter devices, each for counting signals representing event occurrences and providing a first count value representing lower order bits. An overflow bit device associated with each respective counter device is additionally set in response to an overflow condition. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits. An operatively coupled control device monitors each associated overflow bit device and initiates incrementing a second count value stored at a corresponding memory location in response to a respective overflow bit being set. The incremented second count value is compared to an interrupt threshold value stored in a threshold register, and, when the second counter value is equal to the interrupt threshold value, a corresponding "interrupt arm" bit is set to enable a fast interrupt indication. On a subsequent roll-over of the lower bits of that counter, the interrupt will be fired.

  11. Low latency counter event indication

    DOEpatents

    Gara, Alan G.; Salapura, Valentina

    2010-08-24

    A hybrid counter array device for counting events with interrupt indication includes a first counter portion comprising N counter devices, each for counting signals representing event occurrences and providing a first count value representing lower order bits. An overflow bit device associated with each respective counter device is additionally set in response to an overflow condition. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits. An operatively coupled control device monitors each associated overflow bit device and initiates incrementing a second count value stored at a corresponding memory location in response to a respective overflow bit being set. The incremented second count value is compared to an interrupt threshold value stored in a threshold register, and, when the second counter value is equal to the interrupt threshold value, a corresponding "interrupt arm" bit is set to enable a fast interrupt indication. On a subsequent roll-over of the lower bits of that counter, the interrupt will be fired.

  12. Establishment of stream nutrient criteria by comparing reference conditions with ecological thresholds in a typical eutrophic lake basin.

    PubMed

    Cao, Xiaofeng; Wang, Jie; Jiang, Dalin; Sun, Jinhua; Huang, Yi; Luan, Shengji

    2017-12-13

    The establishment of numeric nutrient criteria is essential to aid the control of nutrient pollution and for protecting and restoring healthy ecological conditions. However, it's necessary to determine whether regional nutrient criteria can be defined in stream ecosystems with a poor ecological status. A database of periphytic diatom samples was collected in July and August 2011 and 2012. In total 172 samples were included in the database with matching environmental variables. Here, percentile estimates, nonparametric change-point analysis (nCPA) and Threshold Indicator Taxa ANalysis (TITAN) were conducted to detect the reference conditions and ecological thresholds along a total nitrogen (TN) and total phosphorus (TP) gradient and ammonia nitrogen (NH 3 -N) for the development of nutrient criteria in the streams of the Lake Dianchi basin. The results highlighted the possibility of establishing regional criteria for nutrient concentrations, which we recommended to be no more than 1.39 mg L -1 for TN, 0.04 mg L -1 for TP and 0.17 mg L -1 for NH 3 -N to prevent nuisance growths of tolerant taxa, and 0.38 mg L -1 for TN, 0.02 mg L -1 for TP and 0.02 mg L -1 for NH 3 -N to maintain high quality waters in streams. Additionally, the influence of excessive background nutrient enrichment on the threshold response, and the ecological interaction with other stressors (HQI, etc.) in the nutrient dynamic process need to be considered to establish the eventual nutrient criteria, regardless of which technique is applied.

  13. Sparing of normal urothelium in hexyl-aminolevulinate-mediated photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Vaucher, Laurent; Jichlinski, Patrice; Lange, Norbert; Ritter-Schenk, Celine; van den Bergh, Hubert; Kucera, Pavel

    2005-04-01

    This work determines on an in vitro porcine urothelium model the threshold values of different parameters such as photosensitizer concentration, irradiation parameters and production of reactive oxygen species in order to control the damage on normal urothelium and spare about 50% of normal mucosa. For a three hours HAL incubation time, these threshold values were with blue light (0.75J/cm at 75 mW/cm2 or 0.15J/cm2 at 30 mW/cm2) and with white light (0.55J/cm2, at 30 mW/cm2). This means that for identical fluence rates, the threshold value for white light irradiation may be 3 times higher than for blue light irradiation.

  14. Dynamics of a network-based SIS epidemic model with nonmonotone incidence rate

    NASA Astrophysics Data System (ADS)

    Li, Chun-Hsien

    2015-06-01

    This paper studies the dynamics of a network-based SIS epidemic model with nonmonotone incidence rate. This type of nonlinear incidence can be used to describe the psychological effect of certain diseases spread in a contact network at high infective levels. We first find a threshold value for the transmission rate. This value completely determines the dynamics of the model and interestingly, the threshold is not dependent on the functional form of the nonlinear incidence rate. Furthermore, if the transmission rate is less than or equal to the threshold value, the disease will die out. Otherwise, it will be permanent. Numerical experiments are given to illustrate the theoretical results. We also consider the effect of the nonlinear incidence on the epidemic dynamics.

  15. Trace elements in raw milk of buffaloes (Bubalus bubalis) from Campania, Italy.

    PubMed

    Esposito, Mauro; Miedico, Oto; Cavallo, Stefania; Pellicanò, Roberta; Rosato, Guido; Baldi, Loredana; Chiaravalle, A Eugenio

    2017-10-15

    The profile of 18 trace elements was traced in 68 milk samples collected from buffalo farms in the territory known as the "Land of Fires" in the Campania region (Italy). This area has been polluted by the illegal dumping in fields of industrial or domestic waste, wich is sometimes then burned spreading toxic contaminants. Milk from buffaloes raised on rural farms might be a good indicator of environmental contamination risk in the human food chain. Trace element analysis in milk was performed using mass spectrometry. One milk sample was found to be non-compliant due to high Pb concentration. In the absence of threshold values for the elements, established through legislation, the results were compared with similar studies from other countries, and in most cases the content determined in this study was in agreement with values reported elsewhere and do not represent a risk to human health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Excitation functions of nuclear reactions induced by alpha particles up to 42 MeV on natTi for monitoring purposes and TLA

    NASA Astrophysics Data System (ADS)

    Hermanne, A.; Sonck, M.; Takács, S.; Szelecsényi, F.; Tárkányi, F.

    1999-05-01

    Excitation functions for the reactions induced by alpha particles on natTi foils and leading to the formation of 44m,44g,46,47,48Sc; 48,51Cr and 48V were determined using the stacked foil technique for energies from the respective reaction thresholds up to 42 MeV. The new experimental values are compared to earlier literature values and generally good accordance is found. It appears that the natTi(α,x) 51Cr reaction is particularly useful for monitoring α-beams in the 10-20 MeV region while for energies above 20 MeV the natTi(α,x) 47Sc reaction or the natTi(α,x) 48V reaction are more suited. The excitation functions established can be used to determine calibration curves for thin layer activation (TLA) as well.

  17. Is the introduction of another variable to the strength-duration curve necessary in neurostimulation?

    PubMed

    Abejón, David; Rueda, Pablo; del Saz, Javier; Arango, Sara; Monzón, Eva; Gilsanz, Fernando

    2015-04-01

    Neurostimulation is the process and technology derived from the application of electricity with different parameters to activate or inhibit nerve pathways. Pulse width (Pw) is the duration of each electrical impulse and, along with amplitude (I), determines the total energy charge of the stimulation. The aim of the study was to test Pw values to find the most adequate pulse widths in rechargeable systems to obtain the largest coverage of the painful area, the most comfortable paresthesia, and the greatest patient satisfaction. A study of the parameters was performed, varying Pw while maintaining a fixed frequency at 50 Hz. Data on perception threshold (Tp ), discomfort threshold (Td ), and therapeutic threshold (Tt ) were recorded, applying 14 increasing Pw values ranging from 50 µsec to 1000 µsec. Lastly, the behavior of the therapeutic range (TR), the coverage of the painful area, the subjective patient perception of paresthesia, and the degree of patient satisfaction were assessed. The findings after analyzing the different thresholds were as follows: When varying the Pw, the differences obtained at each threshold (Tp , Tt , and Td ) were statistically significant (p < 0.05). The differences among the resulting Tp values and among the resulting Tt values were statistically significant when varying Pw from 50 up to 600 µsec (p < 0.05). For Pw levels 600 µsec and up, no differences were observed in these thresholds. In the case of Td , significant differences existed as Pw increased from 50 to 700 µsec (p ≤ 0.05). The coverage increased in a statistically significant way (p < 0.05) from Pw values of 50 µsec to 300 µsec. Good or very good subjective perception was shown at about Pw 300 µsec. The patient paresthesia coverage was introduced as an extra variable in the chronaxie-rheobase curve, allowing the adjustment of Pw values for optimal programming. The coverage of the patient against the current chronaxie-rheobase formula will be represented on three axes; an extra axis (z) will appear, multiplying each combination of Pw value and amplitude by the percentage of coverage corresponding to those values. Using this new comparison of chronaxie-rheobase curve vs. coverage, maximum Pw values will be obtained different from those obtained by classic methods. © 2014 International Neuromodulation Society.

  18. Inventing Wastewater: The Social and Scientific Construction of Effluent in the Northeastern United States

    NASA Astrophysics Data System (ADS)

    Brideau, J. M.; Ng, M.; Hoover, J. H.; Hale, R. L.; Thomas, B.; Vogel, R. M.; Northeast ConsortiumHydrologic Synthesis Summer Institute, 2010--Biogeochemistry

    2010-12-01

    Title: Inventing Wastewater: The Social and Scientific Construction of Effluent in the Northeastern United States Authors: Jeffrey Brideau, Melissa Ng, Joseph Hoover, Rebecca Hale, Brian Thomas, and Richard Vogel Presented by: Jeffrey Brideau B.A., M.A., PhD Candidate, Department of History, University of Maryland Regulation of pollution is a prevalent part of contemporary American society. Scientists and policy makers have established acceptable effluent thresholds, with the ostensible goal of protecting human and stream health. However, this ubiquity of regulation is a recent phenomenon, and institutional mechanisms for effluent control were virtually non-existent in the early 20th century. Nonetheless, these same decades witnessed the emergence of nascent efforts at water pollution abatement. This project aims to explore social and scientific perceptions of wastewater, and begins with the simple premise that socio-cultural values underlay human decision-making in water management, and that wastewater is imbued with a matrix of human values that are continuously renegotiated. So what were the primary motivations for abatement efforts? Were they aesthetic and olfactory, or scientific concern for public and stream health? This paper proposes that there are social as well as scientific thresholds for pollutant loads. Collaborating with a team of interdisciplinary researchers we have created and aggregated discrete data sets to model, using export coefficient and linear regression modeling techniques, historic pollutant loading in the Northeastern United States. Concurrently, we have drawn on historical narratives of agitation by abatement advocates, nuisance laws, regulatory regimes, and changing scientific understanding; and contrasting the modeling results with these narratives allows this project to quantitatively determine where social thresholds lay in relation to their scientific counterparts. This project’s novelty lies in its use of existing narratives of wastewater and remediation efforts in tandem with the scientific quantification of pollutant loads in affected streams. In essence, the success of this project was predicated on the ability of the associated researchers to contribute their expertise, perform collaborative analysis, and, ultimately, produce a product that transcends traditional disciplinary boundaries. This paper represents one facet of that larger project. By determining the social thresholds of pollution loading, and where they converge with, or diverge from scientific thresholds, provides insight into why, when, and where various pollutants became offensive.

  19. Forecasting residential solar photovoltaic deployment in California

    DOE PAGES

    Dong, Changgui; Sigrin, Benjamin; Brinkman, Gregory

    2016-12-06

    Residential distributed photovoltaic (PV) deployment in the United States has experienced robust growth, and policy changes impacting the value of solar are likely to occur at the federal and state levels. To establish a credible baseline and evaluate impacts of potential new policies, this analysis employs multiple methods to forecast residential PV deployment in California, including a time-series forecasting model, a threshold heterogeneity diffusion model, a Bass diffusion model, and National Renewable Energy Laboratory's dSolar model. As a baseline, the residential PV market in California is modeled to peak in the early 2020s, with a peak annual installation of 1.5-2more » GW across models. We then use the baseline results from the dSolar model and the threshold model to gauge the impact of the recent federal investment tax credit (ITC) extension, the newly approved California net energy metering (NEM) policy, and a hypothetical value-of-solar (VOS) compensation scheme. We find that the recent ITC extension may increase annual PV installations by 12%-18% (roughly 500 MW, MW) for the California residential sector in 2019-2020. The new NEM policy only has a negligible effect in California due to the relatively small new charges (< 100 MW in 2019-2020). Moreover, impacts of the VOS compensation scheme (0.12 cents per kilowatt-hour) are larger, reducing annual PV adoption by 32% (or 900-1300 MW) in 2019-2020.« less

  20. Dose and Effect Thresholds for Early Key Events in a Mode of ...

    EPA Pesticide Factsheets

    ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. The goal of this study was to evaluate short-term key event indicators using qualitative and quantitative methods in an established pathway of mouse liver tumorigenesis mediated by peroxisome proliferator-activated receptor-alpha (PPARα). Male B6C3F1 mice were exposed for 7 days to di(2-ethylhexyl) phthalate (DEHP), di-n-octyl phthalate (DNOP), and n-butyl benzyl phthalate (BBP), which vary in PPARα activity and liver tumorigenicity. Each phthalate increased expression of select PPARα target genes at 7 days, while only DEHP significantly increased liver cell proliferation labeling index (LI). Transcriptional benchmark dose (BMDT) estimates for dose-related genomic markers stratified phthalates according to hypothetical tumorigenic potencies, unlike BMDs for non-genomic endpoints (liver weights or proliferation). The 7-day BMDT values for Acot1 as a surrogate measure for PPARα activation were 29, 370, and 676 mg/kg-d for DEHP, DNOP, and BBP, respectively, distinguishing DEHP (liver tumor BMD of 35 mg/kg-d) from non-tumorigenic DNOP and BBP. Effect thresholds were generated using linear regression of DEHP effects at 7 days and 2-year tumor incidence values to anchor early response molec

  1. Novel high/low solubility classification methods for new molecular entities.

    PubMed

    Dave, Rutwij A; Morris, Marilyn E

    2016-09-10

    This research describes a rapid solubility classification approach that could be used in the discovery and development of new molecular entities. Compounds (N=635) were divided into two groups based on information available in the literature: high solubility (BDDCS/BCS 1/3) and low solubility (BDDCS/BCS 2/4). We established decision rules for determining solubility classes using measured log solubility in molar units (MLogSM) or measured solubility (MSol) in mg/ml units. ROC curve analysis was applied to determine statistically significant threshold values of MSol and MLogSM. Results indicated that NMEs with MLogSM>-3.05 or MSol>0.30mg/mL will have ≥85% probability of being highly soluble and new molecular entities with MLogSM≤-3.05 or MSol≤0.30mg/mL will have ≥85% probability of being poorly soluble. When comparing solubility classification using the threshold values of MLogSM or MSol with BDDCS, we were able to correctly classify 85% of compounds. We also evaluated solubility classification of an independent set of 108 orally administered drugs using MSol (0.3mg/mL) and our method correctly classified 81% and 95% of compounds into high and low solubility classes, respectively. The high/low solubility classification using MLogSM or MSol is novel and independent of traditionally used dose number criteria. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A city scale study on the effects of intensive groundwater heat pump systems on heavy metal contents in groundwater.

    PubMed

    García-Gil, Alejandro; Epting, Jannis; Garrido, Eduardo; Vázquez-Suñé, Enric; Lázaro, Jesús Mateo; Sánchez Navarro, José Ángel; Huggenberger, P; Calvo, Miguel Ángel Marazuela

    2016-12-01

    As a result of the increasing use of shallow geothermal resources, hydraulic, thermal and chemical impacts affecting groundwater quality can be observed with ever increasing frequency (Possemiers et al., 2014). To overcome the uncertainty associated with chemical impacts, a city scale study on the effects of intensive geothermal resource use by groundwater heat pump systems on groundwater quality, with special emphasis on heavy metal contents was performed. Statistical analysis of geochemical data obtained from several field campaigns has allowed studying the spatiotemporal relationship between temperature anomalies in the aquifer and trace element composition of groundwater. The relationship between temperature and the concentrations of trace elements resulted in weak correlations, indicating that temperature changes are not the driving factor in enhancing heavy metal contaminations. Regression models established for these correlations showed a very low reactivity or response of heavy metal contents to temperature changes. The change rates of heavy metal contents with respect to temperature changes obtained indicate a low risk of exceeding quality threshold values by means of the exploitation regimes used, neither producing nor enhancing contamination significantly. However, modification of pH, redox potential, electrical conductivity, dissolved oxygen and alkalinity correlated with the concentrations of heavy metals. In this case, the change rates of heavy metal contents are higher, with a greater risk of exceeding threshold values. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. The origin of shallow landslides in Moravia (Czech Republic) in the spring of 2006

    NASA Astrophysics Data System (ADS)

    Bíl, Michal; Müller, Ivo

    2008-07-01

    At the end of March 2006, the Czech Republic (CZ) witnessed a fast thawing of an unusually thick snow cover in conjunction with massive rainfall. Most watercourses suffered floods, and more than 90 shallow landslides occurred in the Moravian region of Eastern CZ, primarily in non-forested areas. This region, geologically part of the Outer Western Carpathians, is prone to landslides because the bedrock is highly erodible Mesozoic and Tertiary flysch. The available meteorological data (depth of snow, water equivalent of the snow, cumulative rainfall, air and soil temperatures) from five local weather stations were used to construct indices quantitatively describing the snow thaw. Among these, the Total Cumulative Precipitation ( TCP) combines the amount of water from both thawing snow and rainfall. This concurrence of rain and runoff from snow melt was the decisive factor in triggering the landslides in the spring. The TCP index was applied to data of snow thaw periods for the last 20 years, when no landslides were recorded. This was to establish the safe threshold of TCP without landslides. The calculated safe threshold value for the region is ca. 100 mm of water delivered to the soil during the spring thaw (corresponding to ca. 11 mm day - 1 ). In 2006, 10% of the landslides occurred under or at 100 mm of TCP. The upper value of 155 mm covered all of the landslides.

  4. Forecasting residential solar photovoltaic deployment in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Changgui; Sigrin, Benjamin; Brinkman, Gregory

    Residential distributed photovoltaic (PV) deployment in the United States has experienced robust growth, and policy changes impacting the value of solar are likely to occur at the federal and state levels. To establish a credible baseline and evaluate impacts of potential new policies, this analysis employs multiple methods to forecast residential PV deployment in California, including a time-series forecasting model, a threshold heterogeneity diffusion model, a Bass diffusion model, and National Renewable Energy Laboratory's dSolar model. As a baseline, the residential PV market in California is modeled to peak in the early 2020s, with a peak annual installation of 1.5-2more » GW across models. We then use the baseline results from the dSolar model and the threshold model to gauge the impact of the recent federal investment tax credit (ITC) extension, the newly approved California net energy metering (NEM) policy, and a hypothetical value-of-solar (VOS) compensation scheme. We find that the recent ITC extension may increase annual PV installations by 12%-18% (roughly 500 MW, MW) for the California residential sector in 2019-2020. The new NEM policy only has a negligible effect in California due to the relatively small new charges (< 100 MW in 2019-2020). Moreover, impacts of the VOS compensation scheme (0.12 cents per kilowatt-hour) are larger, reducing annual PV adoption by 32% (or 900-1300 MW) in 2019-2020.« less

  5. EUNIS habitat's thresholds for the Western coast of the Iberian Peninsula - A Portuguese case study

    NASA Astrophysics Data System (ADS)

    Monteiro, Pedro; Bentes, Luis; Oliveira, Frederico; Afonso, Carlos M. L.; Rangel, Mafalda O.; Gonçalves, Jorge M. S.

    2015-06-01

    The European Nature Information System (EUNIS) has been implemented for the establishment of a marine European habitats inventory. Its hierarchical classification is defined and relies on environmental variables which primarily constrain biological communities (e.g. substrate types, sea energy level, depth and light penetration). The EUNIS habitat classification scheme relies on thresholds (e.g. fraction of light and energy) which are based on expert judgment or on the empirical analysis of the above environmental data. The present paper proposes to establish and validate an appropriate threshold for energy classes (high, moderate and low) and for subtidal biological zonation (infralittoral and circalittoral) suitable for EUNIS habitat classification of the Western Iberian coast. Kinetic wave-induced energy and the fraction of photosynthetically available light exerted on the marine bottom were respectively assigned to the presence of kelp (Saccorhiza polyschides, Laminaria hyperborea and Laminaria ochroleuca) and seaweed species in general. Both data were statistically described, ordered from the largest to the smallest and percentile analyses were independently performed. The threshold between infralittoral and circalittoral was based on the first quartile while the 'moderate energy' class was established between the 12.5 and 87.5 percentiles. To avoid data dependence on sampling locations and assess the confidence interval a bootstrap technique was applied. According to this analysis, more than 75% of seaweeds are present at locations where more than 3.65% of the surface light reaches the sea bottom. The range of energy levels estimated using S. polyschides data, indicate that on the Iberian West coast the 'moderate energy' areas are between 0.00303 and 0.04385 N/m2 of wave-induced energy. The lack of agreement between different studies in different regions of Europe suggests the need for more standardization in the future. However, the obtained thresholds in the present study will be very useful in the near future to implement and establish the Iberian EUNIS habitats inventory.

  6. Isoscalar π π , K K ¯ , η η scattering and the σ , f0, f2 mesons from QCD

    NASA Astrophysics Data System (ADS)

    Briceño, Raul A.; Dudek, Jozef J.; Edwards, Robert G.; Wilson, David J.; Hadron Spectrum Collaboration

    2018-03-01

    We present the first lattice QCD study of coupled isoscalar π π ,K K ¯ ,η η S - and D -wave scattering extracted from discrete finite-volume spectra computed on lattices which have a value of the light quark mass corresponding to mπ˜391 MeV . In the JP=0+ sector we find analogues of the experimental σ and f0(980 ) states, where the σ appears as a stable bound-state below π π threshold, and, similar to what is seen in experiment, the f0(980 ) manifests itself as a dip in the π π cross section in the vicinity of the K K ¯ threshold. For JP=2+ we find two states resembling the f2(1270 ) and f2'(1525 ), observed as narrow peaks, with the lighter state dominantly decaying to π π and the heavier state to K K ¯. The presence of all these states is determined rigorously by finding the pole singularity content of scattering amplitudes, and their couplings to decay channels are established using the residues of the poles.

  7. Nonequilibrium electronic transport in a one-dimensional Mott insulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidrich-Meisner, F.; Gonzalez, Ivan; Al-Hassanieh, K. A.

    2010-01-01

    We calculate the nonequilibrium electronic transport properties of a one-dimensional interacting chain at half filling, coupled to noninteracting leads. The interacting chain is initially in a Mott insulator state that is driven out of equilibrium by applying a strong bias voltage between the leads. For bias voltages above a certain threshold we observe the breakdown of the Mott insulator state and the establishment of a steady-state elec- tronic current through the system. Based on extensive time-dependent density-matrix renormalization-group simulations, we show that this steady-state current always has the same functional dependence on voltage, independent of the microscopic details of themore » model and we relate the value of the threshold to the Lieb-Wu gap. We frame our results in terms of the Landau-Zener dielectric breakdown picture. Finally, we also discuss the real-time evolution of the current, and characterize the current-carrying state resulting from the breakdown of the Mott insulator by computing the double occupancy, the spin structure factor, and the entanglement entropy.« less

  8. Economic Evaluation of Community-Based Case Management of Patients Suffering From Chronic Obstructive Pulmonary Disease.

    PubMed

    Sørensen, Sabrina Storgaard; Pedersen, Kjeld Møller; Weinreich, Ulla Møller; Ehlers, Lars

    2017-06-01

    To analyse the cost effectiveness of community-based case management for patients suffering from chronic obstructive pulmonary disease (COPD). The study took place in the third largest municipality in Denmark and was conducted as a randomised controlled trial with 12 months of follow-up. A total of 150 patients with COPD were randomised into two groups receiving usual care and case management in addition to usual care. Case management included among other things self care proficiency, medicine compliance, and care coordination. Outcome measure for the analysis was the incremental cost-effectiveness ratio (ICER) as cost per quality-adjusted life year (QALY) from the perspective of the healthcare sector. Costs were valued in British Pounds (£) at price level 2016. Scenario analyses and probabilistic sensitivity analyses were conducted in order to assess uncertainty of the ICER estimate. The intervention resulted in a QALY improvement of 0.0146 (95% CI -0.0216; 0.0585), and a cost increase of £494 (95% CI -1778; 2766) per patient. No statistically significant difference was observed either in costs or effects. The ICER was £33,865 per QALY gained. Scenario analyses confirmed the robustness of the result and revealed slightly lower ICERs of £28,100-£31,340 per QALY. Analysis revealed that case management led to a positive incremental QALY, but were more costly than usual care. The highly uncertain ICER somewhat exceeds for instance the threshold value used by the National Institute of Health and Care Excellence (NICE). No formally established Danish threshold value exists. ClinicalTrials.gov Identifier: NCT01512836.

  9. Relationship between consumer ranking of lamb colour and objective measures of colour.

    PubMed

    Khliji, S; van de Ven, R; Lamb, T A; Lanza, M; Hopkins, D L

    2010-06-01

    Given the lack of data that relates consumer acceptance of lamb colour to instrument measures a study was undertaken to establish the acceptability thresholds for fresh and displayed meat. Consumers (n=541) were asked to score 20 samples of lamb loin (m.longissimus thoracis et lumborum; LL) on an ordinal scale of 1 (very acceptable) to 5 (very unacceptable). A sample was considered acceptable by a consumer if it scored three or less. Ten samples were used for testing consumer response to fresh colour and 10 to test consumer response to colour during display of up to 4days. The colour of fresh meat was measured using a Minolta chromameter with a closed cone and a Hunter Lab Miniscan was used for measuring meat on display. For fresh meat when the a( *) (redness) and L( *) (lightness) values are equal to or exceed 9.5 and 34, respectively, on average consumers will consider the meat colour acceptable. However a( *) and L( *) values must be much higher (14.5 and 44, respectively) to have 95% confidence that a randomly selected consumer will consider a sample acceptable. For aged meat, when the wavelength ratio (630/580nm) and the a( *) values are equal to or greater than 3.3 and 14.8, respectively, on average consumers will consider the meat acceptable. These thresholds need to be increased to 6.8 for ratio (630/580nm) and 21.7 for a( *) to be 95% confident that a randomly selected consumer will consider a sample acceptable. Crown Copyright 2010. Published by Elsevier Ltd. All rights reserved.

  10. Homocysteine threshold value based on cystathionine beta synthase and paraoxonase 1 activities in mice.

    PubMed

    Hamelet, J; Aït-Yahya-Graison, E; Matulewicz, E; Noll, C; Badel-Chagnon, A; Camproux, A-C; Demuth, K; Paul, J-L; Delabar, J M; Janel, N

    2007-12-01

    Hyperhomocysteinaemia is a metabolic disorder associated with the development of premature atherosclerosis. Among the determinants which predispose to premature thromboembolic and atherothrombotic events, serum activity of paraoxonase 1, mainly synthesized in the liver, has been shown to be a predictor of cardiovascular disease and to be negatively correlated with serum homocysteine levels in human. Even though treatments of hyperhomocysteinaemic patients ongoing cardiovascular complications are commonly used, it still remains unclear above which homocysteine level a preventive therapy should be started. In order to establish a threshold of plasma homocysteine concentration we have analyzed the hepatic cystathionine beta synthase and paraoxonase 1 activities in a moderate to intermediate murine model of hyperhomocysteinaemia. Using wild type and heterozygous cystathionine beta synthase deficient mice fed a methionine enriched diet or a control diet, we first studied the link between cystathionine beta synthase and paraoxonase 1 activities and plasma homocysteine concentration. Among the animals used in this study, we observed a negative correlation between plasma homocysteine level and cystathionine beta synthase activity (rho=-0.52, P=0.0008) or paraoxonase 1 activity (rho=-0.49, P=0.002). Starting from these results, a homocysteine cut-off value of 15 microm has been found for both cystathionine beta synthase (P=0.0003) and paraoxonase 1 (P=0.0007) activities. Our results suggest that both cystathionine beta synthase and paraoxonase 1 activities are significantly decreased in mice with a plasma homocysteine value greater than 15 microm. In an attempt to set up preventive treatment for cardiovascular disease our results indicate that treatments should be started from 15 microm of plasma homocysteine.

  11. Mixing effects on apparent reaction rates and isotope fractionation during denitrification in a heterogeneous aquifer

    USGS Publications Warehouse

    Green, Christopher T.; Böhlke, John Karl; Bekins, Barbara A.; Phillips, Steven P.

    2010-01-01

    Gradients in contaminant concentrations and isotopic compositions commonly are used to derive reaction parameters for natural attenuation in aquifers. Differences between field‐scale (apparent) estimated reaction rates and isotopic fractionations and local‐scale (intrinsic) effects are poorly understood for complex natural systems. For a heterogeneous alluvial fan aquifer, numerical models and field observations were used to study the effects of physical heterogeneity on reaction parameter estimates. Field measurements included major ions, age tracers, stable isotopes, and dissolved gases. Parameters were estimated for the O2 reduction rate, denitrification rate, O2 threshold for denitrification, and stable N isotope fractionation during denitrification. For multiple geostatistical realizations of the aquifer, inverse modeling was used to establish reactive transport simulations that were consistent with field observations and served as a basis for numerical experiments to compare sample‐based estimates of “apparent” parameters with “true“ (intrinsic) values. For this aquifer, non‐Gaussian dispersion reduced the magnitudes of apparent reaction rates and isotope fractionations to a greater extent than Gaussian mixing alone. Apparent and true rate constants and fractionation parameters can differ by an order of magnitude or more, especially for samples subject to slow transport, long travel times, or rapid reactions. The effect of mixing on apparent N isotope fractionation potentially explains differences between previous laboratory and field estimates. Similarly, predicted effects on apparent O2threshold values for denitrification are consistent with previous reports of higher values in aquifers than in the laboratory. These results show that hydrogeological complexity substantially influences the interpretation and prediction of reactive transport.

  12. An avoidance behavior model for migrating whale populations

    NASA Astrophysics Data System (ADS)

    Buck, John R.; Tyack, Peter L.

    2003-04-01

    A new model is presented for the avoidance behavior of migrating marine mammals in the presence of a noise stimulus. This model assumes that each whale will adjust its movement pattern near a sound source to maintain its exposure below its own individually specific maximum received sound-pressure level, called its avoidance threshold. The probability distribution function (PDF) of this avoidance threshold across individuals characterizes the migrating population. The avoidance threshold PDF may be estimated by comparing the distribution of migrating whales during playback and control conditions at their closest point of approach to the sound source. The proposed model was applied to the January 1998 experiment which placed a single acoustic source from the U.S. Navy SURTASS-LFA system in the migration corridor of grey whales off the California coast. This analysis found that the median avoidance threshold for this migrating grey whale population was 135 dB, with 90% confidence that the median threshold was within +/-3 dB of this value. This value is less than the 141 dB value for 50% avoidance obtained when the 1984 ``Probability of Avoidance'' model of Malme et al.'s was applied to the same data. [Work supported by ONR.

  13. The (in)famous GWAS P-value threshold revisited and updated for low-frequency variants.

    PubMed

    Fadista, João; Manning, Alisa K; Florez, Jose C; Groop, Leif

    2016-08-01

    Genome-wide association studies (GWAS) have long relied on proposed statistical significance thresholds to be able to differentiate true positives from false positives. Although the genome-wide significance P-value threshold of 5 × 10(-8) has become a standard for common-variant GWAS, it has not been updated to cope with the lower allele frequency spectrum used in many recent array-based GWAS studies and sequencing studies. Using a whole-genome- and -exome-sequencing data set of 2875 individuals of European ancestry from the Genetics of Type 2 Diabetes (GoT2D) project and a whole-exome-sequencing data set of 13 000 individuals from five ancestries from the GoT2D and T2D-GENES (Type 2 Diabetes Genetic Exploration by Next-generation sequencing in multi-Ethnic Samples) projects, we describe guidelines for genome- and exome-wide association P-value thresholds needed to correct for multiple testing, explaining the impact of linkage disequilibrium thresholds for distinguishing independent variants, minor allele frequency and ancestry characteristics. We emphasize the advantage of studying recent genetic isolate populations when performing rare and low-frequency genetic association analyses, as the multiple testing burden is diminished due to higher genetic homogeneity.

  14. Do poison center triage guidelines affect healthcare facility referrals?

    PubMed

    Benson, B E; Smith, C A; McKinney, P E; Litovitz, T L; Tandberg, W D

    2001-01-01

    The purpose of this study was to determine the extent to which poison center triage guidelines influence healthcare facility referral rates for acute, unintentional acetaminophen-only poisoning and acute, unintentional adult formulation iron poisoning. Managers of US poison centers were interviewed by telephone to determine their center's triage threshold value (mg/kg) for acute iron and acute acetaminophen poisoning in 1997. Triage threshold values and healthcare facility referral rates were fit to a univariate logistic regression model for acetaminophen and iron using maximum likelihood estimation. Triage threshold values ranged from 120-201 mg/kg (acetaminophen) and 16-61 mg/kg (iron). Referral rates ranged from 3.1% to 24% (acetaminophen) and 3.7% to 46.7% (iron). There was a statistically significant inverse relationship between the triage value and the referral rate for acetaminophen (p < 0.001) and iron (p = 0.0013). The model explained 31.7% of the referral variation for acetaminophen but only 4.1% of the variation for iron. There is great variability in poison center triage values and referral rates for iron and acetaminophen poisoning. Guidelines can account for a meaningful proportion of referral variation. Their influence appears to be substance dependent. These data suggest that efforts to determine and utilize the highest, safe, triage threshold value could substantially decrease healthcare costs for poisonings as long as patient medical outcomes are not compromised.

  15. Vertical-cavity surface-emitting lasers come of age

    NASA Astrophysics Data System (ADS)

    Morgan, Robert A.; Lehman, John A.; Hibbs-Brenner, Mary K.

    1996-04-01

    This manuscript reviews our efforts in demonstrating state-of-the-art planar, batch-fabricable, high-performance vertical-cavity surface-emitting lasers (VCSELs). All performance requirements for short-haul data communication applications are clearly established. We concentrate on the flexibility of the established proton-implanted AlGaAs-based (emitting near 850 nm) technology platform, focusing on a standard device design. This structure is shown to meet or exceed performance and producibility requirements. These include > 99% device yield across 3-in-dia. metal-organic vapor phase epitaxy (MOVPE)-grown wafers and wavelength operation across a > 100-nm range. Recent progress in device performance [low threshold voltage (Vth equals 1.53 V); threshold current (Ith equals 0.68 mA); continuous wave (CW) power (Pcw equals 59 mW); maximum and minimum CW lasing temperature (T equals 200 degree(s)C, 10 K); and wall-plug efficiencies ((eta) wp equals 28%)] should enable great advances in VCSEL-based technologies. We also discuss the viability of VCSELs in cryogenic and avionic/military environments. Also reviewed is a novel technique, modifying this established platform, to engineer low-threshold, high-speed, single- mode VCSELs.

  16. Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting

    NASA Astrophysics Data System (ADS)

    Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto

    2015-04-01

    Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This has consequences in the ROC analysis. We applied the proposed procedure to a catalogue of rainfall conditions that have resulted in landslides, and to a set of rainfall events that - presumably - have not resulted in landslides, in Sicily, in the period 2002-2012. First, we determined regional event duration-cumulated event (ED) rainfall thresholds for shallow landslide occurrence using 200 rainfall conditions that have resulted in 223 shallow landslides in Sicily in the period 2002-2011. Next, we validated the thresholds using 29 rainfall conditions that have triggered 42 shallow landslides in Sicily in 2012, and 1250 rainfall events that presumably have not resulted in landslides in the same year. We performed a back analysis simulating the use of the thresholds in a hypothetical landslide warning system operating in 2012.

  17. Ultrasonically triggered ignition at liquid surfaces.

    PubMed

    Simon, Lars Hendrik; Meyer, Lennart; Wilkens, Volker; Beyer, Michael

    2015-01-01

    Ultrasound is considered to be an ignition source according to international standards, setting a threshold value of 1mW/mm(2) [1] which is based on theoretical estimations but which lacks experimental verification. Therefore, it is assumed that this threshold includes a large safety margin. At the same time, ultrasound is used in a variety of industrial applications where it can come into contact with explosive atmospheres. However, until now, no explosion accidents have been reported in connection with ultrasound, so it has been unclear if the current threshold value is reasonable. Within this paper, it is shown that focused ultrasound coupled into a liquid can in fact ignite explosive atmospheres if a specific target positioned at a liquid's surface converts the acoustic energy into a hot spot. Based on ignition tests, conditions could be derived that are necessary for an ultrasonically triggered explosion. These conditions show that the current threshold value can be significantly augmented. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Physical Screening Predictors for Success in Completing Air Force Phase II Air Liaison Officer Aptitude Assessment.

    PubMed

    McGee, John Christopher; Wilson, Eric; Barela, Haley; Blum, Sharon

    2017-03-01

    Air Liaison Officer Aptitude Assessment (AAA) attrition is often associated with a lack of candidate physical preparation. The Functional Movement Screen, Tactical Fitness Assessment, and fitness metrics were collected (n = 29 candidates) to determine what physical factors could predict a candidate s success in completing AAA. Between-group comparisons were made between candidates completing AAA versus those who did not (p < 0.05). Upper 50% thresholds were established for all variables with R 2 < 0.8 and the data were converted to a binary form (0 = did not attain threshold, 1 = attained threshold). Odds-ratios, pre/post-test probabilities and positive likelihood ratios were computed and logistic regression applied to explain model variance. The following variables provided the most predictive value for AAA completion: Pull-ups (p = 0.01), Sit-ups (p = 0.002), Relative Powerball Toss (p = 0.017), and Pull-ups × Sit-ups interaction (p = 0.016). Minimum recommended guidelines for AAA screening are Pull-ups (10 maximum), Sit-ups (76/2 minutes), and a Relative Powerball Toss of 0.6980 ft × lb/BW. Associated benefits could be higher graduation rates, and a cost-savings associated from temporary duty and possible injury care for nonselected candidates. Recommended guidelines should be validated in future class cycles. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  19. Fraction of Electrons Consumed in Electron Acceptor Reduction and Hydrogen Thresholds as Indicators of Halorespiratory Physiology

    PubMed Central

    Löffler, Frank E.; Tiedje, James M.; Sanford, Robert A.

    1999-01-01

    Measurements of the hydrogen consumption threshold and the tracking of electrons transferred to the chlorinated electron acceptor (fe) reliably detected chlororespiratory physiology in both mixed cultures and pure cultures capable of using tetrachloroethene, cis-1,2-dichloroethene, vinyl chloride, 2-chlorophenol, 3-chlorobenzoate, 3-chloro-4-hydroxybenzoate, or 1,2-dichloropropane as an electron acceptor. Hydrogen was consumed to significantly lower threshold concentrations of less than 0.4 ppmv compared with the values obtained for the same cultures without a chlorinated compound as an electron acceptor. The fe values ranged from 0.63 to 0.7, values which are in good agreement with theoretical calculations based on the thermodynamics of reductive dechlorination as the terminal electron-accepting process. In contrast, a mixed methanogenic culture that cometabolized 3-chlorophenol exhibited a significantly lower fe value, 0.012. PMID:10473415

  20. Normalizing rainfall/debris-flow thresholds along the U.S. Pacific coast for long-term variations in precipitation climate

    USGS Publications Warehouse

    Wilson, Raymond C.

    1997-01-01

    Broad-scale variations in long-term precipitation climate may influence rainfall/debris-flow threshold values along the U.S. Pacific coast, where both the mean annual precipitation (MAP) and the number of rainfall days (#RDs) are controlled by topography, distance from the coastline, and geographic latitude. Previous authors have proposed that rainfall thresholds are directly proportional to MAP, but this appears to hold only within limited areas (< 1?? latitude), where rainfall frequency (#RDs) is nearly constant. MAP-normalized thresholds underestimate the critical rainfall when applied to areas to the south, where the #RDs decrease, and overestimate threshold rainfall when applied to areas to the north, where the #RDs increase. For normalization between climates where both MAP and #RDs vary significantly, thresholds may best be described as multiples of the rainy-day normal, RDN = MAP/#RDs. Using data from several storms that triggered significant debris-flow activity in southern California, the San Francisco Bay region, and the Pacific Northwest, peak 24-hour rainfalls were plotted against RDN values, displaying a linear relationship with a lower bound at about 14 RDN. RDN ratios in this range may provide a threshold for broad-scale regional forecasting of debris-flow activity.

  1. Cost-effectiveness thresholds: pros and cons.

    PubMed

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  2. Spreading dynamics of a SIQRS epidemic model on scale-free networks

    NASA Astrophysics Data System (ADS)

    Li, Tao; Wang, Yuanmei; Guan, Zhi-Hong

    2014-03-01

    In order to investigate the influence of heterogeneity of the underlying networks and quarantine strategy on epidemic spreading, a SIQRS epidemic model on the scale-free networks is presented. Using the mean field theory the spreading dynamics of the virus is analyzed. The spreading critical threshold and equilibria are derived. Theoretical results indicate that the critical threshold value is significantly dependent on the topology of the underlying networks and quarantine rate. The existence of equilibria is determined by threshold value. The stability of disease-free equilibrium and the permanence of the disease are proved. Numerical simulations confirmed the analytical results.

  3. Optimal Measurement Level and Ulnar Nerve Cross-Sectional Area Cutoff Threshold for Identifying Ulnar Neuropathy at the Elbow by MRI and Ultrasonography.

    PubMed

    Terayama, Yasushi; Uchiyama, Shigeharu; Ueda, Kazuhiko; Iwakura, Nahoko; Ikegami, Shota; Kato, Yoshiharu; Kato, Hiroyuki

    2018-06-01

    Imaging criteria for diagnosing compressive ulnar neuropathy at the elbow (UNE) have recently been established as the maximum ulnar nerve cross-sectional area (UNCSA) upon magnetic resonance imaging (MRI) and/or ultrasonography (US). However, the levels of maximum UNCSA and diagnostic cutoff values have not yet been established. We therefore analyzed UNCSA by MRI and US in patients with UNE and in controls. We measured UNCSA at 7 levels in 30 patients with UNE and 28 controls by MRI and at 15 levels in 12 patients with UNE and 24 controls by US. We compared UNCSA as determined by MRI or US and determined optimal diagnostic cutoff values based on receiver operating characteristic curve analysis. The UNCSA was significantly larger in the UNE group than in controls at 3, 2, 1, and 0 cm proximal and 1, 2, and 3 cm distal to the medial epicondyle for both modalities. The UNCSA was maximal at 1 cm proximal to the medial epicondyle for MRI (16.1 ± 3.5 mm 2 ) as well as for US (17 ± 7 mm 2 ). A cutoff value of 11.0 mm 2 for MRI and US was found to be optimal for differentiating between patients with UNE and controls, with an area under the receiver operating characteristic curve of 0.95 for MRI and 0.96 for US. The UNCSA measured by MRI was not significantly different from that by US. Intra-rater and interrater reliabilities for UNCSA were all greater than 0.77. The UNCSA in the severe nerve dysfunction group of 18 patients was significantly larger than that in the mild nerve dysfunction group of 12 patients. By measuring UNCSA with MRI or US at 1 cm proximal to the ME, patients with and without UNE could be discriminated at a cutoff threshold of 11.0 mm 2 with high sensitivity, specificity, and reliability. Diagnostic III. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  4. Correlation of new hypervelocity impact data by threshold penetration relations

    NASA Technical Reports Server (NTRS)

    Hayduk, R. J.; Gough, P. S.; Alfaro-Bou, E.

    1973-01-01

    Threshold penetration data are established by impacting spherical projectiles onto 2024 aluminum single-wall targets. Nylon and cadmium projectiles were used at impacting velocities from 3.0 to 6.8 km/s and 7.9 to 8.5 km/s respectively. These data are combined with existing data and compared with three threshold relations to assess their respective validities over a wide range of projectile densities. Two of these relations were validated over the extended range of projectile densities.

  5. Pulse oximeter based mobile biotelemetry application.

    PubMed

    Işik, Ali Hakan; Güler, Inan

    2012-01-01

    Quality and features of tele-homecare are improved by information and communication technologies. In this context, a pulse oximeter-based mobile biotelemetry application is developed. With this application, patients can measure own oxygen saturation and heart rate through Bluetooth pulse oximeter at home. Bluetooth virtual serial port protocol is used to send the test results from pulse oximeter to the smart phone. These data are converted into XML type and transmitted to remote web server database via smart phone. In transmission of data, GPRS, WLAN or 3G can be used. The rule based algorithm is used in the decision making process. By default, the threshold value of oxygen saturation is 80; the heart rate threshold values are 40 and 150 respectively. If the patient's heart rate is out of the threshold values or the oxygen saturation is below the threshold value, an emergency SMS is sent to the doctor. By this way, the directing of an ambulance to the patient can be performed by doctor. The doctor for different patients can change these threshold values. The conversion of the result of the evaluated data to SMS XML template is done on the web server. Another important component of the application is web-based monitoring of pulse oximeter data. The web page provides access to of all patient data, so the doctors can follow their patients and send e-mail related to the evaluation of the disease. In addition, patients can follow own data on this page. Eight patients have become part of the procedure. It is believed that developed application will facilitate pulse oximeter-based measurement from anywhere and at anytime.

  6. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    NASA Astrophysics Data System (ADS)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation. The methodology can be adopted for identifying an appropriate threshold for SWAT model simulation in any watershed with a single simulation of the model with a zero-zero threshold.

  7. 78 FR 16921 - Physical Protection of Byproduct Material

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-19

    ... 1 and category 2 thresholds are based on the quantities established by the International Atomic... orders that were issued to licensees using a graded approach based on the relative risk and quantity of... 5,400 Ytterbium-169 300 8,100 3 81.0 These materials and thresholds are based on the IAEA Code of...

  8. Photosynthetically active radiation (PAR) x ultraviolet radiation (UV) interact to initiate solar injury in apple

    USDA-ARS?s Scientific Manuscript database

    Sunburn or solar injury (SI) in apple is associated with high temperature, high visible light and ultraviolet radiation (UV). Fruit surface temperature (FST) thresholds for SI related disorders have been developed but there are no thresholds established for solar radiation. The objectives of the s...

  9. 48 CFR 8.405-2 - Ordering procedures for services requiring a statement of work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (3) For proposed orders exceeding the maximum order threshold or when establishing a BPA. In addition... is reasonable. Place the order, or establish the BPA, with the schedule contractor that represents...

  10. 75 FR 76729 - Market Access Agreement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-09

    ... falls below pre-established financial performance thresholds. The draft amendment (MAA Amendment) is... System banks and the Funding Corporation that establishes certain financial performance criteria. Under... Agreement). FOR FURTHER INFORMATION CONTACT: Chris Wilson, Financial Analyst, Office of Regulatory Policy...

  11. The Vulnerability of People to Landslides: A Case Study on the Relationship between the Casualties and Volume of Landslides in China.

    PubMed

    Lin, Qigen; Wang, Ying; Liu, Tianxue; Zhu, Yingqi; Sui, Qi

    2017-02-21

    The lack of a detailed landslide inventory makes research on the vulnerability of people to landslides highly limited. In this paper, the authors collect information on the landslides that have caused casualties in China, and established the Landslides Casualties Inventory of China . 100 landslide cases from 2003 to 2012 were utilized to develop an empirical relationship between the volume of a landslide event and the casualties caused by the occurrence of the event. The error bars were used to describe the uncertainty of casualties resulting from landslides and to establish a threshold curve of casualties caused by landslides in China. The threshold curve was then applied to the landslide cases occurred in 2013 and 2014. The validation results show that the estimated casualties of the threshold curve were in good agreement with the real casualties with a small deviation. Therefore, the threshold curve can be used for estimating potential casualties and landslide vulnerability, which is meaningful for emergency rescue operations after landslides occurred and for risk assessment research.

  12. The Vulnerability of People to Landslides: A Case Study on the Relationship between the Casualties and Volume of Landslides in China

    PubMed Central

    Lin, Qigen; Wang, Ying; Liu, Tianxue; Zhu, Yingqi; Sui, Qi

    2017-01-01

    The lack of a detailed landslide inventory makes research on the vulnerability of people to landslides highly limited. In this paper, the authors collect information on the landslides that have caused casualties in China, and established the Landslides Casualties Inventory of China. 100 landslide cases from 2003 to 2012 were utilized to develop an empirical relationship between the volume of a landslide event and the casualties caused by the occurrence of the event. The error bars were used to describe the uncertainty of casualties resulting from landslides and to establish a threshold curve of casualties caused by landslides in China. The threshold curve was then applied to the landslide cases occurred in 2013 and 2014. The validation results show that the estimated casualties of the threshold curve were in good agreement with the real casualties with a small deviation. Therefore, the threshold curve can be used for estimating potential casualties and landslide vulnerability, which is meaningful for emergency rescue operations after landslides occurred and for risk assessment research. PMID:28230810

  13. 41 CFR 102-73.40 - What happens if the dollar value of the project exceeds the prospectus threshold?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What happens if the dollar value of the project exceeds the prospectus threshold? 102-73.40 Section 102-73.40 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 73-REAL ESTATE...

  14. 30 CFR 71.700 - Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors. 71.700 Section 71.700 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY HEALTH STANDARDS-SURFACE COAL MINES AND SURFACE WORK AREAS OF UNDERGROUND...

  15. 30 CFR 71.700 - Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors. 71.700 Section 71.700 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY HEALTH STANDARDS-SURFACE COAL MINES AND SURFACE WORK AREAS OF UNDERGROUND...

  16. 41 CFR 102-73.40 - What happens if the dollar value of the project exceeds the prospectus threshold?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What happens if the dollar value of the project exceeds the prospectus threshold? 102-73.40 Section 102-73.40 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 73-REAL ESTATE...

  17. 30 CFR 71.700 - Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Inhalation hazards; threshold limit values for gases, dust, fumes, mists, and vapors. 71.700 Section 71.700 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR COAL MINE SAFETY AND HEALTH MANDATORY HEALTH STANDARDS-SURFACE COAL MINES AND SURFACE WORK AREAS OF UNDERGROUND...

  18. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study

    PubMed Central

    2018-01-01

    Background The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. Objective The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. Methods A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. Results This study identifies 3 core current perceived value factors and 5 potential perceived value factors—how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Conclusions Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. PMID:29712623

  19. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study.

    PubMed

    Feldman, Sue S

    2018-04-30

    The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. This study identifies 3 core current perceived value factors and 5 potential perceived value factors-how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. ©Sue S Feldman. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 30.04.2018.

  20. 10 CFR Appendix E to Part 20 - Nationally Tracked Source Thresholds

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Nationally Tracked Source Thresholds E Appendix E to Part 20 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION Pt. 20, App. E Appendix E to Part 20— Nationally Tracked Source Thresholds The Terabecquerel (TBq) values are the...

  1. 10 CFR Appendix E to Part 20 - Nationally Tracked Source Thresholds

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Nationally Tracked Source Thresholds E Appendix E to Part 20 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION Pt. 20, App. E Appendix E to Part 20— Nationally Tracked Source Thresholds The Terabecquerel (TBq) values are the...

  2. 10 CFR Appendix E to Part 20 - Nationally Tracked Source Thresholds

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Nationally Tracked Source Thresholds E Appendix E to Part 20 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION Pt. 20, App. E Appendix E to Part 20— Nationally Tracked Source Thresholds The Terabecquerel (TBq) values are the...

  3. 10 CFR Appendix E to Part 20 - Nationally Tracked Source Thresholds

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Nationally Tracked Source Thresholds E Appendix E to Part 20 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION Pt. 20, App. E Appendix E to Part 20— Nationally Tracked Source Thresholds The Terabecquerel (TBq) values are the...

  4. 10 CFR Appendix E to Part 20 - Nationally Tracked Source Thresholds

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Nationally Tracked Source Thresholds E Appendix E to Part 20 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION Pt. 20, App. E Appendix E to Part 20— Nationally Tracked Source Thresholds The Terabecquerel (TBq) values are the...

  5. Percolation of disordered jammed sphere packings

    NASA Astrophysics Data System (ADS)

    Ziff, Robert M.; Torquato, Salvatore

    2017-02-01

    We determine the site and bond percolation thresholds for a system of disordered jammed sphere packings in the maximally random jammed state, generated by the Torquato-Jiao algorithm. For the site threshold, which gives the fraction of conducting versus non-conducting spheres necessary for percolation, we find {{p}\\text{c}}=0.3116(3) , consistent with the 1979 value of Powell 0.310(5) and identical within errors to the threshold for the simple-cubic lattice, 0.311 608, which shares the same average coordination number of 6. In terms of the volume fraction ϕ, the threshold corresponds to a critical value {φ\\text{c}}=0.199 . For the bond threshold, which apparently was not measured before, we find {{p}\\text{c}}=0.2424(3) . To find these thresholds, we considered two shape-dependent universal ratios involving the size of the largest cluster, fluctuations in that size, and the second moment of the size distribution; we confirmed the ratios’ universality by also studying the simple-cubic lattice with a similar cubic boundary. The results are applicable to many problems including conductivity in random mixtures, glass formation, and drug loading in pharmaceutical tablets.

  6. Reliability of the method of levels for determining cutaneous temperature sensitivity

    NASA Astrophysics Data System (ADS)

    Jakovljević, Miroljub; Mekjavić, Igor B.

    2012-09-01

    Determination of the thermal thresholds is used clinically for evaluation of peripheral nervous system function. The aim of this study was to evaluate reliability of the method of levels performed with a new, low cost device for determining cutaneous temperature sensitivity. Nineteen male subjects were included in the study. Thermal thresholds were tested on the right side at the volar surface of mid-forearm, lateral surface of mid-upper arm and front area of mid-thigh. Thermal testing was carried out by the method of levels with an initial temperature step of 2°C. Variability of thermal thresholds was expressed by means of the ratio between the second and the first testing, coefficient of variation (CV), coefficient of repeatability (CR), intraclass correlation coefficient (ICC), mean difference between sessions (S1-S2diff), standard error of measurement (SEM) and minimally detectable change (MDC). There were no statistically significant changes between sessions for warm or cold thresholds, or between warm and cold thresholds. Within-subject CVs were acceptable. The CR estimates for warm thresholds ranged from 0.74°C to 1.06°C and from 0.67°C to 1.07°C for cold thresholds. The ICC values for intra-rater reliability ranged from 0.41 to 0.72 for warm thresholds and from 0.67 to 0.84 for cold thresholds. S1-S2diff ranged from -0.15°C to 0.07°C for warm thresholds, and from -0.08°C to 0.07°C for cold thresholds. SEM ranged from 0.26°C to 0.38°C for warm thresholds, and from 0.23°C to 0.38°C for cold thresholds. Estimated MDC values were between 0.60°C and 0.88°C for warm thresholds, and 0.53°C and 0.88°C for cold thresholds. The method of levels for determining cutaneous temperature sensitivity has acceptable reliability.

  7. Automatic detection of malaria parasite in blood images using two parameters.

    PubMed

    Kim, Jong-Dae; Nam, Kyeong-Min; Park, Chan-Young; Kim, Yu-Seop; Song, Hye-Jeong

    2015-01-01

    Malaria must be diagnosed quickly and accurately at the initial infection stage and treated early to cure it properly. The malaria diagnosis method using a microscope requires much labor and time of a skilled expert and the diagnosis results vary greatly between individual diagnosticians. Therefore, to be able to measure the malaria parasite infection quickly and accurately, studies have been conducted for automated classification techniques using various parameters. In this study, by measuring classification technique performance according to changes of two parameters, the parameter values were determined that best distinguish normal from plasmodium-infected red blood cells. To reduce the stain deviation of the acquired images, a principal component analysis (PCA) grayscale conversion method was used, and as parameters, we used a malaria infected area and a threshold value used in binarization. The parameter values with the best classification performance were determined by selecting the value (72) corresponding to the lowest error rate on the basis of cell threshold value 128 for the malaria threshold value for detecting plasmodium-infected red blood cells.

  8. Model for Estimating the Threshold Mechanical Stability of Structural Cartilage Grafts Used in Rhinoplasty

    PubMed Central

    Zemek, Allison; Garg, Rohit; Wong, Brian J. F.

    2014-01-01

    Objectives/Hypothesis Characterizing the mechanical properties of structural cartilage grafts used in rhinoplasty is valuable because softer engineered tissues are more time- and cost-efficient to manufacture. The aim of this study is to quantitatively identify the threshold mechanical stability (e.g., Young’s modulus) of columellar, L-strut, and alar cartilage replacement grafts. Study Design Descriptive, focus group survey. Methods Ten mechanical phantoms of identical size (5 × 20 × 2.3 mm) and varying stiffness (0.360 to 0.85 MPa in 0.05 MPa increments) were made from urethane. A focus group of experienced rhinoplasty surgeons (n = 25, 5 to 30 years in practice) were asked to arrange the phantoms in order of increasing stiffness. Then, they were asked to identify the minimum acceptable stiffness that would still result in favorable surgical outcomes for three clinical applications: columellar, L-strut, and lateral crural replacement grafts. Available surgeons were tested again after 1 week to evaluate intra-rater consistency. Results For each surgeon, the threshold stiffness for each clinical application differed from the threshold values derived by logistic regression by no more than 0.05 MPa (accuracy to within 10%). Specific thresholds were 0.56, 0.59, and 0.49 MPa for columellar, L-strut, and alar grafts, respectively. For comparison, human nasal septal cartilage is approximately 0.8 MPa. Conclusions There was little inter- and intra-rater variation of the identified threshold values for adequate graft stiffness. The identified threshold values will be useful for the design of tissue-engineered or semisynthetic cartilage grafts for use in structural nasal surgery. PMID:20513022

  9. Integration of community structure data reveals observable effects below sediment guideline thresholds in a large estuary.

    PubMed

    Tremblay, Louis A; Clark, Dana; Sinner, Jim; Ellis, Joanne I

    2017-09-20

    The sustainable management of estuarine and coastal ecosystems requires robust frameworks due to the presence of multiple physical and chemical stressors. In this study, we assessed whether ecological health decline, based on community structure composition changes along a pollution gradient, occurred at levels below guideline threshold values for copper, zinc and lead. Canonical analysis of principal coordinates (CAP) was used to characterise benthic communities along a metal contamination gradient. The analysis revealed changes in benthic community distribution at levels below the individual guideline values for the three metals. These results suggest that field-based measures of ecological health analysed with multivariate tools can provide additional information to single metal guideline threshold values to monitor large systems exposed to multiple stressors.

  10. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    USGS Publications Warehouse

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew; Chignell, Steve

    2017-01-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  11. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    NASA Astrophysics Data System (ADS)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew W.; Chignell, Stephen M.

    2017-07-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  12. Quantitative assessment model for gastric cancer screening

    PubMed Central

    Chen, Kun; Yu, Wei-Ping; Song, Liang; Zhu, Yi-Min

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer. METHODS: A case control study was carried on in 66 patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food, etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD). RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively. According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%. Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05). CONCLUSION: The validity of this method is satisfactory. It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer. PMID:15655813

  13. A novel fusion method of improved adaptive LTP and two-directional two-dimensional PCA for face feature extraction

    NASA Astrophysics Data System (ADS)

    Luo, Yuan; Wang, Bo-yu; Zhang, Yi; Zhao, Li-ming

    2018-03-01

    In this paper, under different illuminations and random noises, focusing on the local texture feature's defects of a face image that cannot be completely described because the threshold of local ternary pattern (LTP) cannot be calculated adaptively, a local three-value model of improved adaptive local ternary pattern (IALTP) is proposed. Firstly, the difference function between the center pixel and the neighborhood pixel weight is established to obtain the statistical characteristics of the central pixel and the neighborhood pixel. Secondly, the adaptively gradient descent iterative function is established to calculate the difference coefficient which is defined to be the threshold of the IALTP operator. Finally, the mean and standard deviation of the pixel weight of the local region are used as the coding mode of IALTP. In order to reflect the overall properties of the face and reduce the dimension of features, the two-directional two-dimensional PCA ((2D)2PCA) is adopted. The IALTP is used to extract local texture features of eyes and mouth area. After combining the global features and local features, the fusion features (IALTP+) are obtained. The experimental results on the Extended Yale B and AR standard face databases indicate that under different illuminations and random noises, the algorithm proposed in this paper is more robust than others, and the feature's dimension is smaller. The shortest running time reaches 0.329 6 s, and the highest recognition rate reaches 97.39%.

  14. The effect of adsorbed liquid and material density on saltation threshold: Insight from laboratory and wind tunnel experiments

    NASA Astrophysics Data System (ADS)

    Yu, Xinting; Hörst, Sarah M.; He, Chao; Bridges, Nathan T.; Burr, Devon M.; Sebree, Joshua A.; Smith, James K.

    2017-11-01

    Saltation threshold, the minimum wind speed for sediment transport, is a fundamental parameter in aeolian processes. Measuring this threshold using boundary layer wind tunnels, in which particles are mobilized by flowing air, for a subset of different planetary conditions can inform our understanding of physical processes of sediment transport. The presence of liquid, such as water on Earth or methane on Titan, may affect the threshold values to a great extent. Sediment density is also crucial for determining threshold values. Here we provide quantitative data on density and water content of common wind tunnel materials (including chromite, basalt, quartz sand, beach sand, glass beads, gas chromatograph packing materials, walnut shells, iced tea powder, activated charcoal, instant coffee, and glass bubbles) that have been used to study conditions on Earth, Titan, Mars, and Venus. The measured density values for low density materials are higher compared to literature values (e.g., ∼30% for walnut shells), whereas for the high density materials, there is no such discrepancy. We also find that low density materials have much higher water content and longer atmospheric equilibration timescales compared to high density sediments. We used thermogravimetric analysis (TGA) to quantify surface and internal water and found that over 80% of the total water content is surface water for low density materials. In the Titan Wind Tunnel (TWT), where Reynolds number conditions similar to those on Titan can be achieved, we performed threshold experiments with the standard walnut shells (125-150 μm, 7.2% water by mass) and dried walnut shells, in which the water content was reduced to 1.7%. The threshold results for the two scenarios are almost the same, which indicates that humidity had a negligible effect on threshold for walnut shells in this experimental regime. When the water content is lower than 11.0%, the interparticle forces are dominated by adsorption forces, whereas at higher values the interparticle forces are dominated by much larger capillary forces. For materials with low equilibrium water content, like quartz sand, capillary forces dominate. When the interparticle forces are dominated by adsorption forces, the threshold does not increase with increasing relative humidity (RH) or water content. Only when the interparticle forces are dominated by capillary forces does the threshold start to increase with increasing RH/water content. Since tholins have a low methane content (0.3% at saturation, [Curtis, D. B., Hatch, C. D., Hasenkopf, C. A., et al., 2008. Laboratory studies of methane and ethane adsorption and nucleation onto organic particles: Application to Titan's clouds. Icarus, 195, 792. http://dx.doi.org/10.1016/j.icarus.2008.02.003]), we believe tholins would behave similarly to quartz sand when subjected to methane moisture.

  15. MISR RCCM Products

    Atmospheric Science Data Center

    2017-10-11

    ... new inland water class for RCCM calculation and changed threshold and surface classification datasets accordingly. Modified land second ... 06/21/2000 First version of RCCM. Pre-launch threshold values are used. New ancillary files: ...

  16. Modeling spatially-varying landscape change points in species occurrence thresholds

    USGS Publications Warehouse

    Wagner, Tyler; Midway, Stephen R.

    2014-01-01

    Predicting species distributions at scales of regions to continents is often necessary, as large-scale phenomena influence the distributions of spatially structured populations. Land use and land cover are important large-scale drivers of species distributions, and landscapes are known to create species occurrence thresholds, where small changes in a landscape characteristic results in abrupt changes in occurrence. The value of the landscape characteristic at which this change occurs is referred to as a change point. We present a hierarchical Bayesian threshold model (HBTM) that allows for estimating spatially varying parameters, including change points. Our model also allows for modeling estimated parameters in an effort to understand large-scale drivers of variability in land use and land cover on species occurrence thresholds. We use range-wide detection/nondetection data for the eastern brook trout (Salvelinus fontinalis), a stream-dwelling salmonid, to illustrate our HBTM for estimating and modeling spatially varying threshold parameters in species occurrence. We parameterized the model for investigating thresholds in landscape predictor variables that are measured as proportions, and which are therefore restricted to values between 0 and 1. Our HBTM estimated spatially varying thresholds in brook trout occurrence for both the proportion agricultural and urban land uses. There was relatively little spatial variation in change point estimates, although there was spatial variability in the overall shape of the threshold response and associated uncertainty. In addition, regional mean stream water temperature was correlated to the change point parameters for the proportion of urban land use, with the change point value increasing with increasing mean stream water temperature. We present a framework for quantify macrosystem variability in spatially varying threshold model parameters in relation to important large-scale drivers such as land use and land cover. Although the model presented is a logistic HBTM, it can easily be extended to accommodate other statistical distributions for modeling species richness or abundance.

  17. Practical determination of aortic valve calcium volume score on contrast-enhanced computed tomography prior to transcatheter aortic valve replacement and impact on paravalvular regurgitation: Elucidating optimal threshold cutoffs.

    PubMed

    Bettinger, Nicolas; Khalique, Omar K; Krepp, Joseph M; Hamid, Nadira B; Bae, David J; Pulerwitz, Todd C; Liao, Ming; Hahn, Rebecca T; Vahl, Torsten P; Nazif, Tamim M; George, Isaac; Leon, Martin B; Einstein, Andrew J; Kodali, Susheel K

    The threshold for the optimal computed tomography (CT) number in Hounsfield Units (HU) to quantify aortic valvular calcium on contrast-enhanced scans has not been standardized. Our aim was to find the most accurate threshold to predict paravalvular regurgitation (PVR) after transcatheter aortic valve replacement (TAVR). 104 patients who underwent TAVR with the CoreValve prosthesis were studied retrospectively. Luminal attenuation (LA) in HU was measured at the level of the aortic annulus. Calcium volume score for the aortic valvular complex was measured using 6 threshold cutoffs (650 HU, 850 HU, LA × 1.25, LA × 1.5, LA+50, LA+100). Receiver-operating characteristic (ROC) analysis was performed to assess the predictive value for > mild PVR (n = 16). Multivariable analysis was performed to determine the accuracy to predict > mild PVR after adjustment for depth and perimeter oversizing. ROC analysis showed lower area under the curve (AUC) values for fixed threshold cutoffs (650 or 850 HU) compared to thresholds relative to LA. The LA+100 threshold had the highest AUC (0.81), and AUC was higher than all studied protocols, other than the LA x 1.25 and LA + 50 protocols, where the difference approached statistical significance (p = 0.05, and 0.068, respectively). Multivariable analysis showed calcium volume determined by the LAx1.25, LAx1.5, LA+50, and LA+ 100 HU protocols to independently predict PVR. Calcium volume scoring thresholds which are relative to LA are more predictive of PVR post-TAVR than those which use fixed cutoffs. A threshold of LA+100 HU had the highest predictive value. Copyright © 2017 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  18. 78 FR 7785 - Request for Comments and Information on Initiating a Risk Assessment for Establishing Food...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-0711] Request for Comments and Information on Initiating a Risk Assessment for Establishing Food Allergen Thresholds; Establishment of Docket; Extension of Comment Period AGENCY: Food and Drug Administration, HHS...

  19. Predictive value of neuron-specific enolase for prognosis in patients with moderate or severe traumatic brain injury: a systematic review and meta-analysis

    PubMed Central

    Mercier, Eric; Boutin, Amélie; Shemilt, Michèle; Lauzier, François; Zarychanski, Ryan; Fergusson, Dean A.; Moore, Lynne; McIntyre, Lauralyn A.; Archambault, Patrick; Légaré, France; Rousseau, François; Lamontagne, François; Nadeau, Linda; Turgeon, Alexis F.

    2016-01-01

    Background: Prognosis is difficult to establish early after moderate or severe traumatic brain injury despite representing an important concern for patients, families and medical teams. Biomarkers, such as neuron-specific enolase, have been proposed as potential early prognostic indicators. Our objective was to determine the association between neuron-specific enolase and clinical outcomes, and the prognostic value of neuron-specific enolase after a moderate or severe traumatic brain injury. Methods: We searched MEDLINE, Embase, The Cochrane Library and Biosis Previews, and reviewed reference lists of eligible articles to identify studies. We included cohort studies and randomized controlled trials that evaluated the prognostic value of neuron-specific enolase to predict mortality or Glasgow Outcome Scale score in patients with moderate or severe traumatic brain injury. Two reviewers independently collected data. The pooled mean differences were analyzed using random-effects models. We assessed risk of bias using a customized Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. Subgroup and sensitivity analyses were performed based on a priori hypotheses. Results: We screened 5026 citations from which 30 studies (involving 1321 participants) met our eligibility criteria. We found a significant positive association between neuron-specific enolase serum levels and mortality (10 studies, n = 474; mean difference [MD] 18.46 µg/L, 95% confidence interval [CI] 10.81 to 26.11 µg/L; I2 = 83%) and a Glasgow Outcome Scale ≤ 3 (14 studies, n = 603; MD 17.25 µg/L, 95% CI 11.42 to 23.07 µg/L; I2 = 82%). We were unable to determine a clinical threshold value using the available patient data. Interpretation: In patients with moderate or severe traumatic brain injury, increased neuron-specific enolase serum levels are associated with unfavourable outcomes. The optimal neuron-specific enolase threshold value to predict unfavourable prognosis remains unknown and clinical decision-making is currently not recommended until additional studies are made available. PMID:27975043

  20. The characteristics of vibrotactile perception threshold among shipyard workers in a tropical environment.

    PubMed

    Tamrin, Shamsul Bahri Mohd; Jamalohdin, Mohd Nazri; Ng, Yee Guan; Maeda, Setsuo; Ali, Nurul Asyiqin Mohd

    2012-01-01

    The objectives of this study are to determine the prevalence of hand-arm vibration syndrome (HAVS) and the characteristics of the vibrotactile perception threshold (VPT) among users of hand-held vibrating tools working in a tropical environment. A cross sectional study was done among 47 shipyard workers using instruments and a questionnaire to determine HAVS related symptoms. The vibration acceleration magnitude was determined using a Human Vibration Meter (Maestro). A P8 Pallesthesiometer (EMSON-MAT, Poland) was used to determine the VPT of index and little finger at frequencies of 31.5 Hz and 125 Hz. The mean reference threshold shift was determined from the reference threshold shift derived from the VPT value. The results show a moderate prevalence of HAVS (49%) among the shipyard workers. They were exposed to the same high intensity level of HAVS (mean = 4.19 ± 1.94 m/s(2)) from the use of vibrating hand-held tools. The VPT values were found to be higher for both fingers and both frequencies (index, 31.5 Hz = 110.91 ± 7.36 dB, 125 Hz = 117.0 ± 10.25 dB; little, 31.5 Hz = 110.70 ± 6.75 dB, 125 Hz = 117.71 ± 10.25 dB) compared to the normal healthy population with a mean threshold shift of between 9.20 to 10.61 decibels. The frequency of 31.5 Hz had a higher percentage of positive mean reference threshold shift (index finger=93.6%, little finger=100%) compared to 125 Hz (index finger=85.1%, little finger=78.7%). In conclusion, the prevalence of HAVS was lower than those working in a cold environment; however, all workers had a higher mean VPT value compared to the normal population with all those reported as having HAVS showing a positive mean reference threshold shift of VPT value.

  1. Using ROC Curves to Choose Minimally Important Change Thresholds when Sensitivity and Specificity Are Valued Equally: The Forgotten Lesson of Pythagoras. Theoretical Considerations and an Example Application of Change in Health Status

    PubMed Central

    Froud, Robert; Abel, Gary

    2014-01-01

    Background Receiver Operator Characteristic (ROC) curves are being used to identify Minimally Important Change (MIC) thresholds on scales that measure a change in health status. In quasi-continuous patient reported outcome measures, such as those that measure changes in chronic diseases with variable clinical trajectories, sensitivity and specificity are often valued equally. Notwithstanding methodologists agreeing that these should be valued equally, different approaches have been taken to estimating MIC thresholds using ROC curves. Aims and objectives We aimed to compare the different approaches used with a new approach, exploring the extent to which the methods choose different thresholds, and considering the effect of differences on conclusions in responder analyses. Methods Using graphical methods, hypothetical data, and data from a large randomised controlled trial of manual therapy for low back pain, we compared two existing approaches with a new approach that is based on the addition of the sums of squares of 1-sensitivity and 1-specificity. Results There can be divergence in the thresholds chosen by different estimators. The cut-point selected by different estimators is dependent on the relationship between the cut-points in ROC space and the different contours described by the estimators. In particular, asymmetry and the number of possible cut-points affects threshold selection. Conclusion Choice of MIC estimator is important. Different methods for choosing cut-points can lead to materially different MIC thresholds and thus affect results of responder analyses and trial conclusions. An estimator based on the smallest sum of squares of 1-sensitivity and 1-specificity is preferable when sensitivity and specificity are valued equally. Unlike other methods currently in use, the cut-point chosen by the sum of squares method always and efficiently chooses the cut-point closest to the top-left corner of ROC space, regardless of the shape of the ROC curve. PMID:25474472

  2. Better Informing Decision Making with Multiple Outcomes Cost-Effectiveness Analysis under Uncertainty in Cost-Disutility Space

    PubMed Central

    McCaffrey, Nikki; Agar, Meera; Harlum, Janeane; Karnon, Jonathon; Currow, David; Eckermann, Simon

    2015-01-01

    Introduction Comparing multiple, diverse outcomes with cost-effectiveness analysis (CEA) is important, yet challenging in areas like palliative care where domains are unamenable to integration with survival. Generic multi-attribute utility values exclude important domains and non-health outcomes, while partial analyses—where outcomes are considered separately, with their joint relationship under uncertainty ignored—lead to incorrect inference regarding preferred strategies. Objective The objective of this paper is to consider whether such decision making can be better informed with alternative presentation and summary measures, extending methods previously shown to have advantages in multiple strategy comparison. Methods Multiple outcomes CEA of a home-based palliative care model (PEACH) relative to usual care is undertaken in cost disutility (CDU) space and compared with analysis on the cost-effectiveness plane. Summary measures developed for comparing strategies across potential threshold values for multiple outcomes include: expected net loss (ENL) planes quantifying differences in expected net benefit; the ENL contour identifying preferred strategies minimising ENL and their expected value of perfect information; and cost-effectiveness acceptability planes showing probability of strategies minimising ENL. Results Conventional analysis suggests PEACH is cost-effective when the threshold value per additional day at home ( 1) exceeds $1,068 or dominated by usual care when only the proportion of home deaths is considered. In contrast, neither alternative dominate in CDU space where cost and outcomes are jointly considered, with the optimal strategy depending on threshold values. For example, PEACH minimises ENL when 1=$2,000 and 2=$2,000 (threshold value for dying at home), with a 51.6% chance of PEACH being cost-effective. Conclusion Comparison in CDU space and associated summary measures have distinct advantages to multiple domain comparisons, aiding transparent and robust joint comparison of costs and multiple effects under uncertainty across potential threshold values for effect, better informing net benefit assessment and related reimbursement and research decisions. PMID:25751629

  3. Impact response of graphite-epoxy flat laminates using projectiles that simulate aircraft engine encounters

    NASA Technical Reports Server (NTRS)

    Preston, J. L., Jr.; Cook, T. S.

    1975-01-01

    An investigation of the response of a graphite-epoxy material to foreign object impact was made by impacting spherical projectiles of gelatin, ice, and steel normally on flat panels. The observed damage was classified as transverse (stress wave delamination and cracking), penetrative, or structural (gross failure): the minimum, or threshold, velocity to cause each class of damage was established as a function of projectile characteristics. Steel projectiles had the lowest transverse damage threshold, followed by gelatin and ice. Making use of the threshold velocities and assuming that the normal component of velocity produces the damage in nonnormal impacts, a set of impact angles and velocities was established for each projectile material which would result in damage to composite fan blades. Analysis of the operating parameters of a typical turbine fan blade shows that small steel projectiles are most likely to cause delamination and penetration damage to unprotected graphite-epoxy composite fan blades.

  4. Establishment of model of visceral pain due to colorectal distension and its behavioral assessment in rats

    PubMed Central

    Yang, Jian-Ping; Yao, Ming; Jiang, Xing-Hong; Wang, Li-Na

    2006-01-01

    AIM: To establish a visceral pain model via colorectal distension (CRD) and to evaluate the efficiency of behavioral responses of CRD by measuring the score of abdominal withdrawal reflex (AWR) in rats. METHODS: Thirty-eight male SD rats weighing 180-240g were used to establish the visceral pain model. The rat was inserted intra-anally with a 7 cm long flexible latex balloon under ether anesthesia, and colorectal distensions by inflating the balloon with air were made 30 min after recovering from the anesthesia. Five AWR scores (AWR0 to AWR4) were used to assess the intensity of noxious visceral stimuli. It was regarded as the threshold of the minimal pressure (kPa) for abdominal flatting was induced by colorectal distension. RESULTS: A vigorous AWR to distension of the descending colon and rectum was found in 100% of the awake rats tested. The higher the pressure of distension, the higher the score of AWR. The distension pressures of 0, 2.00, 3.33, 5.33 and 8.00 kPa produced different AWR scores (P < 0.05). The pain threshold of AWR was constant for up to 80 min after the initial windup (first 1-3 distensions), the mean threshold was 3.69 ± 0.35 kPa. Systemic administration of morphine sulfate elevated the threshold of visceral pain in a dose-dependent and naloxone reversible manner. CONCLUSION: Scoring the AWR during colorectal distensions can assess the intensity of noxious visceral stimulus. Flatting of abdomen (AWR 3) to CRD as the visceral pain threshold is clear, constant and reliable. This pain model and its behavioral assessment are good for research on visceral pain and analgesics. PMID:16718770

  5. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers

    PubMed Central

    Dobie, Robert A; Wojcik, Nancy C

    2015-01-01

    Objectives The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999–2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Methods Regression analysis was used to derive new age-correction values using audiometric data from the 1999–2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20–75 years. Results The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20–75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61–75 years. Conclusions Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. PMID:26169804

  6. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers.

    PubMed

    Dobie, Robert A; Wojcik, Nancy C

    2015-07-13

    The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999-2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Regression analysis was used to derive new age-correction values using audiometric data from the 1999-2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20-75 years. The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20-75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61-75 years. Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Landslide susceptibility and early warning model for shallow landslide in Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Ming; Wei, Lun-Wei; Chi, Chun-Chi; Chang, Kan-Tsun; Lee, Chyi-Tyi

    2017-04-01

    This study aims to development a regional susceptibility model and warning threshold as well as the establishment of early warning system in order to prevent and reduce the losses caused by rainfall-induced shallow landslides in Taiwan. For the purpose of practical application, Taiwan is divided into nearly 185,000 slope units. The susceptibility and warning threshold of each slope unit were analyzed as basic information for disaster prevention. The geological characteristics, mechanism and the occurrence time of landslides were recorded for more than 900 cases through field investigation and interview of residents in order to discuss the relationship between landslides and rainfall. Logistic regression analysis was performed to evaluate the landslide susceptibility and an I3-R24 rainfall threshold model was proposed for the early warning of landslides. The validations of recent landslide cases show that the model was suitable for the warning of regional shallow landslide and most of the cases can be warned 3 to 6 hours in advanced. We also propose a slope unit area weighted method to establish local rainfall threshold on landslide for vulnerable villages in order to improve the practical application. Validations of the local rainfall threshold also show a good agreement to the occurrence time reported by newspapers. Finally, a web based "Rainfall-induced Landslide Early Warning System" is built and connected to real-time radar rainfall data so that landslide real-time warning can be achieved. Keywords: landslide, susceptibility analysis, rainfall threshold

  8. The use of webcam images to determine tourist-climate aptitude: favourable weather types for sun and beach tourism on the Alicante coast (Spain)

    NASA Astrophysics Data System (ADS)

    Ibarra, Emilio Martínez

    2011-05-01

    Climate has an obvious influence on tourism as a resource and as a location factor for tourist activities. Consequently, the tourist phenomenon in general is heavily controlled by meteorological conditions—in short, by the climate. In this article, the author proposes a set of weather types with which to establish the climate aptitude for sun and beach tourism. To determine these types, the density of use of one of the beaches with the lowest seasonality in continental Europe, the Levante Beach in Benidorm (Alicante, Spain), was analysed. Beach attendance was monitored using a webcam installed by the "Agencia Valenciana de Turismo". The relationship between the density of use of the lower and upper beach areas on the one hand, and meteorological variables on the other, allowed comfort (physiological equivalent temperature) and enjoyment (fractions of solar radiation) thresholds to be established. The appropriate hydric comfort values were obtained by comparing the ranges proposed by Besancenot in 1989 [Besancenot (1989) Clima et turismes. Massom, París] with numbers of visitors to the beach. The wind velocity and precipitation thresholds were selected following consultation with the literature and considering the climatic characteristics of the environment under analysis. Based on a combination of these thresholds, weather types suitable for this specific tourist activity are defined. Thus, this article presents a method for assessing the extent to which a day on the beach can be enjoyed. This has a number of applications, for planners, the tourism business and consumers alike. The use of this (filter) method in climate databases and meteorological forecasts could help determine the tourist season, the suitability of setting up a business associated with sun and beach tourism, as well as help plan holidays and program a day's leisure activities. Thus, the article seeks to improve our understanding of the climate preferences of that tourist activity par excellence: sun and beach tourism.

  9. The use of webcam images to determine tourist-climate aptitude: favourable weather types for sun and beach tourism on the Alicante coast (Spain).

    PubMed

    Ibarra, Emilio Martínez

    2011-05-01

    Climate has an obvious influence on tourism as a resource and as a location factor for tourist activities. Consequently, the tourist phenomenon in general is heavily controlled by meteorological conditions-in short, by the climate. In this article, the author proposes a set of weather types with which to establish the climate aptitude for sun and beach tourism. To determine these types, the density of use of one of the beaches with the lowest seasonality in continental Europe, the Levante Beach in Benidorm (Alicante, Spain), was analysed. Beach attendance was monitored using a webcam installed by the "Agencia Valenciana de Turismo". The relationship between the density of use of the lower and upper beach areas on the one hand, and meteorological variables on the other, allowed comfort (physiological equivalent temperature) and enjoyment (fractions of solar radiation) thresholds to be established. The appropriate hydric comfort values were obtained by comparing the ranges proposed by Besancenot in 1989 [Besancenot (1989) Clima et turismes. Massom, París] with numbers of visitors to the beach. The wind velocity and precipitation thresholds were selected following consultation with the literature and considering the climatic characteristics of the environment under analysis. Based on a combination of these thresholds, weather types suitable for this specific tourist activity are defined. Thus, this article presents a method for assessing the extent to which a day on the beach can be enjoyed. This has a number of applications, for planners, the tourism business and consumers alike. The use of this (filter) method in climate databases and meteorological forecasts could help determine the tourist season, the suitability of setting up a business associated with sun and beach tourism, as well as help plan holidays and program a day's leisure activities. Thus, the article seeks to improve our understanding of the climate preferences of that tourist activity par excellence: sun and beach tourism.

  10. Pesticides and PCBs in sediments and fish from the Salton Sea, California, USA.

    PubMed

    Sapozhnikova, Yelena; Bawardi, Ola; Schlenk, Daniel

    2004-05-01

    The Salton Sea, the largest manmade lake in California, is officially designated by the State of California as an agricultural drainage reservoir. The purpose of this study was to determine organochlorine and organophosphorous pesticides, as well as polychlorinated biphenyl (PCB) concentrations in sediments and fish tissues in the Salton Sea and evaluate the relative ecological risk of these compounds. Sediment samples were taken during 2000-2001 and fish tissues (Tilapia mossambique, Cynoscion xanthulu) were collected in May 2001. All samples were analyzed for 12 chlorinated pesticides, 6 organophosphorus pesticides, and 55 polychlorinated biphenyl (PCB) congeners. SigmaDichlorodiphenyltrichloroethane (SigmaDDT) and total PCB concentrations observed in sediments ranged from 10 to 40 and 116 to 304 ng/g dry wt, respectively. DDT/DDD ratios in sediments and fish tissues of the northern Sea in 2001 indicated recent DDT exposure. Lindane, dieldrin, dichlorodiphenylethane (DDE) and total PCB concentrations detected in sediments exceeded probable effect levels established for freshwater ecosystems, and pp-DDE and total PCB concentrations were higher than effect range-median values developed for marine and estuarine sediments. In fish liver, concentrations of endrin and SigmaDDT exceeded threshold effect level established for invertebrates. SigmaDDT concentrations detected in fish tissues were higher than threshold concentrations for the protection of wildlife consumers of aquatic biota. DDE concentrations in fish muscles tissues were above the 50 ng/g concentration threshold for the protection of predatory birds. Dimethoate, diazinon, malathion, chlorpyrifos, disulfoton varied from < or = 0.15 to 9.5 ng/g dry wt in sediments and from < or = 0.1 to 80.3 ng/g wet wt in fish tissues. Disulfoton was found in relatively high concentrations (up to 80.3 ng/g) in all organs from Tilapia and Corvina. These results demonstrate continued contamination of specific organochlorine compounds in sediments and resident fish species of the Salton Sea.

  11. An Approach to Industrial Stormwater Benchmarks: Establishing and Using Site-Specific Threshold Criteria at Lawrence Livermore National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, C G; Mathews, S

    2006-09-07

    Current regulatory schemes use generic or industrial sector specific benchmarks to evaluate the quality of industrial stormwater discharges. While benchmarks can be a useful tool for facility stormwater managers in evaluating the quality stormwater runoff, benchmarks typically do not take into account site-specific conditions, such as: soil chemistry, atmospheric deposition, seasonal changes in water source, and upstream land use. Failing to account for these factors may lead to unnecessary costs to trace a source of natural variation, or potentially missing a significant local water quality problem. Site-specific water quality thresholds, established upon the statistical evaluation of historic data take intomore » account these factors, are a better tool for the direct evaluation of runoff quality, and a more cost-effective trigger to investigate anomalous results. Lawrence Livermore National Laboratory (LLNL), a federal facility, established stormwater monitoring programs to comply with the requirements of the industrial stormwater permit and Department of Energy orders, which require the evaluation of the impact of effluent discharges on the environment. LLNL recognized the need to create a tool to evaluate and manage stormwater quality that would allow analysts to identify trends in stormwater quality and recognize anomalous results so that trace-back and corrective actions could be initiated. LLNL created the site-specific water quality threshold tool to better understand the nature of the stormwater influent and effluent, to establish a technical basis for determining when facility operations might be impacting the quality of stormwater discharges, and to provide ''action levels'' to initiate follow-up to analytical results. The threshold criteria were based on a statistical analysis of the historic stormwater monitoring data and a review of relevant water quality objectives.« less

  12. Transfusion-associated circulatory overload in a pediatric intensive care unit: different incidences with different diagnostic criteria.

    PubMed

    De Cloedt, Lise; Emeriaud, Guillaume; Lefebvre, Émilie; Kleiber, Niina; Robitaille, Nancy; Jarlot, Christine; Lacroix, Jacques; Gauvin, France

    2018-04-01

    The incidence of transfusion-associated circulatory overload (TACO) is not well known in children, especially in pediatric intensive care unit (PICU) patients. All consecutive patients admitted over 1 year to the PICU of CHU Sainte-Justine were included after they received their first red blood cell transfusion. TACO was diagnosed using the criteria of the International Society of Blood Transfusion, with two different ways of defining abnormal values: 1) using normal pediatric values published in the Nelson Textbook of Pediatrics and 2) by using the patient as its own control and comparing pre- and posttransfusion values with either 10 or 20% difference threshold. We monitored for TACO up to 24 hours posttransfusion. A total of 136 patients were included. Using the "normal pediatric values" definition, we diagnosed 63, 88, and 104 patients with TACO at 6, 12, and 24 hours posttransfusion, respectively. Using the "10% threshold" definition we detected 4, 15, and 27 TACO cases in the same periods, respectively; using the "20% threshold" definition, the number of TACO cases was 2, 6, and 17, respectively. Chest radiograph was the most frequent missing item, especially at 6 and 12 hours posttransfusion. Overall, the incidence of TACO varied from 1.5% to 76% depending on the definition. A more operational definition of TACO is needed in PICU patients. Using a threshold could be more optimal but more studies are needed to confirm the best threshold. © 2018 AABB.

  13. Somatosensory temporal discrimination in essential tremor and isolated head and voice tremors.

    PubMed

    Conte, Antonella; Ferrazzano, Gina; Manzo, Nicoletta; Leodori, Giorgio; Fabbrini, Giovanni; Fasano, Alfonso; Tinazzi, Michele; Berardelli, Alfredo

    2015-05-01

    The aim of this study was to investigate the somatosensory temporal discrimination threshold in patients with essential tremor (sporadic and familial) and to evaluate whether somatosensory temporal discrimination threshold values differ depending on the body parts involved by tremor. We also investigated the somatosensory temporal discrimination in patients with isolated voice tremor. We enrolled 61 patients with tremor: 48 patients with essential tremor (31 patients with upper limb tremor alone, nine patients with head tremor alone, and eight patients with upper limb plus head tremor; 22 patients with familial vs. 26 sporadic essential tremor), 13 patients with isolated voice tremor, and 45 healthy subjects. Somatosensory temporal discrimination threshold values were normal in patients with familial essential tremor, whereas they were higher in patients with sporadic essential tremor. When we classified patients according to tremor distribution, somatosensory temporal discrimination threshold values were normal in patients with upper limb tremor and abnormal only in patients with isolated head tremor. Temporal discrimination threshold values were also abnormal in patients with isolated voice tremor. Somatosensory temporal discrimination processing is normal in patients with familial as well as in patients with sporadic essential tremor involving the upper limbs. By contrast, somatosensory temporal discrimination is altered in patients with isolated head tremor and voice tremor. This study with somatosensory temporal discrimination suggests that isolated head and voice tremors might possibly be considered as separate clinical entities from essential tremor. © 2015 International Parkinson and Movement Disorder Society.

  14. Influence of the chemical structure on odor qualities and odor thresholds of halogenated guaiacol-derived odorants

    NASA Astrophysics Data System (ADS)

    Juhlke, Florian; Lorber, Katja; Wagenstaller, Maria; Buettner, Andrea

    2017-12-01

    Chlorinated guaiacol derivatives are found in waste water of pulp mills using chlorine in the bleaching process of wood pulp. They can also be detected in fish tissue, possibly causing off-odors. To date, there is no systematic investigation on the odor properties of halogenated guaiacol derivatives. To close this gap, odor thresholds in air and odor qualities of 14 compounds were determined by gas chromatography-olfactometry. Overall, the investigated compounds elicited smells that are characteristic for guaiacol, namely smoky, sweet, vanilla-like, but also medicinal and plaster-like. Their odor thresholds in air were, however, very low, ranging from 0.00072 to 23 ng/Lair. The lowest thresholds were found for 5-chloro- and 5-bromoguaiacol, followed by 4,5-dichloro- and 6-chloroguaiacol. Moreover, some inter-individual differences in odor threshold values could be observed, with the highest variations having been recorded for the individual values of 5-iodo- and 4-bromoguaiacol.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bachman, D., E-mail: bachman@ualberta.ca; Fedosejevs, R.; Tsui, Y. Y.

    An optical damage threshold for crystalline silicon from single femtosecond laser pulses was determined by detecting a permanent change in the refractive index of the material. This index change could be detected with unprecedented sensitivity by measuring the resonant wavelength shift of silicon integrated optics microring resonators irradiated with femtosecond laser pulses at 400 nm and 800 nm wavelengths. The threshold for permanent index change at 400 nm wavelength was determined to be 0.053 ± 0.007 J/cm{sup 2}, which agrees with previously reported threshold values for femtosecond laser modification of crystalline silicon. However, the threshold for index change at 800 nm wavelength was found to be 0.044 ± 0.005 J/cm{supmore » 2}, which is five times lower than the previously reported threshold values for visual change on the silicon surface. The discrepancy is attributed to possible modification of the crystallinity of silicon below the melting temperature that has not been detected before.« less

  16. A derivation of the stable cavitation threshold accounting for bubble-bubble interactions.

    PubMed

    Guédra, Matthieu; Cornu, Corentin; Inserra, Claude

    2017-09-01

    The subharmonic emission of sound coming from the nonlinear response of a bubble population is the most used indicator for stable cavitation. When driven at twice their resonance frequency, bubbles can exhibit subharmonic spherical oscillations if the acoustic pressure amplitude exceeds a threshold value. Although various theoretical derivations exist for the subharmonic emission by free or coated bubbles, they all rest on the single bubble model. In this paper, we propose an analytical expression of the subharmonic threshold for interacting bubbles in a homogeneous, monodisperse cloud. This theory predicts a shift of the subharmonic resonance frequency and a decrease of the corresponding pressure threshold due to the interactions. For a given sonication frequency, these results show that an optimal value of the interaction strength (i.e. the number density of bubbles) can be found for which the subharmonic threshold is minimum, which is consistent with recently published experiments conducted on ultrasound contrast agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Threshold values and management options for nutrients in a catchment of a temperate estuary with poor ecological status

    NASA Astrophysics Data System (ADS)

    Hinsby, K.; Markager, S.; Kronvang, B.; Windolf, J.; Sonnenborg, T. O.; Thorling, L.

    2012-02-01

    Intensive farming has severe impacts on the chemical status of groundwater and streams and consequently on the ecological status of dependent ecosystems. Eutrophication is a widespread problem in lakes and marine waters. Common problems are hypoxia, algal blooms and fish kills, and loss of water clarity, underwater vegetation, biodiversity, and recreational value. In this paper we evaluate the nitrogen (N) and phosphorus (P) chemistry of groundwater and surface water in a coastal catchment, the loadings and sources of N and P and their effect on the ecological status of an estuary. We calculate the necessary reductions in N and P loadings to the estuary for obtaining a good ecological status, which we define based on the number of days with N and P limitation, and the equivalent stream and groundwater threshold values assuming two different management options. The calculations are performed by the combined use of empirical models and a physically based 3-D integrated hydrological model of the whole catchment. The assessment of the ecological status indicates that the N and P loads to the investigated estuary should be reduced by a factor of 0.52 and 0.56, respectively, to restore good ecological status. Model estimates show that threshold total N concentrations should be in the range of 2.9 to 3.1 mg l-1 in inlet freshwater to Horsens Estuary and 6.0 to 9.3 mg l-1 in shallow aerobic groundwater (∼27-41 mg l-1 of nitrate), depending on the management measures implemented in the catchment. The situation for total P is more complex but data indicate that groundwater threshold values are not needed. The inlet freshwater threshold value for total P to Horsens Estuary for the selected management options is 0.084 mg l-1. Regional climate models project increasing winter precipitation and runoff in the investigated region resulting in increasing runoff and nutrient loads to coastal waters if present land use and farming practices continue. Hence, lower threshold values are required in the future to ensure good status of all water bodies and ecosystems.

  18. Threshold values and management options for nutrients in a catchment of a temperate estuary with poor ecological status

    NASA Astrophysics Data System (ADS)

    Hinsby, K.; Markager, S.; Kronvang, B.; Windolf, J.; Sonnenborg, T. O.; Thorling, L.

    2012-08-01

    Intensive farming has severe impacts on the chemical status of groundwater and streams and consequently on the ecological status of dependent ecosystems. Eutrophication is a widespread problem in lakes and marine waters. Common problems are hypoxia, algal blooms, fish kills, and loss of water clarity, underwater vegetation, biodiversity and recreational value. In this paper we evaluate the nitrogen (N) and phosphorus (P) concentrations of groundwater and surface water in a coastal catchment, the loadings and sources of N and P, and their effect on the ecological status of an estuary. We calculate the necessary reductions in N and P loadings to the estuary for obtaining a good ecological status, which we define based on the number of days with N and P limitation, and the corresponding stream and groundwater threshold values assuming two different management options. The calculations are performed by the combined use of empirical models and a physically based 3-D integrated hydrological model of the whole catchment. The assessment of the ecological status indicates that the N and P loads to the investigated estuary should be reduced to levels corresponding to 52 and 56% of the current loads, respectively, to restore good ecological status. Model estimates show that threshold total N (TN) concentrations should be in the range of 2.9 to 3.1 mg l-1 in inlet freshwater (streams) to Horsens estuary and 6.0 to 9.3 mg l-1 in shallow aerobic groundwater (∼ 27-41 mg l-1 of nitrate), depending on the management measures implemented in the catchment. The situation for total P (TP) is more complex, but data indicate that groundwater threshold values are not needed. The stream threshold value for TP to Horsens estuary for the selected management options is 0.084 mg l-1. Regional climate models project increasing winter precipitation and runoff in the investigated region resulting in increasing runoff and nutrient loads to the Horsens estuary and many other coastal waters if present land use and farming practices continue. Hence, lower threshold values are required in many coastal catchments in the future to ensure good status of water bodies and ecosystems.

  19. Bevacizumab for Metastatic Colorectal Cancer: A Global Cost-Effectiveness Analysis.

    PubMed

    Goldstein, Daniel A; Chen, Qiushi; Ayer, Turgay; Chan, Kelvin K W; Virik, Kiran; Hammerman, Ariel; Brenner, Baruch; Flowers, Christopher R; Hall, Peter S

    2017-06-01

    In the U.S., the addition of bevacizumab to first-line chemotherapy in metastatic colorectal cancer (mCRC) has been demonstrated to provide 0.10 quality-adjusted life years (QALYs) at an incremental cost-effectiveness ratio (ICER) of $571,000/QALY. Due to variability in pricing, value for money may be different in other countries. Our objective was to establish the cost-effectiveness of bevacizumab in mCRC in the U.S., U.K., Canada, Australia, and Israel. We performed the analysis using a previously established Markov model for mCRC. Input data for efficacy, adverse events, and quality of life were considered to be generalizable and therefore identical for all countries. We used country-specific prices for medications, administration, and other health service costs. All costs were converted from local currency to U.S. dollars at the exchange rates in March 2016. We conducted one-way and probabilistic sensitivity analyses (PSA) to assess the model robustness across parameter uncertainties. Base case results demonstrated that the highest ICER was in the U.S. ($571,000/QALY) and the lowest was in Australia ($277,000/QALY). In Canada, the U.K., and Israel, ICERs ranged between $351,000 and $358,000 per QALY. PSA demonstrated 0% likelihood of bevacizumab being cost-effective in any country at a willingness to pay threshold of $150,000 per QALY. The addition of bevacizumab to first-line chemotherapy for mCRC consistently fails to be cost-effective in all five countries. There are large differences in cost-effectiveness between countries. This study provides a framework for analyzing the value of a cancer drug from the perspectives of multiple international payers. The cost-effectiveness of bevacizumab varies significantly between multiple countries. By conventional thresholds, bevacizumab is not cost-effective in metastatic colon cancer in the U.S., the U.K., Australia, Canada, and Israel. © AlphaMed Press 2017.

  20. Implications of Proposed University of Maryland System Patenting Policy Change.

    ERIC Educational Resources Information Center

    Clinch, Richard

    As a result of actual and anticipated growth in the level of entrepreneurial activities within the University of Maryland System (UMS), and corresponding growth in licensing and royalty revenues, a threshold policy was recommended in the Joint Chairmen's Report of 1996. Such a policy would establish a maximum threshold beyond which a portion of…

  1. 75 FR 5716 - Federal Acquisition Regulation; FAR Case 2008-024, Inflation Adjustment of Acquisition-Related...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... Consumer Price Index for all urban consumers, except for Davis-Bacon Act, Service Contract Act, and trade... consumers, except for Davis-Bacon Act, Service Contract Act, and trade agreements thresholds (see FAR 1.109... acquisition-related thresholds established by the Davis-Bacon Act, the Service Contract Act, or the United...

  2. Northwest Manufacturing Initiative

    DTIC Science & Technology

    2012-03-27

    crack growth and threshold stress corrosion cracking evaluation. Threshold stress corrosion cracking was done using the rising step load method with...Group Technology methods to establish manufacturing cells for production efficiency, to develop internal Lean Champions, and to implement rapid... different levels, advisory, core, etc. VI. Core steering committee composed of members that have a significant vested interest. Action Item: Draft

  3. 76 FR 41839 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... Effectiveness of Proposed Rule Change Relating to PAR Official Fees in Volatility Index Options July 7, 2011... threshold tiers for the assessment of PAR Official Fees in Volatility Index Options classes based on the..., 2011 to establish volume threshold tiers for the assessment of PAR Official Fees in Volatility Index...

  4. 78 FR 80369 - Federal Acquisition Regulation; Service Contracts Reporting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-31

    ...-and- materials, and labor-hour contracts and orders above the simplified acquisition threshold (SAT... thresholds established in FAR 4.1703 (e.g., above the SAT for cost-reimbursement, time-and-materials, and... reporting will be made at www.sam.gov (See section 3.10 of the SAM User Guide at https://www.sam.gov/sam/SAM...

  5. A Review of Physical Fitness as It Pertains to the Military Services

    DTIC Science & Technology

    1985-07-01

    muscle metabolic capacity. The results of this research has led to a measure which is commonly referred to as " anaerobic threshold " (40). This is an...unfortunate term in that it is really a measure of aerobic metabolic capacity, not anaerobic . Anaerobic threshold is defined as the point of exercise...the individual can exercise without lactate accumulation, the higher the anaerobic threshold value. Untrained individuals have a threshold at a work

  6. Adaptive thresholding image series from fluorescence confocal scanning laser microscope using orientation intensity profiles

    NASA Astrophysics Data System (ADS)

    Feng, Judy J.; Ip, Horace H.; Cheng, Shuk H.

    2004-05-01

    Many grey-level thresholding methods based on histogram or other statistic information about the interest image such as maximum entropy and so on have been proposed in the past. However, most methods based on statistic analysis of the images concerned little about the characteristics of morphology of interest objects, which sometimes could provide very important indication which can help to find the optimum threshold, especially for those organisms which have special texture morphologies such as vasculature, neuro-network etc. in medical imaging. In this paper, we propose a novel method for thresholding the fluorescent vasculature image series recorded from Confocal Scanning Laser Microscope. After extracting the basic orientation of the slice of vessels inside a sub-region partitioned from the images, we analysis the intensity profiles perpendicular to the vessel orientation to get the reasonable initial threshold for each region. Then the threshold values of those regions near the interest one both in x-y and optical directions have been referenced to get the final result of thresholds of the region, which makes the whole stack of images look more continuous. The resulting images are characterized by suppressing both noise and non-interest tissues conglutinated to vessels, while improving the vessel connectivities and edge definitions. The value of the method for idealized thresholding the fluorescence images of biological objects is demonstrated by a comparison of the results of 3D vascular reconstruction.

  7. Gradient-driven flux-tube simulations of ion temperature gradient turbulence close to the non-linear threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeters, A. G.; Rath, F.; Buchholz, R.

    2016-08-15

    It is shown that Ion Temperature Gradient turbulence close to the threshold exhibits a long time behaviour, with smaller heat fluxes at later times. This reduction is connected with the slow growth of long wave length zonal flows, and consequently, the numerical dissipation on these flows must be sufficiently small. Close to the nonlinear threshold for turbulence generation, a relatively small dissipation can maintain a turbulent state with a sizeable heat flux, through the damping of the zonal flow. Lowering the dissipation causes the turbulence, for temperature gradients close to the threshold, to be subdued. The heat flux then doesmore » not go smoothly to zero when the threshold is approached from above. Rather, a finite minimum heat flux is obtained below which no fully developed turbulent state exists. The threshold value of the temperature gradient length at which this finite heat flux is obtained is up to 30% larger compared with the threshold value obtained by extrapolating the heat flux to zero, and the cyclone base case is found to be nonlinearly stable. Transport is subdued when a fully developed staircase structure in the E × B shearing rate forms. Just above the threshold, an incomplete staircase develops, and transport is mediated by avalanche structures which propagate through the marginally stable regions.« less

  8. An alternative method for quantifying coronary artery calcification: the multi-ethnic study of atherosclerosis (MESA).

    PubMed

    Liang, C Jason; Budoff, Matthew J; Kaufman, Joel D; Kronmal, Richard A; Brown, Elizabeth R

    2012-07-02

    Extent of atherosclerosis measured by amount of coronary artery calcium (CAC) in computed tomography (CT) has been traditionally assessed using thresholded scoring methods, such as the Agatston score (AS). These thresholded scores have value in clinical prediction, but important information might exist below the threshold, which would have important advantages for understanding genetic, environmental, and other risk factors in atherosclerosis. We developed a semi-automated threshold-free scoring method, the spatially weighted calcium score (SWCS) for CAC in the Multi-Ethnic Study of Atherosclerosis (MESA). Chest CT scans were obtained from 6814 participants in the Multi-Ethnic Study of Atherosclerosis (MESA). The SWCS and the AS were calculated for each of the scans. Cox proportional hazards models and linear regression models were used to evaluate the associations of the scores with CHD events and CHD risk factors. CHD risk factors were summarized using a linear predictor. Among all participants and participants with AS > 0, the SWCS and AS both showed similar strongly significant associations with CHD events (hazard ratios, 1.23 and 1.19 per doubling of SWCS and AS; 95% CI, 1.16 to 1.30 and 1.14 to 1.26) and CHD risk factors (slopes, 0.178 and 0.164; 95% CI, 0.162 to 0.195 and 0.149 to 0.179). Even among participants with AS = 0, an increase in the SWCS was still significantly associated with established CHD risk factors (slope, 0.181; 95% CI, 0.138 to 0.224). The SWCS appeared to be predictive of CHD events even in participants with AS = 0, though those events were rare as expected. The SWCS provides a valid, continuous measure of CAC suitable for quantifying the extent of atherosclerosis without a threshold, which will be useful for examining novel genetic and environmental risk factors for atherosclerosis.

  9. The Electromyographic Threshold in Girls and Women.

    PubMed

    Long, Devon; Dotan, Raffy; Pitt, Brynlynn; McKinlay, Brandon; O'Brien, Thomas D; Tokuno, Craig; Falk, Bareket

    2017-02-01

    The electromyographic threshold (EMG Th ) is thought to reflect increased high-threshold/type-II motor-unit (MU) recruitment and was shown higher in boys than in men. Women differ from men in muscular function. Establish whether females' EMG Th and girls-women differences are different than males'. Nineteen women (22.9 ± 3.3yrs) and 20 girls (10.3 ± 1.1yrs) had surface EMG recorded from the right and left vastus lateralis muscles during ramped cycle-ergometry to exhaustion. EMG root-mean-squares were averaged per pedal revolution. EMG Th was determined as the least residual sum of squares for any two regression-line data divisions, if the trace rose ≥ 3SD above its regression line. EMG Th was expressed as % final power-output (%Pmax) and %VO 2 pk power (%P VO2pk ). EMG Th was detected in 13 (68%) of women, but only 9 (45%) of girls (p < .005) and tended to be higher in the girls (%Pmax= 88.6 ± 7.0 vs. 83.0 ± 6.9%, p = .080; %P VO2pk = (101.6 ± 17.6 vs. 90.6 ± 7.8%, p = .063). When EMG Th was undetected it was assumed to occur at 100%Pmax or beyond. Consequently, EMG Th values turned significantly higher in girls than in women (94.8 ± 7.4 vs. 88.4 ± 9.9%Pmax, p = .026; and 103.2 ± 11.7 vs. 95.2 ± 9.9%P VO2pk , p = .028). During progressive exercise, girls appear to rely less on higher-threshold/type-II MUs than do women, suggesting differential muscle activation strategy.

  10. Cost–effectiveness thresholds: pros and cons

    PubMed Central

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  11. Quantitative Analysis of Swallowing Function Between Dysphagia Patients and Healthy Subjects Using High-Resolution Manometry

    PubMed Central

    2017-01-01

    Objective To compare swallowing function between healthy subjects and patients with pharyngeal dysphagia using high resolution manometry (HRM) and to evaluate the usefulness of HRM for detecting pharyngeal dysphagia. Methods Seventy-five patients with dysphagia and 28 healthy subjects were included in this study. Diagnosis of dysphagia was confirmed by a videofluoroscopy. HRM was performed to measure pressure and timing information at the velopharynx (VP), tongue base (TB), and upper esophageal sphincter (UES). HRM parameters were compared between dysphagia and healthy groups. Optimal threshold values of significant HRM parameters for dysphagia were determined. Results VP maximal pressure, TB maximal pressure, UES relaxation duration, and UES resting pressure were lower in the dysphagia group than those in healthy group. UES minimal pressure was higher in dysphagia group than in the healthy group. Receiver operating characteristic (ROC) analyses were conducted to validate optimal threshold values for significant HRM parameters to identify patients with pharyngeal dysphagia. With maximal VP pressure at a threshold value of 144.0 mmHg, dysphagia was identified with 96.4% sensitivity and 74.7% specificity. With maximal TB pressure at a threshold value of 158.0 mmHg, dysphagia was identified with 96.4% sensitivity and 77.3% specificity. At a threshold value of 2.0 mmHg for UES minimal pressure, dysphagia was diagnosed at 74.7% sensitivity and 60.7% specificity. Lastly, UES relaxation duration of <0.58 seconds had 85.7% sensitivity and 65.3% specificity, and UES resting pressure of <75.0 mmHg had 89.3% sensitivity and 90.7% specificity for identifying dysphagia. Conclusion We present evidence that HRM could be a useful evaluation tool for detecting pharyngeal dysphagia. PMID:29201816

  12. Quantitative Analysis of Swallowing Function Between Dysphagia Patients and Healthy Subjects Using High-Resolution Manometry.

    PubMed

    Park, Chul-Hyun; Kim, Don-Kyu; Lee, Yong-Taek; Yi, Youbin; Lee, Jung-Sang; Kim, Kunwoo; Park, Jung Ho; Yoon, Kyung Jae

    2017-10-01

    To compare swallowing function between healthy subjects and patients with pharyngeal dysphagia using high resolution manometry (HRM) and to evaluate the usefulness of HRM for detecting pharyngeal dysphagia. Seventy-five patients with dysphagia and 28 healthy subjects were included in this study. Diagnosis of dysphagia was confirmed by a videofluoroscopy. HRM was performed to measure pressure and timing information at the velopharynx (VP), tongue base (TB), and upper esophageal sphincter (UES). HRM parameters were compared between dysphagia and healthy groups. Optimal threshold values of significant HRM parameters for dysphagia were determined. VP maximal pressure, TB maximal pressure, UES relaxation duration, and UES resting pressure were lower in the dysphagia group than those in healthy group. UES minimal pressure was higher in dysphagia group than in the healthy group. Receiver operating characteristic (ROC) analyses were conducted to validate optimal threshold values for significant HRM parameters to identify patients with pharyngeal dysphagia. With maximal VP pressure at a threshold value of 144.0 mmHg, dysphagia was identified with 96.4% sensitivity and 74.7% specificity. With maximal TB pressure at a threshold value of 158.0 mmHg, dysphagia was identified with 96.4% sensitivity and 77.3% specificity. At a threshold value of 2.0 mmHg for UES minimal pressure, dysphagia was diagnosed at 74.7% sensitivity and 60.7% specificity. Lastly, UES relaxation duration of <0.58 seconds had 85.7% sensitivity and 65.3% specificity, and UES resting pressure of <75.0 mmHg had 89.3% sensitivity and 90.7% specificity for identifying dysphagia. We present evidence that HRM could be a useful evaluation tool for detecting pharyngeal dysphagia.

  13. Style-by-style analysis of two sporadic self-compatible Solanum chacoense lines supports a primary role for S-RNases in determining pollen rejection thresholds.

    PubMed

    Qin, Xike; Liu, Bolin; Soulard, Jonathan; Morse, David; Cappadocia, Mario

    2006-01-01

    A method for the quantification of S-RNase levels in single styles of self-incompatible Solanum chacoense was developed and applied toward an experimental determination of the S-RNase threshold required for pollen rejection. It was found that, when single style values are averaged, accumulated levels of the S(11)- and S(12)-RNases can differ up to 10-fold within a genotype, while accumulated levels of the S(12)-RNase can differ by over 3-fold when different genotypes are compared. Surprisingly, the amount of S(12)-RNase accumulated in different styles of the same plant can differ by over 20-fold. A low level of 160 ng S-RNase in individual styles of fully incompatible plants, and a high value of 68 ng in a sporadic self-compatible (SSC) line during a bout of complete compatibility was measured, suggesting that these values bracket the threshold level of S-RNase needed for pollen rejection. Remarkably, correlations of S-RNase values to average fruit sets in different plant lines displaying sporadic self-compatibility (SSC) to different extents as well as to fruit set in immature flowers, are all consistent with a threshold value of 80 ng S(12)-RNase. Taken together, these results suggest that S-RNase levels alone are the principal determinant of the incompatibility phenotype. Interestingly, while the S-RNase threshold required for rejection of S(12)-pollen from a given genetic background is the same in styles of different genetic backgrounds, it is different when pollen donors of different genetic backgrounds are used. These results reveal a previously unsuspected level of complexity in the incompatibility reaction.

  14. MRI assessment of relapsed glioblastoma during treatment with bevacizumab: volumetric measurement of enhanced and FLAIR lesions for evaluation of response and progression--a pilot study.

    PubMed

    Pichler, Josef; Pachinger, Corinna; Pelz, Manuela; Kleiser, Raimund

    2013-05-01

    To develop a magnetic resonance imaging (MRI) metric that is useful for therapy monitoring in patients with relapsed glioblastoma (GBM) during treatment with the antiangiogenic monoclonal antibody bevacizumab (Bev). We evaluated the feasibility of tumour volume measurement with our software tool in clinical routine and tried to establish reproducible and quantitative parameters for surveillance of patients on treatment with antiangiogenic drugs. In this retrospective institutional pilot study, 18 patients (11 men, 7 women; mean age 53.5) with recurrent GBM received bevacizumab and irinotecan every two weeks as second line therapy. Follow up scans were assessed every two to four months. Data were collected on a 1.5 T MR System (Siemens, Symphony) with the standard head coil using our standardized tumour protocol. Volumetric measurement was performed with a commercial available software stroketool in FLAIR and T1-c imaging with following procedure: Pre-processing involved cutting noise and electing a Gaussian of 3 × 3 to smooth images, selecting a ROI (region of interest) in healthy brain area of the contra lateral side with quantifying the intensity value, adding 20% to this value to define the threshold level. Only values above this threshold are left corresponding to the tumour lesion. For the volumetric measurement the detected tumour area was circuited in all slices and finally summing up all values and multiplied by slice thickness to get the whole volume. With McDonalds criteria progression was indicated in 14 out of 18 patients. In contrast, volumetric measurement showed an increase of contrast enhancement of >25%, defined as threshold for progression, in 11 patients (78%) and in 12 patients (85%) in FLAIR volume, respectively. 6 patients revealed that volumes in MRI increased earlier than the last scan, which was primarily defined as the date of progression with McDonald criteria, changing PFS after re-evaluation of the tumour volumes from 6.8 to 5.6 months. In this pilot study the applied imaging estimates objectively tumour response and progression compared to the bi-dimensional measurement. The quantitative parameters are reproducible and also applicable for the diffuse infiltrating lesions. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Delineating riparian zones for entire river networks using geomorphological criteria

    NASA Astrophysics Data System (ADS)

    Fernández, D.; Barquín, J.; Álvarez-Cabria, M.; Peñas, F. J.

    2012-03-01

    Riparian zone delineation is a central issue for riparian and river ecosystem management, however, criteria used to delineate them are still under debate. The area inundated by a 50-yr flood has been indicated as an optimal hydrological descriptor for riparian areas. This detailed hydrological information is, however, not usually available for entire river corridors, and is only available for populated areas at risk of flooding. One of the requirements for catchment planning is to establish the most appropriate location of zones to conserve or restore riparian buffer strips for whole river networks. This issue could be solved by using geomorphological criteria extracted from Digital Elevation Models. In this work we have explored the adjustment of surfaces developed under two different geomorphological criteria with respect to the flooded area covered by the 50-yr flood, in an attempt to rapidly delineate hydrologically-meaningful riparian zones for entire river networks. The first geomorphological criterion is based on the surface that intersects valley walls at a given number of bankfull depths above the channel (BFDAC), while the second is based on the surface defined by a~threshold value indicating the relative cost of moving from the stream up to the valley, accounting for slope and elevation change (path distance). As the relationship between local geomorphology and 50-yr flood has been suggested to be river-type dependant, we have performed our analyses distinguishing between three river types corresponding with three valley morphologies: open, shallow vee and deep vee valleys (in increasing degree of valley constrainment). Adjustment between the surfaces derived from geomorphological and hydrological criteria has been evaluated using two different methods: one based on exceeding areas (minimum exceeding score) and the other on the similarity among total area values. Both methods have pointed out the same surfaces when looking for those that best match with the 50-yr flood. Results have shown that the BFDAC approach obtains an adjustment slightly better than that of path distance. However, BFDAC requires bankfull depth regional regressions along the considered river network. Results have also confirmed that unconstrained valleys require lower threshold values than constrained valleys when deriving surfaces using geomorphological criteria. Moreover, this study provides: (i) guidance on the selection of the proper geomorphological criterion and associated threshold values, and (ii) an easy calibration framework to evaluate the adjustment with respect to hydrologically-meaningful surfaces.

  16. Ovarian stimulation length, number of follicles higher than 17 mm and estradiol on the day of human chorionic gonadotropin administration are risk factors for multiple pregnancy in intrauterine insemination

    PubMed Central

    MELO, MARCO A.B.; SIMÓN, CARLOS; REMOHÍ, JOSÉ; PELLICER, ANTONIO; MESEGUER, MARCOS

    2007-01-01

    Aim:  The aim of the present study was to identify the risk factors, their prognostic value on multiple pregnancies (MP) prediction and their thresholds in women undergoing controlled ovarian hyperstimulation (COH) with follicle stimulating hormone (FSH) and intrauterine insemination (IUI). Methods:  A case‐control study was carried out by identifying in our database all the pregnancies reached by donor and conjugal IUI (DIUI and CIUI, respectively), and compared cycle features, patients’ characteristics and sperm analysis results between women achieving single pregnancy (SP) versus MP. The number of gestational sacs, follicular sizes and estradiol levels on the human chorionic gonadotropin (hCG) administration day, COH length and semen parameters were obtained from each cycle and compared. Student's t‐tests for mean comparisons, receiver–operator curve (ROC) analysis to determine the predictive value of each parameter on MP achievement and multiple regression analysis to determine single parameter influence were carried out. Results:  Women with MP in IUI stimulated cycles reached the adequate size of the dominant follicle (17 mm) significantly earlier than those achieving SP. Also, the mean follicles number, and estradiol levels on the hCG day were higher in the CIUI and DIUI MP group. Nevertheless, only ROC curve analysis revealed good prognostic value for estradiol and follicles higher than 17 mm. Multiple regression analysis confirmed these results. No feature of the basic sperm analysis, either in the ejaculate or in the prepared sample, was different or predictive of MP. When using donor sperm, different thresholds of follicle number, stimulation length and estradiol in the prediction of MP were noted, in comparison with CIUI. Conclusions:  MP in stimulated IUI cycles are closely associated to stimulation length, number of developed follicles higher than 17 mm on the day of hCG administration and estradiol levels. Also, estradiol has a good predictive value over MP in IUI stimulated cycles. The establishment of clinical thresholds will certainly help in the management of these couples to avoid undesired multiple pregnancies by canceling cycles or converting them into in vitro fertilization procedures. (Reprod Med Biol 2007; 6: 19–26) PMID:29699262

  17. Optical ranked-order filtering using threshold decomposition

    DOEpatents

    Allebach, Jan P.; Ochoa, Ellen; Sweeney, Donald W.

    1990-01-01

    A hybrid optical/electronic system performs median filtering and related ranked-order operations using threshold decomposition to encode the image. Threshold decomposition transforms the nonlinear neighborhood ranking operation into a linear space-invariant filtering step followed by a point-to-point threshold comparison step. Spatial multiplexing allows parallel processing of all the threshold components as well as recombination by a second linear, space-invariant filtering step. An incoherent optical correlation system performs the linear filtering, using a magneto-optic spatial light modulator as the input device and a computer-generated hologram in the filter plane. Thresholding is done electronically. By adjusting the value of the threshold, the same architecture is used to perform median, minimum, and maximum filtering of images. A totally optical system is also disclosed.

  18. A Universal Threshold for the Assessment of Load and Output Residuals of Strain-Gage Balance Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2017-01-01

    A new universal residual threshold for the detection of load and gage output residual outliers of wind tunnel strain{gage balance data was developed. The threshold works with both the Iterative and Non{Iterative Methods that are used in the aerospace testing community to analyze and process balance data. It also supports all known load and gage output formats that are traditionally used to describe balance data. The threshold's definition is based on an empirical electrical constant. First, the constant is used to construct a threshold for the assessment of gage output residuals. Then, the related threshold for the assessment of load residuals is obtained by multiplying the empirical electrical constant with the sum of the absolute values of all first partial derivatives of a given load component. The empirical constant equals 2.5 microV/V for the assessment of balance calibration or check load data residuals. A value of 0.5 microV/V is recommended for the evaluation of repeat point residuals because, by design, the calculation of these residuals removes errors that are associated with the regression analysis of the data itself. Data from a calibration of a six-component force balance is used to illustrate the application of the new threshold definitions to real{world balance calibration data.

  19. Research on energy stock market associated network structure based on financial indicators

    NASA Astrophysics Data System (ADS)

    Xi, Xian; An, Haizhong

    2018-01-01

    A financial market is a complex system consisting of many interacting units. In general, due to the various types of information exchange within the industry, there is a relationship between the stocks that can reveal their clear structural characteristics. Complex network methods are powerful tools for studying the internal structure and function of the stock market, which allows us to better understand the stock market. Applying complex network methodology, a stock associated network model based on financial indicators is created. Accordingly, we set threshold value and use modularity to detect the community network, and we analyze the network structure and community cluster characteristics of different threshold situations. The study finds that the threshold value of 0.7 is the abrupt change point of the network. At the same time, as the threshold value increases, the independence of the community strengthens. This study provides a method of researching stock market based on the financial indicators, exploring the structural similarity of financial indicators of stocks. Also, it provides guidance for investment and corporate financial management.

  20. Development of water quality thresholds during dredging for the protection of benthic primary producer habitats.

    PubMed

    Sofonia, Jeremy J; Unsworth, Richard K F

    2010-01-01

    Given the potential for adverse effects of ocean dredging on marine organisms, particularly benthic primary producer communities, the management and monitoring of those activities which cause elevated turbidity and sediment loading is critical. In practice, however, this has proven challenging as the development of water quality threshold values, upon which management responses are based, are subject to a large number of physical and biological parameters that are spatially and temporally specific. As a consequence, monitoring programs to date have taken a wide range of different approaches, most focusing on measures of turbidity reported as nephelometric turbidity units (NTU). This paper presents a potential approach in the determination of water quality thresholds which utilises data gathered through the long-term deployment of in situ water instruments, but suggests a focus on photosynthetic active radiation (PAR) rather than NTU as it is more relevant biologically and inclusive of other site conditions. A simple mathematical approach to data interpretation is also presented which facilitates threshold value development, not individual values of concentrations over specific intervals, but as an equation which may be utilized in numerical modelling.

  1. Rich Sliding Motion and Dynamics in a Filippov Plant-Disease System

    NASA Astrophysics Data System (ADS)

    Chen, Can; Chen, Xi

    In order to reduce the spread of plant diseases and maintain the number of infected trees below an economic threshold, we choose the number of infected trees and the number of susceptible plants as the control indexes on whether to implement control strategies. Then a Filippov plant-disease model incorporating cutting off infected branches and replanting susceptible trees is proposed. Based on the theory of Filippov system, the sliding mode dynamics and conditions for the existence of all the possible equilibria and Lotka-Volterra cycles are presented. We find that model solutions ultimately approach the positive equilibrium that lies in the region above the infected threshold value TI, or the periodic trajectories that lie in the region below TI, or the pseudo-attractor ET = (TS,TI), as we vary the susceptible and infected threshold values. It indicates that the plant-disease transmission is tolerable if the trajectories approach ET = (TS,TI) or the periodic trajectories lie in the region below TI. Hence an acceptable level of the number of infected trees can be achieved when the susceptible and infected threshold values are chosen appropriately.

  2. Application of a Threshold Method to the TRMM Radar for the Estimation of Space-Time Rain Rate Statistics

    NASA Technical Reports Server (NTRS)

    Meneghini, Robert; Jones, Jeffrey A.

    1997-01-01

    One of the TRMM radar products of interest is the monthly-averaged rain rates over 5 x 5 degree cells. Clearly, the most directly way of calculating these and similar statistics is to compute them from the individual estimates made over the instantaneous field of view of the Instrument (4.3 km horizontal resolution). An alternative approach is the use of a threshold method. It has been established that over sufficiently large regions the fractional area above a rain rate threshold and the area-average rain rate are well correlated for particular choices of the threshold [e.g., Kedem et al., 19901]. A straightforward application of this method to the TRMM data would consist of the conversion of the individual reflectivity factors to rain rates followed by a calculation of the fraction of these that exceed a particular threshold. Previous results indicate that for thresholds near or at 5 mm/h, the correlation between this fractional area and the area-average rain rate is high. There are several drawbacks to this approach, however. At the TRMM radar frequency of 13.8 GHz the signal suffers attenuation so that the negative bias of the high resolution rain rate estimates will increase as the path attenuation increases. To establish a quantitative relationship between fractional area and area-average rain rate, an independent means of calculating the area-average rain rate is needed such as an array of rain gauges. This type of calibration procedure, however, is difficult for a spaceborne radar such as TRMM. To estimate a statistic other than the mean of the distribution requires, in general, a different choice of threshold and a different set of tuning parameters.

  3. Regional rainfall thresholds for landslide occurrence using a centenary database

    NASA Astrophysics Data System (ADS)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  4. A lumped mucosal wave model of the vocal folds revisited: Recent extensions and oscillation hysteresis

    PubMed Central

    Lucero, Jorge C.; Koenig, Laura L.; Lourenço, Kelem G.; Ruty, Nicolas; Pelorson, Xavier

    2011-01-01

    This paper examines an updated version of a lumped mucosal wave model of the vocal fold oscillation during phonation. Threshold values of the subglottal pressure and the mean (DC) glottal airflow for the oscillation onset are determined. Depending on the nonlinear characteristics of the model, an oscillation hysteresis phenomenon may occur, with different values for the oscillation onset and offset threshold. The threshold values depend on the oscillation frequency, but the occurrence of the hysteresis is independent of it. The results are tested against pressure data collected from a mechanical replica of the vocal folds, and oral airflow data collected from speakers producing intervocalic ∕h∕. In the human speech data, observed differences between voice onset and offset may be attributed to variations in voice pitch, with a very small or inexistent hysteresis phenomenon. PMID:21428520

  5. The Dubna-Mainz-Taipei Dynamical Model for πN Scattering and π Electromagnetic Production

    NASA Astrophysics Data System (ADS)

    Yang, Shin Nan

    Some of the featured results of the Dubna-Mainz-Taipei (DMT) dynamical model for πN scattering and π0 electromagnetic production are summarized. These include results for threshold π0 production, deformation of Δ(1232),and the extracted properties of higher resonances below 2 GeV. The excellent agreement of DMT model's predictions with threshold π0 production data, including the recent precision measurements from MAMI establishes results of DMT model as a benchmark for experimentalists and theorists in dealing with threshold pion production.

  6. Digital music exposure reliably induces temporary threshold shift in normal-hearing human subjects.

    PubMed

    Le Prell, Colleen G; Dell, Shawna; Hensley, Brittany; Hall, James W; Campbell, Kathleen C M; Antonelli, Patrick J; Green, Glenn E; Miller, James M; Guire, Kenneth

    2012-01-01

    One of the challenges for evaluating new otoprotective agents for potential benefit in human populations is the availability of an established clinical paradigm with real-world relevance. These studies were explicitly designed to develop a real-world digital music exposure that reliably induces temporary threshold shift (TTS) in normal-hearing human subjects. Thirty-three subjects participated in studies that measured effects of digital music player use on hearing. Subjects selected either rock or pop music, which was then presented at 93 to 95 (n = 10), 98 to 100 (n = 11), or 100 to 102 (n = 12) dBA in-ear exposure level for a period of 4 hr. Audiograms and distortion product otoacoustic emissions (DPOAEs) were measured before and after music exposure. Postmusic tests were initiated 15 min, 1 hr 15 min, 2 hr 15 min, and 3 hr 15 min after the exposure ended. Additional tests were conducted the following day and 1 week later. Changes in thresholds after the lowest-level exposure were difficult to distinguish from test-retest variability; however, TTS was reliably detected after higher levels of sound exposure. Changes in audiometric thresholds had a "notch" configuration, with the largest changes observed at 4 kHz (mean = 6.3 ± 3.9 dB; range = 0-14 dB). Recovery was largely complete within the first 4 hr postexposure, and all subjects showed complete recovery of both thresholds and DPOAE measures when tested 1 week postexposure. These data provide insight into the variability of TTS induced by music-player use in a healthy, normal-hearing, young adult population, with music playlist, level, and duration carefully controlled. These data confirm the likelihood of temporary changes in auditory function after digital music-player use. Such data are essential for the development of a human clinical trial protocol that provides a highly powered design for evaluating novel therapeutics in human clinical trials. Care must be taken to fully inform potential subjects in future TTS studies, including protective agent evaluations, that some noise exposures have resulted in neural degeneration in animal models, even when both audiometric thresholds and DPOAE levels returned to pre-exposure values.

  7. Method and system for controlling a rotational speed of a rotor of a turbogenerator

    DOEpatents

    Stahlhut, Ronnie Dean; Vuk, Carl Thomas

    2008-12-30

    A system and method controls a rotational speed of a rotor or shaft of a turbogenerator in accordance with a present voltage level on a direct current bus. A lower threshold and a higher threshold are established for a speed of a rotor or shaft of a turbogenerator. A speed sensor determines speed data or a speed signal for the rotor or shaft associated with a turbogenerator. A voltage regulator adjusts a voltage level associated with a direct current bus within a target voltage range if the speed data or speed signal indicates that the speed is above the higher threshold or below the lower threshold.

  8. The Search for Significance: A Few Peculiarities in the Distribution of P Values in Experimental Psychology Literature.

    PubMed

    Krawczyk, Michał

    2015-01-01

    In this project I investigate the use and possible misuse of p values in papers published in five (high-ranked) journals in experimental psychology. I use a data set of over 135'000 p values from more than five thousand papers. I inspect (1) the way in which the p values are reported and (2) their distribution. The main findings are following: first, it appears that some authors choose the mode of reporting their results in an arbitrary way. Moreover, they often end up doing it in such a way that makes their findings seem more statistically significant than they really are (which is well known to improve the chances for publication). Specifically, they frequently report p values "just above" significance thresholds directly, whereas other values are reported by means of inequalities (e.g. "p<.1"), they round the p values down more eagerly than up and appear to choose between the significance thresholds and between one- and two-sided tests only after seeing the data. Further, about 9.2% of reported p values are inconsistent with their underlying statistics (e.g. F or t) and it appears that there are "too many" "just significant" values. One interpretation of this is that researchers tend to choose the model or include/discard observations to bring the p value to the right side of the threshold.

  9. Establishing nonlinearity thresholds with ultraintense X-ray pulses

    NASA Astrophysics Data System (ADS)

    Szlachetko, Jakub; Hoszowska, Joanna; Dousse, Jean-Claude; Nachtegaal, Maarten; Błachucki, Wojciech; Kayser, Yves; Sà, Jacinto; Messerschmidt, Marc; Boutet, Sebastien; Williams, Garth J.; David, Christian; Smolentsev, Grigory; van Bokhoven, Jeroen A.; Patterson, Bruce D.; Penfold, Thomas J.; Knopp, Gregor; Pajek, Marek; Abela, Rafael; Milne, Christopher J.

    2016-09-01

    X-ray techniques have evolved over decades to become highly refined tools for a broad range of investigations. Importantly, these approaches rely on X-ray measurements that depend linearly on the number of incident X-ray photons. The advent of X-ray free electron lasers (XFELs) is opening the ability to reach extremely high photon numbers within ultrashort X-ray pulse durations and is leading to a paradigm shift in our ability to explore nonlinear X-ray signals. However, the enormous increase in X-ray peak power is a double-edged sword with new and exciting methods being developed but at the same time well-established techniques proving unreliable. Consequently, accurate knowledge about the threshold for nonlinear X-ray signals is essential. Herein we report an X-ray spectroscopic study that reveals important details on the thresholds for nonlinear X-ray interactions. By varying both the incident X-ray intensity and photon energy, we establish the regimes at which the simplest nonlinear process, two-photon X-ray absorption (TPA), can be observed. From these measurements we can extract the probability of this process as a function of photon energy and confirm both the nature and sub-femtosecond lifetime of the virtual intermediate electronic state.

  10. Dental ceramics: a CIEDE2000 acceptability thresholds for lightness, chroma and hue differences.

    PubMed

    Perez, María Del Mar; Ghinea, Razvan; Herrera, Luis Javier; Ionescu, Ana Maria; Pomares, Héctor; Pulgar, Rosa; Paravina, Rade D

    2011-12-01

    To determine the visual 50:50% acceptability thresholds for lightness, chroma and hue for dental ceramics using CIEDE2000(K(L):K(C):K(H)) formula, and to evaluate the formula performance using different parametric factors. A 30-observer panel evaluated three subsets of ceramic samples: lightness subset (|ΔL'/ΔE(00)| ≥ 0.9), chroma subset (|ΔC'/ΔE(00)| ≥ 0.9) and hue subset (|ΔH'/ΔE(00)| ≥ 0.9). A Takagi-Sugeno-Kang Fuzzy Approximation was used as fitting procedure, and the 50:50% acceptability thresholds were calculated. A t-test was used in statistical analysis of the thresholds values. The performance of the CIEDE2000(1:1:1) and CIEDE2000(2:1:1) colour difference formulas against visual results was tested using PF/3 performance factor. The 50:50% CIEDE2000 acceptability thresholds were ΔL' = 2.92 (95% CI 1.22-4.96; r(2) = 0.76), ΔC' = 2.52 (95% CI 1.31-4.19; r(2) = 0.71) and ΔH' = 1.90 (95% CI 1.63-2.15; r(2) = 0.88). The 50:50% acceptability threshold for colour difference (ΔE') for CIEDE2000(1:1:1) was 1.87, whilst corresponding value for CIEDE2000(2:1:1) was 1.78. The PF/3 values were 139.86 for CIEDE2000(1:1:1), and 132.31 for CIEDE2000(2:1:1). There was a statistically significant difference amongst CIEDE2000 50:50% acceptability thresholds for lightness, chroma and hue differences for dental ceramics. The CIEDE2000(2:1:1) formula performed better than CIEDE2000(1:1:1). Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Alphasatellitidae: a new family with two subfamilies for the classification of geminivirus- and nanovirus-associated alphasatellites.

    PubMed

    Briddon, Rob W; Martin, Darren P; Roumagnac, Philippe; Navas-Castillo, Jesús; Fiallo-Olivé, Elvira; Moriones, Enrique; Lett, Jean-Michel; Zerbini, F Murilo; Varsani, Arvind

    2018-05-09

    Nanoviruses and geminiviruses are circular, single stranded DNA viruses that infect many plant species around the world. Nanoviruses and certain geminiviruses that belong to the Begomovirus and Mastrevirus genera are associated with additional circular, single stranded DNA molecules (~ 1-1.4 kb) that encode a replication-associated protein (Rep). These Rep-encoding satellite molecules are commonly referred to as alphasatellites and here we communicate the establishment of the family Alphasatellitidae to which these have been assigned. Within the Alphasatellitidae family two subfamilies, Geminialphasatellitinae and Nanoalphasatellitinae, have been established to respectively accommodate the geminivirus- and nanovirus-associated alphasatellites. Whereas the pairwise nucleotide sequence identity distribution of all the known geminialphasatellites (n = 628) displayed a troughs at ~ 70% and 88% pairwise identity, that of the known nanoalphasatellites (n = 54) had a troughs at ~ 67% and ~ 80% pairwise identity. We use these pairwise identity values as thresholds together with phylogenetic analyses to establish four genera and 43 species of geminialphasatellites and seven genera and 19 species of nanoalphasatellites. Furthermore, a divergent alphasatellite associated with coconut foliar decay disease is assigned to a species but not a subfamily as it likely represents a new alphasatellite subfamily that could be established once other closely related molecules are discovered.

  12. Systematic analysis of esophageal pressure topography in high-resolution manometry of 68 normal volunteers.

    PubMed

    Niebisch, S; Wilshire, C L; Peters, J H

    2013-01-01

    The introduction of high-resolution manometry (HRM) has been a significant advance in esophageal diagnostics. Normative values however are currently based upon a single set of published reference values, and multiple new metrics have been added over the past several years. Our goal was to provide a second set of 'normal-values' and to include all current metrics suggested by the 2012 Chicago classification. Sixty-eight subjects without foregut symptoms or previous surgery (median age 25.5 years, ranging from 20-58 years, 53% female) underwent esophageal motility assessment via an established standardized protocol. Normative thresholds were calculated for esophago-gastric junction (EGJ) characteristics (resting, relaxation, intrabolus pressure, and lengths) as well as for esophageal body strength (contraction amplitudes at multiple levels, distal contractile integral, integrity of peristalsis) and wave propagation (contractile front velocity, distal latency). Overall, our findings where strikingly similar to the previously described metrics derived from 75 control subjects of the Northwestern group. This suggests a high degree of reproducibility of HRM. © 2013 Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.

  13. Two-threshold model for scaling laws of noninteracting snow avalanches

    USGS Publications Warehouse

    Faillettaz, J.; Louchet, F.; Grasso, J.-R.

    2004-01-01

    A two-threshold model was proposed for scaling laws of noninteracting snow avalanches. It was found that the sizes of the largest avalanches just preceding the lattice system were power-law distributed. The proposed model reproduced the range of power-law exponents observe for land, rock or snow avalanches, by tuning the maximum value of the ratio of the two failure thresholds. A two-threshold 2D cellular automation was introduced to study the scaling for gravity-driven systems.

  14. The Utility of Selection for Military and Civilian Jobs

    DTIC Science & Technology

    1989-07-01

    parsimonious use of information; the relative ease in making threshold (break-even) judgments compared to estimating actual SDy values higher than a... threshold value, even though judges are unlikely to agree on the exact point estimate for the SDy parameter; and greater understanding of how even small...ability, spatial ability, introversion , anxiety) considered to vary or differ across individuals. A construct (sometimes called a latent variable) is not

  15. An entropy decision approach in flash flood warning: rainfall thresholds definition

    NASA Astrophysics Data System (ADS)

    Montesarchio, V.; Napolitano, F.; Ridolfi, E.

    2009-09-01

    Flash floods events are floods characterised by very rapid response of the basins to the storms, and often they involve loss of life and damage to common and private properties. Due to the specific space-time scale of this kind of flood, generally only a short lead time is available for triggering civil protection measures. Thresholds values specify the precipitation amount for a given duration that generates a critical discharge in a given cross section. The overcoming of these values could produce a critical situation in river sites exposed to alluvial risk, so it is possible to compare directly the observed or forecasted precipitation with critical reference values, without running on line real time forecasting systems. This study is focused on the Mignone River basin, located in Central Italy. The critical rainfall threshold values are evaluated minimising an utility function based on the informative entropy concept. The study concludes with a system performance analysis, in terms of correctly issued warning, false alarms and missed alarms.

  16. A comparison of the performance of threshold criteria for binary classification in terms of predicted prevalence and Kappa

    Treesearch

    Elizabeth A. Freeman; Gretchen G. Moisen

    2008-01-01

    Modelling techniques used in binary classification problems often result in a predicted probability surface, which is then translated into a presence - absence classification map. However, this translation requires a (possibly subjective) choice of threshold above which the variable of interest is predicted to be present. The selection of this threshold value can have...

  17. Threshold-adaptive canny operator based on cross-zero points

    NASA Astrophysics Data System (ADS)

    Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu

    2018-03-01

    Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.

  18. [Improved euler algorithm for trend forecast model and its application to oil spectrum analysis].

    PubMed

    Zheng, Chang-song; Ma, Biao

    2009-04-01

    The oil atomic spectrometric analysis technology is one of the most important methods for fault diagnosis and state monitoring of large machine equipment. The gray method is preponderant in the trend forecast at the same time. With the use of oil atomic spectrometric analysis result and combining the gray forecast theory, the present paper established a gray forecast model of the Fe/Cu concentration trend in the power-shift steering transmission. Aiming at the shortage of the gray method used in the trend forecast, the improved Euler algorithm was put forward for the first time to resolve the problem of the gray model and avoid the non-precision that the old gray model's forecast value depends on the first test value. This new method can make the forecast value more precision as shown in the example. Combined with the threshold value of the oil atomic spectrometric analysis, the new method was applied on the Fe/Cu concentration forecast and the premonition of fault information was obtained. So we can take steps to prevent the fault and this algorithm can be popularized to the state monitoring in the industry.

  19. Comprehensive analysis of sperm DNA fragmentation by five different assays: TUNEL assay, SCSA, SCD test and alkaline and neutral Comet assay.

    PubMed

    Ribas-Maynou, J; García-Peiró, A; Fernández-Encinas, A; Abad, C; Amengual, M J; Prada, E; Navarro, J; Benet, J

    2013-09-01

    Sperm DNA fragmentation (SDF) is becoming an important test to assess male infertility. Several different tests are available, but no consensus has yet been reached as to which tests are most predictive of infertility. Few publications have reported a comprehensive analysis comparing these methods within the same population. The objective of this study was to analyze the differences between the five most common methodologies, to study their correlations and to establish their cut-off values, sensitivity and specificity in predicting male infertility. We found differences in SDF between fertile donors and infertile patients in TUNEL, SCSA, SCD and alkaline Comet assays, but none with the neutral Comet assay. The alkaline COMET assay was the best in predicting male infertility followed by TUNEL, SCD and SCSA, whereas the neutral COMET assay had no predictive power. For our patient population, threshold values for infertility were 20.05% for TUNEL assay, 18.90% for SCSA, 22.75% for the SCD test, 45.37% for alkaline Comet and 34.37% for neutral Comet. This work establishes in a comprehensive study that the all techniques except neutral Comet are useful to distinguish fertile and infertile men. © 2013 American Society of Andrology and European Academy of Andrology.

  20. Structure-based thresholds of toxicological concern-guidance for application to substances present at low levels in the diet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renwick, A.G.

    Health-based guidance values, such as the ADI, use chemical-specific data to determine the highest intake that would be without significant adverse health effects. A threshold of toxicological concern (TTC) is a level of intake predicted to be without adverse effects based on the toxicity of structurally related compounds. The main advantage of the use of TTCs is that the risk of low exposures can be evaluated without the need for chemical-specific animal toxicity data. TTCs have been used for many years for screening the safety of packaging migrants by the FDA in the USA, and of flavoring substances, by themore » JECFA. A recent reassessment of the use of TTCs, organized by ILSI Europe, has developed a decision tree which allows a systematic approach to the evaluation of low levels of diverse chemicals in food. The decision tree incorporates a series of increasing TTC values into a step-wise approach. Potentially genotoxic carcinogens are considered first, based on the presence of known structural alerts. Aflatoxin-like, azoxy- and nitroso-compounds are removed from consideration because they are the most potent, and a practical TTC could not be established. Other compounds with structural alerts for genotoxicity are allocated a TTC of 0.15 {mu}g/person per day. Compounds without structural alerts for genotoxicity are evaluated based on chemical structure and intake using a series of TTC values derived by the application of a 100-fold uncertainty factor to the 5th percentile of the distribution of NOAELs from chronic studies on compounds sharing similar structural characteristics.« less

Top