76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).
Data Assimilation Experiments using Quality Controlled AIRS Version 5 Temperature Soundings
NASA Technical Reports Server (NTRS)
Susskind, Joel
2008-01-01
The AIRS Science Team Version 5 retrieval algorithm has been finalized and is now operational at the Goddard DAAC in the processing (and reprocessing) of all AlRS data. Version 5 contains accurate case-by-case error estimates for most derived products, which are also used for quality control. We have conducted forecast impact experiments assimilating AlRS quality controlled temperature profiles using the NASA GEOS-5 data assimilation system, consisting of the NCEP GSI analysis coupled with the NASA FVGCM. Assimilation of quality controlled temperature profiles resulted in significantly improved forecast skill in both the Northern Hemisphere and Southern Hemisphere Extra-Tropics, compared to that obtained from analyses obtained when all data used operationally by NCEP except for AlRS data is assimilated. Experiments using different Quality Control thresholds for assimilation of AlRS temperature retrievals showed that a medium quality control threshold performed better than a tighter threshold, which provided better overall sounding accuracy; or a looser threshold, which provided better spatial coverage of accepted soundings. We are conducting more experiments to further optimize this balance of spatial coverage and sounding accuracy from the data assimilation perspective. In all cases, temperature soundings were assimilated well below cloud level in partially cloudy cases. The positive impact of assimilating AlRS derived atmospheric temperatures all but vanished when only AIRS stratospheric temperatures were assimilated. Forecast skill resulting from assimilation of AlRS radiances uncontaminated by clouds, instead of AlRS temperature soundings, was only slightly better than that resulting from assimilation of only stratospheric AlRS temperatures. This reduction in forecast skill is most likely the result of significant loss of tropospheric information when only AIRS radiances unaffected by clouds are used in the data assimilation process.
Kang, Deqiang; Hua, Haiqin; Peng, Nan; Zhao, Jing; Wang, Zhiqun
2017-04-01
We aim to improve the image quality of coronary computed tomography angiography (CCTA) by using personalized weight and height-dependent scan trigger threshold. This study was divided into two parts. First, we performed and analyzed the 100 scheduled CCTA data, which were acquired by using body mass index-dependent Smart Prep sequence (trigger threshold ranged from 80 Hu to 250 Hu based on body mass index). By identifying the cases of high quality image, a linear regression equation was established to determine the correlation among the Smart Prep threshold, height, and body weight. Furthermore, a quick search table was generated for weight and height-dependent Smart Prep threshold in CCTA scan. Second, to evaluate the effectiveness of the new individual threshold method, an additional 100 consecutive patients were divided into two groups: individualized group (n = 50) with weight and height-dependent threshold and control group (n = 50) with the conventional constant threshold of 150 HU. Image quality was compared between the two groups by measuring the enhancement in coronary artery, aorta, left and right ventricle, and inferior vena cava. By visual inspection, image quality scores were performed to compare between the two groups. Regression equation between Smart Prep threshold (K, Hu), height (H, cm), and body weight (BW, kg) was K = 0.811 × H + 1.917 × BW - 99.341. When compared to the control group, the individualized group presented an average overall increase of 12.30% in enhancement in left main coronary artery, 12.94% in proximal right coronary artery, and 10.6% in aorta. Correspondingly, the contrast-to-noise ratios increased by 26.03%, 27.08%, and 23.17%, respectively, and by 633.1% in contrast between aorta and left ventricle. Meanwhile, the individualized group showed an average overall decrease of 22.7% in enhancement of right ventricle and 32.7% in inferior vena cava. There was no significant difference of the image noise between the two groups (P > .05). By visual inspection, the image quality score of the individualized group was higher than that of the control group. Using personalized weight and height-dependent Smart Prep threshold to adjust scan trigger time can significantly improve the image quality of CCTA. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2017-09-01
Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.
Mindlis, I; Morales-Raveendran, E; Goodman, E; Xu, K; Vila-Castelar, C; Keller, K; Crawford, G; James, S; Katz, C L; Crowley, L E; de la Hoz, R E; Markowitz, S; Wisnivesky, J P
2017-09-01
Using data from a cohort of World Trade Center (WTC) rescue and recovery workers with asthma, we assessed whether meeting criteria for post-traumatic stress disorder (PTSD), sub-threshold PTSD, and for specific PTSD symptom dimensions are associated with increased asthma morbidity. Participants underwent a Structured Clinical Interview for Diagnostic and Statistical Manual to assess the presence of PTSD following DSM-IV criteria during in-person interviews between December 2013 and April 2015. We defined sub-threshold PTSD as meeting criteria for two of three symptom dimensions: re-experiencing, avoidance, or hyper-arousal. Asthma control, acute asthma-related healthcare utilization, and asthma-related quality of life data were collected using validated scales. Unadjusted and multiple regression analyses were performed to assess the relationship between sub-threshold PTSD and PTSD symptom domains with asthma morbidity measures. Of the 181 WTC workers with asthma recruited into the study, 28% had PTSD and 25% had sub-threshold PTSD. Patients with PTSD showed worse asthma control, higher rates of inpatient healthcare utilization, and poorer asthma quality of life than those with sub-threshold or no PTSD. After adjusting for potential confounders, among patients not meeting the criteria for full PTSD, those presenting symptoms of re-experiencing exhibited poorer quality of life (p = 0.003). Avoidance was associated with increased acute healthcare use (p = 0.05). Sub-threshold PTSD was not associated with asthma morbidity (p > 0.05 for all comparisons). There may be benefit in assessing asthma control in patients with sub-threshold PTSD symptoms as well as those with full PTSD to more effectively identify ongoing asthma symptoms and target management strategies.
Watershed safety and quality control by safety threshold method
NASA Astrophysics Data System (ADS)
Da-Wei Tsai, David; Mengjung Chou, Caroline; Ramaraj, Rameshprabu; Liu, Wen-Cheng; Honglay Chen, Paris
2014-05-01
Taiwan was warned as one of the most dangerous countries by IPCC and the World Bank. In such an exceptional and perilous island, we would like to launch the strategic research of land-use management on the catastrophe prevention and environmental protection. This study used the watershed management by "Safety Threshold Method" to restore and to prevent the disasters and pollution on island. For the deluge prevention, this study applied the restoration strategy to reduce total runoff which was equilibrium to 59.4% of the infiltration each year. For the sediment management, safety threshold management could reduce the sediment below the equilibrium of the natural sediment cycle. In the water quality issues, the best strategies exhibited the significant total load reductions of 10% in carbon (BOD5), 15% in nitrogen (nitrate) and 9% in phosphorus (TP). We found out the water quality could meet the BOD target by the 50% peak reduction with management. All the simulations demonstrated the safety threshold method was helpful to control the loadings within the safe range of disasters and environmental quality. Moreover, from the historical data of whole island, the past deforestation policy and the mistake economic projects were the prime culprits. Consequently, this study showed a practical method to manage both the disasters and pollution in a watershed scale by the land-use management.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Goldrath, Dara A.; Wright, Michael T.; Belitz, Kenneth
2010-01-01
Groundwater quality in the 188-square-mile Colorado River Study unit (COLOR) was investigated October through December 2007 as part of the Priority Basin Project of the California State Water Resources Control Board (SWRCB) Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and the U.S. Geological Survey (USGS) is the technical project lead. The Colorado River study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within COLOR, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 28 wells in three study areas in San Bernardino, Riverside, and Imperial Counties. Twenty wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the Study unit; these wells are termed 'grid wells'. Eight additional wells were selected to evaluate specific water-quality issues in the study area; these wells are termed `understanding wells.' The groundwater samples were analyzed for organic constituents (volatile organic compounds [VOC], gasoline oxygenates and degradates, pesticides and pesticide degradates, pharmaceutical compounds), constituents of special interest (perchlorate, 1,4-dioxane, and 1,2,3-trichlorpropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), and radioactive constituents. Concentrations of naturally occurring isotopes (tritium, carbon-14, and stable isotopes of hydrogen and oxygen in water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, approximately 220 constituents and water-quality indicators were investigated. Quality-control samples (blanks, replicates, and matrix spikes) were collected at approximately 30 percent of the wells, and the results were used to evaluate the quality of the data obtained from the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a significant source of bias in the data. Differences between replicate samples were within acceptable ranges and matrix-spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw groundwater typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared to regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH) and to thresholds established for aesthetic concerns by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate compliance or noncompliance with those thresholds. The concentrations of most constituents detected in groundwater samples were below drinking-water thresholds. Volatile organic compounds (VOC) were detected in approximately 35 percent of grid well samples; all concentrations were below health-based thresholds. Pesticides and pesticide degradates were detected in about 20 percent of all samples; detections were below health-based thresholds. No concentrations of constituents of special interest or nutrients were detected above health-based thresholds. Most of the major and minor ion constituents sampled do not have health-based thresholds; the exception is chloride. Concentrations of chloride, sulfate, and total dis
Jensen-Dahm, Christina; Madsen, Caspar Skau; Waldemar, Gunhild; Ballegaard, Martin; Hejl, Anne-Mette; Johnsen, Birger; Jensen, Troels Staehelin
2016-04-01
Clinical studies have found that patients with Alzheimer's disease report pain of less intensity and with a lower affective response, which has been thought to be due to altered pain processing. The authors wished to examine the cerebral processing of non-painful and painful stimuli using somatosensory evoked potentials and contact heat evoked potentials in patients with Alzheimer's disease and in healthy elderly controls. Case-control study Twenty outpatients with mild-moderate Alzheimer's disease and in 17 age- and gender-matched healthy controls were included Contact heat evoked potentials and somatosensory evoked potentials were recorded in all subjects. Furthermore, warmth detection threshold and heat pain threshold were assessed. Patients and controls also rated quality and intensity of the stimuli. The authors found no difference on contact heat evoked potential amplitude (P = 0.59) or latency of N2 or P2 wave (P = 0.62 and P = 0.75, respectively) between patients and controls. In addition, there was no difference in regard to pain intensity scores or pain quality. The patients and controls had similar warmth detection threshold and heat pain threshold. Somatosensory evoked potentials, amplitude, and latency were within normal range and similar for the two groups. The findings suggest that the processing of non-painful and painful stimuli is preserved in patients with mild to moderate Alzheimer's disease. © 2015 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Fram, Miranda S.; Belitz, Kenneth
2007-01-01
Ground-water quality in the approximately 1,800 square-mile Southern Sierra study unit (SOSA) was investigated in June 2006 as part of the Statewide Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Statewide Basin Assessment Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Southern Sierra study was designed to provide a spatially unbiased assessment of raw ground-water quality within SOSA, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from fifty wells in Kern and Tulare Counties. Thirty-five of the wells were selected using a randomized grid-based method to provide statistical representation of the study area, and fifteen were selected to evaluate changes in water chemistry along ground-water flow paths. The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, pharmaceutical compounds, and wastewater-indicator compounds], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), and 1,2,3-trichloropropane (1,2,3-TCP)], naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water], and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected for approximately one-eighth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control information resulted in censoring of less than 0.2 percent of the data collected for ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs and pesticides were detected in less than one-third of the grid wells, and all detections in samples from SOSA wells were below health-based thresholds. All detections of trace elements and nutrients in samples from SOSA wells were below health-based thresholds, with the exception of four detections of arsenic that were above the USEPA maximum contaminant level (MCL-US) and one detection of boron that was above the CDPH notification level (NL-CA). All detections of radioactive constituents were below health-based thresholds, although four samples had activities of radon-222 above the proposed MCL-US. Most of the samples from SOSA wells had concentrations of major elements, total dissolved solids, and trace elements below the non-enforceable thresholds set for aesthetic concerns. A few samples contained iron, manganese, or total dissolved solids at concentrations above the SMCL-CA thresholds.
Implementation guide for turbidity threshold sampling: principles, procedures, and analysis
Jack Lewis; Rand Eads
2009-01-01
Turbidity Threshold Sampling uses real-time turbidity and river stage information to automatically collect water quality samples for estimating suspended sediment loads. The system uses a programmable data logger in conjunction with a stage measurement device, a turbidity sensor, and a pumping sampler. Specialized software enables the user to control the sampling...
Land, Michael; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 460 square mile San Fernando-San Gabriel study unit (SFSG) was investigated between May and July 2005 as part of the Priority Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Assessment Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The San Fernando-San Gabriel study was designed to provide a spatially unbiased assessment of raw ground-water quality within SFSG, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 52 wells in Los Angeles County. Thirty-five of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and seventeen wells were selected to aid in the evaluation of specific water-quality issues or changes in water chemistry along a historic ground-water flow path (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), 1,2,3-trichloropropane (1,2,3-TCP), and 1,4-dioxane], naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, and carbon), and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, samples for matrix spikes) were collected at approximately one-fifth (11 of 52) of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control results showed that the data had very little bias or variability and resulted in censoring of less than 0.7 percent (32 of 4,484 measurements) of the data collected for ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs were detected in more than 90 percent (33 of 35) of grid wells. For all wells sampled for SFSG, nearly all VOC detections were below health-based thresholds, and most were less than one-tenth of the threshold values. Samples from seven wells had at least one detection of PCE, TCE, tetrachloromethane, NDMA, or 1,2,3-TCP at or above a health-based threshold. Pesticides were detected in about 90 percent (31 of 35) grid wells and all detections in samples from SFSG wells were below health-based thresholds. Major ions, trace elements, and nutrients in samples from 17 SFSG wells were all below health-based thresholds, with the exception of one detection of nitrate that was above the USEPA maximum contaminant level (MCL-US). With the exception of 14 samples having radon-222 above the proposed MCL-US, radioactive constituents were below health-based thresholds for 16 of the SFSG wells sampled. Total dissolved solids in 6 of the 24 SFSG wells that were sampled ha
Bian, Lin
2012-01-01
In clinical practice, hearing thresholds are measured at only five to six frequencies at octave intervals. Thus, the audiometric configuration cannot closely reflect the actual status of the auditory structures. In addition, differential diagnosis requires quantitative comparison of behavioral thresholds with physiological measures, such as otoacoustic emissions (OAEs) that are usually measured in higher resolution. The purpose of this research was to develop a method to improve the frequency resolution of the audiogram. A repeated-measure design was used in the study to evaluate the reliability of the threshold measurements. A total of 16 participants with clinically normal hearing and mild hearing loss were recruited from a population of university students. No intervention was involved in the study. Custom developed system and software were used for threshold acquisition with quality control (QC). With real-ear calibration and monitoring of test signals, the system provided accurate and individualized measure of hearing thresholds that were determined by an analysis based on signal detection theory (SDT). The reliability of the threshold measure was assessed by correlation and differences between the repeated measures. The audiometric configurations were diverse and unique to each individual ear. The accuracy, within-subject reliability, and between-test repeatability are relatively high. With QC, the high-resolution audiograms can be reliably and accurately measured. Hearing thresholds measured as ear canal sound pressures with higher frequency resolution can provide more customized hearing-aid fitting. The test system may be integrated with other physiological measures, such as OAEs, into a comprehensive evaluative tool. American Academy of Audiology.
THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL
Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...
Cloud Motion Vectors from MISR using Sub-pixel Enhancements
NASA Technical Reports Server (NTRS)
Davies, Roger; Horvath, Akos; Moroney, Catherine; Zhang, Banglin; Zhu, Yanqiu
2007-01-01
The operational retrieval of height-resolved cloud motion vectors by the Multiangle Imaging SpectroRadiometer on the Terra satellite has been significantly improved by using sub-pixel approaches to co-registration and disparity assessment, and by imposing stronger quality control based on the agreement between independent forward and aft triplet retrievals. Analysis of the fore-aft differences indicates that CMVs pass the basic operational quality control 67% of the time, with rms differences - in speed of 2.4 m/s, in direction of 17 deg, and in height assignment of 290 m. The use of enhanced quality control thresholds reduces these rms values to 1.5 m/s, 17 deg and 165 m, respectively, at the cost of reduced coverage to 45%. Use of the enhanced thresholds also eliminates a tendency for the rms differences to increase with height. Comparison of CMVs from an earlier operational version that had slightly weaker quality control, with 6-hour forecast winds from the Global Modeling and Assimilation Office yielded very low bias values and an rms vector difference that ranged from 5 m/s for low clouds to 10 m/s for high clouds.
Methods to achieve high interrater reliability in data collection from primary care medical records.
Liddy, Clare; Wiens, Miriam; Hogg, William
2011-01-01
We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.
Bennett, Peter A.; Bennett, George L.; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 1,180-square-mile Northern Sacramento Valley study unit (REDSAC) was investigated in October 2007 through January 2008 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within REDSAC and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 66 wells in Shasta and Tehama Counties. Forty-three of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 23 were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial constituents. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of nitrogen and oxygen in nitrate, stable isotopes of hydrogen and oxygen of water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. In total, over 275 constituents and field water-quality indicators were investigated. Three types of quality-control samples (blanks, replicates, and sampmatrix spikes) were collected at approximately 8 to 11 percent of the wells, and the results for these samples were used to evaluate the quality of the data obtained from the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the groundwater samples. Differences between replicate samples were within acceptable ranges for nearly all compounds, indicating acceptably low variability. Matrix-spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw groundwater typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and with aesthetic and technical thresholds established by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate compliance or noncompliance with those thresholds. The concentrations of most constituents detected in groundwater samples from REDSAC were below drinking-water thresholds. Volatile organic compounds (VOC) and pesticides were detected in less than one-quarter of the samples and were generally less than a hundredth of any health-based thresholds. NDMA was detected in one grid well above the NL-CA. Concentrations of all nutrients and trace elements in samples from REDSAC wells were below the health-based thresholds except those of arsenic in three samples, which were above the USEPA maximum contaminant level (MCL-US). However
Fram, Miranda S.; Munday, Cathy; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 460-square-mile Tahoe-Martis study unit was investigated in June through September 2007 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within the Tahoe-Martis study unit (Tahoe-Martis) and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 52 wells in El Dorado, Placer, and Nevada Counties. Forty-one of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 11 were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, carbon-14, strontium isotope ratio, and stable isotopes of hydrogen and oxygen of water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, 240 constituents and water-quality indicators were investigated. Three types of quality-control samples (blanks, replicates, and samples for matrix spikes) each were collected at 12 percent of the wells, and the results obtained from these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that data for the groundwater samples were not compromised by possible contamination during sample collection, handling or analysis. Differences between replicate samples were within acceptable ranges. Matrix spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH), and with aesthetic and technical thresholds established by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and do not indicate of compliance or noncompliance with regulatory thresholds. The concentrations of most constituents detected in groundwater samples from the Tahoe-Martis wells were below drinking-water thresholds. Organic compounds (VOCs and pesticides) were detected in about 40 percent of the samples from grid wells, and most concentrations were less than 1/100th of regulatory and nonregulatory health-based thresholds, although the conentration of perchloroethene in one sample was above the USEPA maximum contaminant level (MCL-US). Concentrations of all trace elements and nutrients in samples from grid wells were below regulatory and nonregulatory health-based thresholds, with five exceptions. Concentra
Quality Aware Compression of Electrocardiogram Using Principal Component Analysis.
Gupta, Rajarshi
2016-05-01
Electrocardiogram (ECG) compression finds wide application in various patient monitoring purposes. Quality control in ECG compression ensures reconstruction quality and its clinical acceptance for diagnostic decision making. In this paper, a quality aware compression method of single lead ECG is described using principal component analysis (PCA). After pre-processing, beat extraction and PCA decomposition, two independent quality criteria, namely, bit rate control (BRC) or error control (EC) criteria were set to select optimal principal components, eigenvectors and their quantization level to achieve desired bit rate or error measure. The selected principal components and eigenvectors were finally compressed using a modified delta and Huffman encoder. The algorithms were validated with 32 sets of MIT Arrhythmia data and 60 normal and 30 sets of diagnostic ECG data from PTB Diagnostic ECG data ptbdb, all at 1 kHz sampling. For BRC with a CR threshold of 40, an average Compression Ratio (CR), percentage root mean squared difference normalized (PRDN) and maximum absolute error (MAE) of 50.74, 16.22 and 0.243 mV respectively were obtained. For EC with an upper limit of 5 % PRDN and 0.1 mV MAE, the average CR, PRDN and MAE of 9.48, 4.13 and 0.049 mV respectively were obtained. For mitdb data 117, the reconstruction quality could be preserved up to CR of 68.96 by extending the BRC threshold. The proposed method yields better results than recently published works on quality controlled ECG compression.
Sleep quality and arousal in migraine and tension-type headache: the headache-sleep study.
Engstrøm, M; Hagen, K; Bjørk, M H; Stovner, L J; Sand, T
2014-01-01
The present paper summarizes and compares data from our studies on subjective and objective sleep quality and pain thresholds in tension-type headache (TTH), migraine, and controls. In a blinded controlled explorative study, we recorded polysomnography (PSG) and pressure, heat, and cold pain thresholds in 34 controls, 20 TTH, and 53 migraine patients. Sleep quality was assessed by questionnaires, sleep diaries, and PSG. Migraineurs who had their recordings more than 2 days from an attack were classified as interictal while the rest were classified as either preictal or postictal. Interictal migraineurs (n=33) were also divided into two groups if their headache onsets mainly were during sleep and awakening (sleep migraine, SM), or during daytime and no regular onset pattern (non-sleep migraine, NSM). TTH patients were divided into a chronic or episodic group according to headache days per month. Compared to controls, all headache groups reported more anxiety and sleep-related symptoms. TTH and NSM patients reported more daytime tiredness and tended to have lower pain thresholds. Despite normal sleep times in diary, TTH and NSM had increased slow-wave sleep as seen after sleep deprivation. Migraineurs in the preictal phase had shorter latency to sleep onset than controls. Except for a slight but significantly increased awakening index SM, patients differed little from controls in objective measurements. We hypothesize that TTH and NSM patients on the average need more sleep than healthy controls. SM patients seem more susceptible to sleep disturbances. Inadequate rest might be an attack-precipitating- and hyperalgesia-inducing factor. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Methods of Statistical Control for Groundwater Quality Indicators
NASA Astrophysics Data System (ADS)
Yankovich, E.; Nevidimova, O.; Yankovich, K.
2016-06-01
The article describes the results of conducted groundwater quality control. Controlled quality indicators included the following microelements - barium, manganese, iron, mercury, iodine, chromium, strontium, etc. Quality control charts - X-bar chart and R chart - were built. For the upper and the lower threshold limits, maximum permissible concentration of components in water and the lower limit of their biologically significant concentration, respectively, were selected. The charts analysis has shown that the levels of microelements content in water at the area of study are stable. Most elements in the underground water are contained in concentrations, significant for human organisms consuming the water. For example, such elements as Ba, Mn, Fe have concentrations that exceed maximum permissible levels for drinking water.
Setting limits: Using air pollution thresholds to protect and restore US ecosystems
Fenn, Mark E.; Lambert, Kathleen F.; Blett, Tamara F.; Burns, Douglas A.; Pardo, Linda H.; Lovett, Gary M.; Haeuber, Richard A.; Evers, David C.; Driscoll, Charles T.; Jeffries, Dean S.
2011-01-01
More than four decades of research provide unequivocal evidence that sulfur, nitrogen, and mercury pollution have altered, and will continue to alter, our nation's lands and waters. The emission and deposition of air pollutants harm native plants and animals, degrade water quality, affect forest productivity, and are damaging to human health. Many air quality policies limit emissions at the source but these control measures do not always consider ecosystem impacts. Air pollution thresholds at which ecological effects are observed, such as critical loads, are effective tools for assessing the impacts of air pollution on essential ecosystem services and for informing public policy. U.S. ecosystems can be more effectively protected and restored by using a combination of emissions-based approaches and science-based thresholds of ecosystem damage.
Yoo, Sun K; Kim, D K; Jung, S M; Kim, E-K; Lim, J S; Kim, J H
2004-01-01
A Web-based, realtime, tele-ultrasound consultation system was designed. The system employed ActiveX control, MPEG-4 coding of full-resolution ultrasound video (640 x 480 pixels at 30 frames/s) and H.320 videoconferencing. It could be used via a Web browser. The system was evaluated over three types of commercial line: a cable connection, ADSL and VDSL. Three radiologists assessed the quality of compressed and uncompressed ultrasound video-sequences from 16 cases (10 abnormal livers, four abnormal kidneys and two abnormal gallbladders). The radiologists' scores showed that, at a given frame rate, increasing the bit rate was associated with increasing quality; however, at a certain threshold bit rate the quality did not increase significantly. The peak signal to noise ratio (PSNR) was also measured between the compressed and uncompressed images. In most cases, the PSNR increased as the bit rate increased, and increased as the number of dropped frames increased. There was a threshold bit rate, at a given frame rate, at which the PSNR did not improve significantly. Taking into account both sets of threshold values, a bit rate of more than 0.6 Mbit/s, at 30 frames/s, is suggested as the threshold for the maintenance of diagnostic image quality.
Shelton, Jennifer L.; Pimentel, Isabel; Fram, Miranda S.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 3,000 square-mile Kern County Subbasin study unit (KERN) was investigated from January to March, 2006, as part of the Priority Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Assessment project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the California State Water Resources Control Board (SWRCB) in collaboration with the U.S. Geological Survey (USGS) and the Lawrence Livermore National Laboratory (LLNL). The Kern County Subbasin study was designed to provide a spatially unbiased assessment of raw (untreated) ground-water quality within KERN, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 50 wells within the San Joaquin Valley portion of Kern County. Forty-seven of the wells were selected using a randomized grid-based method to provide a statistical representation of the ground-water resources within the study unit. Three additional wells were sampled to aid in the evaluation of changes in water chemistry along regional ground-water flow paths. The ground-water samples were analyzed for a large number of man-made organic constituents (volatile organic compounds [VOCs], pesticides, and pesticide degradates), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon) and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, and laboratory matrix spikes) were collected and analyzed at approximately 10 percent of the wells, and the results for these samples were used to evaluate the quality of the data from the ground-water samples. Assessment of the quality-control information resulted in censoring of less than 0.4 percent of the data collected for ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw ground water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply, not to the raw ground water, but to treated water that is served to the consumer. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH), and as well as with thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs and pesticides each were detected in approximately 60 percent of the grid wells, and detections of all compounds but one were below health-based thresholds. The fumigant, 1,2-dibromo-3-chloropropane (DBCP), was detected above the USEPA maximum contaminant level (MCL-US) in one sample. Detections of most inorganic constituents were also below health-based thresholds. Constituents detected above health-based thresholds include: nitrate, (MCL-US, 2 samples), arsenic (MCL-US, 2 samples), and vanadium (California notification level, NL-CA, 1 sample). All detections of radioactive constituents were below health-based thresholds, although nine samples had activities of radon-222 above the lower proposed MCL-US. Most of the samples from KERN wells had concentrations of major elements, total dissolved solids, and trace elements below the non-enforceable thresholds set for aesthetic concerns.
Yang, Chihae; Barlow, Susan M; Muldoon Jacobs, Kristi L; Vitcheva, Vessela; Boobis, Alan R; Felter, Susan P; Arvidson, Kirk B; Keller, Detlef; Cronin, Mark T D; Enoch, Steven; Worth, Andrew; Hollnagel, Heli M
2017-11-01
A new dataset of cosmetics-related chemicals for the Threshold of Toxicological Concern (TTC) approach has been compiled, comprising 552 chemicals with 219, 40, and 293 chemicals in Cramer Classes I, II, and III, respectively. Data were integrated and curated to create a database of No-/Lowest-Observed-Adverse-Effect Level (NOAEL/LOAEL) values, from which the final COSMOS TTC dataset was developed. Criteria for study inclusion and NOAEL decisions were defined, and rigorous quality control was performed for study details and assignment of Cramer classes. From the final COSMOS TTC dataset, human exposure thresholds of 42 and 7.9 μg/kg-bw/day were derived for Cramer Classes I and III, respectively. The size of Cramer Class II was insufficient for derivation of a TTC value. The COSMOS TTC dataset was then federated with the dataset of Munro and colleagues, previously published in 1996, after updating the latter using the quality control processes for this project. This federated dataset expands the chemical space and provides more robust thresholds. The 966 substances in the federated database comprise 245, 49 and 672 chemicals in Cramer Classes I, II and III, respectively. The corresponding TTC values of 46, 6.2 and 2.3 μg/kg-bw/day are broadly similar to those of the original Munro dataset. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Goldrath, Dara A.; Wright, Michael T.; Belitz, Kenneth
2009-01-01
Ground-water quality in the approximately 820 square-mile Coachella Valley Study Unit (COA) was investigated during February and March 2007 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of raw ground water used for public-water supplies within the Coachella Valley, and to facilitate statistically consistent comparisons of ground-water quality throughout California. Samples were collected from 35 wells in Riverside County. Nineteen of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study unit (grid wells). Sixteen additional wells were sampled to evaluate changes in water chemistry along selected ground-water flow paths, examine land use effects on ground-water quality, and to collect water-quality data in areas where little exists. These wells were referred to as 'understanding wells'. The ground-water samples were analyzed for a large number of organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, pharmaceutical compounds, and potential wastewater-indicator compounds), constituents of special interest (perchlorate and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (uranium, tritium, carbon-14, and stable isotopes of hydrogen, oxygen, and boron), and dissolved noble gases (the last in collaboration with Lawrence Livermore National Laboratory) also were measured to help identify the source and age of the sampled ground water. A quality-control sample (blank, replicate, or matrix spike) was collected at approximately one quarter of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control information resulted in V-coding less than 0.1 percent of the data collected. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is supplied to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH) and thresholds established for aesthetic purposes (secondary maximum contaminant levels, SMCL-CA) by CDPH. Most constituents detected in ground-water samples were at concentrations below drinking-water thresholds. Volatile organic compounds, pesticides, and pesticide degradates were detected in less than one-third of the grid well samples collected. All VOC and pesticide concentrations measured were below health-based thresholds. Potential waste-water indicators were detected in less than half of the wells sampled, and no detections were above health-based thresholds. Perchlorate was detected in seven grid wells; concentrations from two wells were above the CDPH maximum contaminant level (MCL-CA). Most detections of trace elements in samples collected from COA Study Unit wells were below water-quality thresholds. Exceptions include five samples of arsenic that were above the USEPA maximum contaminant level (MCL-US), two detections of boron above the CDPH notification level (NL-CA), and two detections of mol
Groffman, P.M.; Baron, Jill S.; Blett, T.; Gold, A.J.; Goodman, I.; Gunderson, L.H.; Levinson, B.M.; Palmer, Margaret A.; Paerl, H.W.; Peterson, G.D.; Poff, N.L.; Rejeski, D.W.; Reynolds, J.F.; Turner, M.G.; Weathers, K.C.; Wiens, J.
2006-01-01
An ecological threshold is the point at which there is an abrupt change in an ecosystem quality, property or phenomenon, or where small changes in an environmental driver produce large responses in the ecosystem. Analysis of thresholds is complicated by nonlinear dynamics and by multiple factor controls that operate at diverse spatial and temporal scales. These complexities have challenged the use and utility of threshold concepts in environmental management despite great concern about preventing dramatic state changes in valued ecosystems, the need for determining critical pollutant loads and the ubiquity of other threshold-based environmental problems. In this paper we define the scope of the thresholds concept in ecological science and discuss methods for identifying and investigating thresholds using a variety of examples from terrestrial and aquatic environments, at ecosystem, landscape and regional scales. We end with a discussion of key research needs in this area.
Landon, Matthew K.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 1,695-square-mile Central Eastside study unit (CESJO) was investigated from March through June 2006 as part of the Statewide Basin Assessment Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Statewide Basin Assessment project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the California State Water Resources Control Board (SWRCB) in collaboration with the U.S. Geological Survey (USGS) and the Lawrence Livermore National Laboratory (LLNL). The study was designed to provide a spatially unbiased assessment of raw ground-water quality within CESJO, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 78 wells in Merced and Stanislaus Counties. Fifty-eight of the 78 wells were selected using a randomized grid-based method to provide statistical representation of the study unit (grid wells). Twenty of the wells were selected to evaluate changes in water chemistry along selected lateral or vertical ground-water flow paths in the aquifer (flow-path wells). The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), gasoline oxygenates and their degradates, pesticides and pesticide degradates], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), and 1,2,3-trichloropropane (1,2,3-TCP)], inorganic constituents that can occur naturally [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, carbon-14, and uranium isotopes and stable isotopes of hydrogen, oxygen, nitrogen, sulfur, and carbon], and dissolved noble and other gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, samples for matrix spikes) were collected for approximately one-sixth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control results showed that the environmental data were of good quality, with low bias and low variability, and resulted in censoring of less than 0.3 percent of the detections found in ground-water samples. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CADPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CADPH. VOCs and pesticides were detected in approximately half of the grid wells, and all detections in samples from CESJO wells were below health-based thresholds. All detections of nutrients and major elements in grid wells also were below health-based thresholds. Most detections of constituents of special interest, trace elements, and radioactive constituents in samples from grid wells were below health-based thresholds. Exceptions included two detections of arsenic that were above the USEPA maximum contaminant level (MCL-US), one detection of lead above the USEPA action level (AL-US), and one detection of vanadium and three detections of 1,2,3-TCP that were above the CADPH notification levels (NL-CA). All detections of radioactive constituents were below health-based thresholds, although fourteen samples had activities of radon-222 above the lower proposed MCL-US. Most of th
Ferrari, Matthew J.; Fram, Miranda S.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 950 square kilometer (370 square mile) Central Sierra study unit (CENSIE) was investigated in May 2006 as part of the Priority Basin Assessment project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Assessment project was developed in response to the Ground-Water Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). This study was designed to provide a spatially unbiased assessment of the quality of raw ground water used for drinking-water supplies within CENSIE, and to facilitate statistically consistent comparisons of ground-water quality throughout California. Samples were collected from thirty wells in Madera County. Twenty-seven of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and three were selected to aid in evaluation of specific water-quality issues (understanding wells). Ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], gasoline oxygenates and degradates, pesticides and pesticide degradates), constituents of special interest (N-nitrosodimethylamine, perchlorate, and 1,2,3-trichloropropane), naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon], and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. In total, over 250 constituents and water-quality indicators were investigated. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected at approximately one-sixth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Results from field blanks indicated contamination was not a noticeable source of bias in the data for ground-water samples. Differences between replicate samples were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most constituents. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH), and thresholds established for aesthetic concerns (Secondary Maximum Contaminant Levels, SMCL-CA) by CDPH. Therefore, any comparisons of the results of this study to drinking-water standards only is for illustrative purposes and is not indicative of compliance or non-compliance to those standards. Most constituents that were detected in ground-water samples were found at concentrations below drinking-water standards or thresholds. Six constituents? fluoride, arsenic, molybdenum, uranium, gross-alpha radioactivity, and radon-222?were detected at concentrations higher than thresholds set for health-based regulatory purposes. Three additional constituents?pH, iron and manganese?were detected at concentrations above thresholds set for aesthetic concerns. Volatile organic compounds (VOCs) and pesticides, were detected in less than one-third of the samples and generally at less than one one-hundredth of a health-based threshold.
Schmitt, Stephen J.; Milby Dawson, Barbara J.; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 1,600 square-mile Antelope Valley study unit (ANT) was investigated from January to April 2008 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within ANT, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 57 wells in Kern, Los Angeles, and San Bernardino Counties. Fifty-six of the wells were selected using a spatially distributed, randomized, grid-based method to provide statistical representation of the study area (grid wells), and one additional well was selected to aid in evaluation of specific water-quality issues (understanding well). The groundwater samples were analyzed for a large number of organic constituents (volatile organic compounds [VOCs], gasoline additives and degradates, pesticides and pesticide degradates, fumigants, and pharmaceutical compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), and radioactive constituents (gross alpha and gross beta radioactivity, radium isotopes, and radon-222). Naturally occurring isotopes (strontium, tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, 239 constituents and water-quality indicators (field parameters) were investigated. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected at 12 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the groundwater samples. Differences between replicate samples generally were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most compoundsThis study did not evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only, and are not indicative of compliance or non-compliance with drinking water standards. Most constituents that were detected in groundwater samples were found at concentrations below drinking-water thresholds. Volatile organic compounds (VOCs) were detected in about one-half of the samples and pesticides detected in about one-third of the samples; all detections of these constituents were below health-based thresholds. Most detections of trace elements and nutrients in samples from ANT wells were below health-based thresholds. Exceptions include: one detection of nitrite plus nitr
Effects of Acupuncture on Sensory Perception: A Systematic Review and Meta-Analysis
Baeumler, Petra I.; Fleckenstein, Johannes; Takayama, Shin; Simang, Michael; Seki, Takashi; Irnich, Dominik
2014-01-01
Background The effect of acupuncture on sensory perception has never been systematically reviewed; although, studies on acupuncture mechanisms are frequently based on the idea that changes in sensory thresholds reflect its effect on the nervous system. Methods Pubmed, EMBASE and Scopus were screened for studies investigating the effect of acupuncture on thermal or mechanical detection or pain thresholds in humans published in English or German. A meta-analysis of high quality studies was performed. Results Out of 3007 identified articles 85 were included. Sixty five studies showed that acupuncture affects at least one sensory threshold. Most studies assessed the pressure pain threshold of which 80% reported an increase after acupuncture. Significant short- and long-term effects on the pressure pain threshold in pain patients were revealed by two meta-analyses including four and two high quality studies, respectively. In over 60% of studies, acupuncture reduced sensitivity to noxious thermal stimuli, but measuring methods might influence results. Few but consistent data indicate that acupuncture reduces pin-prick like pain but not mechanical detection. Results on thermal detection are heterogeneous. Sensory threshold changes were equally frequent reported after manual acupuncture as after electroacupuncture. Among 48 sham-controlled studies, 25 showed stronger effects on sensory thresholds through verum than through sham acupuncture, but in 9 studies significant threshold changes were also observed after sham acupuncture. Overall, there is a lack of high quality acupuncture studies applying comprehensive assessments of sensory perception. Conclusions Our findings indicate that acupuncture affects sensory perception. Results are most compelling for the pressure pain threshold, especially in pain conditions associated with tenderness. Sham acupuncture can also cause such effects. Future studies should incorporate comprehensive, standardized assessments of sensory profiles in order to fully characterize its effect on sensory perception and to explore the predictive value of sensory profiles for the effectiveness of acupuncture. PMID:25502787
Effects of acupuncture on sensory perception: a systematic review and meta-analysis.
Baeumler, Petra I; Fleckenstein, Johannes; Takayama, Shin; Simang, Michael; Seki, Takashi; Irnich, Dominik
2014-01-01
The effect of acupuncture on sensory perception has never been systematically reviewed; although, studies on acupuncture mechanisms are frequently based on the idea that changes in sensory thresholds reflect its effect on the nervous system. Pubmed, EMBASE and Scopus were screened for studies investigating the effect of acupuncture on thermal or mechanical detection or pain thresholds in humans published in English or German. A meta-analysis of high quality studies was performed. Out of 3007 identified articles 85 were included. Sixty five studies showed that acupuncture affects at least one sensory threshold. Most studies assessed the pressure pain threshold of which 80% reported an increase after acupuncture. Significant short- and long-term effects on the pressure pain threshold in pain patients were revealed by two meta-analyses including four and two high quality studies, respectively. In over 60% of studies, acupuncture reduced sensitivity to noxious thermal stimuli, but measuring methods might influence results. Few but consistent data indicate that acupuncture reduces pin-prick like pain but not mechanical detection. Results on thermal detection are heterogeneous. Sensory threshold changes were equally frequent reported after manual acupuncture as after electroacupuncture. Among 48 sham-controlled studies, 25 showed stronger effects on sensory thresholds through verum than through sham acupuncture, but in 9 studies significant threshold changes were also observed after sham acupuncture. Overall, there is a lack of high quality acupuncture studies applying comprehensive assessments of sensory perception. Our findings indicate that acupuncture affects sensory perception. Results are most compelling for the pressure pain threshold, especially in pain conditions associated with tenderness. Sham acupuncture can also cause such effects. Future studies should incorporate comprehensive, standardized assessments of sensory profiles in order to fully characterize its effect on sensory perception and to explore the predictive value of sensory profiles for the effectiveness of acupuncture.
Multi-criteria decision making approaches for quality control of genome-wide association studies.
Malovini, Alberto; Rognoni, Carla; Puca, Annibale; Bellazzi, Riccardo
2009-03-01
Experimental errors in the genotyping phases of a Genome-Wide Association Study (GWAS) can lead to false positive findings and to spurious associations. An appropriate quality control phase could minimize the effects of this kind of errors. Several filtering criteria can be used to perform quality control. Currently, no formal methods have been proposed for taking into account at the same time these criteria and the experimenter's preferences. In this paper we propose two strategies for setting appropriate genotyping rate thresholds for GWAS quality control. These two approaches are based on the Multi-Criteria Decision Making theory. We have applied our method on a real dataset composed by 734 individuals affected by Arterial Hypertension (AH) and 486 nonagenarians without history of AH. The proposed strategies appear to deal with GWAS quality control in a sound way, as they lead to rationalize and make explicit the experimenter's choices thus providing more reproducible results.
Summary of the effects of engine throttle response on airplane formation-flying qualities
NASA Technical Reports Server (NTRS)
Walsh, Kevin R.
1992-01-01
A flight evaluation as conducted to determine the effect of engine throttle response characteristics on precision formation-flying qualities. A variable electronic throttle control system was developed and flight-tested on a TF-104G airplane with a J79-11B engine at the NASA Dryden Flight Research Facility. Ten research flights were flown to evaluate the effects of throttle gain, time delay, and fuel control rate limiting on engine handling qualities during a demanding precision wing formation task. Handling quality effects of lag filters and lead compensation time delays were also evaluated. Data from pilot ratings and comments indicate that throttle control system time delays and rate limits cause significant degradations in handling qualities. Threshold values for satisfactory (level 1) and adequate (level 2) handling qualities of these key variables are presented.
Fleetcroft, Robert; Steel, Nicholas; Cookson, Richard; Howe, Amanda
2008-06-17
The 2003 revision of the UK GMS contract rewards general practices for performance against clinical quality indicators. Practices can exempt patients from treatment, and can receive maximum payment for less than full coverage of eligible patients. This paper aims to estimate the gap between the percentage of maximum incentive gained and the percentage of patients receiving indicated care (the pay-performance gap), and to estimate how much of the gap is attributable respectively to thresholds and to exception reporting. Analysis of Quality Outcomes Framework data in the National Primary Care Database and exception reporting data from the Information Centre from 8407 practices in England in 2005 - 6. The main outcome measures were the gap between the percentage of maximum incentive gained and the percentage of patients receiving indicated care at the practice level, both for individual indicators and a combined composite score. An additional outcome was the percentage of that gap attributable respectively to exception reporting and maximum threshold targets set at less than 100%. The mean pay-performance gap for the 65 aggregated clinical indicators was 13.3% (range 2.9% to 48%). 52% of this gap (6.9% of eligible patients) is attributable to thresholds being set at less than 100%, and 48% to patients being exception reported. The gap was greater than 25% in 9 indicators: beta blockers and cholesterol control in heart disease; cholesterol control in stroke; influenza immunization in asthma; blood pressure, sugar and cholesterol control in diabetes; seizures in epilepsy and treatment of hypertension. Threshold targets and exception reporting introduce an incentive ceiling, which substantially reduces the percentage of eligible patients that UK practices need to treat in order to receive maximum incentive payments for delivering that care. There are good clinical reasons for exception reporting, but after unsuitable patients have been exempted from treatment, there is no reason why all maximum thresholds should not be 100%, whilst retaining the current lower thresholds to provide incentives for lower performing practices.
Characterizing air quality data from complex network perspective.
Fan, Xinghua; Wang, Li; Xu, Huihui; Li, Shasha; Tian, Lixin
2016-02-01
Air quality depends mainly on changes in emission of pollutants and their precursors. Understanding its characteristics is the key to predicting and controlling air quality. In this study, complex networks were built to analyze topological characteristics of air quality data by correlation coefficient method. Firstly, PM2.5 (particulate matter with aerodynamic diameter less than 2.5 μm) indexes of eight monitoring sites in Beijing were selected as samples from January 2013 to December 2014. Secondly, the C-C method was applied to determine the structure of phase space. Points in the reconstructed phase space were considered to be nodes of the network mapped. Then, edges were determined by nodes having the correlation greater than a critical threshold. Three properties of the constructed networks, degree distribution, clustering coefficient, and modularity, were used to determine the optimal value of the critical threshold. Finally, by analyzing and comparing topological properties, we pointed out that similarities and difference in the constructed complex networks revealed influence factors and their different roles on real air quality system.
Boonruab, Jurairat; Nimpitakpong, Netraya; Damjuti, Watchara
2018-01-01
This randomized controlled trial aimed to investigate the distinctness after treatment among hot herbal compress, hot compress, and topical diclofenac. The registrants were equally divided into groups and received the different treatments including hot herbal compress, hot compress, and topical diclofenac group, which served as the control group. After treatment courses, Visual Analog Scale and 36-Item Short Form Health survey were, respectively, used to establish the level of pain intensity and quality of life. In addition, cervical range of motion and pressure pain threshold were also examined to identify the motional effects. All treatments showed significantly decreased level of pain intensity and increased cervical range of motion, while the intervention groups exhibited extraordinary capability compared with the topical diclofenac group in pressure pain threshold and quality of life. In summary, hot herbal compress holds promise to be an efficacious treatment parallel to hot compress and topical diclofenac.
Milby Dawson, Barbara J.; Bennett, George L.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 2,100 square-mile Southern Sacramento Valley study unit (SSACV) was investigated from March to June 2005 as part of the Statewide Basin Assessment Project of Ground-Water Ambient Monitoring and Assessment (GAMA) Program. This study was designed to provide a spatially unbiased assessment of raw ground-water quality within SSACV, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 83 wells in Placer, Sacramento, Solano, Sutter, and Yolo Counties. Sixty-seven of the wells were selected using a randomized grid-based method to provide statistical representation of the study area. Sixteen of the wells were sampled to evaluate changes in water chemistry along ground-water flow paths. Four additional samples were collected at one of the wells to evaluate water-quality changes with depth. The GAMA Statewide Basin Assessment project was developed in response to the Ground-Water Quality Monitoring Act of 2001 and is being conducted by the California State Water Resources Control Board (SWRCB) in collaboration with the U.S. Geological Survey (USGS) and the Lawrence Livermore National Laboratory (LLNL). The ground-water samples were analyzed for a large number of man-made organic constituents (volatile organic compounds [VOCs], pesticides and pesticide degradates, pharmaceutical compounds, and wastewater-indicator constituents), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, and carbon), and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, matrix spikes) were collected at ten percent of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control data resulted in censoring of less than 0.03 percent of the analyses of ground-water samples. This study did not evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Health Services (CADHS) (Maximum Contaminant Levels [MCLs], notification levels [NLs], or lifetime health advisories [HA-Ls]) and thresholds established for aesthetic concerns (Secondary Maximum Contaminant Levels [SMCLs]). All wells were sampled for organic constituents and selected general water quality parameters; subsets of wells were sampled for inorganic constituents, nutrients, and radioactive constituents. Volatile organic compounds were detected in 49 out of 83 wells sampled and pesticides were detected in 35 out of 82 wells; all detections were below health-based thresholds, with the exception of 1 detection of 1,2,3-trichloropropane above a NL. Of the 43 wells sampled for trace elements, 27 had no detections of a trace element above a health-based threshold and 16 had at least one detection above. Of the 18 trace elements with health-based thresholds, 3 (arsenic, barium, and boron) were detected at concentrations higher an MCL. Of the 43 wells sampled for nitrate, only 1 well had a detection above the MCL. Twenty wells were sampled for radioactive constituents; only 1 (radon-222) was measured at activiti
Shelton, Jennifer L.; Fram, Miranda S.; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 860-square-mile Madera-Chowchilla study unit (MADCHOW) was investigated in April and May 2008 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within MADCHOW, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 35 wells in Madera, Merced, and Fresno Counties. Thirty of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and five more were selected to provide additional sampling density to aid in understanding processes affecting groundwater quality (flow-path wells). Detection summaries in the text and tables are given for grid wells only, to avoid over-representation of the water quality in areas adjacent to flow-path wells. Groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], low-level 1,2-dibromo-3-chloropropane [DBCP] and 1,2-dibromoethane [EDB], pesticides and pesticide degradates, polar pesticides and metabolites, and pharmaceutical compounds), constituents of special interest (N-nitrosodimethylamine [NDMA], perchlorate, and low-level 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), and radioactive constituents (uranium isotopes, and gross alpha and gross beta particle activities). Naturally occurring isotopes and geochemical tracers (stable isotopes of hydrogen, oxygen, and carbon, and activities of tritium and carbon-14), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, approximately 300 constituents and field water-quality indicators were investigated. Three types of quality-control samples (blanks, replicates, and samples for matrix spikes) each were collected at approximately 11 percent of the wells sampled for each analysis, and the results obtained from these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that data for the groundwater samples were not compromised by possible contamination during sample collection, handling or analysis. Differences between replicate samples were within acceptable ranges. Matrix spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, raw groundwater typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to raw groundwater. However, to provide some context for the results, concentrations of constituents measured in the raw groundwater were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH), and with aesthetic and technical thresholds established by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only, and are not indicative of compliance or non-compliance with regulatory thresholds. The concentrations of most constituents detected in groundwater samples from MADCHOW wells were below drinking-water thresholds. Organic compounds (VOCs and pesticides
The importance of reference materials in doping-control analysis.
Mackay, Lindsey G; Kazlauskas, Rymantas
2011-08-01
Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.
NASA Astrophysics Data System (ADS)
Young, Kenneth C.; Cook, James J. H.; Oduko, Jennifer M.; Bosmans, Hilde
2006-03-01
European Guidelines for quality control in digital mammography specify minimum and achievable standards of image quality in terms of threshold contrast, based on readings of images of the CDMAM test object by human observers. However this is time-consuming and has large inter-observer error. To overcome these problems a software program (CDCOM) is available to automatically read CDMAM images, but the optimal method of interpreting the output is not defined. This study evaluates methods of determining threshold contrast from the program, and compares these to human readings for a variety of mammography systems. The methods considered are (A) simple thresholding (B) psychometric curve fitting (C) smoothing and interpolation and (D) smoothing and psychometric curve fitting. Each method leads to similar threshold contrasts but with different reproducibility. Method (A) had relatively poor reproducibility with a standard error in threshold contrast of 18.1 +/- 0.7%. This was reduced to 8.4% by using a contrast-detail curve fitting procedure. Method (D) had the best reproducibility with an error of 6.7%, reducing to 5.1% with curve fitting. A panel of 3 human observers had an error of 4.4% reduced to 2.9 % by curve fitting. All automatic methods led to threshold contrasts that were lower than for humans. The ratio of human to program threshold contrasts varied with detail diameter and was 1.50 +/- .04 (sem) at 0.1mm and 1.82 +/- .06 at 0.25mm for method (D). There were good correlations between the threshold contrast determined by humans and the automated methods.
Lauche, Romy; Cramer, Holger; Choi, Kyung-Eun; Rampp, Thomas; Saha, Felix Joyonto; Dobos, Gustav J; Musial, Frauke
2011-08-15
In this preliminary trial we investigated the effects of dry cupping, an ancient method for treating pain syndromes, on patients with chronic non-specific neck pain. Sensory mechanical thresholds and the participants' self-reported outcome measures of pain and quality of life were evaluated. Fifty patients (50.5 ± 11.9 years) were randomised to a treatment group (TG) or a waiting-list control group (WL). Patients in the TG received a series of 5 cupping treatments over a period of 2 weeks; the control group did not. Self-reported outcome measures before and after the cupping series included the following: Pain at rest (PR) and maximal pain related to movement (PM) on a 100-mm visual analogue scale (VAS), pain diary (PD) data on a 0-10 numeric rating scale (NRS), Neck Disability Index (NDI), and health-related quality of life (SF-36). In addition, the mechanical-detection thresholds (MDT), vibration-detection thresholds (VDT), and pressure-pain thresholds (PPT) were determined at pain-related and control areas. Patients of the TG had significantly less pain after cupping therapy than patients of the WL group (PR: Δ-22.5 mm, p = 0.00002; PM: Δ-17.8 mm, p = 0.01). Pain diaries (PD) revealed that neck pain decreased gradually in the TG patients and that pain reported by the two groups differed significantly after the fifth cupping session (Δ-1.1, p = 0.001). There were also significant differences in the SF-36 subscales for bodily pain (Δ13.8, p = 0.006) and vitality (Δ10.2, p = 0.006). Group differences in PPT were significant at pain-related and control areas (all p < 0.05), but were not significant for MDT or VDT. A series of five dry cupping treatments appeared to be effective in relieving chronic non-specific neck pain. Not only subjective measures improved, but also mechanical pain sensitivity differed significantly between the two groups, suggesting that cupping has an influence on functional pain processing. The trial was registered at clinicaltrials.gov (NCT01289964).
2011-01-01
Background In this preliminary trial we investigated the effects of dry cupping, an ancient method for treating pain syndromes, on patients with chronic non-specific neck pain. Sensory mechanical thresholds and the participants' self-reported outcome measures of pain and quality of life were evaluated. Methods Fifty patients (50.5 ± 11.9 years) were randomised to a treatment group (TG) or a waiting-list control group (WL). Patients in the TG received a series of 5 cupping treatments over a period of 2 weeks; the control group did not. Self-reported outcome measures before and after the cupping series included the following: Pain at rest (PR) and maximal pain related to movement (PM) on a 100-mm visual analogue scale (VAS), pain diary (PD) data on a 0-10 numeric rating scale (NRS), Neck Disability Index (NDI), and health-related quality of life (SF-36). In addition, the mechanical-detection thresholds (MDT), vibration-detection thresholds (VDT), and pressure-pain thresholds (PPT) were determined at pain-related and control areas. Results Patients of the TG had significantly less pain after cupping therapy than patients of the WL group (PR: Δ-22.5 mm, p = 0.00002; PM: Δ-17.8 mm, p = 0.01). Pain diaries (PD) revealed that neck pain decreased gradually in the TG patients and that pain reported by the two groups differed significantly after the fifth cupping session (Δ-1.1, p = 0.001). There were also significant differences in the SF-36 subscales for bodily pain (Δ13.8, p = 0.006) and vitality (Δ10.2, p = 0.006). Group differences in PPT were significant at pain-related and control areas (all p < 0.05), but were not significant for MDT or VDT. Conclusions A series of five dry cupping treatments appeared to be effective in relieving chronic non-specific neck pain. Not only subjective measures improved, but also mechanical pain sensitivity differed significantly between the two groups, suggesting that cupping has an influence on functional pain processing. Trial registration The trial was registered at clinicaltrials.gov (NCT01289964). PMID:21843336
Montrella, Joseph; Belitz, Kenneth
2009-01-01
Ground-water quality in the approximately 460-square-mile Santa Clara River Valley study unit (SCRV) was investigated from April to June 2007 as part of the statewide Priority Basin project of the Ground-Water Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of the quality of raw ground water used for public water supplies within SCRV, and to facilitate a statistically consistent basis for comparing water quality throughout California. Fifty-seven ground-water samples were collected from 53 wells in Ventura and Los Angeles Counties. Forty-two wells were selected using a randomized grid-based method to provide statistical representation of the study area (grid wells). Eleven wells (understanding wells) were selected to further evaluate water chemistry in particular parts of the study area, and four depth-dependent ground-water samples were collected from one of the eleven understanding wells to help understand the relation between water chemistry and depth. The ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, potential wastewater-indicator compounds, and pharmaceutical compounds), a constituent of special interest (perchlorate), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial constituents. Naturally occurring isotopes (tritium, carbon-13, carbon-14 [abundance], stable isotopes of hydrogen and oxygen in water, stable isotopes of nitrogen and oxygen in nitrate, chlorine-37, and bromine-81), and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks or replicates, or samples for matrix spikes) were collected from approximately 26 percent of the wells, and the analyses of these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control results showed that the quality of the environmental data was good, with low bias and low variability, and as a result, less than 0.1 percent of the analytes detected in ground-water samples were censored. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is delivered (or, supplied) to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Most constituents that were detected in ground-water samples were reported at concentrations below their established health-based thresholds. VOCs, pesticides and pesticide degradates, and potential wastewater-indicator compounds were detected in about 33 percent or less of the 42 SCRV grid wells. Concentrations of all detected organic constituents were below established health-based thresholds. Perchlorate was detected in approximately 12 percent of the SCRV grid wells; all concentrations reported were below the NL-CA threshold. Additional constituents, including major ions, trace elements, and nutrients were collected at 26 wells (16 grid wells and 10 understanding wells) of the 53 wells sampled f
Groundwater Quality Data in the Mojave Study Unit, 2008: Results from the California GAMA Program
Mathany, Timothy M.; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 1,500 square-mile Mojave (MOJO) study unit was investigated from February to April 2008, as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). MOJO was the 23rd of 37 study units to be sampled as part of the GAMA Priority Basin Project. The MOJO study was designed to provide a spatially unbiased assessment of the quality of untreated ground water used for public water supplies within MOJO, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 59 wells in San Bernardino and Los Angeles Counties. Fifty-two of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and seven were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for a large number of organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, and pharmaceutical compounds], constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]) naturally occurring inorganic constituents (nutrients, dissolved organic carbon [DOC], major and minor ions, silica, total dissolved solids [TDS], and trace elements), and radioactive constituents (gross alpha and gross beta radioactivity, radium isotopes, and radon-222). Naturally occurring isotopes (stable isotopes of hydrogen, oxygen, and carbon, stable isotopes of nitrogen and oxygen in nitrate, and activities of tritium and carbon-14), and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. In total, over 230 constituents and water-quality indicators (field parameters) were investigated. Three types of quality-control samples (blanks, replicates, and matrix spikes) each were collected at approximately 5-8 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a significant source of bias in the data for the groundwater samples. Differences between replicate samples generally were within acceptable ranges, indicating acceptable analytical reproducibility. Matrix spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, untreated groundwater typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to untreated ground water. However, to provide some context for the results, concentrations of constituents measured in the untreated ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic and technical concerns by CDPH. Comparisons between data collected for this study and thresholds for drinking-water are for illustrative purposes only, and are not indicative of compliance or non-compliance with those thresholds. Most constituents that were detected in groundwater samples in the 59 wells in MOJO were found at concentrations below drinking-water thresholds. In MOJO's 52 grid wells, volatile organic compounds (VOCs) were detected in 40 percent of the wells, and pesticides and pesticide degradates were detected in 23 percent of the grid wel
Ray, Mary C.; Kulongoski, Justin T.; Belitz, Kenneth
2009-01-01
Ground-water quality in the approximately 620-square-mile San Francisco Bay study unit (SFBAY) was investigated from April through June 2007 as part of the Priority Basin project of the Ground-Water Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The study was designed to provide a spatially unbiased assessment of raw ground-water quality, as well as a statistically consistent basis for comparing water quality throughout California. Samples in SFBAY were collected from 79 wells in San Francisco, San Mateo, Santa Clara, Alameda, and Contra Costa Counties. Forty-three of the wells sampled were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study unit (grid wells). Thirty-six wells were sampled to aid in evaluation of specific water-quality issues (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, pharmaceutical compounds, and potential wastewater-indicator compounds), constituents of special interest (perchlorate and N-nitrosodimethylamine [NDMA]), naturally occurring inorganic constituents (nutrients, major and minor ions, trace elements, chloride and bromide isotopes, and uranium and strontium isotopes), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, carbon-14 isotopes, and stable isotopes of hydrogen, oxygen, nitrogen, boron, and carbon), and dissolved noble gases (noble gases were analyzed in collaboration with Lawrence Livermore National Laboratory) also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blank samples, replicate samples, matrix spike samples) were collected for approximately one-third of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control information from the field blanks resulted in applying 'V' codes to approximately 0.1 percent of the data collected for ground-water samples (meaning a constituent was detected in blanks as well as the corresponding environmental data). See the Appendix section 'Quality-Control-Sample Results'. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is delivered to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. VOCs were detected in about one-half of the grid wells, while pesticides were detected in about one-fifth of the grid wells. Concentrations of all VOCs and pesticides detected in samples from all SFBAY wells were below health-based thresholds. No pharmaceutical compounds were detected in any SFBAY well. One potential wastewater-indicator compound, caffeine, was detected in one grid well in SFBAY. Concentrations of most trace elements and nutrients detected in samples from all SFBAY wells were below health-based thresholds. Exceptions include nitrate, detected above the USEPA maximum contaminant level (MCL-US) in 3samples; arsenic, above the USEPA maximum contaminant level (MCL-US) in 3 samples; c
Schmitt, Stephen J.; Fram, Miranda S.; Milby Dawson, Barbara J.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 3,340 square mile Middle Sacramento Valley study unit (MSACV) was investigated from June through September, 2006, as part of the California Groundwater Ambient Monitoring and Assessment (GAMA) program. The GAMA Priority Basin Assessment project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Middle Sacramento Valley study was designed to provide a spatially unbiased assessment of raw ground-water quality within MSACV, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 108 wells in Butte, Colusa, Glenn, Sutter, Tehama, Yolo, and Yuba Counties. Seventy-one wells were selected using a randomized grid-based method to provide statistical representation of the study unit (grid wells), 15 wells were selected to evaluate changes in water chemistry along ground-water flow paths (flow-path wells), and 22 were shallow monitoring wells selected to assess the effects of rice agriculture, a major land use in the study unit, on ground-water chemistry (RICE wells). The ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], gasoline oxygenates and degradates, pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], and 1,2,3-trichloropropane [1,2,3-TCP]), inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon), and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. Quality-control samples (blanks, replicates, laboratory matrix spikes) were collected at approximately 10 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a noticeable source of bias in the data for the ground-water samples. Differences between replicate samples were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most constituents. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, or blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only and are not indicative of compliance or noncompliance with regulatory thresholds. Most constituents that were detected in ground-water samples were found at concentrations below drinking-water thresholds. VOCs were detected in less than one-third and pesticides and pesticide degradates in just over one-half of the grid wells, and all detections of these constituents in samples from all wells of the MSACV study unit were below health-based thresholds. All detections of trace elements in samples from MSACV grid wells were below health-based thresholds, with the exceptions of arsenic and boro
Multi-Criteria Decision Making Approaches for Quality Control of Genome-Wide Association Studies
Malovini, Alberto; Rognoni, Carla; Puca, Annibale; Bellazzi, Riccardo
2009-01-01
Experimental errors in the genotyping phases of a Genome-Wide Association Study (GWAS) can lead to false positive findings and to spurious associations. An appropriate quality control phase could minimize the effects of this kind of errors. Several filtering criteria can be used to perform quality control. Currently, no formal methods have been proposed for taking into account at the same time these criteria and the experimenter’s preferences. In this paper we propose two strategies for setting appropriate genotyping rate thresholds for GWAS quality control. These two approaches are based on the Multi-Criteria Decision Making theory. We have applied our method on a real dataset composed by 734 individuals affected by Arterial Hypertension (AH) and 486 nonagenarians without history of AH. The proposed strategies appear to deal with GWAS quality control in a sound way, as they lead to rationalize and make explicit the experimenter’s choices thus providing more reproducible results. PMID:21347174
Abejón, David; Rueda, Pablo; Vallejo, Ricardo
2016-04-01
Pulse frequency (Fc) is one of the most important parameters in neurostimulation, with Pulse Amplitude (Pw) and Amplitude (I). Up to certain Fc, increasing the number of pulses will generate action potentials in neighboring neural structures and may facilitate deeper penetration of the electromagnetic fields. In addition, changes in frequency modify the patient's sensation with stimulation. Fifty patients previously implanted with rechargeable current control spinal cord stimulation. With pulse width fixed at 300 μsec, we stimulated at 26 different Fc values between 40 and 1200 Hz and determine the influence of these changes on different stimulation thresholds: perception threshold (Tp ), therapeutic perception (Tt), and discomfort threshold (Td). Simultaneously, paresthesia coverage of the painful area and patient's sensation and satisfaction related to the quality of stimulation were recorded. Pulse Fc is inversely proportional to stimulation thresholds and this influence is statistically significant (p < 0.05). As Pulse Fc increased from 40 to 1200 Hz, the mean threshold decreases from 7.25 to 1.38 mA (Tp ), 8.17 to 1.63 (Tt ), and 9.20 to 1.85 (Td). Significant differences for Tp and Tt began at 750 Hz (Tp , Tt ) and at 650 Hz for Td. No significant influence was found regarding paresthesia coverage. As expected, Fc affects significantly patient's sensation and satisfaction. Changes in Fc affect the quality of paresthesias. Within the evaluated parameters higher frequencies are inversely proportional to stimulation thresholds and Tt. It seems that Fc is a vital parameter to achieve therapeutic success. Changes in Fc is a useful parameter to modulate the patient's sensory perception. Fc can be successfully used to adjust the quality of the paresthesias and to modify patient's subjective sensation. We showed that as the frequency increases, the patient's satisfaction with the perceived sensation decreases, suggesting that higher Fc may need to be set up at subthreshold amplitude to achieve positive response. © 2016 International Neuromodulation Society.
FDAs Critical Path Initiative identifies pharmacogenomics and toxicogenomics as key opportunities in advancing medical product development and personalized medicine, and the Guidance for Industry: Pharmacogenomic Data Submissions has been released. Microarrays represent a co...
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
Honoré, Margaux; Leboeuf-Yde, Charlotte; Gagey, Olivier
2018-01-01
Spinal manipulation (SM) has been shown to have an effect on pain perception. More knowledge is needed on this phenomenon and it would be relevant to study its effect in asymptomatic subjects. To compare regional effect of SM on pressure pain threshold (PPT) vs. sham, inactive control, mobilisation, another SM, and some type of physical therapy. In addition, we reported the results for the three different spinal regions. A systematic search of literature was done using PubMed, Embase and Cochrane. Search terms were ((spinal manipulation) AND (experimental pain)); ((spinal manipulative therapy OR spinal manipulation) AND ((experimental pain OR quantitative sensory testing OR pressure pain threshold OR pain threshold)) (Final search: June 13th 2017). The inclusion criteria were SM performed anywhere in the spine; the use of PPT, PPT tested in an asymptomatic region and on the same day as the SM. Studies had to be experimental with at least one external or internal control group. Studies on only spinal motion or tenderness, other reviews, case reports, and less than 15 invited participants in each group were excluded. Evidence tables were constructed with information relevant to each research question and by spinal region. Results were reported in relation to statistical significance and were interpreted taking into account their quality. Only 12 articles of 946 were accepted. The quality of studies was generally good. In 8 sham controlled studies, a psychologically and physiologically "credible" sham was found in only 2 studies. A significant difference was noted between SM vs. Sham, and between SM and an inactive control. No significant difference in PPT was found between SM and another SM, mobilisation or some type of physical therapy. The cervical region more often obtained significant findings as compared to studies in the thoracic or lumbar regions. SM has an effect regionally on pressure pain threshold in asymptomatic subjects. The clinical significance of this must be quantified. More knowledge is needed in relation to the comparison of different spinal regions and different types of interventions.
Red blood cell transfusion for people undergoing hip fracture surgery.
Brunskill, Susan J; Millette, Sarah L; Shokoohi, Ali; Pulford, E C; Doree, Carolyn; Murphy, Michael F; Stanworth, Simon
2015-04-21
The incidence of hip fracture is increasing and it is more common with increasing age. Surgery is used for almost all hip fractures. Blood loss occurs as a consequence of both the fracture and the surgery and thus red blood cell transfusion is frequently used. However, red blood cell transfusion is not without risks. Therefore, it is important to identify the evidence for the effective and safe use of red blood cell transfusion in people with hip fracture. To assess the effects (benefits and harms) of red blood cell transfusion in people undergoing surgery for hip fracture. We searched the Cochrane Bone, Joint and Muscle Trauma Group Specialised Register (31 October 2014), the Cochrane Central Register of Controlled Trials (The Cochrane Library, 2014, Issue 10), MEDLINE (January 1946 to 20 November 2014), EMBASE (January 1974 to 20 November 2014), CINAHL (January 1982 to 20 November 2014), British Nursing Index Database (January 1992 to 20 November 2014), the Systematic Review Initiative's Transfusion Evidence Library, PubMed for e-publications, various other databases and ongoing trial registers. Randomised controlled trials comparing red blood cell transfusion versus no transfusion or an alternative to transfusion, different transfusion protocols or different transfusion thresholds in people undergoing surgery for hip fracture. Three review authors independently assessed each study's risk of bias and extracted data using a study-specific form. We pooled data where there was homogeneity in the trial comparisons and the timing of outcome measurement. We used GRADE criteria to assess the quality (low, moderate or high) of the evidence for each outcome. We included six trials (2722 participants): all compared two thresholds for red blood cell transfusion: a 'liberal' strategy to maintain a haemoglobin concentration of usually 10 g/dL versus a more 'restrictive' strategy based on symptoms of anaemia or a lower haemoglobin concentration, usually 8 g/dL. The exact nature of the transfusion interventions, types of surgery and participants varied between trials. The mean age of participants ranged from 81 to 87 years and approximately 24% of participants were men. The largest trial enrolled 2016 participants, over 60% of whom had a history of cardiovascular disease. The percentage of participants receiving a red blood cell transfusion ranged from 74% to 100% in the liberal transfusion threshold group and from 11% to 45% in the restrictive transfusion threshold group. There were no results available for the smallest trial (18 participants). All studies were at some risk of bias, in particular performance bias relating to the absence of blinding of personnel. We judged the evidence for all outcomes, except myocardial infarction, was low quality reflecting risk of bias primarily from imbalances in protocol violations in the largest trial and imprecision, often because of insufficient events. Thus, further research is likely to have an important impact on these results.There was no evidence of a difference between a liberal versus restricted threshold transfusion in mortality, at 30 days post hip fracture surgery (risk ratio (RR) 0.92, 95% confidence interval (CI) 0.67 to 1.26; five trials; 2683 participants; low quality evidence) or at 60 days post surgery (RR 1.08, 95% CI 0.80 to 1.44; three trials; 2283 participants; low quality evidence). Assuming an illustrative baseline risk of 50 deaths per 1000 participants in the restricted threshold group at 30 days, these data equate to four fewer (95% CI 17 fewer to 14 more) deaths per 1000 in the liberal threshold group at 30 days.There was no evidence of a difference between a liberal versus restricted threshold transfusion in functional recovery at 60 days, assessed in terms of the inability to walk 10 feet (3 m) without human assistance (RR 1.00, 95% CI 0.87 to 1.15; two trials; 2083 participants; low quality evidence).There was low quality evidence of no difference between the transfusion thresholds in postoperative morbidity for the following complications: thromboembolism (RR 1.15 favouring a restrictive threshold, 95% CI 0.56 to 2.37; four trials; 2416 participants), stroke (RR 2.40 favouring a restrictive threshold, 95% CI 0.85 to 6.79; four trials; 2416 participants), wound infection (RR 1.61 favouring a restrictive threshold, 95% CI 0.77 to 3.35; three trials; 2332 participants), respiratory infection (pneumonia) (RR 1.35 favouring a restrictive threshold, 95% CI 0.95 to 1.92; four trials; 2416 participants) and new diagnosis of congestive heart failure (RR 0.77 favouring a liberal threshold, 95% CI 0.48 to 1.23; three trials; 2332 participants). There was very low quality evidence of a lower risk of myocardial infarction in the liberal compared with the restrictive transfusion threshold group (RR 0.59, 95% CI 0.36 to 0.96; three trials; 2217 participants). Assuming an illustrative baseline risk of myocardial infarction of 24 per 1000 participants in the restricted threshold group, this result was compatible with between one and 15 fewer myocardial infarctions in the liberal threshold group. We found low quality evidence of no difference in mortality, functional recovery or postoperative morbidity between 'liberal' versus 'restrictive' thresholds for red blood cell transfusion in people undergoing surgery for hip fracture. Although further research may change the estimates of effect, the currently available evidence does not support the use of liberal red blood cell transfusion thresholds based on a 10 g/dL haemoglobin trigger in preference to more restrictive transfusion thresholds based on lower haemoglobin levels or symptoms of anaemia in these people. Future research needs to address the effectiveness of red blood cell transfusions at different time points in the surgical pathway, whether pre-operative, peri-operative or postoperative. In particular, such research would need to consider people who are symptomatic or haemodynamically unstable who were excluded from most of these trials.
MASQOT: a method for cDNA microarray spot quality control
Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan
2005-01-01
Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442
NASA Astrophysics Data System (ADS)
Quinn, J. D.; Reed, P. M.; Keller, K.
2015-12-01
Recent multi-objective extensions of the classical shallow lake problem are useful for exploring the conceptual and computational challenges that emerge when managing irreversible water quality tipping points. Building on this work, we explore a four objective version of the lake problem where a hypothetical town derives economic benefits from polluting a nearby lake, but at the risk of irreversibly tipping the lake into a permanently polluted state. The trophic state of the lake exhibits non-linear threshold dynamics; below some critical phosphorus (P) threshold it is healthy and oligotrophic, but above this threshold it is irreversibly eutrophic. The town must decide how much P to discharge each year, a decision complicated by uncertainty in the natural P inflow to the lake. The shallow lake problem provides a conceptually rich set of dynamics, low computational demands, and a high level of mathematical difficulty. These properties maximize its value for benchmarking the relative merits and limitations of emerging decision support frameworks, such as Direct Policy Search (DPS). Here, we explore the use of DPS as a formal means of developing robust environmental pollution control rules that effectively account for deeply uncertain system states and conflicting objectives. The DPS reformulation of the shallow lake problem shows promise in formalizing pollution control triggers and signposts, while dramatically reducing the computational complexity of the multi-objective pollution control problem. More broadly, the insights from the DPS variant of the shallow lake problem formulated in this study bridge emerging work related to socio-ecological systems management, tipping points, robust decision making, and robust control.
Li, Yangfan; Li, Yi; Wu, Wei
2016-01-01
The concept of thresholds shows important implications for environmental and resource management. Here we derived potential landscape thresholds which indicated abrupt changes in water quality or the dividing points between exceeding and failing to meet national surface water quality standards for a rapidly urbanizing city on the Eastern Coast in China. The analysis of landscape thresholds was based on regression models linking each of the seven water quality variables to each of the six landscape metrics for this coupled land-water system. We found substantial and accelerating urban sprawl at the suburban areas between 2000 and 2008, and detected significant nonlinear relations between water quality and landscape pattern. This research demonstrated that a simple modeling technique could provide insights on environmental thresholds to support more-informed decision making in land use, water environmental and resilience management. Copyright © 2015 Elsevier Ltd. All rights reserved.
Summary of the effects of engine throttle response on airplane formation-flying qualities
NASA Technical Reports Server (NTRS)
Walsh, Kevin R.
1993-01-01
A flight evaluation was conducted to determine the effect of engine throttle response characteristics on precision formation-flying qualities. A variable electronic throttle control system was developed and flight-tested on a TF-104G airplane with a J79-11B engine at the NASA Dryden Flight Research Facility. This airplane was chosen because of its known, very favorable thrust response characteristics. Ten research flights were flown to evaluate the effects of throttle gain, time delay, and fuel control rate limiting on engine handling qualities during a demanding precision wing formation task. Handling quality effects of lag filters and lead compensation time delays were also evaluated. The Cooper and Harper Pilot Rating Scale was used to assign levels of handling quality. Data from pilot ratings and comments indicate that throttle control system time delays and rate limits cause significant degradations in handling qualities. Threshold values for satisfactory (level 1) and adequate (level 2) handling qualities of these key variables are presented. These results may provide engine manufacturers with guidelines to assure satisfactory handling qualities in future engine designs.
da Silva, Mariana Moreira; Albertini, Regiane; de Tarso Camillo de Carvalho, Paulo; Leal-Junior, Ernesto Cesar Pinto; Bussadori, Sandra Kalil; Vieira, Stella Sousa; Bocalini, Danilo Sales; de Oliveira, Luis Vicente Franco; Grandinetti, Vanessa; Silva, José Antonio; Serra, Andrey Jorge
2018-02-01
This study evaluated the role of the phototherapy and exercise training (EXT) as well as the combined treatment in general symptoms, pain, and quality of life in women suffering from fibromyalgia (FM). A total of 160 women were enrolled and measures were carried out in two sets: it was sought to identify the acute effect for a single phototherapy and EXT session (Set 1); long-term effect (10 weeks) of the interventions (Set 2). Phototherapy irradiation was performed at 11 locations in their bodies, employing a cluster with nine diodes (one super-pulsed infrared 905 nm, four light-emitting diodes [LEDs] of 640 nm, and four LEDs of 875 nm, 39.3 J per location). Algometry and VAS instrument were applied to evaluate pain. The FM symptoms were evaluated with Fibromyalgia Impact Questionnaire (FIQ) and Research Diagnostic Criteria (RDC) instruments. Quality of life was assessed through SF-36 survey. Set 1: pain threshold was improved with the phototherapy, and EXT improved the pain threshold for temporomandibular joint (right and left body side) and occipital site (right body side). Set 2: there was improved pain threshold in several tender points with the phototherapy and EXT. There was an overlap of therapies to reduce the tender point numbers, anxiety, depression, fatigue, sleep, and difficulty sleeping on FIQ/RDC scores. Moreover, quality of life was improved with both therapies. The phototherapy and EXT improved the pain threshold in FM women. A more substantial effect was noticed for the combined therapy, in which pain relief was accomplished by improving VAS and FIQ scores as well as quality of life.
Gandjour, Afschin
2015-01-01
In Germany, the Institute for Quality and Efficiency in Health Care (IQWiG) makes recommendations for reimbursement prices of drugs on the basis of a proportional relationship between costs and health benefits. This paper analyzed the potential of IQWiG's decision rule to control health expenditures and used a cost-per-quality-adjusted life year (QALY) rule as a comparison. A literature search was conducted, and a theoretical model of health expenditure growth was built. The literature search shows that the median incremental cost-effectiveness ratio of German cost-effectiveness analyses was €7650 per QALY gained, thus yielding a much lower threshold cost-effectiveness ratio for IQWiG's rule than an absolute rule at €30 000 per QALY. The theoretical model shows that IQWiG's rule is able to contain the long-term growth of health expenditures under the conservative assumption that future health increases at a constant absolute rate and that the threshold incremental cost-effectiveness ratio increases at a smaller rate than health expenditures. In contrast, an absolute rule offers the potential for manufacturers to raise drug prices in response to the threshold, thus resulting in an initial spike in expenditures. Results suggest that IQWiG's proportional rule will lead to lower drug prices and a slower growth of health expenditures than an absolute cost-effectiveness threshold at €30 000 per QALY. This finding is surprising as IQWiG's rule-in contrast to a cost-per-QALY rule-does not start from a fixed budget. Copyright © 2014 John Wiley & Sons, Ltd.
1998-05-01
ROG reactive organic compound RONA Record of Non -applicability RTV rational threshold value RWQCB Regional Water Quality Control Board SARA...over water. The ranges are either scheduled via a designated military or civilian controlling agency (for restricted or warning areas) or are used...operations areas (MOAs), and air traffic control authorized airspace (ATCAA). Airspace designations throughout the United States are controlled by the Federal
Mathany, Timothy M.; Land, Michael; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 860 square-mile Coastal Los Angeles Basin study unit (CLAB) was investigated from June to November of 2006 as part of the Statewide Basin Assessment Project of the Ground-Water Ambient Monitoring and Assessment (GAMA) Program. The GAMA Statewide Basin Assessment was developed in response to the Ground-Water Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Coastal Los Angeles Basin study was designed to provide a spatially unbiased assessment of raw ground-water quality within CLAB, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 69 wells in Los Angeles and Orange Counties. Fifty-five of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (?grid wells?). Fourteen additional wells were selected to evaluate changes in ground-water chemistry or to gain a greater understanding of the ground-water quality within a specific portion of the Coastal Los Angeles Basin study unit ('understanding wells'). Ground-water samples were analyzed for: a large number of synthetic organic constituents [volatile organic compounds (VOCs), gasoline oxygenates and their degradates, pesticides, polar pesticides, and pesticide degradates, pharmaceutical compounds, and potential wastewater-indicators]; constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), 1,4-dioxane, and 1,2,3-trichloropropane (1,2,3-TCP)]; inorganic constituents that can occur naturally [nutrients, major and minor ions, and trace elements]; radioactive constituents [gross-alpha and gross-beta radiation, radium isotopes, and radon-222]; and microbial indicators. Naturally occurring isotopes [stable isotopic ratios of hydrogen and oxygen, and activities of tritium and carbon-14] and dissolved noble gases also were measured to help identify the sources and ages of the sampled ground water. Quality-control samples (blanks, replicates, and samples for matrix spikes) were collected at approximately one-fourth of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a significant source of bias. Differences between replicate samples were within acceptable ranges, indicating acceptably low variability. Matrix spike recoveries were within acceptable ranges for most compounds. Assessment of the quality-control information resulted in applying ?V? codes to approximately 0.1 percent of the data collected for ground-water samples (meaning a constituent was detected in blanks as well as the corresponding environmental data). This study did not attempt to evaluate the quality of drinking water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable drinking-water quality. Regulatory thresholds are applied to the treated drinking water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA), California Department of Public Health (CDPH, formerly California Department of Health Services [CADHS]) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Comparisons between data collected for this study and drinking-water thresholds are for illustrative purposes only, and are not indicative of compliance or non-compliance with those thresholds. VOCs were detected in alm
Wetzel, Hermann
2006-01-01
In a large number of mostly retrospective association studies, a statistical relationship between volume and quality of health care has been reported. However, the relevance of these results is frequently limited by methodological shortcomings. In this article, criteria for the evidence and definition of thresholds for volume-outcome relations are proposed, e.g. the specification of relevant outcomes for quality indicators, analysis of volume as a continuous variable with an adequate case-mix and risk adjustment, accounting for cluster effects and considering mathematical models for the derivation of cut-off values. Moreover, volume thresholds are regarded as surrogate parameters for the indirect classification of the quality of care, whose diagnostic validity and effectiveness in improving health care quality need to be evaluated in prospective studies.
Spirometry in 3-5-year-old children with asthma.
Nève, Véronique; Edmé, Jean-Louis; Devos, Patrick; Deschildre, Antoine; Thumerelle, Caroline; Santos, Clarisse; Methlin, Catherine-Marie; Matran, Murielle; Matran, Régis
2006-08-01
Spirometry with incentive games was applied to 207 2-5-year-old preschool children (PSC) with asthma in order to refine the quality-control criteria proposed by Aurora et al. (Am J Respir Crit Care Med 2004;169:1152-159). The data set in our study was much larger compared to that in Aurora et al. (Am J Respir Crit Care Med 2004;169:1152-159), where 42 children with cystic fibrosis and 37 healthy control were studied. At least two acceptable maneuvers were obtained in 178 (86%) children. Data were focused on 3-5-year-old children (n = 171). The proportion of children achieving a larger number of thresholds for each quality-control criterion (backward-extrapolated volume (Vbe), Vbe in percent of forced vital capacity (FVC, Vbe/FVC), time-to-peak expiratory flow (time-to-PEF), and difference (Delta) between the two FVCs (DeltaFVC), forced expiratory volume in 1 sec (DeltaFEV(1)), and forced expiratory volume in 0.5 sec (DeltaFEV(0.5)) from the two "best" curves) was calculated, and cumulative plots were obtained. The optimal threshold was determined for all ages by derivative function of rate of success-threshold curves, close to the inflexion point. The following thresholds were defined for acceptability: Vbe
Yang, Kun; Yu, Zhenyu; Luo, Yi; Yang, Yang; Zhao, Lei; Zhou, Xiaolu
2018-05-15
Global warming and rapid urbanization in China have caused a series of ecological problems. One consequence has involved the degradation of lake water environments. Lake surface water temperatures (LSWTs) significantly shape water ecological environments and are highly correlated with the watershed ecosystem features and biodiversity levels. Analysing and predicting spatiotemporal changes in LSWT and exploring the corresponding impacts on water quality is essential for controlling and improving the ecological water environment of watersheds. In this study, Dianchi Lake was examined through an analysis of 54 water quality indicators from 10 water quality monitoring sites from 2005 to 2016. Support vector regression (SVR), Principal Component Analysis (PCA) and Back Propagation Artificial Neural Network (BPANN) methods were applied to form a hybrid forecasting model. A geospatial analysis was conducted to observe historical LSWTs and water quality changes for Dianchi Lake from 2005 to 2016. Based on the constructed model, LSWTs and changes in water quality were simulated for 2017 to 2020. The relationship between LSWTs and water quality thresholds was studied. The results show limited errors and highly generalized levels of predictive performance. In addition, a spatial visualization analysis shows that from 2005 to 2020, the chlorophyll-a (Chla), chemical oxygen demand (COD) and total nitrogen (TN) diffused from north to south and that ammonia nitrogen (NH 3 -N) and total phosphorus (TP) levels are increases in the northern part of Dianchi Lake, where the LSWT levels exceed 17°C. The LSWT threshold is 17.6-18.53°C, which falls within the threshold for nutritional water quality, but COD and TN levels fall below V class water quality standards. Transparency (Trans), COD, biochemical oxygen demand (BOD) and Chla levels present a close relationship with LSWT, and LSWTs are found to fundamentally affect lake cyanobacterial blooms. Copyright © 2017 Elsevier B.V. All rights reserved.
Burton, Carmen A.; Belitz, Kenneth
2008-01-01
Ground-water quality in the approximately 3,800 square-mile Southeast San Joaquin Valley study unit (SESJ) was investigated from October 2005 through February 2006 as part of the Priority Basin Assessment Project of Ground-Water Ambient Monitoring and Assessment (GAMA) Program. The GAMA Statewide Basin Assessment project was developed in response to the Ground-Water Quality Monitoring Act of 2001 and is being conducted by the California State Water Resources Control Board (SWRCB) in collaboration with the U.S. Geological Survey (USGS) and the Lawrence Livermore National Laboratory (LLNL). The SESJ study was designed to provide a spatially unbiased assessment of raw ground-water quality within SESJ, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 99 wells in Fresno, Tulare, and Kings Counties, 83 of which were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 16 of which were sampled to evaluate changes in water chemistry along ground-water flow paths or across alluvial fans (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine, and 1,2,3-trichloropropane), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, and carbon-14, and stable isotopes of hydrogen, oxygen, nitrogen, and carbon), and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. Quality-control samples (blanks, replicates, samples for matrix spikes) were collected at approximately 10 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the ground-water samples. Assessment of the quality-control data resulted in censoring of less than 1 percent of the detections of constituents measured in ground-water samples. This study did not attempt to evaluate the quality of drinking water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable drinking-water quality. Regulatory thresholds apply to the treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and other health-based thresholds established by the U.S. Environmental Protection Agency and California Department of Public Health (CDPH) and thresholds established for aesthetic concerns by CDPH. Two VOCs were detected above health-based thresholds: 1,2-dibromo-3-chloropropane (DBCP), and benzene. DBCP was detected above the U.S. Environmental Protections Agency's maximum contaminant level (MCL-US) in three grid wells and five understanding wells. Benzene was detected above the CDPH's maximum contaminant level (MCL-CA) in one grid well. All pesticide detections were below health-based thresholds. Perchlorate was detected above its maximum contaminate level for California in one grid well. Nitrate was detected above the MCL-US in six samples from understanding wells, of which one was a public supply well. Two trace elements were detected above MCLs-US: arsenic and uranium. Arsenic was detected above the MCL-US in four grid wells and two understanding wells; uranium was detected above the MCL-US in one grid well and one understanding well. Gross alpha radiation was detected above MCLs-US in five samples; four of them understanding wells, and uranium isotope activity was greater than the MCL-US for one understanding well
Cost-Effectiveness of Evaluating the New Technologies.
ERIC Educational Resources Information Center
Kastner, Theodore A.
1997-01-01
This commentary on a study comparing use of the brand name drug Depakene with generic valproic acid to control seizures in people with mental retardation focuses on issues of cost-effectiveness. It notes existing guidelines for pharmacoeconomic evaluation and suggests a possible model to include a threshold price (per quality-adjusted life year)…
Ertl, Peter; Kruse, Annika; Tilp, Markus
2016-10-01
The aim of the current paper was to systematically review the relevant existing electromyographic threshold concepts within the literature. The electronic databases MEDLINE and SCOPUS were screened for papers published between January 1980 and April 2015 including the keywords: neuromuscular fatigue threshold, anaerobic threshold, electromyographic threshold, muscular fatigue, aerobic-anaerobictransition, ventilatory threshold, exercise testing, and cycle-ergometer. 32 articles were assessed with regard to their electromyographic methodologies, description of results, statistical analysis and test protocols. Only one article was of very good quality. 21 were of good quality and two articles were of very low quality. The review process revealed that: (i) there is consistent evidence of one or two non-linear increases of EMG that might reflect the additional recruitment of motor units (MU) or different fiber types during fatiguing cycle ergometer exercise, (ii) most studies reported no statistically significant difference between electromyographic and metabolic thresholds, (iii) one minute protocols with increments between 10 and 25W appear most appropriate to detect muscular threshold, (iv) threshold detection from the vastus medialis, vastus lateralis, and rectus femoris is recommended, and (v) there is a great variety in study protocols, measurement techniques, and data processing. Therefore, we recommend further research and standardization in the detection of EMGTs. Copyright © 2016 Elsevier Ltd. All rights reserved.
[A quality controllable algorithm for ECG compression based on wavelet transform and ROI coding].
Zhao, An; Wu, Baoming
2006-12-01
This paper presents an ECG compression algorithm based on wavelet transform and region of interest (ROI) coding. The algorithm has realized near-lossless coding in ROI and quality controllable lossy coding outside of ROI. After mean removal of the original signal, multi-layer orthogonal discrete wavelet transform is performed. Simultaneously,feature extraction is performed on the original signal to find the position of ROI. The coefficients related to the ROI are important coefficients and kept. Otherwise, the energy loss of the transform domain is calculated according to the goal PRDBE (Percentage Root-mean-square Difference with Baseline Eliminated), and then the threshold of the coefficients outside of ROI is determined according to the loss of energy. The important coefficients, which include the coefficients of ROI and the coefficients that are larger than the threshold outside of ROI, are put into a linear quantifier. The map, which records the positions of the important coefficients in the original wavelet coefficients vector, is compressed with a run-length encoder. Huffman coding has been applied to improve the compression ratio. ECG signals taken from the MIT/BIH arrhythmia database are tested, and satisfactory results in terms of clinical information preserving, quality and compress ratio are obtained.
Assessing Personal Qualities in Medical School Admissions.
ERIC Educational Resources Information Center
Albanese, Mark A.; Snow, Mikel H.; Skochelak, Susan E.; Huggett, Kathryn N.; Farrell, Philip M.
2003-01-01
Analyzes the challenges to using academic measures (MCAT scores and GPAs) as thresholds for medical school admissions and, for applicants exceeding the threshold, using personal qualities for admission decisions; reviews the literature on using the medical school interview and other admission data to assess personal qualities of applicants;…
Müller, Dirk; Pulm, Jannis; Gandjour, Afschin
2012-01-01
To compare cost-effectiveness modeling analyses of strategies to prevent osteoporotic and osteopenic fractures either based on fixed thresholds using bone mineral density or based on variable thresholds including bone mineral density and clinical risk factors. A systematic review was performed by using the MEDLINE database and reference lists from previous reviews. On the basis of predefined inclusion/exclusion criteria, we identified relevant studies published since January 2006. Articles included for the review were assessed for their methodological quality and results. The literature search resulted in 24 analyses, 14 of them using a fixed-threshold approach and 10 using a variable-threshold approach. On average, 70% of the criteria for methodological quality were fulfilled, but almost half of the analyses did not include medication adherence in the base case. The results of variable-threshold strategies were more homogeneous and showed more favorable incremental cost-effectiveness ratios compared with those based on a fixed threshold with bone mineral density. For analyses with fixed thresholds, incremental cost-effectiveness ratios varied from €80,000 per quality-adjusted life-year in women aged 55 years to cost saving in women aged 80 years. For analyses with variable thresholds, the range was €47,000 to cost savings. Risk assessment using variable thresholds appears to be more cost-effective than selecting high-risk individuals by fixed thresholds. Although the overall quality of the studies was fairly good, future economic analyses should further improve their methods, particularly in terms of including more fracture types, incorporating medication adherence, and including or discussing unrelated costs during added life-years. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Bowes, M J; Loewenthal, M; Read, D S; Hutchins, M G; Prudhomme, C; Armstrong, L K; Harman, S A; Wickham, H D; Gozzard, E; Carvalho, L
2016-11-01
River phytoplankton blooms can pose a serious risk to water quality and the structure and function of aquatic ecosystems. Developing a greater understanding of the physical and chemical controls on the timing, magnitude and duration of blooms is essential for the effective management of phytoplankton development. Five years of weekly water quality monitoring data along the River Thames, southern England were combined with hourly chlorophyll concentration (a proxy for phytoplankton biomass), flow, temperature and daily sunlight data from the mid-Thames. Weekly chlorophyll data was of insufficient temporal resolution to identify the causes of short term variations in phytoplankton biomass. However, hourly chlorophyll data enabled identification of thresholds in water temperature (between 9 and 19°C) and flow (<30m(3)s(-1)) that explained the development of phytoplankton populations. Analysis showed that periods of high phytoplankton biomass and growth rate only occurred when these flow and temperature conditions were within these thresholds, and coincided with periods of long sunshine duration, indicating multiple stressor controls. Nutrient concentrations appeared to have no impact on the timing or magnitude of phytoplankton bloom development, but severe depletion of dissolved phosphorus and silicon during periods of high phytoplankton biomass may have contributed to some bloom collapses through nutrient limitation. This study indicates that for nutrient enriched rivers such as the Thames, manipulating residence time (through removing impoundments) and light/temperature (by increasing riparian tree shading) may offer more realistic solutions than reducing phosphorus concentrations for controlling excessive phytoplankton biomass. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Setting limits: Using air pollution thresholds to protect and restore U.S. ecosystems
Fenn, M.E.; Lambert, K.F.; Blett, T.F.; Burns, Douglas A.; Pardo, L.H.; Lovett, Gary M.; Haeuber, R. A.; Evers, D.C.; Driscoll, C.T.; Jeffries, D.S.
2011-01-01
More than four decades of research provide unequivocal evidence that sulfur, nitrogen, and mercury pollution have altered, and will continue to alter, our nation's lands and waters. The emission and deposition of air pollutants harm native plants and animals, degrade water quality, affect forest productivity, and are damaging to human health. Many air quality policies limit emissions at the source but these control measures do not always consider ecosystem impacts. Air pollution thresholds at which ecological effects are observed, such as critical loads, are effective tools for assessing the impacts of air pollution on essential ecosystem services and for informing public policy. U.S. ecosystems can be more effectively protected and restored by using a combination of emissions-based approaches and science-based thresholds of ecosystem damage. Based on the results of a comprehensive review of air pollution thresholds, we conclude: ??? Ecosystem services such as air and water purification, decomposition and detoxification of waste materials, climate regulation, regeneration of soil fertility, production and biodiversity maintenance, as well as crop, timber and fish supplies are impacted by deposition of nitrogen, sulfur, mercury and other pollutants. The consequences of these changes may be difficult or impossible to reverse as impacts cascade throughout affected ecosystems. ??? The effects of too much nitrogen are common across the U.S. and include altered plant and lichen communities, enhanced growth of invasive species, eutrophication and acidification of lands and waters, and habitat deterioration for native species, including endangered species. ??? Lake, stream and soil acidification is widespread across the eastern United States. Up to 65% of lakes within sensitive areas receive acid deposition that exceeds critical loads. ??? Mercury contamination adversely affects fish in many inland and coastal waters. Fish consumption advisories for mercury exist in all 50 states and on many tribal lands. High concentrations of mercury in wildlife are also widespread and have multiple adverse effects. ??? Air quality programs, such as those stemming from the 1990 Clean Air Act Amendments, have helped decrease air pollution even as population and energy demand have increased. Yet, they do not adequately protect ecosystems from long-term damage. Moreover they do not address ammonia emissions. ??? A stronger ecosystem basis for air pollutant policies could be established through adoption of science-based thresholds. Existing monitoring programs track vital information needed to measure the response to policies, and could be expanded to include appropriate chemical and biological indicators for terrestrial and aquatic ecosystems and establishment of a national ecosystem monitoring network for mercury. The development and use of air pollution thresholds for ecosystem protection and management is increasing in the United States, yet threshold approaches remain underutilized. Ecological thresholds for air pollution, such as critical loads for nitrogen and sulfur deposition, are not currently included in the formal regulatory process for emissions controls in the United States, although they are now considered in local management decisions by the National Park Service and U.S. Forest Service. Ecological thresholds offer a scientifically sound approach to protecting and restoring U.S. ecosystems and an important tool for natural resource management and policy. ?? The Ecological Society of America.
NASA Astrophysics Data System (ADS)
Oby, Emily R.; Perel, Sagi; Sadtler, Patrick T.; Ruff, Douglas A.; Mischel, Jessica L.; Montez, David F.; Cohen, Marlene R.; Batista, Aaron P.; Chase, Steven M.
2016-06-01
Objective. A traditional goal of neural recording with extracellular electrodes is to isolate action potential waveforms of an individual neuron. Recently, in brain-computer interfaces (BCIs), it has been recognized that threshold crossing events of the voltage waveform also convey rich information. To date, the threshold for detecting threshold crossings has been selected to preserve single-neuron isolation. However, the optimal threshold for single-neuron identification is not necessarily the optimal threshold for information extraction. Here we introduce a procedure to determine the best threshold for extracting information from extracellular recordings. We apply this procedure in two distinct contexts: the encoding of kinematic parameters from neural activity in primary motor cortex (M1), and visual stimulus parameters from neural activity in primary visual cortex (V1). Approach. We record extracellularly from multi-electrode arrays implanted in M1 or V1 in monkeys. Then, we systematically sweep the voltage detection threshold and quantify the information conveyed by the corresponding threshold crossings. Main Results. The optimal threshold depends on the desired information. In M1, velocity is optimally encoded at higher thresholds than speed; in both cases the optimal thresholds are lower than are typically used in BCI applications. In V1, information about the orientation of a visual stimulus is optimally encoded at higher thresholds than is visual contrast. A conceptual model explains these results as a consequence of cortical topography. Significance. How neural signals are processed impacts the information that can be extracted from them. Both the type and quality of information contained in threshold crossings depend on the threshold setting. There is more information available in these signals than is typically extracted. Adjusting the detection threshold to the parameter of interest in a BCI context should improve our ability to decode motor intent, and thus enhance BCI control. Further, by sweeping the detection threshold, one can gain insights into the topographic organization of the nearby neural tissue.
Oby, Emily R; Perel, Sagi; Sadtler, Patrick T; Ruff, Douglas A; Mischel, Jessica L; Montez, David F; Cohen, Marlene R; Batista, Aaron P; Chase, Steven M
2018-01-01
Objective A traditional goal of neural recording with extracellular electrodes is to isolate action potential waveforms of an individual neuron. Recently, in brain–computer interfaces (BCIs), it has been recognized that threshold crossing events of the voltage waveform also convey rich information. To date, the threshold for detecting threshold crossings has been selected to preserve single-neuron isolation. However, the optimal threshold for single-neuron identification is not necessarily the optimal threshold for information extraction. Here we introduce a procedure to determine the best threshold for extracting information from extracellular recordings. We apply this procedure in two distinct contexts: the encoding of kinematic parameters from neural activity in primary motor cortex (M1), and visual stimulus parameters from neural activity in primary visual cortex (V1). Approach We record extracellularly from multi-electrode arrays implanted in M1 or V1 in monkeys. Then, we systematically sweep the voltage detection threshold and quantify the information conveyed by the corresponding threshold crossings. Main Results The optimal threshold depends on the desired information. In M1, velocity is optimally encoded at higher thresholds than speed; in both cases the optimal thresholds are lower than are typically used in BCI applications. In V1, information about the orientation of a visual stimulus is optimally encoded at higher thresholds than is visual contrast. A conceptual model explains these results as a consequence of cortical topography. Significance How neural signals are processed impacts the information that can be extracted from them. Both the type and quality of information contained in threshold crossings depend on the threshold setting. There is more information available in these signals than is typically extracted. Adjusting the detection threshold to the parameter of interest in a BCI context should improve our ability to decode motor intent, and thus enhance BCI control. Further, by sweeping the detection threshold, one can gain insights into the topographic organization of the nearby neural tissue. PMID:27097901
Oby, Emily R; Perel, Sagi; Sadtler, Patrick T; Ruff, Douglas A; Mischel, Jessica L; Montez, David F; Cohen, Marlene R; Batista, Aaron P; Chase, Steven M
2016-06-01
A traditional goal of neural recording with extracellular electrodes is to isolate action potential waveforms of an individual neuron. Recently, in brain-computer interfaces (BCIs), it has been recognized that threshold crossing events of the voltage waveform also convey rich information. To date, the threshold for detecting threshold crossings has been selected to preserve single-neuron isolation. However, the optimal threshold for single-neuron identification is not necessarily the optimal threshold for information extraction. Here we introduce a procedure to determine the best threshold for extracting information from extracellular recordings. We apply this procedure in two distinct contexts: the encoding of kinematic parameters from neural activity in primary motor cortex (M1), and visual stimulus parameters from neural activity in primary visual cortex (V1). We record extracellularly from multi-electrode arrays implanted in M1 or V1 in monkeys. Then, we systematically sweep the voltage detection threshold and quantify the information conveyed by the corresponding threshold crossings. The optimal threshold depends on the desired information. In M1, velocity is optimally encoded at higher thresholds than speed; in both cases the optimal thresholds are lower than are typically used in BCI applications. In V1, information about the orientation of a visual stimulus is optimally encoded at higher thresholds than is visual contrast. A conceptual model explains these results as a consequence of cortical topography. How neural signals are processed impacts the information that can be extracted from them. Both the type and quality of information contained in threshold crossings depend on the threshold setting. There is more information available in these signals than is typically extracted. Adjusting the detection threshold to the parameter of interest in a BCI context should improve our ability to decode motor intent, and thus enhance BCI control. Further, by sweeping the detection threshold, one can gain insights into the topographic organization of the nearby neural tissue.
Densmore, Jill N.; Fram, Miranda S.; Belitz, Kenneth
2009-01-01
Ground-water quality in the approximately 1,630 square-mile Owens and Indian Wells Valleys study unit (OWENS) was investigated in September-December 2006 as part of the Priority Basin Project of Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in collaboration with the California State Water Resources Control Board (SWRCB). The Owens and Indian Wells Valleys study was designed to provide a spatially unbiased assessment of raw ground-water quality within OWENS study unit, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 74 wells in Inyo, Kern, Mono, and San Bernardino Counties. Fifty-three of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study area (grid wells), and 21 wells were selected to evaluate changes in water chemistry in areas of interest (understanding wells). The ground-water samples were analyzed for a large number of synthetic organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, pharmaceutical compounds, and potential wastewater- indicator compounds], constituents of special interest [perchlorate, N-nitrosodimethylamine (NDMA), and 1,2,3- trichloropropane (1,2,3-TCP)], naturally occurring inorganic constituents [nutrients, major and minor ions, and trace elements], radioactive constituents, and microbial indicators. Naturally occurring isotopes [tritium, and carbon-14, and stable isotopes of hydrogen and oxygen in water], and dissolved noble gases also were measured to help identify the source and age of the sampled ground water. This study evaluated the quality of raw ground water in the aquifer in the OWENS study unit and did not attempt to evaluate the quality of treated water delivered to consumers. Water supplied to consumers typically is treated after withdrawal from the ground, disinfected, and blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH) and non-regulatory thresholds established for aesthetic concerns (secondary maximum contamination levels, SMCL-CA) by CDPH. VOCs and pesticides were detected in samples from less than one-third of the grid wells; all detections were below health-based thresholds, and most were less than one-one hundredth of threshold values. All detections of perchlorate and nutrients in samples from OWENS were below health-based thresholds. Most detections of trace elements in ground-water samples from OWENS wells were below health-based thresholds. In samples from the 53 grid wells, three constituents were detected at concentrations above USEPA maximum contaminant levels: arsenic in 5 samples, uranium in 4 samples, and fluoride in 1 sample. Two constituents were detected at concentrations above CDPH notification levels (boron in 9 samples and vanadium in 1 sample), and two were above USEPA lifetime health advisory levels (molybdenum in 3 samples and strontium in 1 sample). Most of the samples from OWENS wells had concentrations of major elements, TDS, and trace elements below the non-enforceable standards set for aesthetic concerns. Samples from nine grid wells had concentrations of manganese, iron, or TDS above the SMCL-CAs.
Kent, Robert; Belitz, Kenneth
2009-01-01
Ground-water quality in the approximately 1,000-square-mile Upper Santa Ana Watershed study unit (USAW) was investigated from November 2006 through March 2007 as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin project was developed in response to the Groundwater Quality Monitoring Act of 2001, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Upper Santa Ana Watershed study was designed to provide a spatially unbiased assessment of raw ground-water quality within USAW, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 99 wells in Riverside and San Bernardino Counties. Ninety of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study unit (grid wells). Nine wells were selected to provide additional understanding of specific water-quality issues identified within the basin (understanding wells). The ground-water samples were analyzed for a large number of organic constituents (volatile organic compounds [VOCs], pesticides and pesticide degradates, pharmaceutical compounds, and potential wastewater-indicator compounds), constituents of special interest (perchlorate, N-nitrosodimethylamine [NDMA], 1,4-dioxane, and 1,2,3-trichloropropane [1,2,3-TCP]), naturally occurring inorganic constituents (nutrients, major and minor ions, and trace elements), radioactive constituents, and microbial indicators. Naturally occurring isotopes (tritium, carbon-14, and stable isotopes of hydrogen and oxygen in water) and dissolved noble gases also were measured to help identify sources and ages of the sampled ground water. Dissolved gases, and isotopes of nitrogen gas and of dissolved nitrate also were measured in order to investigate the sources and occurrence of nitrate in the study unit. In total, nearly 400 constituents and water-quality indicators were investigated for this study. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain acceptable water quality. Regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. However, to provide some context for the results, concentrations of constituents measured in the raw ground water were compared with regulatory and non-regulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and the California Department of Public Health (CDPH) and thresholds established for aesthetic concerns (secondary maximum contaminant levels, SMCL-CA) by CDPH. Volatile organic compounds (VOCs) were detected in more than 80 percent of USAW grid wells. Most VOCs detected were at concentrations far less than thresholds established for drinking water to protect human health; however, six wells had VOC concentrations above health-based thresholds. Twenty-four of the 85 VOCs investigated were detected in the study unit;11 were detected in more than 10 percent of the wells. The VOCs detected above health-based thresholds in at least one well were dibromochloropropane (DBCP), tetrachloroethene (PCE), trichloroethene (TCE), carbon tetrachloride, and 1,1-dichoroethene. Pesticide compounds were detected in more than 75 percent of the grid wells. However, of the 134 different pesticide compounds investigated, 13 were detected at concentrations greater than their respective long-term method detection limits, and only 7 compounds (all herbicides or herbicide degradates) were detected in more than 10 percent of the wells. No pesticide compound was detected above its health-based threshold, although thresholds exist for fewer than half of the pesticide compounds investigat
Radcliffe, Michael J; Lewith, George T; Turner, Richard G; Prescott, Philip; Church, Martin K; Holgate, Stephen T
2003-08-02
To assess the efficacy of enzyme potentiated desensitisation in the treatment of severe summer hay fever poorly controlled by pharmacotherapy. Double blind randomised placebo controlled parallel group study. Hospital in Hampshire. 183 participants aged between 18 and 64 with a history of severe summer hay fever for at least two years; all were skin prick test positive to timothy grass pollen. 90 randomised to active treatment; 93 randomised to placebo. Active treatment: two injections of enzyme potentiated desensitisation, given between eight and 11 weeks apart, each comprising 200 Fishman units of beta glucuronidase, 50 pg 1,3-cyclohexanediol, 50 ng protamine sulphate, and a mixed inhaled allergen extract (pollen mixes for trees, grasses, and weeds; allergenic fungal spores; cat and dog danders; dust and storage mites) in a total volume of 0.05 ml of buffered saline. Placebo: two injections of 0.05 ml buffered saline solution. Proportion of problem-free days; global rhinoconjunctivitis quality of life scores assessed weekly during pollen season. The active treatment group and the placebo group did not differ in the proportion of problem-free days, quality of life scores, symptom severity scores, change in quantitative skin prick provocation threshold, or change in conjunctival provocation threshold. No clinically significant adverse reactions occurred. Enzyme potentiated desensitisation showed no treatment effect in this study.
Hübner, Nils-Olaf; Fleßa, Steffen; Jakisch, Ralf; Assadian, Ojan; Kramer, Axel
2012-01-01
In the care of patients, the prevention of nosocomial infections is crucial. For it to be successful, cross-sectoral, interface-oriented hygiene quality management is necessary. The goal is to apply the HACCP (Hazard Assessment and Critical Control Points) concept to hospital hygiene, in order to create a multi-dimensional hygiene control system based on hygiene indicators that will overcome the limitations of a procedurally non-integrated and non-cross-sectoral view of hygiene. Three critical risk dimensions can be identified for the implementation of three-dimensional quality control of hygiene in clinical routine: the constitution of the person concerned, the surrounding physical structures and technical equipment, and the medical procedures. In these dimensions, the establishment of indicators and threshold values enables a comprehensive assessment of hygiene quality. Thus, the cross-sectoral evaluation of the quality of structure, processes and results is decisive for the success of integrated infection prophylaxis. This study lays the foundation for hygiene indicator requirements and develops initial concepts for evaluating quality management in hygiene. PMID:22558049
Healy, Sinead; McMahon, Jill; Owens, Peter; Dockery, Peter; FitzGerald, Una
2018-02-01
Image segmentation is often imperfect, particularly in complex image sets such z-stack micrographs of slice cultures and there is a need for sufficient details of parameters used in quantitative image analysis to allow independent repeatability and appraisal. For the first time, we have critically evaluated, quantified and validated the performance of different segmentation methodologies using z-stack images of ex vivo glial cells. The BioVoxxel toolbox plugin, available in FIJI, was used to measure the relative quality, accuracy, specificity and sensitivity of 16 global and 9 local threshold automatic thresholding algorithms. Automatic thresholding yields improved binary representation of glial cells compared with the conventional user-chosen single threshold approach for confocal z-stacks acquired from ex vivo slice cultures. The performance of threshold algorithms varies considerably in quality, specificity, accuracy and sensitivity with entropy-based thresholds scoring highest for fluorescent staining. We have used the BioVoxxel toolbox to correctly and consistently select the best automated threshold algorithm to segment z-projected images of ex vivo glial cells for downstream digital image analysis and to define segmentation quality. The automated OLIG2 cell count was validated using stereology. As image segmentation and feature extraction can quite critically affect the performance of successive steps in the image analysis workflow, it is becoming increasingly necessary to consider the quality of digital segmenting methodologies. Here, we have applied, validated and extended an existing performance-check methodology in the BioVoxxel toolbox to z-projected images of ex vivo glia cells. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Setodji, Claude Messan; Le, Vi-Nhuan; Schaack, Diana
2013-01-01
Research linking high-quality child care programs and children's cognitive development has contributed to the growing popularity of child care quality benchmarking efforts such as quality rating and improvement systems (QRIS). Consequently, there has been an increased interest in and a need for approaches to identifying thresholds, or cutpoints,…
Harbison, Justin E; Zazra, Dave; Henry, Marlon; Xamplas, Christopher; Kafensztok, Ruth
2015-09-01
Because it is often logistically impossible to monitor all catch basins within an operational area, local mosquito control programs will preemptively treat catch basins with larvicides each season. However, these larvicides can, ostensibly, be considered water quality pollutants. To experimentally reduce the use of larvicides toward improving water quality, 30 basins within a small 0.7-km(2) residential area were monitored weekly for the presence of larvae and pupae for 14 wk in the summer of 2013. Once a basin was found to reach a threshold of 12 mosquitoes per dip sample, it received a FourStar® Briquet (a 180-day briquet formulation of 6% Bacillus sphaericus and 1% B. thuringiensis israelensis). Each week a FourStar-treated basin surpassed this threshold, it was treated with an application of CocoBear™ oil (10% mineral oil). By the end of trials, all but one basin received a briquet and 13 required at least 4 treatments of CocoBear, suggesting that preemptive treatment is appropriate for the study area.
Banca, Paula; Vestergaard, Martin D; Rankov, Vladan; Baek, Kwangyeol; Mitchell, Simon; Lapa, Tatyana; Castelo-Branco, Miguel; Voon, Valerie
2015-03-13
The compulsive behaviour underlying obsessive-compulsive disorder (OCD) may be related to abnormalities in decision-making. The inability to commit to ultimate decisions, for example, patients unable to decide whether their hands are sufficiently clean, may reflect failures in accumulating sufficient evidence before a decision. Here we investigate the process of evidence accumulation in OCD in perceptual discrimination, hypothesizing enhanced evidence accumulation relative to healthy volunteers. Twenty-eight OCD patients and thirty-five controls were tested with a low-level visual perceptual task (random-dot-motion task, RDMT) and two response conflict control tasks. Regression analysis across different motion coherence levels and Hierarchical Drift Diffusion Modelling (HDDM) were used to characterize response strategies between groups in the RDMT. Patients required more evidence under high uncertainty perceptual contexts, as indexed by longer response time and higher decision boundaries. HDDM, which defines a decision when accumulated noisy evidence reaches a decision boundary, further showed slower drift rate towards the decision boundary reflecting poorer quality of evidence entering the decision process in patients under low uncertainty. With monetary incentives emphasizing speed and penalty for slower responses, patients decreased the decision thresholds relative to controls, accumulating less evidence in low uncertainty. These findings were unrelated to visual perceptual deficits and response conflict. This study provides evidence for impaired decision-formation processes in OCD, with a differential influence of high and low uncertainty contexts on evidence accumulation (decision threshold) and on the quality of evidence gathered (drift rates). It further emphasizes that OCD patients are sensitive to monetary incentives heightening speed in the speed-accuracy tradeoff, improving evidence accumulation.
Mate choice when males are in patches: optimal strategies and good rules of thumb.
Hutchinson, John M C; Halupka, Konrad
2004-11-07
In standard mate-choice models, females encounter males sequentially and decide whether to inspect the quality of another male or to accept a male already inspected. What changes when males are clumped in patches and there is a significant cost to travel between patches? We use stochastic dynamic programming to derive optimum strategies under various assumptions. With zero costs to returning to a male in the current patch, the optimal strategy accepts males above a quality threshold which is constant whenever one or more males in the patch remain uninspected; this threshold drops when inspecting the last male in the patch, so returns may occur only then and are never to a male in a previously inspected patch. With non-zero within-patch return costs, such a two-threshold rule still performs extremely well, but a more gradual decline in acceptance threshold is optimal. Inability to return at all need not decrease performance by much. The acceptance threshold should also decline if it gets harder to discover the last males in a patch. Optimal strategies become more complex when mean male quality varies systematically between patches or years, and females estimate this in a Bayesian manner through inspecting male qualities. It can then be optimal to switch patch before inspecting all males on a patch, or, exceptionally, to return to an earlier patch. We compare performance of various rules of thumb in these environments and in ones without a patch structure. A two-threshold rule performs excellently, as do various simplifications of it. The best-of-N rule outperforms threshold rules only in non-patchy environments with between-year quality variation. The cutoff rule performs poorly.
Marshall, Charla; Sturk-Andreaggi, Kimberly; Daniels-Higginbotham, Jennifer; Oliver, Robert Sean; Barritt-Ross, Suzanne; McMahon, Timothy P
2017-11-01
Next-generation ancient DNA technologies have the potential to assist in the analysis of degraded DNA extracted from forensic specimens. Mitochondrial genome (mitogenome) sequencing, specifically, may be of benefit to samples that fail to yield forensically relevant genetic information using conventional PCR-based techniques. This report summarizes the Armed Forces Medical Examiner System's Armed Forces DNA Identification Laboratory's (AFMES-AFDIL) performance evaluation of a Next-Generation Sequencing protocol for degraded and chemically treated past accounting samples. The procedure involves hybridization capture for targeted enrichment of mitochondrial DNA, massively parallel sequencing using Illumina chemistry, and an automated bioinformatic pipeline for forensic mtDNA profile generation. A total of 22 non-probative samples and associated controls were processed in the present study, spanning a range of DNA quantity and quality. Data were generated from over 100 DNA libraries by ten DNA analysts over the course of five months. The results show that the mitogenome sequencing procedure is reliable and robust, sensitive to low template (one ng control DNA) as well as degraded DNA, and specific to the analysis of the human mitogenome. Haplotypes were overall concordant between NGS replicates and with previously generated Sanger control region data. Due to the inherent risk for contamination when working with low-template, degraded DNA, a contamination assessment was performed. The consumables were shown to be void of human DNA contaminants and suitable for forensic use. Reagent blanks and negative controls were analyzed to determine the background signal of the procedure. This background signal was then used to set analytical and reporting thresholds, which were designated at 4.0X (limit of detection) and 10.0X (limit of quantiation) average coverage across the mitogenome, respectively. Nearly all human samples exceeded the reporting threshold, although coverage was reduced in chemically treated samples resulting in a ∼58% passing rate for these poor-quality samples. A concordance assessment demonstrated the reliability of the NGS data when compared to known Sanger profiles. One case sample was shown to be mixed with a co-processed sample and two reagent blanks indicated the presence of DNA above the analytical threshold. This contamination was attributed to sequencing crosstalk from simultaneously sequenced high-quality samples to include the positive control. Overall this study demonstrated that hybridization capture and Illumina sequencing provide a viable method for mitogenome sequencing of degraded and chemically treated skeletal DNA samples, yet may require alternative measures of quality control. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Vanamail, P; Subramanian, S; Srividya, A; Ravi, R; Krishnamoorthy, K; Das, P K
2006-08-01
Lot quality assurance sampling (LQAS) with two-stage sampling plan was applied for rapid monitoring of coverage after every round of mass drug administration (MDA). A Primary Health Centre (PHC) consisting of 29 villages in Thiruvannamalai district, Tamil Nadu was selected as the study area. Two threshold levels of coverage were used: threshold A (maximum: 60%; minimum: 40%) and threshold B (maximum: 80%; minimum: 60%). Based on these thresholds, one sampling plan each for A and B was derived with the necessary sample size and the number of allowable defectives (i.e. defectives mean those who have not received the drug). Using data generated through simple random sampling (SRSI) of 1,750 individuals in the study area, LQAS was validated with the above two sampling plans for its diagnostic and field applicability. Simultaneously, a household survey (SRSH) was conducted for validation and cost-effectiveness analysis. Based on SRSH survey, the estimated coverage was 93.5% (CI: 91.7-95.3%). LQAS with threshold A revealed that by sampling a maximum of 14 individuals and by allowing four defectives, the coverage was >or=60% in >90% of villages at the first stage. Similarly, with threshold B by sampling a maximum of nine individuals and by allowing four defectives, the coverage was >or=80% in >90% of villages at the first stage. These analyses suggest that the sampling plan (14,4,52,25) of threshold A may be adopted in MDA to assess if a minimum coverage of 60% has been achieved. However, to achieve the goal of elimination, the sampling plan (9, 4, 42, 29) of threshold B can identify villages in which the coverage is <80% so that remedial measures can be taken. Cost-effectiveness analysis showed that both options of LQAS are more cost-effective than SRSH to detect a village with a given level of coverage. The cost per village was US dollars 76.18 under SRSH. The cost of LQAS was US dollars 65.81 and 55.63 per village for thresholds A and B respectively. The total financial cost of classifying a village correctly with the given threshold level of LQAS could be reduced by 14% and 26% of the cost of conventional SRSH method.
Seizure threshold increases can be predicted by EEG quality in right unilateral ultrabrief ECT.
Gálvez, Verònica; Hadzi-Pavlovic, Dusan; Waite, Susan; Loo, Colleen K
2017-12-01
Increases in seizure threshold (ST) over a course of brief pulse ECT can be predicted by decreases in EEG quality, informing ECT dose adjustment to maintain adequate supra-threshold dosing. ST increases also occur over a course of right unilateral ultrabrief (RUL UB) ECT, but no data exist on the relationship between ST increases and EEG indices. This study (n = 35) investigated if increases in ST over RUL UB ECT treatments could be predicted by a decline in seizure quality. ST titration was performed at ECT session one and seven, with treatment dosing maintained stable (at 6-8 times ST) in intervening sessions. Seizure quality indices (slow-wave onset, mid-ictal amplitude, regularity, stereotypy, and post-ictal suppression) were manually rated at the first supra-threshold treatment, and last supra-threshold treatment before re-titration, using a structured rating scale, by a single trained rater blinded to the ECT session being rated. Twenty-one subjects (60%) had a ST increase. The association between ST changes and EEG quality indices was analysed by logistic regression, yielding a significant model (p < 0.001). Initial ST (p < 0.05) and percentage change in mid-ictal amplitude (p < 0.05) were significant predictors of change in ST. Percentage change in post-ictal suppression reached trend level significance (p = 0.065). Increases in ST over a RUL UB ECT course may be predicted by decreases in seizure quality, specifically decline in mid-ictal amplitude and potentially in post-ictal suppression. Such EEG indices may be able to inform when dose adjustments are necessary to maintain adequate supra-threshold dosing in RUL UB ECT.
Betran, Ana Pilar; Torloni, Maria Regina; Zhang, Jun; Ye, Jiangfeng; Mikolajczyk, Rafael; Deneux-Tharaux, Catherine; Oladapo, Olufemi Taiwo; Souza, João Paulo; Tunçalp, Özge; Vogel, Joshua Peter; Gülmezoglu, Ahmet Metin
2015-06-21
In 1985, WHO stated that there was no justification for caesarean section (CS) rates higher than 10-15% at population-level. While the CS rates worldwide have continued to increase in an unprecedented manner over the subsequent three decades, concern has been raised about the validity of the 1985 landmark statement. We conducted a systematic review to identify, critically appraise and synthesize the analyses of the ecologic association between CS rates and maternal, neonatal and infant outcomes. Four electronic databases were searched for ecologic studies published between 2000 and 2014 that analysed the possible association between CS rates and maternal, neonatal or infant mortality or morbidity. Two reviewers performed study selection, data extraction and quality assessment independently. We identified 11,832 unique citations and eight studies were included in the review. Seven studies correlated CS rates with maternal mortality, five with neonatal mortality, four with infant mortality, two with LBW and one with stillbirths. Except for one, all studies were cross-sectional in design and five were global analyses of national-level CS rates versus mortality outcomes. Although the overall quality of the studies was acceptable; only two studies controlled for socio-economic factors and none controlled for clinical or demographic characteristics of the population. In unadjusted analyses, authors found a strong inverse relationship between CS rates and the mortality outcomes so that maternal, neonatal and infant mortality decrease as CS rates increase up to a certain threshold. In the eight studies included in this review, this threshold was at CS rates between 9 and 16%. However, in the two studies that adjusted for socio-economic factors, this relationship was either weakened or disappeared after controlling for these confounders. CS rates above the threshold of 9-16% were not associated with decreases in mortality outcomes regardless of adjustments. Our findings could be interpreted to mean that at CS rates below this threshold, socio-economic development may be driving the ecologic association between CS rates and mortality. On the other hand, at rates higher than this threshold, there is no association between CS and mortality outcomes regardless of adjustment. The ecological association between CS rates and relevant morbidity outcomes needs to be evaluated before drawing more definite conclusions at population level.
Mathany, Timothy M.; Kulongoski, Justin T.; Ray, Mary C.; Belitz, Kenneth
2009-01-01
Groundwater quality in the approximately 653-square-mile South Coast Interior Basins (SCI) study unit was investigated from August to December 2008, as part of the Priority Basins Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basins Project was developed in response to Legislative mandates (Supplemental Report of the 1999 Budget Act 1999-00 Fiscal Year; and, the Groundwater-Quality Monitoring Act of 2001 [Sections 10780-10782.3 of the California Water Code, Assembly Bill 599]) to assess and monitor the quality of groundwater used as public supply for municipalities in California, and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). SCI was the 27th study unit to be sampled as part of the GAMA Priority Basins Project. This study was designed to provide a spatially unbiased assessment of the quality of untreated groundwater used for public water supplies within SCI, and to facilitate statistically consistent comparisons of groundwater quality throughout California. Samples were collected from 54 wells within the three study areas [Livermore, Gilroy, and Cuyama] of SCI in Alameda, Santa Clara, San Benito, Santa Barbara, Ventura, and Kern Counties. Thirty-five of the wells were selected using a spatially distributed, randomized grid-based method to provide statistical representation of the study unit (grid wells), and 19 were selected to aid in evaluation of specific water-quality issues (understanding wells). The groundwater samples were analyzed for organic constituents [volatile organic compounds (VOCs), pesticides and pesticide degradates, polar pesticides and metabolites, and pharmaceutical compounds], constituents of special interest [perchlorate and N-nitrosodimethylamine (NDMA)], naturally occurring inorganic constituents [trace elements, nutrients, major and minor ions, silica, total dissolved solids (TDS), and alkalinity], and radioactive constituents [gross alpha and gross beta radioactivity and radon-222]. Naturally occurring isotopes [stable isotopes of hydrogen, oxygen, and carbon, and activities of tritium and carbon-14] and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. In total, 288 constituents and water-quality indicators (field parameters) were investigated. Three types of quality-control samples (blanks, replicates, and matrix spikes) each were collected at approximately 4-11 percent of the wells, and the results for these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination was not a significant source of bias in the data obtained from the groundwater samples. Differences between replicate samples generally were less than 10 percent relative standard deviation, indicating acceptable analytical reproducibility. Matrix spike recoveries were within the acceptable range (70 to 130 percent) for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, untreated groundwater typically is treated, disinfected, and/or blended with other waters to maintain water quality. Regulatory thresholds apply to water that is served to the consumer, not to untreated groundwater. However, to provide some context for the results, concentrations of constituents measured in the untreated groundwater were compared with regulatory and nonregulatory health-based thresholds established by the U.S. Environmental Protection Agency (USEPA) and California Department of Public Health (CDPH), and to nonregulatory thresholds established for aesthetic and technical concerns by CDPH. Comparisons between data collected for this study and thresholds for drinking water are for illustrative purposes only, and are not indicative of complia
Estimation of Effect Thresholds for the Development of Water Quality Criteria
Biological and ecological effect thresholds can be used for determining safe levels of nontraditional stressors. The U.S. EPA Framework for Developing Suspended and Bedded Sediments (SABS) Water Quality Criteria (WQC) [36] uses a risk assessment approach to estimate effect thre...
Vollono, Catello; Testani, Elisa; Losurdo, Anna; Mazza, Salvatore; Della Marca, Giacomo
2013-06-10
We discuss the hypothesis proposed by Engstrom and coworkers that Migraineurs have a relative sleep deprivation, which lowers the pain threshold and predispose to attacks. Previous data indicate that Migraineurs have a reduction of Cyclic Alternating Pattern (CAP), an essential mechanism of NREM sleep regulation which allows to dump the effect of incoming disruptive stimuli, and to protect sleep. The modifications of CAP observed in Migraineurs are similar to those observed in patients with impaired arousal (narcolepsy) and after sleep deprivation. The impairment of this mechanism makes Migraineurs more vulnerable to stimuli triggering attacks during sleep, and represents part of a more general vulnerability to incoming stimuli.
Setodji, Claude Messan; Le, Vi-Nhuan; Schaack, Diana
2013-04-01
Research linking high-quality child care programs and children's cognitive development has contributed to the growing popularity of child care quality benchmarking efforts such as quality rating and improvement systems (QRIS). Consequently, there has been an increased interest in and a need for approaches to identifying thresholds, or cutpoints, in the child care quality measures used in these benchmarking efforts that differentiate between different levels of children's cognitive functioning. To date, research has provided little guidance to policymakers as to where these thresholds should be set. Using the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B) data set, this study explores the use of generalized additive modeling (GAM) as a method of identifying thresholds on the Infant/Toddler Environment Rating Scale (ITERS) in relation to toddlers' performance on the Mental Development subscale of the Bayley Scales of Infant Development (the Bayley Mental Development Scale Short Form-Research Edition, or BMDSF-R). The present findings suggest that simple linear models do not always correctly depict the relationships between ITERS scores and BMDSF-R scores and that GAM-derived thresholds were more effective at differentiating among children's performance levels on the BMDSF-R. Additionally, the present findings suggest that there is a minimum threshold on the ITERS that must be exceeded before significant improvements in children's cognitive development can be expected. There may also be a ceiling threshold on the ITERS, such that beyond a certain level, only marginal increases in children's BMDSF-R scores are observed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Use of Quality Controlled AIRS Temperature Soundings to Improve Forecast Skill
NASA Technical Reports Server (NTRS)
Susskind, Joel; Reale, Oreste; Iredell, Lena
2010-01-01
AIRS was launched on EOS Aqua on May 4, 2002, together with AMSU-A and HSB, to form a next generation polar orbiting infrared and microwave atmospheric sounding system. The primary products of AIRS/AMSU-A are twice daily global fields of atmospheric temperature-humidity profiles, ozone profiles, sea/land surface skin temperature, and cloud related parameters including OLR. Also included are the clear column radiances used to derive these products which are representative of the radiances AIRS would have seen if there were no clouds in the field of view. All products also have error estimates. The sounding goals of AIRS are to produce 1 km tropospheric layer mean temperatures with an rms error of 1K, and layer precipitable water with an rms error of 20 percent, in cases with up to 90 percent effective cloud cover. The products are designed for data assimilation purposes for the improvement of numerical weather prediction, as well as for the study of climate and meteorological processes. With regard to data assimilation, one can use either the products themselves or the clear column radiances from which the products were derived. The AIRS Version 5 retrieval algorithm is now being used operationally at the Goddard DISC in the routine generation of geophysical parameters derived from AIRS/AMSU data. A major innovation in Version 5 is the ability to generate case-by-case level-by-level error estimates for retrieved quantities and clear column radiances, and the use of these error estimates for Quality Control. The temperature profile error estimates are used to determine a case-by-case characteristic pressure pbest, down to which the profile is considered acceptable for data assimilation purposes. The characteristic pressure p(sub best) is determined by comparing the case dependent error estimate (delta)T(p) to the threshold values (Delta)T(p). The AIRS Version 5 data set provides error estimates of T(p) at all levels, and also profile dependent values of pbest based on use of a Standard profile dependent threshold (Delta)T(p). These Standard thresholds were designed as a compromise between optimal use for data assimilation purposes, which requires highest accuracy (tighter Quality Control), and climate purposes, which requires more spatial coverage (looser Quality Control). Subsequent research using Version 5 sounding and error estimates showed that tighter Quality Control performs better for data assimilation proposes, while looser Quality Control better spatial coverage) performs better for climate purposes. We conducted a number of data assimilation experiments using the NASA GEOS-5 Data Assimilation System as a step toward finding an optimum balance of spatial coverage and sounding accuracy with regard to improving forecast skill. The model was run at a horizontal resolution of 0.5 degree latitude x 0.67 degree longitude with 72 vertical levels. These experiments were run during four different seasons, each using a different year. The AIRS temperature profiles were presented to the GEOS-5 analysis as rawinsonde profiles, and the profile error estimates (delta)T(p) were used as the uncertainty for each measurement in the data assimilation process.
Color difference thresholds in dentistry.
Paravina, Rade D; Ghinea, Razvan; Herrera, Luis J; Bona, Alvaro D; Igiel, Christopher; Linninger, Mercedes; Sakai, Maiko; Takahashi, Hidekazu; Tashkandi, Esam; Perez, Maria del Mar
2015-01-01
The aim of this prospective multicenter study was to determine 50:50% perceptibility threshold (PT) and 50:50% acceptability threshold (AT) of dental ceramic under simulated clinical settings. The spectral radiance of 63 monochromatic ceramic specimens was determined using a non-contact spectroradiometer. A total of 60 specimen pairs, divided into 3 sets of 20 specimen pairs (medium to light shades, medium to dark shades, and dark shades), were selected for psychophysical experiment. The coordinating center and seven research sites obtained the Institutional Review Board (IRB) approvals prior the beginning of the experiment. Each research site had 25 observers, divided into five groups of five observers: dentists-D, dental students-S, dental auxiliaries-A, dental technicians-T, and lay persons-L. There were 35 observers per group (five observers per group at each site ×7 sites), for a total of 175 observers. Visual color comparisons were performed using a viewing booth. Takagi-Sugeno-Kang (TSK) fuzzy approximation was used for fitting the data points. The 50:50% PT and 50:50% AT were determined in CIELAB and CIEDE2000. The t-test was used to evaluate the statistical significance in thresholds differences. The CIELAB 50:50% PT was ΔEab = 1.2, whereas 50:50% AT was ΔEab = 2.7. Corresponding CIEDE2000 (ΔE00 ) values were 0.8 and 1.8, respectively. 50:50% PT by the observer group revealed differences among groups D, A, T, and L as compared with 50:50% PT for all observers. The 50:50% AT for all observers was statistically different than 50:50% AT in groups T and L. A 50:50% perceptibility and ATs were significantly different. The same is true for differences between two color difference formulas ΔE00 /ΔEab . Observer groups and sites showed high level of statistical difference in all thresholds. Visual color difference thresholds can serve as a quality control tool to guide the selection of esthetic dental materials, evaluate clinical performance, and interpret visual and instrumental findings in clinical dentistry, dental research, and subsequent standardization. The importance of quality control in dentistry is reinforced by increased esthetic demands of patients and dental professionals. © 2015 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cain, W.S.; Shoaf, C.R.; Velasquez, S.F.
1992-03-01
In response to numerous requests for information related to odor thresholds, this document was prepared by the Air Risk Information Support Center in its role in providing technical assistance to State and Local government agencies on risk assessment of air pollutants. A discussion of basic concepts related to olfactory function and the measurement of odor thresholds is presented. A detailed discussion of criteria which are used to evaluate the quality of published odor threshold values is provided. The use of odor threshold information in risk assessment is discussed. The results of a literature search and review of odor threshold informationmore » for the chemicals listed as hazardous air pollutants in the Clean Air Act amendments of 1990 is presented. The published odor threshold values are critically evaluated based on the criteria discussed and the values of acceptable quality are used to determine a geometric mean or best estimate.« less
Yazla, Semih; Özmen, Süay; Kıyıcı, Sinem; Yıldız, Demet; Haksever, Mehmet; Gencay, Sündüz
2018-03-01
Olfaction and gustation in patients with diabetes mellitus have great significance on quality of life, and their impairment may result in possible hazards. A limited number of studies have been performed to determine the alteration of both gustatory and olfactory function in type 2 diabetic patients with diabetic peripheral neuropathy (DPN). The aim of this study was to determine whether type 2 diabetic patients, with and without DPN, exhibit major olfactory and gustatory dysfunction using validated and dependable techniques. An observational-analytical case-control study was conducted. Sixty patients with type 2 diabetes mellitus (T2DM) and 30 healthy control subjects with a mean age of 57.1 ± 8.4 were included in the study. Patients with T2DM were recruited from the endocrinology outpatient clinic. After clinical evaluation and electromyography examination, patients with T2DM were divided into the 2 groups, with and without DPN. After a 10-hour fasting period, blood samples were taken for the measurement of serum creatinine, lipids, and HbA1c. For the quantitative assessment of olfactory function, all participants underwent butanol threshold test and odour identification test. Gustatory function was tested administering a whole-mouth above-threshold test using sucrose solutions. The control subjects showed significantly higher Sniffin' sticks and butanol threshold scores than the diabetic patients without DPN (P = .001 and P = .009). No significant difference was found in the gustatory function test between these 2 groups (P = .116). Diabetic patients with DPN had lower Sniffin' sticks scores, butanol threshold scores, and higher sucrose thresholds compared to the controls (P < .001, P < .001, and P = .002). There were no significant differences between diabetic patients with or without DPN regarding Sniffin' sticks scores, butanol threshold, and sucrose thresholds (P = .302, P = .181, and P = .118). In conclusion, this study demonstrates that T2DM is associated with olfactory and gustatory dysfunction. The fact that there was no difference between the diabetic patients with and without DPN elicits the idea of central neuropathy. This novel finding might facilitate the addition of olfactory and gustatory tests to the methodological spectrum of afferent pathway investigations. Copyright © 2017 John Wiley & Sons, Ltd.
Fernandes, Giovana; Jennings, Fabio; Nery Cabral, Michele Vieira; Pirozzi Buosi, Ana Letícia; Natour, Jamil
2016-08-01
To evaluate the effect of swimming on pain, functional capacity, aerobic capacity, and quality of life in patients with fibromyalgia (FM). Randomized controlled trial. Rheumatology outpatient clinics of a university hospital. Women with FM (N=75; age range, 18-60y) randomly assigned to a swimming group (SG) (n=39) or a walking group (WG) (n=36). The SG performed 50 minutes of swimming 3 times a week for 12 weeks, with a heart rate at 11 beats under the anaerobic threshold. The WG performed walking with a heart rate at the anaerobic threshold, with the same duration and frequency as the SG. Participants were evaluated before the exercise protocols (t0), at 6 weeks (t6), and at 12 weeks (t12) after the onset of the protocols. The primary outcome measure was the visual analog scale for pain. The secondary measurements were the Fibromyalgia Impact Questionnaire and the Medical Outcomes Study 36-Item Short-Form Health Survey for quality of life; a spiroergometric test for cardiorespiratory variables; and the timed Up & Go test for functional performance. Patients in both groups experienced improvement in pain after the 12-week program, with no difference between groups (P=.658). The same results were found regarding functional capacity and quality of life. Moreover, no statistical difference between groups was found regarding aerobic capacity over time. Swimming, like walking, is an effective method for reducing pain and improving both functional capacity and quality of life in patients with FM. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
[Continuous quality improvement in anesthesia].
Gaitini, L; Vaida, S; Madgar, S
1998-01-01
Slow continuous quality improvement (SCQI) in anesthesia is a process that allows identification of problems and their causes. Implementing measures to correct them and continuous monitoring to ensure that the problems have been eliminated are necessary. The basic assumption of CQI is that the employees of an organization are competent and working to the best of their abilities. If problems occur they are the consequences of inadequacies in the process rather that in the individual. The CQI program is a dynamic but gradual system that invokes a slower rate of response in comparison with other quality methods, like quality assurance. Spectacular results following a system change are not to be expected an the ideal is slow and continuous improvement. A SCQI program was adapted by our department in May 1994, according to the recommendations of the American Society of Anesthesiologists. Problem identification was based on 65 clinical indicators, reflecting negative events related to anesthesia. Data were collected using a specially designed computer database. 4 events were identified as crossing previously established thresholds (hypertension, hypotension, hypoxia and inadequate nerve block). Statistical process control was used to establish stability of the system and whether negative events were influenced only by the common causes. The causes responsible for these negative events were identified using specific SCQI tools, such as control-charts, cause-effect diagrams and Pareto diagrams. Hypertension and inadequate nerve block were successfully managed. The implementation of corrective measures for the other events that cross the threshold is still in evolution. This program requires considerable dedication on the part of the staff, and it is hoped that it will improve our clinical performance.
Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences
NASA Technical Reports Server (NTRS)
Watson, Andrew B. (Inventor)
2002-01-01
A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.
Image quality, threshold contrast and mean glandular dose in CR mammography
NASA Astrophysics Data System (ADS)
Jakubiak, R. R.; Gamba, H. R.; Neves, E. B.; Peixoto, J. E.
2013-09-01
In many countries, computed radiography (CR) systems represent the majority of equipment used in digital mammography. This study presents a method for optimizing image quality and dose in CR mammography of patients with breast thicknesses between 45 and 75 mm. Initially, clinical images of 67 patients (group 1) were analyzed by three experienced radiologists, reporting about anatomical structures, noise and contrast in low and high pixel value areas, and image sharpness and contrast. Exposure parameters (kV, mAs and target/filter combination) used in the examinations of these patients were reproduced to determine the contrast-to-noise ratio (CNR) and mean glandular dose (MGD). The parameters were also used to radiograph a CDMAM (version 3.4) phantom (Artinis Medical Systems, The Netherlands) for image threshold contrast evaluation. After that, different breast thicknesses were simulated with polymethylmethacrylate layers and various sets of exposure parameters were used in order to determine optimal radiographic parameters. For each simulated breast thickness, optimal beam quality was defined as giving a target CNR to reach the threshold contrast of CDMAM images for acceptable MGD. These results were used for adjustments in the automatic exposure control (AEC) by the maintenance team. Using optimized exposure parameters, clinical images of 63 patients (group 2) were evaluated as described above. Threshold contrast, CNR and MGD for such exposure parameters were also determined. Results showed that the proposed optimization method was effective for all breast thicknesses studied in phantoms. The best result was found for breasts of 75 mm. While in group 1 there was no detection of the 0.1 mm critical diameter detail with threshold contrast below 23%, after the optimization, detection occurred in 47.6% of the images. There was also an average MGD reduction of 7.5%. The clinical image quality criteria were attended in 91.7% for all breast thicknesses evaluated in both patient groups. Finally, this study also concluded that the use of the AEC of the x-ray unit based on the constant dose to the detector may bring some difficulties to CR systems to operate under optimal conditions. More studies must be performed, so that the compatibility between systems and optimization methodologies can be evaluated, as well as this optimization method. Most methods are developed for phantoms, so comparative studies including clinical images must be developed.
NASA Astrophysics Data System (ADS)
Muench, R.; Jones, M.; Herndon, K. E.; Bell, J. R.; Anderson, E. R.; Markert, K. N.; Molthan, A.; Adams, E. C.; Shultz, L.; Cherrington, E. A.; Flores, A.; Lucey, R.; Munroe, T.; Layne, G.; Pulla, S. T.; Weigel, A. M.; Tondapu, G.
2017-12-01
On August 25, 2017, Hurricane Harvey made landfall between Port Aransas and Port O'Connor, Texas, bringing with it unprecedented amounts of rainfall and flooding. In times of natural disasters of this nature, emergency responders require timely and accurate information about the hazard in order to assess and plan for disaster response. Due to the extreme flooding impacts associated with Hurricane Harvey, delineations of water extent were crucial to inform resource deployment. Through the USGS's Hazards Data Distribution System, government and commercial vendors were able to acquire and distribute various satellite imagery to analysts to create value-added products that can be used by these emergency responders. Rapid-response water extent maps were created through a collaborative multi-organization and multi-sensor approach. One team of researchers created Synthetic Aperture Radar (SAR) water extent maps using modified Copernicus Sentinel data (2017), processed by ESA. This group used backscatter images, pre-processed by the Alaska Satellite Facility's Hybrid Pluggable Processing Pipeline (HyP3), to identify and apply a threshold to identify water in the image. Quality control was conducted by manually examining the image and correcting for potential errors. Another group of researchers and graduate student volunteers derived water masks from high resolution DigitalGlobe and SPOT images. Through a system of standardized image processing, quality control measures, and communication channels the team provided timely and fairly accurate water extent maps to support a larger NASA Disasters Program response. The optical imagery was processed through a combination of various band thresholds by using Normalized Difference Water Index (NDWI), Modified Normalized Water Index (MNDWI), Normalized Difference Vegetation Index (NDVI), and cloud masking. Several aspects of the pre-processing and image access were run on internal servers to expedite the provision of images to analysts who could focus on manipulating thresholds and quality control checks for maximum accuracy within the time constraints. The combined results of the radar- and optical-derived value-added products through the coordination of multiple organizations provided timely information for emergency response and recovery efforts
NASA Technical Reports Server (NTRS)
Muench, Rebekke; Jones, Madeline; Herndon, Kelsey; Schultz, Lori; Bell, Jordan; Anderson, Eric; Markert, Kel; Molthan, Andrew; Adams, Emily; Cherrington, Emil;
2017-01-01
On August 25, 2017, Hurricane Harvey made landfall between Port Aransas and Port O'Connor, Texas, bringing with it unprecedented amounts of rainfall and record flooding. In times of natural disasters of this nature, emergency responders require timely and accurate information about the hazard in order to assess and plan for disaster response. Due to the extreme flooding impacts associated with Hurricane Harvey, delineations of water extent were crucial to inform resource deployment. Through the USGS's Hazards Data Distribution System, government and commercial vendors were able to acquire and distribute various satellite imagery to analysts to create value-added products that can be used by these emergency responders. Rapid-response water extent maps were created through a collaborative multi-organization and multi-sensor approach. One team of researchers created Synthetic Aperture Radar (SAR) water extent maps using modified Copernicus Sentinel data (2017), processed by ESA. This group used backscatter images, pre-processed by the Alaska Satellite Facility's Hybrid Pluggable Processing Pipeline (HyP3), to identify and apply a threshold to identify water in the image. Quality control was conducted by manually examining the image and correcting for potential errors. Another group of researchers and graduate student volunteers derived water masks from high resolution DigitalGlobe and SPOT images. Through a system of standardized image processing, quality control measures, and communication channels the team provided timely and fairly accurate water extent maps to support a larger NASA Disasters Program response. The optical imagery was processed through a combination of various band thresholds and by using Normalized Difference Water Index (NDWI), Modified Normalized Water Index (MNDWI), Normalized Difference Vegetation Index (NDVI), and cloud masking. Several aspects of the pre-processing and image access were run on internal servers to expedite the provision of images to analysts who could focus on manipulating thresholds and quality control checks for maximum accuracy within the time constraints. The combined results of the radar- and optical-derived value-added products through the coordination of multiple organizations provided timely information for emergency response and recovery efforts.
Radcliffe, Michael J; Lewith, George T; Turner, Richard G; Prescott, Philip; Church, Martin K; Holgate, Stephen T
2003-01-01
Objective To assess the efficacy of enzyme potentiated desensitisation in the treatment of severe summer hay fever poorly controlled by pharmacotherapy. Design Double blind randomised placebo controlled parallel group study. Setting Hospital in Hampshire. Participants 183 participants aged between 18 and 64 with a history of severe summer hay fever for at least two years; all were skin prick test positive to timothy grass pollen. 90 randomised to active treatment; 93 randomised to placebo. Interventions Active treatment: two injections of enzyme potentiated desensitisation, given between eight and 11 weeks apart, each comprising 200 Fishman units of β glucuronidase, 50 pg 1,3-cyclohexanediol, 50 ng protamine sulphate, and a mixed inhaled allergen extract (pollen mixes for trees, grasses, and weeds; allergenic fungal spores; cat and dog danders; dust and storage mites) in a total volume of 0.05 ml of buffered saline. Placebo: two injections of 0.05 ml buffered saline solution. Main outcome measures Proportion of problem-free days; global rhinoconjunctivitis quality of life scores assessed weekly during pollen season. Results The active treatment group and the placebo group did not differ in the proportion of problem-free days, quality of life scores, symptom severity scores, change in quantitative skin prick provocation threshold, or change in conjunctival provocation threshold. No clinically significant adverse reactions occurred. Conclusions Enzyme potentiated desensitisation showed no treatment effect in this study. PMID:12896934
Maestú, Ceferino; Blanco, Manuel; Nevado, Angel; Romero, Julia; Rodríguez-Rubio, Patricia; Galindo, Javier; Bautista Lorite, Juan; de las Morenas, Francisco; Fernández-Argüelles, Pedro
2013-01-01
Exposure to electromagnetic fields has been reported to have analgesic and antinociceptive effects in several organisms. To test the effect of very low-intensity transcranial magnetic stimulation on symptoms associated with fibromyalgia syndrome. A double-blinded, placebo-controlled clinical trial was performed in the Sagrado Corazón Hospital, Seville, Spain. Female fibromyalgia patients (22 to 50 years of age) were randomly assigned to either a stimulation group or a sham group. The stimulation group (n=28) was stimulated using 8 Hz pulsed magnetic fields of very low intensity, while the sham group (n=26) underwent the same protocol without stimulation. Pressure pain thresholds before and after stimulation were determined using an algometer during the eight consecutive weekly sessions of the trial. In addition, blood serotonin levels were measured and patients completed questionnaires to monitor symptom evolution. A repeated-measures ANOVA indicated statistically significant improvement in the stimulation group compared with the control group with respect to somatosensory pain thresholds, ability to perform daily activities, perceived chronic pain and sleep quality. While improvement in pain thresholds was apparent after the first stimulation session, improvement in the other three measures occurred after the sixth week. No significant between-group differences were observed in scores of depression, fatigue, severity of headaches or serotonin levels. No adverse side effects were reported in any of the patients. Very low-intensity magnetic stimulation may represent a safe and effective treatment for chronic pain and other symptoms associated with fibromyalgia.
Maestú, Ceferino; Blanco, Manuel; Nevado, Angel; Romero, Julia; Rodríguez-Rubio, Patricia; Galindo, Javier; Lorite, Juan Bautista; de las Morenas, Francisco; Fernández-Argüelles, Pedro
2013-01-01
BACKGROUND: Exposure to electromagnetic fields has been reported to have analgesic and antinociceptive effects in several organisms. OBJECTIVE: To test the effect of very low-intensity transcranial magnetic stimulation on symptoms associated with fibromyalgia syndrome. METHODS: A double-blinded, placebo-controlled clinical trial was performed in the Sagrado Corazón Hospital, Seville, Spain. Female fibromyalgia patients (22 to 50 years of age) were randomly assigned to either a stimulation group or a sham group. The stimulation group (n=28) was stimulated using 8 Hz pulsed magnetic fields of very low intensity, while the sham group (n=26) underwent the same protocol without stimulation. Pressure pain thresholds before and after stimulation were determined using an algometer during the eight consecutive weekly sessions of the trial. In addition, blood serotonin levels were measured and patients completed questionnaires to monitor symptom evolution. RESULTS: A repeated-measures ANOVA indicated statistically significant improvement in the stimulation group compared with the control group with respect to somatosensory pain thresholds, ability to perform daily activities, perceived chronic pain and sleep quality. While improvement in pain thresholds was apparent after the first stimulation session, improvement in the other three measures occurred after the sixth week. No significant between-group differences were observed in scores of depression, fatigue, severity of headaches or serotonin levels. No adverse side effects were reported in any of the patients. CONCLUSIONS: Very low-intensity magnetic stimulation may represent a safe and effective treatment for chronic pain and other symptoms associated with fibromyalgia. PMID:24308025
Towards the control of the modal energy transfer in transverse mode instabilities
NASA Astrophysics Data System (ADS)
Stihler, Christoph; Jauregui, Cesar; Tünnermann, Andreas; Limpert, Jens
2018-02-01
Thermally-induced refractive index gratings (RIG) in high-power fiber laser systems lead to transverse mode instabilities (TMI) above a certain average power threshold. The effect of TMI is currently the main limitation for the further average power scaling of fiber lasers and amplifiers with nearly diffraction-limited beam quality. In this work we experimentally investigate, for the first time, the growth of the RIG strength by introducing a phase-shift between the RIG and the modal interference pattern in a fiber amplifier. The experiments reveal that the RIG is strong enough to couple energy between different transverse modes even at powers significantly below the TMI threshold, provided that the introduced phase-shift is high enough. This indicates that, as the strength of the RIG further increases with increasing average output power, the RIG becomes more and more sensitive to even small noise-induced phase-shifts, which ultimately trigger TMI. Furthermore, it is shown that a beam cleaning also occurs when a positive phase-shift is introduced, even above the TMI threshold. This finding will pave the way for the development of a new class of mitigation strategies for TMI, which key feature is the control of the introduced phase-shift.
Determination of Cost-Effectiveness Threshold for Health Care Interventions in Malaysia.
Lim, Yen Wei; Shafie, Asrul Akmal; Chua, Gin Nie; Ahmad Hassali, Mohammed Azmi
2017-09-01
One major challenge in prioritizing health care using cost-effectiveness (CE) information is when alternatives are more expensive but more effective than existing technology. In such a situation, an external criterion in the form of a CE threshold that reflects the willingness to pay (WTP) per quality-adjusted life-year is necessary. To determine a CE threshold for health care interventions in Malaysia. A cross-sectional, contingent valuation study was conducted using a stratified multistage cluster random sampling technique in four states in Malaysia. One thousand thirteen respondents were interviewed in person for their socioeconomic background, quality of life, and WTP for a hypothetical scenario. The CE thresholds established using the nonparametric Turnbull method ranged from MYR12,810 to MYR22,840 (~US $4,000-US $7,000), whereas those estimated with the parametric interval regression model were between MYR19,929 and MYR28,470 (~US $6,200-US $8,900). Key factors that affected the CE thresholds were education level, estimated monthly household income, and the description of health state scenarios. These findings suggest that there is no single WTP value for a quality-adjusted life-year. The CE threshold estimated for Malaysia was found to be lower than the threshold value recommended by the World Health Organization. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Relationship Between Consumer Acceptability and Pungency-Related Flavor Compounds of Vidalia Onions.
Kim, Ha-Yeon; Jackson, Daniel; Adhikari, Koushik; Riner, Cliff; Sanchez-Brambila, Gabriela
2017-10-01
A consumer study was conducted to evaluate preferences in Vidalia onions, and define consumer acceptability thresholds for commonly analyzed flavor compounds associated with pungency. Two varieties of Vidalia onions (Plethora and Sapelo Sweet) were grown at 3 fertilizer application rates (37.5 and 0; 134.5 and 59.4; and 190 and 118.8 kg/ha of nitrogen and sulfur, respectively), creating 6 treatments with various flavor attributes to use in the study. Bulb soluble solids, sugars, pyruvic acid, lachrymatory factor (LF; propanethial S-oxide), and methyl thiosulfinate (MT) content were determined and compared to sensory responses for overall liking, intensity of the sharp/pungent/burning sensation (SPB), and intent to buy provided by 142 consumers. Onion pyruvate, LF, MT, and sugar content increased as fertilization rate increased, regardless of onion variety. Consumer responses showed participants preferred onions with low SPB, which correlated positively to lower pyruvate, LF and MT concentrations, but showed no relationship to total sugars in the onion bulb. Regression analyses revealed that the majority of consumers (≥55%) found the flavor of Vidalia onions acceptable when the concentrations of LF, pyruvic acid, and MT within the bulbs were below 2.21, 4.83, and 0.43 nmol/mL, respectively. These values will support future studies aimed at identifying the optimal cultivation practices for production of sweet Vidalia onions, and can serve as an industry benchmark for quality control, thus ensuring the flavor of Vidalia onions will be acceptable to the majority of consumers. This study identified the relationship between consumer preferences and commonly analyzed flavor compounds in Vidalia onions, and established thresholds for these compounds at concentrations which the majority of consumers will find desirable. These relationships and thresholds will support future research investigating how cultural practices impact onion quality, and can be used to assist growers in variety selection decisions. In addition, this information will provide a benchmark to Vidalia onion producers for quality control of the sweet onions produced, ensuring that the onions are consistently of a desired quality, thereby increasing consumer's reliability in the Vidalia onion brand. © 2017 Institute of Food Technologists®.
Water quality and bed sediment quality in the Albemarle Sound, North Carolina, 2012–14
Moorman, Michelle C.; Fitzgerald, Sharon A.; Gurley, Laura N.; Rhoni-Aref, Ahmed; Loftin, Keith A.
2017-01-23
The Albemarle Sound region was selected in 2012 as one of two demonstration sites in the Nation to test and improve the design of the National Water Quality Monitoring Council’s National Monitoring Network (NMN) for U.S. Coastal Waters and Tributaries. The goal of the NMN for U.S. Coastal Waters and Tributaries is to provide information about the health of our oceans, coastal ecosystems, and inland influences on coastal waters for improved resource management. The NMN is an integrated, multidisciplinary, and multi-organizational program using multiple sources of data and information to augment current monitoring programs.This report presents and summarizes selected water-quality and bed sediment-quality data collected as part of the demonstration project conducted in two phases. The first phase was an occurrence and distribution study to assess nutrients, metals, pesticides, cyanotoxins, and phytoplankton communities in the Albemarle Sound during the summer of 2012 at 34 sites in Albemarle Sound, nearby sounds, and various tributaries. The second phase consisted of monthly sampling over a year (March 2013 through February 2014) to assess seasonality in a more limited set of constituents including nutrients, cyanotoxins, and phytoplankton communities at a subset (eight) of the sites sampled in the first phase. During the summer of 2012, few constituent concentrations exceeded published water-quality thresholds; however, elevated levels of chlorophyll a and pH were observed in the northern embayments and in Currituck Sound. Chlorophyll a, and metals (copper, iron, and zinc) were detected above a water-quality threshold. The World Health Organization provisional guideline based on cyanobacterial density for high recreational risk was exceeded in approximately 50 percent of water samples collected during the summer of 2012. Cyanobacteria capable of producing toxins were present, but only low levels of cyanotoxins below human health benchmarks were detected. Finally, 12 metals in surficial bed sediments were detected at levels above a published sediment-quality threshold. These metals included chromium, mercury, copper, lead, arsenic, nickel, and cadmium. Sites with several metal concentrations above the respective thresholds had relatively high concentrations of organic carbon or fine sediment (silt plus clay), or both and were predominantly located in the western and northwestern parts of the Albemarle Sound.Results from the second phase were generally similar to those of the first in that relatively few constituents exceeded a water-quality threshold, both pH and chlorophyll a were detected above the respective water-quality thresholds, and many of these elevated concentrations occurred in the northern embayments and in Currituck Sound. In contrast to the results from phase one, the cyanotoxin, microcystin was detected at more than 10 times the water-quality threshold during a phytoplankton bloom on the Chowan River at Mount Gould, North Carolina in August of 2013. This was the only cyanotoxin concentration measured during the entire study that exceeded a respective water-quality threshold.The information presented in this report can be used to improve understanding of water-quality conditions in the Albemarle Sound, particularly when evaluating causal and response variables that are indicators of eutrophication. In particular, this information can be used by State agencies to help develop water-quality criteria for nutrients, and to understand factors like cyanotoxins that may affect fisheries and recreation in the Albemarle Sound region.
Lauche, Romy; Cramer, Holger; Hohmann, Claudia; Choi, Kyung-Eun; Rampp, Thomas; Saha, Felix Joyonto; Musial, Frauke; Langhorst, Jost; Dobos, Gustav
2012-01-01
Introduction. Cupping has been used since antiquity in the treatment of pain conditions. In this pilot study, we investigated the effect of traditional cupping therapy on chronic nonspecific neck pain (CNP) and mechanical sensory thresholds. Methods. Fifty CNP patients were randomly assigned to treatment (TG, n = 25) or waiting list control group (WL, n = 25). TG received a single cupping treatment. Pain at rest (PR), pain related to movement (PM), quality of life (SF-36), Neck Disability Index (NDI), mechanical detection (MDT), vibration detection (MDT), and pressure pain thresholds (PPT) were measured before and three days after a single cupping treatment. Patients also kept a pain and medication diary (PaDi, MeDi) during the study. Results. Baseline characteristics were similar in the two groups. After cupping TG reported significantly less pain (PR: −17.9 mm VAS, 95%CI −29.2 to −6.6; PM: −19.7, 95%CI −32.2 to −7.2; PaDi: −1.5 points on NRS, 95%CI −2.5 to −0.4; all P < 0.05) and higher quality of life than WL (SF-36, Physical Functioning: 7.5, 95%CI 1.4 to 13.5; Bodily Pain: 14.9, 95%CI 4.4 to 25.4; Physical Component Score: 5.0, 95%CI 1.4 to 8.5; all P < 0.05). No significant effect was found for NDI, MDT, or VDT, but TG showed significantly higher PPT at pain-areas than WL (in lg(kPa); pain-maximum: 0.088, 95%CI 0.029 to 0.148, pain-adjacent: 0.118, 95%CI 0.038 to 0.199; both P < 0.01). Conclusion. A single application of traditional cupping might be an effective treatment for improving pain, quality of life, and hyperalgesia in CNP. PMID:22203873
Scale Control and Quality Management of Printed Image Parameters
NASA Astrophysics Data System (ADS)
Novoselskaya, O. A.; Kolesnikov, V. L.; Solov'eva, T. V.; Nagornova, I. V.; Babluyk, E. B.; Trapeznikova, O. V.
2017-06-01
The article provides a comparison of the main valuation techniques for a regulated parameter of printability of the offset paper by current standards GOST 24356 and ISO 3783: 2006. The results of development and implementation of a complex test scale for management and control the quality of printed production are represented. The estimation scale is introduced. It includes normalized parameters of print optical density, print uniformity, picking out speed, the value of dot gain, print contrast with the added criteria of minimizing microtexts, a paper slip, resolution threshold and effusing ability of paper surface. The results of analysis allow directionally form surface properties of the substrate to facilitate achieving the required quality of the printed image parameters, i. e. optical density of a print at a predetermined level not less than 1.3, the print uniformity with minimal deviation of dot gain about the order of 10 per cents.
A low-threshold high-index-contrast grating (HCG)-based organic VCSEL
NASA Astrophysics Data System (ADS)
Shayesteh, Mohammad Reza; Darvish, Ghafar; Ahmadi, Vahid
2015-12-01
We propose a low-threshold high-index-contrast grating (HCG)-based organic vertical-cavity surface-emitting laser (OVCSEL). The device has the feasibility to apply both electrical and optical excitation. The microcavity of the laser is a hybrid photonic crystal (HPC) in which the top distributed Bragg reflector (DBR) is replaced by a sub-wavelength high-contrast-grating layer, and provides a high-quality factor. The simulated quality factor of the microcavity is shown to be as high as 282,000. We also investigate the threshold behavior and the dynamics of the OVCSEL optically pumped with sub-picosecond pulses. Results from numerical simulation show that lasing threshold is 75 nJ/cm2.
Rosecrans, Celia Z.; Nolan, Bernard T.; Gronberg, JoAnn M.
2018-01-31
The purpose of the prediction grids for selected redox constituents—dissolved oxygen and dissolved manganese—are intended to provide an understanding of groundwater-quality conditions at the domestic and public-supply drinking water depths. The chemical quality of groundwater and the fate of many contaminants is influenced by redox processes in all aquifers, and understanding the redox conditions horizontally and vertically is critical in evaluating groundwater quality. The redox condition of groundwater—whether oxic (oxygen present) or anoxic (oxygen absent)—strongly influences the oxidation state of a chemical in groundwater. The anoxic dissolved oxygen thresholds of <0.5 milligram per liter (mg/L), <1.0 mg/L, and <2.0 mg/L were selected to apply broadly to regional groundwater-quality investigations. Although the presence of dissolved manganese in groundwater indicates strongly reducing (anoxic) groundwater conditions, it is also considered a “nuisance” constituent in drinking water, making drinking water undesirable with respect to taste, staining, or scaling. Three dissolved manganese thresholds, <50 micrograms per liter (µg/L), <150 µg/L, and <300 µg/L, were selected to create predicted probabilities of exceedances in depth zones used by domestic and public-supply water wells. The 50 µg/L event threshold represents the secondary maximum contaminant level (SMCL) benchmark for manganese (U.S. Environmental Protection Agency, 2017; California Division of Drinking Water, 2014), whereas the 300 µg/L event threshold represents the U.S. Geological Survey (USGS) health-based screening level (HBSL) benchmark, used to put measured concentrations of drinking-water contaminants into a human-health context (Toccalino and others, 2014). The 150 µg/L event threshold represents one-half the USGS HBSL. The resultant dissolved oxygen and dissolved manganese prediction grids may be of interest to water-resource managers, water-quality researchers, and groundwater modelers concerned with the occurrence of natural and anthropogenic contaminants related to anoxic conditions. Prediction grids for selected redox constituents and thresholds were created by the USGS National Water-Quality Assessment (NAWQA) modeling and mapping team.
Ryan, Andrew; Sutton, Matthew; Doran, Tim
2014-01-01
Objective To test whether receiving a financial bonus for quality in the Premier Hospital Quality Incentive Demonstration (HQID) stimulated subsequent quality improvement. Data Hospital-level data on process-of-care quality from Hospital Compare for the treatment of acute myocardial infarction (AMI), heart failure, and pneumonia for 260 hospitals participating in the HQID from 2004 to 2006; receipt of quality bonuses in the first 3 years of HQID from the Premier Inc. website; and hospital characteristics from the 2005 American Hospital Association Annual Survey. Study Design Under the HQID, hospitals received a 1 percent bonus on Medicare payments for scoring between the 80th and 90th percentiles on a composite quality measure, and a 2 percent bonus for scoring at the 90th percentile or above. We used a regression discontinuity design to evaluate whether hospitals with quality scores just above these payment thresholds improved more in the subsequent year than hospitals with quality scores just below the thresholds. In alternative specifications, we examined samples of hospitals scoring within 3, 5, and 10 percentage point “bandwidths” of the thresholds. We used a Generalized Linear Model to estimate whether the relationship between quality and lagged quality was discontinuous at the lagged thresholds required for quality bonuses. Principal Findings There were no statistically significant associations between receipt of a bonus and subsequent quality performance, with the exception of the 2 percent bonus for AMI in 2006 using the 5 percentage point bandwidth (0.8 percentage point increase, p < .01), and the 1 percent bonus for pneumonia in 2005 using all bandwidths (3.7 percentage point increase using the 3 percentage point bandwidth, p < .05). Conclusions We found little evidence that hospitals' receipt of quality bonuses was associated with subsequent improvement in performance. This raises questions about whether winning in pay-for-performance programs, such as Hospital Value-Based Purchasing, will lead to subsequent quality improvement. PMID:23909992
Reactive power and voltage control strategy based on dynamic and adaptive segment for DG inverter
NASA Astrophysics Data System (ADS)
Zhai, Jianwei; Lin, Xiaoming; Zhang, Yongjun
2018-03-01
The inverter of distributed generation (DG) can support reactive power to help solve the problem of out-of-limit voltage in active distribution network (ADN). Therefore, a reactive voltage control strategy based on dynamic and adaptive segment for DG inverter is put forward to actively control voltage in this paper. The proposed strategy adjusts the segmented voltage threshold of Q(U) droop curve dynamically and adaptively according to the voltage of grid-connected point and the power direction of adjacent downstream line. And then the reactive power reference of DG inverter can be got through modified Q(U) control strategy. The reactive power of inverter is controlled to trace the reference value. The proposed control strategy can not only control the local voltage of grid-connected point but also help to maintain voltage within qualified range considering the terminal voltage of distribution feeder and the reactive support for adjacent downstream DG. The scheme using the proposed strategy is compared with the scheme without the reactive support of DG inverter and the scheme using the Q(U) control strategy with constant segmented voltage threshold. The simulation results suggest that the proposed method has a significant improvement on solving the problem of out-of-limit voltage, restraining voltage variation and improving voltage quality.
Toward a perceptual video-quality metric
NASA Astrophysics Data System (ADS)
Watson, Andrew B.
1998-07-01
The advent of widespread distribution of digital video creates a need for automated methods for evaluating the visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics, and the economic need to reduce bit-rate to the lowest level that yields acceptable quality. In previous work, we have developed visual quality metrics for evaluating, controlling,a nd optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. Here I describe a new video quality metric that is an extension of these still image metrics into the time domain. Like the still image metrics, it is based on the Discrete Cosine Transform. An effort has been made to minimize the amount of memory and computation required by the metric, in order that might be applied in the widest range of applications. To calibrate the basic sensitivity of this metric to spatial and temporal signals we have made measurements of visual thresholds for temporally varying samples of DCT quantization noise.
2013-01-01
We discuss the hypothesis proposed by Engstrom and coworkers that Migraineurs have a relative sleep deprivation, which lowers the pain threshold and predispose to attacks. Previous data indicate that Migraineurs have a reduction of Cyclic Alternating Pattern (CAP), an essential mechanism of NREM sleep regulation which allows to dump the effect of incoming disruptive stimuli, and to protect sleep. The modifications of CAP observed in Migraineurs are similar to those observed in patients with impaired arousal (narcolepsy) and after sleep deprivation. The impairment of this mechanism makes Migraineurs more vulnerable to stimuli triggering attacks during sleep, and represents part of a more general vulnerability to incoming stimuli. PMID:23758606
Wavelet methodology to improve single unit isolation in primary motor cortex cells
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.
2016-01-01
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, C G; Mathews, S
2006-09-07
Current regulatory schemes use generic or industrial sector specific benchmarks to evaluate the quality of industrial stormwater discharges. While benchmarks can be a useful tool for facility stormwater managers in evaluating the quality stormwater runoff, benchmarks typically do not take into account site-specific conditions, such as: soil chemistry, atmospheric deposition, seasonal changes in water source, and upstream land use. Failing to account for these factors may lead to unnecessary costs to trace a source of natural variation, or potentially missing a significant local water quality problem. Site-specific water quality thresholds, established upon the statistical evaluation of historic data take intomore » account these factors, are a better tool for the direct evaluation of runoff quality, and a more cost-effective trigger to investigate anomalous results. Lawrence Livermore National Laboratory (LLNL), a federal facility, established stormwater monitoring programs to comply with the requirements of the industrial stormwater permit and Department of Energy orders, which require the evaluation of the impact of effluent discharges on the environment. LLNL recognized the need to create a tool to evaluate and manage stormwater quality that would allow analysts to identify trends in stormwater quality and recognize anomalous results so that trace-back and corrective actions could be initiated. LLNL created the site-specific water quality threshold tool to better understand the nature of the stormwater influent and effluent, to establish a technical basis for determining when facility operations might be impacting the quality of stormwater discharges, and to provide ''action levels'' to initiate follow-up to analytical results. The threshold criteria were based on a statistical analysis of the historic stormwater monitoring data and a review of relevant water quality objectives.« less
Ekici, Gamze; Unal, Edibe; Akbayrak, Turkan; Vardar-Yagli, Naciye; Yakut, Yavuz; Karabulut, Erdem
2017-01-01
The authors of this study compared the effects of pilates exercises and connective tissue massage (CTM) on pain intensity; pain-pressure threshold; and tolerance, anxiety, progress, and health-related quality of life in females with fibromyalgia. It was a pilot, assessor masked, randomized controlled trial conducted between January and August of 2013. Twenty-one women with fibromyalgia were randomly assigned to the pilates exercise program (six of whom did not complete the program), and 22 were randomly assigned to CTM (one of whom did not complete this program). Each group received the assigned intervention three times per week during a 4-week period. The Visual Analogue Scale, algometry, State-Trait Anxiety Inventory, Fibromyalgia Impact Questionnaire, and Nottingham Health Profile were used at baseline and at the end of treatments. Significant improvements were found in both groups for all parameters. However, the scores for pain-pressure threshold were significantly elevated and the symptoms of anxiety were significantly diminished in the exercise group compared to the massage group. Thus, exercise and massage might be used to provide improvements in women with fibromyalgia. The exercise group showed more advantages than the massage group and thus might be preferred for patients with fibromyalgia. However, an adequately powered trial is required to determine this with certainty.
Castro Sánchez, Adelaida M; García López, Hector; Fernández Sánchez, Manuel; Pérez Mármol, José Manuel; Aguilar-Ferrándiz, María Encarnación; Luque Suárez, Alejandro; Matarán Peñarrocha, Guillermo Adolfo
2018-04-23
To compare the effectiveness of dry needling versus myofascial release on myofascial trigger points pain in cervical muscles, quality of life, impact of symptoms pain, quality of sleep, anxiety, depression, and fatigue in patients with fibromyalgia syndrome. A single-blind randomized controlled trial was conducted. Sixty-four subjects with fibromyalgia were randomly assigned to a dry needling group or a myofascial release group. Pain pressure thresholds of myofascial trigger points were evaluated in the cervical muscles. In addition, quality of life, impact of fibromyalgia symptoms, quality of sleep, intensity of pain, anxiety and depression symptoms, impact of fatigue at baseline and post treatment after four weeks of intervention were evaluated. Significant improvement was found in most pain pressure thresholds of the myofascial trigger points in cervical muscles in the dry needling group compared to myofascial release (p < 0.05). Similarly, these differences between groups were found for the components of quality of life of physical function (F = 12.74, p = 0.001), physical role (F = 11.24, p = 0.001), body pain (F =30.26, p < 0.001), general health (F = 15.83, p < 0.001), vitality (F = 13.51, p = 0.001), social function (F = 4.73, p = 0.034), emotional role (F = 8.01, p = 0.006), and mental health (F = 4.95, p = 0.030). Similar results were achieved for total impact of FMS symptoms (F = 42.91, p < 0.001), quality of sleep (F = 11.96, p = 0.001), state anxiety (F = 7.40, p = 0.009), and trait anxiety (F = -14.63, p < 0.001), hospital anxiety and depression (F = 20.60, p < 0.001), general pain intensity (F = 29.59, p < 0.001), and fatigue (F = -25.73, p < 0.001). The dry needling therapy showed higher improvements in comparison with myofascial release therapy for pain pressure thresholds, the components of quality of life of physical role, body pain, vitality and social function, as well as the total impact of FMS symptoms, quality of sleep, state and trait anxiety, hospital anxiety-depression, general pain intensity and fatigue. Implications for rehabilitation Dry needling therapy reduces myofascial trigger point pain in the short term in patients with fibromyalgia syndrome. This therapeutic approach improves anxiety, depression, fatigue symptoms, quality of life, and sleep after treatment. Dry needling and myofascial release therapies decrease intensity of pain, and the impact of fibromyalgia symptoms in this population. These intervention approaches should be considered in an independent manner as complementary therapies within a multidisciplinary setting.
Radiation-induced changes in taste acuity in cancer patients. [. gamma. rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossman, K.L.; Henkin, R.I.
1978-01-01
Changes in taste acuity were measured in 27 patients with various forms of cancer who received radiation to the head and neck region. In 9 of these patients (group I), measurements of taste acuity were made more than 1 year after completion of radiation therapy. In the other 18 patients (group II), taste measurements were made before, during, and approximately 1 month after radiation therapy. Taste acuity was measured for four taste qualities (salt, sweet, sour, and bitter) by a forced choice-three stimulus drop technique which measured detection and recognition thresholds and by a forced scaling technique which measured tastemore » intensity responsiveness. In group II patients, impaired acuity, as indicated by elevated detection and recognition thresholds, was observed approximately 3 weeks after initiation of radiotherapy. The bitter and salt qualities showed the earliest and greatest impairment and the sweet quality the least. Taste intensity responsiveness also was impaired in group II patients. As for thresholds, scaling impairment was most severe for bitter and salt taste qualities. Scaling impairment occurred before changes in either detection or recognition thresholds. Detection and recognition thresholds determined in group I patients also showed salt and bitter qualities were affected more severely than either sweet or sour qualities. Zinc administration to group I patients in an uncontrolled study suggested that zinc therapy may be useful in ameliorating taste impairment in some patients. These results suggest that taste loss may be a factor in the anorexia and weight loss that is observed commonly in patients who have undergone radiation treatment. Correction of this abnormality may be useful in aiding the nutritional status of these patients.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... of Significant Deterioration (PSD) program to establish appropriate emission thresholds for... Mexico's December 1, 2010, proposed SIP revision establishes appropriate emissions thresholds for... appropriate thresholds for GHG permitting applicability into New Mexico's SIP, then paragraph (d) in Sec. 52...
NASA Astrophysics Data System (ADS)
Natarajan, Ajay; Hansen, John H. L.; Arehart, Kathryn Hoberg; Rossi-Katz, Jessica
2005-12-01
This study describes a new noise suppression scheme for hearing aid applications based on the auditory masking threshold (AMT) in conjunction with a modified generalized minimum mean square error estimator (GMMSE) for individual subjects with hearing loss. The representation of cochlear frequency resolution is achieved in terms of auditory filter equivalent rectangular bandwidths (ERBs). Estimation of AMT and spreading functions for masking are implemented in two ways: with normal auditory thresholds and normal auditory filter bandwidths (GMMSE-AMT[ERB]-NH) and with elevated thresholds and broader auditory filters characteristic of cochlear hearing loss (GMMSE-AMT[ERB]-HI). Evaluation is performed using speech corpora with objective quality measures (segmental SNR, Itakura-Saito), along with formal listener evaluations of speech quality rating and intelligibility. While no measurable changes in intelligibility occurred, evaluations showed quality improvement with both algorithm implementations. However, the customized formulation based on individual hearing losses was similar in performance to the formulation based on the normal auditory system.
UrQt: an efficient software for the Unsupervised Quality trimming of NGS data.
Modolo, Laurent; Lerat, Emmanuelle
2015-04-29
Quality control is a necessary step of any Next Generation Sequencing analysis. Although customary, this step still requires manual interventions to empirically choose tuning parameters according to various quality statistics. Moreover, current quality control procedures that provide a "good quality" data set, are not optimal and discard many informative nucleotides. To address these drawbacks, we present a new quality control method, implemented in UrQt software, for Unsupervised Quality trimming of Next Generation Sequencing reads. Our trimming procedure relies on a well-defined probabilistic framework to detect the best segmentation between two segments of unreliable nucleotides, framing a segment of informative nucleotides. Our software only requires one user-friendly parameter to define the minimal quality threshold (phred score) to consider a nucleotide to be informative, which is independent of both the experiment and the quality of the data. This procedure is implemented in C++ in an efficient and parallelized software with a low memory footprint. We tested the performances of UrQt compared to the best-known trimming programs, on seven RNA and DNA sequencing experiments and demonstrated its optimality in the resulting tradeoff between the number of trimmed nucleotides and the quality objective. By finding the best segmentation to delimit a segment of good quality nucleotides, UrQt greatly increases the number of reads and of nucleotides that can be retained for a given quality objective. UrQt source files, binary executables for different operating systems and documentation are freely available (under the GPLv3) at the following address: https://lbbe.univ-lyon1.fr/-UrQt-.html .
Using satellite data to guide emission control strategies for surface ozone pollution
NASA Astrophysics Data System (ADS)
Jin, X.; Fiore, A. M.
2017-12-01
Surface ozone (O3) has adverse effects on public health, agriculture and ecosystems. As a secondary pollutant, ozone is not emitted directly. Ozone forms from two classes of precursors: NOx and VOCs. We use satellite observations of formaldehyde (a marker of VOCs) and NO2 (a marker of NOx) to identify areas which would benefit more from reducing NOx emissions (NOx-limited) versus areas where reducing VOC emissions would lead to lower ozone (VOC-limited). We use a global chemical transport model (GEOS-Chem) to develop a set of threshold values that separate the NOx-limited and VOC-limited conditions. Combining these threshold values with a decadal record of satellite observations, we find that U.S. cities (e.g. New York, Chicago) have shifted from VOC-limited to NOx-limited ozone production regimes in the warm season. This transition reflects the NOx emission controls implemented over the past decade. Increasing NOx sensitivity implies that regional NOx emission control programs will improve O3 air quality more now than it would have a decade ago.
The effect of state policies on nursing home resident outcomes.
Mor, Vincent; Gruneir, Andrea; Feng, Zhanlian; Grabowski, David C; Intrator, Orna; Zinn, Jacqueline
2011-01-01
To test the effect of changes in Medicaid reimbursement on clinical outcomes of long-stay nursing home (NH) residents. Longitudinal, retrospective study of NHs, merging aggregated resident-level quality measures with facility characteristics and state policy survey data. All free-standing NHs in urban counties with at least 20 long-stay residents per quarter (length of stay > 90 days) in the continental United States between 1999 and 2005. Long-stay NH residents Annual state Medicaid average per diem reimbursement and the presence of case-mix reimbursement in each year. Quarterly facility-aggregated, risk-adjusted quality-of-care measures surpassing a threshold for functional (activity of daily living) decline, physical restraint use, pressure ulcer incidence or worsening, and persistent pain. All outcomes showed an improvement trend over the study period, particularly physical restraint use. Facility fixed-effect regressions revealed that a $10 increase in Medicaid payment increased the likelihood of a NH meeting quality thresholds by 9% for functional decline, 5% for pain control, and 2% for pressure ulcers but not reduced use of physical restraints. Facilities in states that increased Medicaid payment most showed the greatest improvement in outcomes. The introduction of case-mix reimbursement was unrelated to quality improvement. Improvements in the clinical quality of NH care have been achieved, particularly where Medicaid payment has increased, generally from a lower baseline. Although this is a positive finding, challenges to implementing efficient reimbursement policies remain. © 2010, Copyright the Authors. Journal compilation © 2010, The American Geriatrics Society.
Ris, I; Søgaard, K; Gram, B; Agerbo, K; Boyle, E; Juul-Kristensen, B
2016-12-01
To investigate the effect of combining pain education, specific exercises and graded physical activity training (exercise) compared with pain education alone (control) on physical health-related quality of life (HR-QoL) in chronic neck pain patients. A multicentre randomised controlled trial of 200 neck pain patients receiving pain education. The exercise group received additional exercises for neck/shoulder, balance and oculomotor function, plus graded physical activity training. Patient-reported outcome measures (Short Form-36 Physical and Mental component summary scores, EuroQol-5D, Beck Depression Inventory-II, Neck Disability Index, Pain Bothersomeness, Patient-Specific Functioning Scale, Tampa Scale of Kinesiophobia, Global Perceived Effect) and clinical tests (Aastrand Physical Fitness, cervical Range of Motion, Pressure Pain Threshold at infraspinatus, tibialis anterior and cervical spine, Cranio-cervical Flexion, Cervical Extension muscle function, and oculomotion) were recorded at baseline and after 4 months. The exercise group showed statistically significant improvement in physical HR-QoL, mental HR-QoL, depression, cervical pressure pain threshold, cervical extension movement, muscle function, and oculomotion. Per protocol analyses confirmed these results with additional significant improvements in the exercise group compared with controls. This multimodal intervention may be an effective intervention for chronic neck pain patients. The trial was registered on www.ClinicalTrials.govNCT01431261 and at the Regional Scientific Ethics Committee of Southern Denmark S-20100069. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pfeifle, C.A.; Cain, J.L.; Rasmussen, R.B.
2017-09-27
Surface-water supplies are important sources of drinking water for residents in the Triangle area of North Carolina, which is located within the upper Cape Fear and Neuse River Basins. Since 1988, the U.S. Geological Survey and a consortium of local governments have tracked water-quality conditions and trends in several of the area’s water-supply lakes and streams. This report summarizes data collected through this cooperative effort, known as the Triangle Area Water Supply Monitoring Project, during October 2013 through September 2014 (water year 2014) and October 2014 through September 2015 (water year 2015). Major findings for this period include:More than 5,500 individual measurements of water quality were made at a total of 15 sites—4 in the Neuse River Basin and 11 in the Cape Fear River Basin. Thirty water-quality properties or constituents were measured; State water-quality thresholds exist for 11 of these.All observations met State water-quality thresholds for temperature, hardness, chloride, fluoride, sulfate, and nitrate plus nitrite.North Carolina water-quality thresholds were exceeded one or more times for dissolved oxygen, dissolved-oxygen percent saturation, pH, turbidity, and chlorophyll a.
Quast, Troy
2013-01-01
The Patient Protection and Affordable Care Act (PPACA) includes a provision that penalizes insurance companies if their Medical Loss Ratio (MLR) falls below a specified threshold. The MLR is roughly measured as the ratio of health care expenses to premiums paid by enrollees. I investigate whether there is a relationship between MLRs and the quality of care provided by insurance companies. I employ a ten-year sample of market-level financial data and quality variables for Texas insurers, as well as relevant control variables, in regression analyses that utilize insurer and market fixed effects. Of the 15 quality measures, only one has a statistically significant relationship with the MLR. For this measure, the relationship is negative. Although the MLR provision may provide incentives for insurance companies to lower premiums, this sample does not suggest that there is likely to be a beneficial effect on quality.
A Data Quality Filter for PMU Measurements: Description, Experience, and Examples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follum, James D.; Amidan, Brett G.
Networks of phasor measurement units (PMUs) continue to grow, and along with them, the amount of data available for analysis. With so much data, it is impractical to identify and remove poor quality data manually. The data quality filter described in this paper was developed for use with the Data Integrity and Situation Awareness Tool (DISAT), which analyzes PMU data to identify anomalous system behavior. The filter operates based only on the information included in the data files, without supervisory control and data acquisition (SCADA) data, state estimator values, or system topology information. Measurements are compared to preselected thresholds tomore » determine if they are reliable. Along with the filter's description, examples of data quality issues from application of the filter to nine months of archived PMU data are provided. The paper is intended to aid the reader in recognizing and properly addressing data quality issues in PMU data.« less
Land cover controls on summer discharge and runoff solution chemistry of semi-arid urban catchments
NASA Astrophysics Data System (ADS)
Gallo, Erika L.; Brooks, Paul D.; Lohse, Kathleen A.; McLain, Jean E. T.
2013-04-01
SummaryRecharge of urban runoff to groundwater as a stormwater management practice has gained importance in semi-arid regions where water resources are scarce and urban centers are growing. Despite this trend, the importance of land cover in controlling semi-arid catchment runoff quantity and quality remains unclear. Here we address the question: How do land cover characteristics control the amount and quality of storm runoff in semi-arid urban catchments? We monitored summertime runoff quantity and quality from five catchments dominated by distinct urban land uses: low, medium, and high density residential, mixed use, and commercial. Increasing urban land cover increased runoff duration and the likelihood that a rainfall event would result in runoff, but did not increase the time to peak discharge of episodic runoff. The effect of urban land cover on hydrologic responses was tightly coupled to the magnitude of rainfall. At distinct rainfall thresholds, roads, percent impervious cover and the stormwater drainage network controlled runoff frequency, runoff depth and runoff ratios. Contrary to initial expectations, runoff quality did not vary in repose to impervious cover or land use. We identified four major mechanisms controlling runoff quality: (1) variable solute sourcing due to land use heterogeneity and above ground catchment connectivity; (2) the spatial extent of pervious and biogeochemically active areas; (3) the efficiency of overland flow and runoff mobilization; and (4) solute flushing and dilution. Our study highlights the importance of the stormwater drainage systems characteristics in controlling urban runoff quantity and quality; and suggests that enhanced wetting and in-stream processes may control solute sourcing and retention. Finally, we suggest that the characteristics of the stormwater drainage system should be integrated into stormwater management approaches.
Janzic, Andrej; Kos, Mitja
2015-04-01
Vitamin K antagonists, such as warfarin, are standard treatments for stroke prophylaxis in patients with atrial fibrillation. Patient outcomes depend on quality of warfarin management, which includes regular monitoring and dose adjustments. Recently, novel oral anticoagulants (NOACs) that do not require regular monitoring offer an alternative to warfarin. The aim of this study was to evaluate whether cost effectiveness of NOACs for stroke prevention in atrial fibrillation depends on the quality of warfarin control. We developed a Markov decision model to simulate warfarin treatment outcomes in relation to the quality of anticoagulation control, expressed as percentage of time in the therapeutic range (TTR). Standard treatment with adjusted-dose warfarin and improved anticoagulation control by genotype-guided dosing were compared with dabigatran, rivaroxaban, apixaban and edoxaban. The analysis was performed from the Slovenian healthcare payer perspective using 2014 costs. In the base case, the incremental cost-effectiveness ratio for apixaban, dabigatran and edoxaban was below the threshold of €25,000 per quality-adjusted life-years compared with adjusted-dose warfarin with a TTR of 60%. The probability that warfarin was a cost-effective option was around 1%. This percentage rises as the quality of anticoagulation control improves. At a TTR of 70%, warfarin was the preferred treatment in half the iterations. The cost effectiveness of NOACs for stroke prevention in patients with nonvalvular atrial fibrillation who are at increased risk for stroke is highly sensitive to warfarin anticoagulation control. NOACs are more likely to be cost-effective options in settings with poor warfarin management than in settings with better anticoagulation control, where they may not represent good value for money.
NASA Technical Reports Server (NTRS)
Peterman, M.; McCrory, J. L.; Sharkey, N. A.; Piazza, S.; Cavanagh, P. R.
1999-01-01
The human zero-gravity locomotion simulator and the cadaver simulator offer a powerful combination for the study of the implications of exercise for maintaining bone quality during space flight. Such studies, when compared with controlled in-flight exercise programs, could help in the identification of a strain threshold for the prevention of bone loss during space flight.
Cost-Effectiveness of Intensive versus Standard Blood-Pressure Control.
Bress, Adam P; Bellows, Brandon K; King, Jordan B; Hess, Rachel; Beddhu, Srinivasan; Zhang, Zugui; Berlowitz, Dan R; Conroy, Molly B; Fine, Larry; Oparil, Suzanne; Morisky, Donald E; Kazis, Lewis E; Ruiz-Negrón, Natalia; Powell, Jamie; Tamariz, Leonardo; Whittle, Jeff; Wright, Jackson T; Supiano, Mark A; Cheung, Alfred K; Weintraub, William S; Moran, Andrew E
2017-08-24
In the Systolic Blood Pressure Intervention Trial (SPRINT), adults at high risk for cardiovascular disease who received intensive systolic blood-pressure control (target, <120 mm Hg) had significantly lower rates of death and cardiovascular disease events than did those who received standard control (target, <140 mm Hg). On the basis of these data, we wanted to determine the lifetime health benefits and health care costs associated with intensive control versus standard control. We used a microsimulation model to apply SPRINT treatment effects and health care costs from national sources to a hypothetical cohort of SPRINT-eligible adults. The model projected lifetime costs of treatment and monitoring in patients with hypertension, cardiovascular disease events and subsequent treatment costs, treatment-related risks of serious adverse events and subsequent costs, and quality-adjusted life-years (QALYs) for intensive control versus standard control of systolic blood pressure. We determined that the mean number of QALYs would be 0.27 higher among patients who received intensive control than among those who received standard control and would cost approximately $47,000 more per QALY gained if there were a reduction in adherence and treatment effects after 5 years; the cost would be approximately $28,000 more per QALY gained if the treatment effects persisted for the remaining lifetime of the patient. Most simulation results indicated that intensive treatment would be cost-effective (51 to 79% below the willingness-to-pay threshold of $50,000 per QALY and 76 to 93% below the threshold of $100,000 per QALY), regardless of whether treatment effects were reduced after 5 years or persisted for the remaining lifetime. In this simulation study, intensive systolic blood-pressure control prevented cardiovascular disease events and prolonged life and did so at levels below common willingness-to-pay thresholds per QALY, regardless of whether benefits were reduced after 5 years or persisted for the patient's remaining lifetime. (Funded by the National Heart, Lung, and Blood Institute and others; SPRINT ClinicalTrials.gov number, NCT01206062 .).
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-28
... Significant Deterioration (PSD) program to establish appropriate emission thresholds for determining which new... emissions above the thresholds established in the PSD regulations. DATES: This final rule is effective on... of GHG, and that do not limit PSD applicability to GHGs to the higher thresholds in the Tailoring...
A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raja, R. Subramaniyan; Babu, G. Anandha; Ramasamy, P., E-mail: E-mail-ramasamyp@ssn.edu.in
2016-05-23
Good quality single crystals of pure hydrocarbon 1,3,5-Triphenylbenzene (TPB) have been successfully grown using toluene as a solvent using controlled slow cooling solution growth technique. TPB crystallizes in orthorhombic structure with the space group Pna2{sub 1}. The structural perfection of the grown crystal has been analysed by high resolution X-ray diffraction measurements. The range and percentage of the optical transmission are ascertained by recording the UV-vis spectrum. Thermo gravimetric analysis (TGA) and differential thermal analysis (DTA) were used to study its thermal properties. Powder second harmonic generation studies were carried out to explore its NLO properties. Laser damage threshold valuemore » has been determined using Nd:YAG laser operating at 1064 nm.« less
Wavelet methodology to improve single unit isolation in primary motor cortex cells.
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2015-05-15
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.
Quantitative sensory studies in complex regional pain syndrome type 1/RSD.
Tahmoush, A J; Schwartzman, R J; Hopp, J L; Grothusen, J R
2000-12-01
Patients with complex regional pain syndrome type I (CRPSD1) may have thermal allodynia after application of a non-noxious thermal stimulus to the affected limb. We measured the warm, cold, heat-evoked pain threshold and the cold-evoked pain threshold in the affected area of 16 control patients and patients with complex regional pain syndrome type 1/RSD to test the hypothesis that allodynia results from an abnormality in sensory physiology. A contact thermode was used to apply a constant 1 degrees C/second increasing (warm and heat-evoked pain) or decreasing (cold and cold-evoked pain) thermal stimulus until the patient pressed the response button to show that a temperature change was felt by the patient. Student t test was used to compare thresholds in patients and control patients. The cold-evoked pain threshold in patients with CRPSD1/RSD (p <0.001) was significantly decreased when compared with the thresholds in control patients (i.e., a smaller decrease in temperature was necessary to elicit cold-pain in patients with CRPSD1/RSD than in control patients). The heat-evoked pain threshold in patients with CRPS1/RSD was (p <0.05) decreased significantly when compared with thresholds in control patients. The warm- and cold-detection thresholds in patients with CRPS1/RSD were similar to the thresholds in control patients. This study suggests that thermal allodynia in patients with CRPS1/RSD results from decreased cold-evoked and heat-evoked pain thresholds. The thermal pain thresholds are reset (decreased) so that non-noxious thermal stimuli are perceived to be pain (allodynia).
Ding, Changfeng; Li, Xiaogang; Zhang, Taolin; Ma, Yibing; Wang, Xingxiang
2014-10-01
Soil environmental quality standards in respect of heavy metals for farmlands should be established considering both their effects on crop yield and their accumulation in the edible part. A greenhouse experiment was conducted to investigate the effects of chromium (Cr) on biomass production and Cr accumulation in carrot plants grown in a wide range of soils. The results revealed that carrot yield significantly decreased in 18 of the total 20 soils with Cr addition being the soil environmental quality standard of China. The Cr content of carrot grown in the five soils with pH>8.0 exceeded the maximum allowable level (0.5mgkg(-1)) according to the Chinese General Standard for Contaminants in Foods. The relationship between carrot Cr concentration and soil pH could be well fitted (R(2)=0.70, P<0.0001) by a linear-linear segmented regression model. The addition of Cr to soil influenced carrot yield firstly rather than the food quality. The major soil factors controlling Cr phytotoxicity and the prediction models were further identified and developed using path analysis and stepwise multiple linear regression analysis. Soil Cr thresholds for phytotoxicity meanwhile ensuring food safety were then derived on the condition of 10 percent yield reduction. Copyright © 2014 Elsevier Inc. All rights reserved.
McClellan, Sean R; Wu, Frances M; Snowden, Lonnie R
2012-06-01
Title VI of the 1964 Civil Rights Act prohibits federal funds recipients from providing care to limited English proficiency (LEP) persons more limited in scope or lower in quality than care provided to others. In 1999, the California Department of Mental Health implemented a "threshold language access policy" to meet its Title VI obligations. Under this policy, Medi-Cal agencies must provide language assistance programming in a non-English language where a county's Medical population contains either 3000 residents or 5% speakers of that language. We examine the impact of threshold language policy-required language assistance programming on LEP persons' access to mental health services by analyzing the county-level penetration rate of services for Russian, Spanish, and Vietnamese speakers across 34 California counties, over 10 years of quarterly data. Exploiting a time series with nonequivalent control group study design, we studied this phenomena using linear regression with random county effects to account for trends over time. Threshold language policy-required assistance programming led to an immediate and significant increase in the penetration rate of mental health services for Russian (8.2, P < 0.01) and Vietnamese (3.3, P < 0.01) language speaking persons. Threshold language assistance programming was effective in increasing mental health access for Russian and Vietnamese, but not for Spanish-speaking LEP persons.
NASA Astrophysics Data System (ADS)
Ward, V. L.; Singh, R.; Reed, P. M.; Keller, K.
2014-12-01
As water resources problems typically involve several stakeholders with conflicting objectives, multi-objective evolutionary algorithms (MOEAs) are now key tools for understanding management tradeoffs. Given the growing complexity of water planning problems, it is important to establish if an algorithm can consistently perform well on a given class of problems. This knowledge allows the decision analyst to focus on eliciting and evaluating appropriate problem formulations. This study proposes a multi-objective adaptation of the classic environmental economics "Lake Problem" as a computationally simple but mathematically challenging MOEA benchmarking problem. The lake problem abstracts a fictional town on a lake which hopes to maximize its economic benefit without degrading the lake's water quality to a eutrophic (polluted) state through excessive phosphorus loading. The problem poses the challenge of maintaining economic activity while confronting the uncertainty of potentially crossing a nonlinear and potentially irreversible pollution threshold beyond which the lake is eutrophic. Objectives for optimization are maximizing economic benefit from lake pollution, maximizing water quality, maximizing the reliability of remaining below the environmental threshold, and minimizing the probability that the town will have to drastically change pollution policies in any given year. The multi-objective formulation incorporates uncertainty with a stochastic phosphorus inflow abstracting non-point source pollution. We performed comprehensive diagnostics using 6 algorithms: Borg, MOEAD, eMOEA, eNSGAII, GDE3, and NSGAII to ascertain their controllability, reliability, efficiency, and effectiveness. The lake problem abstracts elements of many current water resources and climate related management applications where there is the potential for crossing irreversible, nonlinear thresholds. We show that many modern MOEAs can fail on this test problem, indicating its suitability as a useful and nontrivial benchmarking problem.
NASA Astrophysics Data System (ADS)
Juhlke, Florian; Lorber, Katja; Wagenstaller, Maria; Buettner, Andrea
2017-12-01
Chlorinated guaiacol derivatives are found in waste water of pulp mills using chlorine in the bleaching process of wood pulp. They can also be detected in fish tissue, possibly causing off-odors. To date, there is no systematic investigation on the odor properties of halogenated guaiacol derivatives. To close this gap, odor thresholds in air and odor qualities of 14 compounds were determined by gas chromatography-olfactometry. Overall, the investigated compounds elicited smells that are characteristic for guaiacol, namely smoky, sweet, vanilla-like, but also medicinal and plaster-like. Their odor thresholds in air were, however, very low, ranging from 0.00072 to 23 ng/Lair. The lowest thresholds were found for 5-chloro- and 5-bromoguaiacol, followed by 4,5-dichloro- and 6-chloroguaiacol. Moreover, some inter-individual differences in odor threshold values could be observed, with the highest variations having been recorded for the individual values of 5-iodo- and 4-bromoguaiacol.
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
Multi-mode ultrasonic welding control and optimization
Tang, Jason C.H.; Cai, Wayne W
2013-05-28
A system and method for providing multi-mode control of an ultrasonic welding system. In one embodiment, the control modes include the energy of the weld, the time of the welding process and the compression displacement of the parts being welded during the welding process. The method includes providing thresholds for each of the modes, and terminating the welding process after the threshold for each mode has been reached, the threshold for more than one mode has been reached or the threshold for one of the modes has been reached. The welding control can be either open-loop or closed-loop, where the open-loop process provides the mode thresholds and once one or more of those thresholds is reached the welding process is terminated. The closed-loop control provides feedback of the weld energy and/or the compression displacement so that the weld power and/or weld pressure can be increased or decreased accordingly.
Sofonia, Jeremy J; Unsworth, Richard K F
2010-01-01
Given the potential for adverse effects of ocean dredging on marine organisms, particularly benthic primary producer communities, the management and monitoring of those activities which cause elevated turbidity and sediment loading is critical. In practice, however, this has proven challenging as the development of water quality threshold values, upon which management responses are based, are subject to a large number of physical and biological parameters that are spatially and temporally specific. As a consequence, monitoring programs to date have taken a wide range of different approaches, most focusing on measures of turbidity reported as nephelometric turbidity units (NTU). This paper presents a potential approach in the determination of water quality thresholds which utilises data gathered through the long-term deployment of in situ water instruments, but suggests a focus on photosynthetic active radiation (PAR) rather than NTU as it is more relevant biologically and inclusive of other site conditions. A simple mathematical approach to data interpretation is also presented which facilitates threshold value development, not individual values of concentrations over specific intervals, but as an equation which may be utilized in numerical modelling.
Modeling Source Water Threshold Exceedances with Extreme Value Theory
NASA Astrophysics Data System (ADS)
Rajagopalan, B.; Samson, C.; Summers, R. S.
2016-12-01
Variability in surface water quality, influenced by seasonal and long-term climate changes, can impact drinking water quality and treatment. In particular, temperature and precipitation can impact surface water quality directly or through their influence on streamflow and dilution capacity. Furthermore, they also impact land surface factors, such as soil moisture and vegetation, which can in turn affect surface water quality, in particular, levels of organic matter in surface waters which are of concern. All of these will be exacerbated by anthropogenic climate change. While some source water quality parameters, particularly Total Organic Carbon (TOC) and bromide concentrations, are not directly regulated for drinking water, these parameters are precursors to the formation of disinfection byproducts (DBPs), which are regulated in drinking water distribution systems. These DBPs form when a disinfectant, added to the water to protect public health against microbial pathogens, most commonly chlorine, reacts with dissolved organic matter (DOM), measured as TOC or dissolved organic carbon (DOC), and inorganic precursor materials, such as bromide. Therefore, understanding and modeling the extremes of TOC and Bromide concentrations is of critical interest for drinking water utilities. In this study we develop nonstationary extreme value analysis models for threshold exceedances of source water quality parameters, specifically TOC and bromide concentrations. In this, the threshold exceedances are modeled as Generalized Pareto Distribution (GPD) whose parameters vary as a function of climate and land surface variables - thus, enabling to capture the temporal nonstationarity. We apply these to model threshold exceedance of source water TOC and bromide concentrations at two locations with different climate and find very good performance.
Rowland, Michelle R; Lesnock, Jamie L; Farris, Coreen; Kelley, Joseph L; Krivak, Thomas C
2015-06-01
Treatment for advanced-stage epithelial ovarian cancer (AEOC) includes primary debulking surgery (PDS) or neoadjuvant chemotherapy (NACT). A randomized controlled trial comparing these treatments resulted in comparable overall survival (OS). Studies report more complications and lower chemotherapy completion rates in patients 65 years old or older receiving PDS. We sought to evaluate the cost implications of NACT relative to PDS in AEOC patients 65 years old or older. A 5 year Markov model was created. Arm 1 modeled PDS followed by 6 cycles of carboplatin and paclitaxel (CT). Arm 2 modeled 3 cycles of CT, followed by interval debulking surgery and then 3 additional cycles of CT. Parameters included OS, surgical complications, probability of treatment initiation, treatment cost, and quality of life (QOL). OS was assumed to be equal based on the findings of the international randomized control trial. Differences in surgical complexity were accounted for in base surgical cost plus add-on procedure costs weighted by occurrence rates. Hospital cost was a weighted average of diagnosis-related group costs weighted by composite estimates of complication rates. Sensitivity analyses were performed. Assuming equal survival, NACT produces a cost savings of $5616. If PDS improved median OS by 1.5 months or longer, PDS would be cost effective (CE) at a $100,000/quality-adjusted life-year threshold. If PDS improved OS by 3.2 months or longer, it would be CE at a $50,000 threshold. The model was robust to variation in costs and complication rates. Moderate decreases in the QOL with NACT would result in PDS being CE. A model based on the RCT comparing NACT and PDS showed NACT is a cost-saving treatment compared with PDS for AEOC in patients 65 years old or older. Small increases in OS with PDS or moderate declines in QOL with NACT would result in PDS being CE at the $100,000/quality-adjusted life-year threshold. Our results support further evaluation of the effects of PDS on OS, QOL and complications in AEOC patients 65 years old or older. Copyright © 2015 Elsevier Inc. All rights reserved.
Ding, Changfeng; Ma, Yibing; Li, Xiaogang; Zhang, Taolin; Wang, Xingxiang
2018-04-01
Cadmium (Cd) is an environmental toxicant with high rates of soil-plant transfer. It is essential to establish an accurate soil threshold for the implementation of soil management practices. This study takes root vegetable as an example to derive soil thresholds for Cd based on the food quality standard as well as health risk assessment using species sensitivity distribution (SSD). A soil type-specific bioconcentration factor (BCF, ratio of Cd concentration in plant to that in soil) generated from soil with a proper Cd concentration gradient was calculated and applied in the derivation of soil thresholds instead of a generic BCF value to minimize the uncertainty. The sensitivity variations of twelve root vegetable cultivars for accumulating soil Cd and the empirical soil-plant transfer model were investigated and developed in greenhouse experiments. After normalization, the hazardous concentrations from the fifth percentile of the distribution based on added Cd (HC5 add ) were calculated from the SSD curves fitted by Burr Type III distribution. The derived soil thresholds were presented as continuous or scenario criteria depending on the combination of soil pH and organic carbon content. The soil thresholds based on food quality standard were on average 0.7-fold of those based on health risk assessment, and were further validated to be reliable using independent data from field survey and published articles. The results suggested that deriving soil thresholds for Cd using SSD method is robust and also applicable to other crops as well as other trace elements that have the potential to cause health risk issues. Copyright © 2017 Elsevier B.V. All rights reserved.
Electrically controllable liquid crystal random lasers below the Fréedericksz transition threshold.
Lee, Chia-Rong; Lin, Jia-De; Huang, Bo-Yuang; Lin, Shih-Hung; Mo, Ting-Shan; Huang, Shuan-Yu; Kuo, Chie-Tong; Yeh, Hui-Chen
2011-01-31
This investigation elucidates for the first time electrically controllable random lasers below the threshold voltage in dye-doped liquid crystal (DDLC) cells with and without adding an azo-dye. Experimental results show that the lasing intensities and the energy thresholds of the random lasers can be decreased and increased, respectively, by increasing the applied voltage below the Fréedericksz transition threshold. The below-threshold-electric-controllability of the random lasers is attributable to the effective decrease of the spatial fluctuation of the orientational order and thus of the dielectric tensor of LCs by increasing the electric-field-aligned order of LCs below the threshold, thereby increasing the diffusion constant and decreasing the scattering strength of the fluorescence photons in their recurrent multiple scattering. This can result in the decrease in the lasing intensity of the random lasers and the increase in their energy thresholds. Furthermore, the addition of an azo-dye in DDLC cell can induce the range of the working voltage below the threshold for the control of the random laser to reduce.
Color vision testing with a computer graphics system: preliminary results.
Arden, G; Gündüz, K; Perry, S
1988-06-01
We report a method for computer enhancement of color vision tests. In our graphics system 256 colors are selected from a much larger range and displayed on a screen divided into 768 x 288 pixels. Eight-bit digital-to-analogue converters drive a high quality monitor with separate inputs to the red, green, and blue amplifiers and calibrated gun chromaticities. The graphics are controlled by a PASCAL program written for a personal computer, which calculates the values of the red, green, and blue signals and specifies them in Commité Internationale d'Eclairage X, Y, and Z fundamentals, so changes in chrominance occur without changes in luminance. The system for measuring color contrast thresholds with gratings is more than adequate in normal observers. In patients with mild retinal damage in whom other tests of visual function are normal, this method of testing color vision shows specific increases in contrast thresholds along tritan color-confusion lines. By the time the Hardy-Rand-Rittler and Farnsworth-Munsell 100-hue tests disclose abnormalities, gross defects in color contrast threshold can be seen with our system.
Economic Analysis Supporting the Increase of the Unspecified Minor Military Construction Threshold
2016-06-01
UNSPECIFIED MINOR MILITARY CONSTRUCTION THRESHOLD June 2016 By: Clifford L. Kelsey Advisors: Philip Candreva Amilcar Menichini...THE UNSPECIFIED MINOR MILITARY CONSTRUCTION THRESHOLD 5. FUNDING NUMBERS 6. AUTHOR(S) Clifford L. Kelsey 7. PERFORMING ORGANIZATION NAME(S) AND...words) This report analyzes the economical, technological, and environmental challenges U.S. Navy engineers face in constructing quality, usable
Del Paggio, Joseph C; Sullivan, Richard; Schrag, Deborah; Hopman, Wilma M; Azariah, Biju; Pramesh, C S; Tannock, Ian F; Booth, Christopher M
2017-07-01
The American Society of Clinical Oncology (ASCO) and the European Society for Medical Oncology (ESMO) have developed frameworks that quantify survival gains in light of toxicity and quality of life to assess the benefits of cancer therapies. We applied these frameworks to a cohort of contemporary randomised controlled trials to explore agreement between the two approaches and to assess the relation between treatment benefit and cost. We identified all randomised controlled trials of systemic therapies in non-small-cell lung cancer, breast cancer, colorectal cancer, and pancreatic cancer published between Jan 1, 2011, and Dec 31, 2015, and assessed their abstracts and methods. Trials were eligible for inclusion in our cohort if significant differences favouring the experimental group in a prespecified primary or secondary outcome were reported (secondary outcomes were assessed only if primary outcomes were not significant). We assessed trial endpoints with the ASCO and ESMO frameworks at two timepoints 3 months apart to confirm intra-rater reliability. Cohen's κ statistic was calculated to establish agreement between the two frameworks on the basis of the median ASCO score, which was used as an arbitrary threshold of benefit, and the framework-recommended ESMO threshold. Differences in monthly drug cost between the experimental and control groups of each randomised controlled trial (ie, incremental drug cost) were derived from 2016 average wholesale prices. 109 randomised controlled trials were eligible for inclusion, 42 (39%) in non-small-cell lung cancer, 36 (33%) in breast cancer, 25 (23%) in colorectal cancer, and six (6%) in pancreatic cancer. ASCO scores ranged from 2 to 77; median score was 25 (IQR 16-35). 41 (38%) trials met the benefit thresholds in the ESMO framework. Agreement between the two frameworks was fair (κ=0·326). Among the 100 randomised controlled trials for which drug costing data were available, ASCO benefit score and monthly incremental drug costs were negatively correlated (ρ=-0·207; p=0·039). Treatments that met ESMO benefit thresholds had a lower median incremental drug cost than did those that did not meet benefit thresholds (US$2981 [IQR 320-9059] vs $8621 [1174-13 930]; p=0·018). There is only fair correlation between these two major value care frameworks, and negative correlations between framework outputs and drug costs. Delivery of optimal cancer care in a sustainable health system will necessitate future oncologists, investigators, and policy makers to reconcile the disconnect between drug cost and clinical benefit. None. Copyright © 2017 Elsevier Ltd. All rights reserved.
Serón, P; Riedemann, P; Muñoz, S; Doussoulin, A; Villarroel, P; Cea, X
2005-11-01
Chronic airflow limitation (CAL) is a significant cause of illness and death. Inspiratory muscle training has been described as a technique for managing CAL. The aim of the present study was to evaluate the effectiveness of inspiratory muscle training on improving physiological and functional variables. Randomized controlled trial in which 35 patients with CAL were assigned to receive either an experimental (n=17) or control (n=18) intervention. The experimental intervention consisted of 2 months of inspiratory muscle training using a device that administered a resistive load of 40% of maximal static inspiratory mouth pressure (PImax). Inspiratory muscle strength, exercise tolerance, respiratory function, and quality of life were assessed. Significant improvement in inspiratory muscle strength was observed in the experimental training group (P=.02). All patients improved over time in both groups (P<.001). PImax increased by 8.9 cm H2O per month of training. Likewise, the health-related quality of life scores improved by 0.56 points. Use of a threshold loading device is effective for strengthening inspiratory muscles as measured by PImax after the first month of training in patients with CAL. The long-term effectiveness of such training and its impact on quality of life should be studied in a larger number of patients.
Murphy, Colin W; Parker, Nathan C
2014-02-18
Air pollution emissions regulation can affect the location, size, and technology choice of potential biofuel production facilities. Difficulty in obtaining air pollutant emission permits and the cost of air pollution control devices have been cited by some fuel producers as barriers to development. This paper expands on the Geospatial Bioenergy Systems Model (GBSM) to evaluate the effect of air pollution control costs on the availability, cost, and distribution of U.S. biofuel production by subjecting potential facility locations within U.S. Clean Air Act nonattainment areas, which exceed thresholds for healthy air quality, to additional costs. This paper compares three scenarios: one with air quality costs included, one without air quality costs, and one in which conversion facilities were prohibited in Clean Air Act nonattainment areas. While air quality regulation may substantially affect local decisions regarding siting or technology choices, their effect on the system as a whole is small. Most biofuel facilities are expected to be sited near to feedstock supplies, which are seldom in nonattainment areas. The average cost per unit of produced energy is less than 1% higher in the scenarios with air quality compliance costs than in scenarios without such costs. When facility construction is prohibited in nonattainment areas, the costs increase by slightly over 1%, due to increases in the distance feedstock is transported to facilities in attainment areas.
Ong, Song-Quan; Ahmad, Hamdan; Jaal, Zairi; Rus, Adanan; Fadzlah, Fadhlina Hazwani Mohd
2017-01-01
Determining the control threshold for a pest is common prior to initiating a pest control program; however, previous studies related to the house fly control threshold for a poultry farm are insufficient for determining such a threshold. This study aimed to predict the population changes of house fly population by comparing the intrinsic rate of increase (r m ) for different house fly densities in a simulated system. This study first defined the knee points of a known population growth curve as a control threshold by comparing the r m of five densities of house flies in a simulated condition. Later, to understand the interactions between the larval and adult populations, the correlation between larval and adult capacity rate (r c ) was studied. The r m values of 300- and 500-fly densities were significantly higher compared with the r m values at densities of 50 and 100 flies. This result indicated their representative indices as candidates for a control threshold. The r c of larval and adult populations were negatively correlated with densities of fewer than 300 flies; this implicated adult populations with fewer than 300 flies as declining while the larval population was growing; therefore, control approaches should focus on the immature stages. The results in the present study suggest a control threshold for house fly populations. Future works should focus on calibrating the threshold indices in field conditions. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Weber, Anderson; Thewes, Fabio Rodrigo; Anese, Rogerio de Oliveira; Both, Vanderlei; Pavanello, Elizandra Pivotto; Brackmann, Auri
2017-11-15
The objective of the present work was to evaluate the appropriate respiratory quotient (RQ) value to achieve a safe lowest oxygen limit (LOL), during storage of 'Fuji Suprema' apples, in dynamic controlled atmosphere (DCA), treated with or without 1-methylcyclopropene (1-MCP). The apples were stored in DCA-RQ, a new technology for storing fruits, and were compared with the HarvestWatch™, a system based on chlorophyll fluorescence DCA (DCA-CF), and static controlled atmosphere. DCA-RQ1.5 is the most suited for the storage of 'Fuji Suprema' apples. In this condition fermentative products were induced, which reduced ethylene production and respiration rate; however, it did not increase physiological disorders, and the concentration of ethyl acetate was below the odour threshold. 1-MCP application maintained higher flesh firmness and reduced the anaerobic metabolism, although it decreased fruit quality due to the occurrence of cavities, therefore its application is not recommended for 'Fuji Suprema' apple stored in DCA conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
LinkImputeR: user-guided genotype calling and imputation for non-model organisms.
Money, Daniel; Migicovsky, Zoë; Gardner, Kyle; Myles, Sean
2017-07-10
Genomic studies such as genome-wide association and genomic selection require genome-wide genotype data. All existing technologies used to create these data result in missing genotypes, which are often then inferred using genotype imputation software. However, existing imputation methods most often make use only of genotypes that are successfully inferred after having passed a certain read depth threshold. Because of this, any read information for genotypes that did not pass the threshold, and were thus set to missing, is ignored. Most genomic studies also choose read depth thresholds and quality filters without investigating their effects on the size and quality of the resulting genotype data. Moreover, almost all genotype imputation methods require ordered markers and are therefore of limited utility in non-model organisms. Here we introduce LinkImputeR, a software program that exploits the read count information that is normally ignored, and makes use of all available DNA sequence information for the purposes of genotype calling and imputation. It is specifically designed for non-model organisms since it requires neither ordered markers nor a reference panel of genotypes. Using next-generation DNA sequence (NGS) data from apple, cannabis and grape, we quantify the effect of varying read count and missingness thresholds on the quantity and quality of genotypes generated from LinkImputeR. We demonstrate that LinkImputeR can increase the number of genotype calls by more than an order of magnitude, can improve genotyping accuracy by several percent and can thus improve the power of downstream analyses. Moreover, we show that the effects of quality and read depth filters can differ substantially between data sets and should therefore be investigated on a per-study basis. By exploiting DNA sequence data that is normally ignored during genotype calling and imputation, LinkImputeR can significantly improve both the quantity and quality of genotype data generated from NGS technologies. It enables the user to quickly and easily examine the effects of varying thresholds and filters on the number and quality of the resulting genotype calls. In this manner, users can decide on thresholds that are most suitable for their purposes. We show that LinkImputeR can significantly augment the value and utility of NGS data sets, especially in non-model organisms with poor genomic resources.
Estimating parameters for probabilistic linkage of privacy-preserved datasets.
Brown, Adrian P; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Boyd, James H
2017-07-10
Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets.
Extraction of Extended Small-Scale Objects in Digital Images
NASA Astrophysics Data System (ADS)
Volkov, V. Y.
2015-05-01
Detection and localization problem of extended small-scale objects with different shapes appears in radio observation systems which use SAR, infra-red, lidar and television camera. Intensive non-stationary background is the main difficulty for processing. Other challenge is low quality of images, blobs, blurred boundaries; in addition SAR images suffer from a serious intrinsic speckle noise. Statistics of background is not normal, it has evident skewness and heavy tails in probability density, so it is hard to identify it. The problem of extraction small-scale objects is solved here on the basis of directional filtering, adaptive thresholding and morthological analysis. New kind of masks is used which are open-ended at one side so it is possible to extract ends of line segments with unknown length. An advanced method of dynamical adaptive threshold setting is investigated which is based on isolated fragments extraction after thresholding. Hierarchy of isolated fragments on binary image is proposed for the analysis of segmentation results. It includes small-scale objects with different shape, size and orientation. The method uses extraction of isolated fragments in binary image and counting points in these fragments. Number of points in extracted fragments is normalized to the total number of points for given threshold and is used as effectiveness of extraction for these fragments. New method for adaptive threshold setting and control maximises effectiveness of extraction. It has optimality properties for objects extraction in normal noise field and shows effective results for real SAR images.
75 FR 53129 - Federal Acquisition Regulation; Inflation Adjustment of Acquisition-Related Thresholds
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... contractors pay very low wages and benefits, work quality can suffer and the Government may bear hidden costs because of the need to provide income assistance to low income families. The threshold for subcontracting...
Spatiotemporal video deinterlacing using control grid interpolation
NASA Astrophysics Data System (ADS)
Venkatesan, Ragav; Zwart, Christine M.; Frakes, David H.; Li, Baoxin
2015-03-01
With the advent of progressive format display and broadcast technologies, video deinterlacing has become an important video-processing technique. Numerous approaches exist in the literature to accomplish deinterlacing. While most earlier methods were simple linear filtering-based approaches, the emergence of faster computing technologies and even dedicated video-processing hardware in display units has allowed higher quality but also more computationally intense deinterlacing algorithms to become practical. Most modern approaches analyze motion and content in video to select different deinterlacing methods for various spatiotemporal regions. We introduce a family of deinterlacers that employs spectral residue to choose between and weight control grid interpolation based spatial and temporal deinterlacing methods. The proposed approaches perform better than the prior state-of-the-art based on peak signal-to-noise ratio, other visual quality metrics, and simple perception-based subjective evaluations conducted by human viewers. We further study the advantages of using soft and hard decision thresholds on the visual performance.
NASA Astrophysics Data System (ADS)
Febriani, Ika Kartika; Hadiyanto
2018-02-01
The problem of environmental pollution especially urban water pollution becomes major issue in Indonesia. The cause of water pollution is not only from industrial factory waste disposal but also other causes which become pollution factor. One cause of water pollution is the existence of agricultural activities with the use of the amount of pesticides that exceed the threshold. As regulated in Government Regulation No. 82/2001 on Water Quality Management and Water Pollution Control, it is necessary to manage water quality and control water pollution wisely by taking into account the interests of current and future generations as well as the ecological balance. To overcome the problem of water pollution due to agricultural activities, it is necessary to conduct research on phytoremediation technique by utilizing eceng gondok plant. It is excepted that using this phytoremediation technique can reduce the problem of water pollution due to the use of pesticides on agricultural activities.
Electrically Injected UV-Visible Nanowire Lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, George T.; Li, Changyi; Li, Qiming
2015-09-01
There is strong interest in minimizing the volume of lasers to enable ultracompact, low-power, coherent light sources. Nanowires represent an ideal candidate for such nanolasers as stand-alone optical cavities and gain media, and optically pumped nanowire lasing has been demonstrated in several semiconductor systems. Electrically injected nanowire lasers are needed to realize actual working devices but have been elusive due to limitations of current methods to address the requirement for nanowire device heterostructures with high material quality, controlled doping and geometry, low optical loss, and efficient carrier injection. In this project we proposed to demonstrate electrically injected single nanowire lasersmore » emitting in the important UV to visible wavelengths. Our approach to simultaneously address these challenges is based on high quality III-nitride nanowire device heterostructures with precisely controlled geometries and strong gain and mode confinement to minimize lasing thresholds, enabled by a unique top-down nanowire fabrication technique.« less
Legg Ditterline, Bonnie E; Aslan, Sevda C; Randall, David C; Harkema, Susan J; Castillo, Camilo; Ovechkin, Alexander V
2018-03-01
To evaluate the effects of pressure threshold respiratory training (RT) on heart rate variability and baroreflex sensitivity in persons with chronic spinal cord injury (SCI). Before-after intervention case-controlled clinical study. SCI research center and outpatient rehabilitation unit. Participants (N=44) consisted of persons with chronic SCI ranging from C2 to T11 who participated in RT (n=24), and untrained control subjects with chronic SCI ranging from C2 to T9 (n=20). A total of 21±2 RT sessions performed 5 days a week during a 4-week period using a combination of pressure threshold inspiratory and expiratory devices. Forced vital capacity (FVC), forced expiratory volume in 1 second (FEV 1 ), and beat-to-beat arterial blood pressure and heart rate changes during the 5-second-long maximum expiratory pressure maneuver (5s MEP) and the sit-up orthostatic stress test, acquired before and after the RT program. In contrast to the untrained controls, individuals in the RT group experienced significantly increased FVC and FEV 1 (both P<.01) in association with improved quality of sleep, cough, and speech. Sympathetically (phase II) and parasympathetically (phase IV) mediated baroreflex sensitivity both significantly (P<.05) increased during the 5s MEP. During the orthostatic stress test, improved autonomic control over heart rate was associated with significantly increased sympathetic and parasympathetic modulation (low- and high-frequency change: P<.01 and P<.05, respectively). Inspiratory-expiratory pressure threshold RT is a promising technique to positively affect both respiratory and cardiovascular dysregulation observed in persons with chronic SCI. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Lecordier, J; Heluin, Y; Plivard, C; Bureau, A; Mouawad, C; Chaillot, B; Lahet, J-J
2011-02-01
We present the way to integrate gravimetric control (GC) in a centralized preparation of cytotoxic drugs unit. Two different modalities are described. In the first strategy, the balance is located inside the isolator, whereas in the second, it is located outside in order to remove many technical and ergonomic constraints. These two modalities are compared in terms of benefits and limits. GC consists in comparing the observed weight variation with the expected weight variation using a precision balance. According to the B-in strategy, this variation is directly attributable to the weight of the cytotoxic solution injected, whereas with the B-out strategy, the weight of various additional components must be taken into account. Five hundred and seventy-seven preparations have been weighed. For "B-in" strategy, the 95% confidence interval is [1.02-1.14%] and every preparation is below the threshold of 5%. For "B-out" strategy, the 95% confidence interval is [2.34-2.63%] and 94% of preparations are below the threshold of 5%. B-in strategy is distinctly more precise than B-out strategy and can be applied to all preparations. However, B-out strategy is a feasible option in practice and enables the detection of an important mistake. All in all, results obtained from B-out strategy can be considered as a quality indicator in the production line. Results of GC are helpful in the final step of release, which the pharmacist is responsible for. Many contributions in the quality assurance policy could justify using of GC in every unit. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Air pollution in Boston bars before and after a smoking ban.
Repace, James L; Hyde, James N; Brugge, Doug
2006-10-27
We quantified the air quality benefits of a smoke-free workplace law in Boston Massachusetts, U.S.A., by measuring air pollution from secondhand smoke (SHS) in 7 pubs before and after the law, comparing actual ventilation practices to engineering society (ASHRAE) recommendations, and assessing SHS levels using health and comfort indices. We performed real-time measurements of respirable particle (RSP) air pollution and particulate polycyclic aromatic hydrocarbons (PPAH), in 7 pubs and outdoors in a model-based design yielding air exchange rates for RSP removal. We also assessed ventilation rates from carbon dioxide concentrations. We compared RSP air pollution to the federal Air Quality Index (AQI) and the National Ambient Air Quality Standard (NAAQS) to assess health risks, and assessed odor and irritation levels using published SHS-RSP thresholds. Pre-smoking-ban RSP levels in 6 pubs (one pub with a non-SHS air quality problem was excluded) averaged 179 microg/m3, 23 times higher than post-ban levels, which averaged 7.7 microg/m3, exceeding the NAAQS for fine particle pollution (PM2.5) by nearly 4-fold. Pre-smoking ban levels of fine particle air pollution in all 7 of the pubs were in the Unhealthy to Hazardous range of the AQI. In the same 6 pubs, pre-ban indoor carcinogenic PPAH averaged 61.7 ng/m3, nearly 10 times higher than post-ban levels of 6.32 ng/m3. Post-ban particulate air pollution levels were in the Good AQI range, except for 1 venue with a defective gas-fired deep-fat fryer, while post-ban carcinogen levels in all 7 pubs were lower than outdoors. During smoking, although pub ventilation rates per occupant were within ASHRAE design parameters for the control of carbon dioxide levels for the number of occupants present, they failed to control SHS carcinogens or RSP. Nonsmokers' SHS odor and irritation sensory thresholds were massively exceeded. Post-ban air pollution measurements showed 90% to 95% reductions in PPAH and RSP respectively, differing little from outdoor concentrations. Ventilation failed to control SHS, leading to increased risk of the diseases of air pollution for nonsmoking workers and patrons. Boston's smoking ban eliminated this risk.
Frederix, Geert W J; Hövels, Anke M; Severens, Johan L; Raaijmakers, Jan A M; Schellens, Jan H M
2015-01-01
There is increasing discussion in the Netherlands about the introduction of a threshold value for the costs per extra year of life when reimbursing costs of new drugs. The Medicines Committee ('Commissie Geneesmiddelen'), a division of the Netherlands National Healthcare Institute ('Zorginstituut Nederland'), advises on reimbursement of costs of new drugs. This advice is based upon the determination of therapeutic value of the drug and the results of economic evaluations. Mathematical models that predict future costs and effectiveness are often used in economic evaluations; these models can vary greatly in transparency and quality due to author assumptions. Standardisation of cost-effectiveness models is one solution to overcome the unwanted variation in quality. Discussions about the introduction of a threshold value can only be meaningful if all involved are adequately informed, and by high quality in cost-effectiveness research and, particularly, economic evaluations. Collaboration and discussion between medical specialists, patients or patient organisations, health economists and policy makers, both in development of methods and in standardisation, are essential to improve the quality of decision making.
Estimating economic thresholds for pest control: an alternative procedure.
Ramirez, O A; Saunders, J L
1999-04-01
An alternative methodology to determine profit maximizing economic thresholds is developed and illustrated. An optimization problem based on the main biological and economic relations involved in determining a profit maximizing economic threshold is first advanced. From it, a more manageable model of 2 nonsimultaneous reduced-from equations is derived, which represents a simpler but conceptually and statistically sound alternative. The model recognizes that yields and pest control costs are a function of the economic threshold used. Higher (less strict) economic thresholds can result in lower yields and, therefore, a lower gross income from the sale of the product, but could also be less costly to maintain. The highest possible profits will be obtained by using the economic threshold that results in a maximum difference between gross income and pest control cost functions.
Ramsay, A; Bonnet, M; Gagnidze, L; Githui, W; Varaine, F; Guérin, P J
2009-05-01
Urban clinic, Nairobi. To evaluate the impact of specimen quality and different smear-positive tuberculosis (TB) case (SPC) definitions on SPC detection by sex. Prospective study among TB suspects. A total of 695 patients were recruited: 644 produced > or =1 specimen for microscopy. The male/female sex ratio was 0.8. There were no significant differences in numbers of men and women submitting three specimens (274/314 vs. 339/380, P = 0.43). Significantly more men than women produced a set of three 'good' quality specimens (175/274 vs. 182/339, P = 0.01). Lowering thresholds for definitions to include scanty smears resulted in increases in SPC detection in both sexes; the increase was significantly higher for women. The revised World Health Organization (WHO) case definition was associated with the highest detection rates in women. When analysis was restricted only to patients submitting 'good' quality specimen sets, the difference in detection between sexes was on the threshold for significance (P = 0.05). Higher SPC notification rates in men are commonly reported by TB control programmes. The revised WHO SPC definition may reduce sex disparities in notification. This should be considered when evaluating other interventions aimed at reducing these. Further study is required on the effects of the human immuno-deficiency virus and instructed specimen collection on sex-specific impact of new SPC definition.
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
Physiopathology of megarectum: the association of megarectum with encopresis.
Meunier, P; Mollard, P; Marechal, J M
1976-01-01
Studies of both rectosphincteric reflex threshold and conscious rectal sensitivity threshold were performed on 15 control subjects and 61 children with a radiological megarectum, 70% of whom were encopretics. In control subjects, the reflex threshold and the sensitivity threshold were obtained with a comparable volume of rectal distension. In the megarectum patients, sensitivity was often considerably reduced, the incidence of encopresis increasing proportionally with the decrease in conscious rectal sensitivity. Patients were segregated in three functional groups, according to measurements of the sensitivity threshold. PMID:1269991
Zhao, Changsen; Yang, Shengtian; Liu, Junguo; Liu, Changming; Hao, Fanghua; Wang, Zhonggen; Zhang, Huitong; Song, Jinxi; Mitrovic, Simon M; Lim, Richard P
2018-05-15
The survival of aquatic biota in stream ecosystems depends on both water quantity and quality, and is particularly susceptible to degraded water quality in regulated rivers. Maintenance of environmental flows (e-flows) for aquatic biota with optimum water quantity and quality is essential for sustainable ecosystem services, especially in developing regions with insufficient stream monitoring of hydrology, water quality and aquatic biota. Few e-flow methods are available that closely link aquatic biota tolerances to pollutant concentrations in a simple and practical manner. In this paper a new method was proposed to assess e-flows that aimed to satisfy the requirements of aquatic biota for both the quantity and quality of the streamflow by linking fish tolerances to water quality criteria, or the allowable concentration of pollutants. For better operation of water projects and control of pollutants discharged into streams, this paper presented two coefficients for streamflow adjustment and pollutant control. Assessment of e-flows in the Wei River, the largest tributary of the Yellow River, shows that streamflow in dry seasons failed to meet e-flow requirements. Pollutant influx exerted a large pressure on the aquatic ecosystem, with pollutant concentrations much higher than that of the fish tolerance thresholds. We found that both flow velocity and water temperature exerted great influences on the pollutant degradation rate. Flow velocity had a much greater influence on pollutant degradation than did the standard deviation of flow velocity. This study provides new methods to closely link the tolerance of aquatic biota to water quality criteria for e-flow assessment. The recommended coefficients for streamflow adjustment and pollutant control, to dynamically regulate streamflow and control pollutant discharge, are helpful for river management and ecosystems rehabilitation. The relatively low data requirement also makes the method easy to use efficiently in developing regions, and thus this study has significant implications for managing flows in polluted and regulated rivers worldwide. Copyright © 2018. Published by Elsevier Ltd.
Zhang, Li; Liu, Haiyu; Qin, Lingling; Zhang, Zhixin; Wang, Qing; Zhang, Qingqing; Lu, Zhiwei; Wei, Shengli; Gao, Xiaoyan; Tu, Pengfei
2015-02-01
A global chemical profiling based quality evaluation approach using ultra performance liquid chromatography with tandem quadrupole time-of-flight mass spectrometry was developed for the quality evaluation of three rhubarb species, including Rheum palmatum L., Rheum tanguticum Maxim. ex Balf., and Rheum officinale Baill. Considering that comprehensive detection of chemical components is crucial for the global profile, a systemic column performance evaluation method was developed. Based on this, a Cortecs column was used to acquire the chemical profile, and Chempattern software was employed to conduct similarity evaluation and hierarchical cluster analysis. The results showed R. tanguticum could be differentiated from R. palmatum and R. officinale at the similarity value 0.65, but R. palmatum and R. officinale could not be distinguished effectively. Therefore, a common pattern based on three rhubarb species was developed to conduct the quality evaluation, and the similarity value 0.50 was set as an appropriate threshold to control the quality of rhubarb. A total of 88 common peaks were identified by their accurate mass and fragmentation, and partially verified by reference standards. Through the verification, the newly developed method could be successfully used for evaluating the holistic quality of rhubarb. It would provide a reference for the quality control of other herbal medicines. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bai, Xiaohui; Zhi, Xinghua; Zhu, Huifeng; Meng, Mingqun; Zhang, Mingde
2015-01-01
This study investigates the effect of chloramine residual on bacteria growth and regrowth and the relationship between heterotrophic plate counts (HPCs) and the concentration of chloramine residual in the Shanghai drinking water distribution system (DWDS). In this study, models to control HPCs in the water distribution system and consumer taps are also developed. Real-time ArcGIS was applied to show the distribution and changed results of the chloramine residual concentration in the pipe system by using these models. Residual regression analysis was used to get a reasonable range of the threshold values that allows the chloramine residual to efficiently inhibit bacteria growth in the Shanghai DWDS; the threshold values should be between 0.45 and 0.5 mg/L in pipe water and 0.2 and 0.25 mg/L in tap water. The low residual chloramine value (0.05 mg/L) of the Chinese drinking water quality standard may pose a potential health risk for microorganisms that should be improved. Disinfection by-products (DBPs) were detected, but no health risk was identified.
Seeing and identifying with a virtual body decreases pain perception.
Hänsel, Alexander; Lenggenhager, Bigna; von Känel, Roland; Curatolo, Michele; Blanke, Olaf
2011-09-01
Pain and the conscious mind (or the self) are experienced in our body. Both are intimately linked to the subjective quality of conscious experience. Here, we used virtual reality technology and visuo-tactile conflicts in healthy subjects to test whether experimentally induced changes of bodily self-consciousness (self-location; self-identification) lead to changes in pain perception. We found that visuo-tactile stroking of a virtual body but not of a control object led to increased pressure pain thresholds and self-location. This increase was not modulated by the synchrony of stroking as predicted based on earlier work. This differed for self-identification where we found as predicted that synchrony of stroking increased self-identification with the virtual body (but not a control object), and positively correlated with an increase in pain thresholds. We discuss the functional mechanisms of self-identification, self-location, and the visual perception of human bodies with respect to pain perception. Copyright © 2011 European Federation of International Association for the Study of Pain Chapters. Published by Elsevier Ltd. All rights reserved.
Method for enhanced control of welding processes
Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin
2000-01-01
Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.
Sliding mode control of outbreaks of emerging infectious diseases.
Xiao, Yanni; Xu, Xiaxia; Tang, Sanyi
2012-10-01
This paper proposes and analyzes a mathematical model of an infectious disease system with a piecewise control function concerning threshold policy for disease management strategy. The proposed models extend the classic models by including a piecewise incidence rate to represent control or precautionary measures being triggered once the number of infected individuals exceeds a threshold level. The long-term behaviour of the proposed non-smooth system under this strategy consists of the so-called sliding motion-a very rapid switching between application and interruption of the control action. Model solutions ultimately approach either one of two endemic states for two structures or the sliding equilibrium on the switching surface, depending on the threshold level. Our findings suggest that proper combinations of threshold densities and control intensities based on threshold policy can either preclude outbreaks or lead the number of infected to a previously chosen level.
Munford, Luke A; Sidaway, Mark; Blakemore, Amy; Sutton, Matt; Bower, Pete
2017-01-01
Background Community assets are promoted as a way to improve quality of life and reduce healthcare usage. However, the quantitative impact of participation in community assets on these outcomes is not known. Methods We examined the association between participation in community assets and health-related quality of life (HRQoL) (EuroQol-5D-5L) and healthcare usage in 3686 individuals aged ≥65 years. We estimated the unadjusted differences in EuroQol-5D-5L scores and healthcare usage between participants and non-participants in community assets and then used multivariate regression to examine scores adjusted for sociodemographic and limiting long-term health conditions. We derived the net benefits of participation using a range of threshold values for a quality-adjusted life year (QALY). Results 50% of individuals reported participation in community assets. Their EuroQol-5D-5L scores were 0.094 (95% CI 0.077 to 0.111) points higher than non-participants. Controlling for sociodemographic characteristics reduced this differential to 0.081 (95% CI 0.064 to 0.098). Further controlling for limiting long-term conditions reduced this effect to 0.039 (95% CI 0.025 to 0.052). Once we adjusted for sociodemographic and limiting long-term conditions, the reductions in healthcare usage and costs associated with community asset participation were not statistically significant. Based on a threshold value of £20 000 per QALY, the net benefits of participation in community assets were £763 (95% CI £478 to £1048) per participant per year. Conclusions Participation in community assets is associated with substantially higher HRQoL but is not associated with lower healthcare costs. The social value of developing community assets is potentially substantial. PMID:28183807
Riga, Maria G; Chelis, Leonidas; Kakolyris, Stylianos; Papadopoulos, Stergios; Stathakidou, Sofia; Chamalidou, Eleni; Xenidis, Nikolaos; Amarantidis, Kyriakos; Dimopoulos, Prokopios; Danielides, Vasilios
2013-02-01
Ototoxicity is a common and irreversible adverse effect of cisplatin treatment with great impact on the patients' quality of life. N-acetylcysteine is a low-molecular-weight agent which has shown substantial otoprotective activity. The role of transtympanic infusions of N-acetylcysteine was examined in a cohort of patients treated with cisplatin-based regimens. Twenty cisplatin-treated patients were subjected, under local anesthesia, to transtympanic N-acetylcysteine (10%) infusions in 1 ear, during the hydration procedure preceding intravenous effusion of cisplatin. The contralateral ear was used as control. The number of transtympanic infusions was respective to the number of administered cycles. Hearing acuity was evaluated before each cycle with pure tone audiometry by an audiologist blinded to the treated ear. A total of 84 transtympanic infusions were performed. In treated ears, no significant changes in auditory thresholds were recorded. In the control ears cisplatin induced a significant decrease of auditory thresholds at the 8000 Hz frequency band (P=0.008). At the same frequency (8000 Hz), the changes in auditory thresholds were significantly larger for the control ears than the treated ones (P=0.005). An acute pain starting shortly after the injection and lasting for a few minutes seemed to be the only significant adverse effect. Transtympanic injections of N-acetylcysteine seem to be a feasible and effective otoprotective strategy for the prevention of cisplatin-induced ototoxicity. Additional studies are required to further clarify the efficiency of this treatment and determine the optimal dosage and protocol.
Threshold automatic selection hybrid phase unwrapping algorithm for digital holographic microscopy
NASA Astrophysics Data System (ADS)
Zhou, Meiling; Min, Junwei; Yao, Baoli; Yu, Xianghua; Lei, Ming; Yan, Shaohui; Yang, Yanlong; Dan, Dan
2015-01-01
Conventional quality-guided (QG) phase unwrapping algorithm is hard to be applied to digital holographic microscopy because of the long execution time. In this paper, we present a threshold automatic selection hybrid phase unwrapping algorithm that combines the existing QG algorithm and the flood-filled (FF) algorithm to solve this problem. The original wrapped phase map is divided into high- and low-quality sub-maps by selecting a threshold automatically, and then the FF and QG unwrapping algorithms are used in each level to unwrap the phase, respectively. The feasibility of the proposed method is proved by experimental results, and the execution speed is shown to be much faster than that of the original QG unwrapping algorithm.
NASA Astrophysics Data System (ADS)
Cheng, Siyang; An, Xingqin; Zhou, Lingxi; Tans, Pieter P.; Jacobson, Andy
2017-06-01
In order to explore where the source and sink have the greatest impact on CO2 background concentration at Waliguan (WLG) station, a statistical method is here proposed to calculate the representative source-sink region. The key to this method is to find the best footprint threshold, and the study is carried out in four parts. Firstly, transport climatology, expressed by total monthly footprint, was simulated by FLEXPART on a 7-day time scale. Surface CO2 emissions in Eurasia frequently transported to WLG station. WLG station was mainly influenced by the westerlies in winter and partly controlled by the Southeast Asian monsoon in summer. Secondly, CO2 concentrations, simulated by CT2015, were processed and analyzed through data quality control, screening, fitting and comparing. CO2 concentrations displayed obvious seasonal variation, with the maximum and minimum concentration appearing in April and August, respectively. The correlation of CO2 fitting background concentrations was R2 = 0.91 between simulation and observation. The temporal patterns were mainly correlated with CO2 exchange of biosphere-atmosphere, human activities and air transport. Thirdly, for the monthly CO2 fitting background concentrations from CT2015, a best footprint threshold was found based on correlation analysis and numerical iteration using the data of footprints and emissions. The grid cells where monthly footprints were greater than the best footprint threshold were the best threshold area corresponding to representative source-sink region. The representative source-sink region of maximum CO2 concentration in April was primarily located in Qinghai province, but the minimum CO2 concentration in August was mainly influenced by emissions in a wider region. Finally, we briefly presented the CO2 source-sink characteristics in the best threshold area. Generally, the best threshold area was a carbon sink. The major source and sink were relatively weak owing to less human activities and vegetation types in this high altitude area. CO2 concentrations were more influenced by human activities when air mass passed through many urban areas in summer. Therefore, the combination of footprints and emissions is an effective approach for assessing the source-sink region representativeness of CO2 background concentration.
Stokes, E A; Wordsworth, S; Bargo, D; Pike, K; Rogers, C A; Brierley, R C M; Angelini, G D; Murphy, G J; Reeves, B C
2016-01-01
Objective To assess the incremental cost and cost-effectiveness of a restrictive versus a liberal red blood cell transfusion threshold after cardiac surgery. Design A within-trial cost-effectiveness analysis with a 3-month time horizon, based on a multicentre superiority randomised controlled trial from the perspective of the National Health Service (NHS) and personal social services in the UK. Setting 17 specialist cardiac surgery centres in UK NHS hospitals. Participants 2003 patients aged >16 years undergoing non-emergency cardiac surgery with a postoperative haemoglobin of <9 g/dL. Interventions Restrictive (transfuse if haemoglobin <7.5 g/dL) or liberal (transfuse if haemoglobin <9 g/dL) threshold during hospitalisation after surgery. Main outcome measures Health-related quality of life measured using the EQ-5D-3L to calculate quality-adjusted life years (QALYs). Results The total costs from surgery up to 3 months were £17 945 and £18 127 in the restrictive and liberal groups (mean difference is −£182, 95% CI −£1108 to £744). The cost difference was largely attributable to the difference in the cost of red blood cells. Mean QALYs to 3 months were 0.18 in both groups (restrictive minus liberal difference is 0.0004, 95% CI −0.0037 to 0.0045). The point estimate for the base-case cost-effectiveness analysis suggested that the restrictive group was slightly more effective and slightly less costly than the liberal group and, therefore, cost-effective. However, there is great uncertainty around these results partly due to the negligible differences in QALYs gained. Conclusions We conclude that there is no clear difference in the cost-effectiveness of restrictive and liberal thresholds for red blood cell transfusion after cardiac surgery. Trial registration number ISRCTN70923932; Results. PMID:27481621
Almeida, Osvaldo P; Marsh, Kylie; Murray, Karen; Hickey, Martha; Sim, Moira; Ford, Andrew; Flicker, Leon
2016-10-01
To determine if health coaching (HC) decreases the incidence of depression, reduces the severity of symptoms, and increases quality of life during the menopausal transition (MT). Parallel, single-blinded, randomised controlled trial of 6 sessions of phone-delivered HC compared with usual care. Participants were 351 community-dwelling women free of major depression going through the MT, of whom 180 were assigned the intervention and 171 usual care. The primary outcome of interest was the incidence of clinically significant depressive symptoms over 52 weeks. Other study measures included the Hospital Anxiety and Depression Scale, quality of life (SF-12), the Menopause Rating Scale (MRS), diet, body mass index, alcohol use, smoking and physical activity. We considered that women with Patient Health Questionnaire (PHQ-9) scores between 5 and 14 (inclusive) had sub-threshold depressive symptoms. Nine women developed clinically significant symptoms of depression during the study-2 had been assigned HC (odds ratio, OR=0.26, 95%CI=0.05, 1.29; p=0.099). Intention-to-treat showed that, compared with usual care, the intervention led to a greater decline in depressive scores, most markedly for participants with sub-threshold depressive symptoms. Similar, but less pronounced, benefits were noticed for anxiety scores and the mental component summary of the SF-12. The intervention led to a decline in MRS scores by week 26 and subtle improvements in body mass, consumption of vegetables and smoking. HC addressing relevant risk factors for depression during the MT improves mental health measures. Our findings indicate that women with sub-threshold depressive symptoms may benefit the most from such interventions, and suggest that HC could play a useful role in minimizing mental health disturbance for women going through the MT. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Schmitz, Patric; Hildebrandt, Julian; Valdez, Andre Calero; Kobbelt, Leif; Ziefle, Martina
2018-04-01
In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications. While these studies suggest that only relatively low degrees of manipulation are tolerable, we claim that, besides establishing detection thresholds, it is important to know when the user's immersion breaks. We hypothesize that the degree of unnoticed manipulation is significantly different from the detection threshold when the user is immersed in a task. We conducted three studies: a) to devise an experimental paradigm to measure the threshold of limited immersion (TLI), b) to measure the TLI for slowly decreasing and increasing rotation gains, and c) to establish a baseline of cybersickness for our experimental setup. For rotation gains greater than 1.0, we found that immersion breaks quite late after the gain is detectable. However, for gains lesser than 1.0, some users reported a break of immersion even before established detection thresholds were reached. Apparently, the developed metric measures an additional quality of user experience. This article contributes to the development of effective spatial compression methods by utilizing the break of immersion as a benchmark for redirection techniques.
Alali, Aziz S; Burton, Kirsteen; Fowler, Robert A; Naimark, David M J; Scales, Damon C; Mainprize, Todd G; Nathens, Avery B
2015-07-01
Economic evaluations provide a unique opportunity to identify the optimal strategies for the diagnosis and management of traumatic brain injury (TBI), for which uncertainty is common and the economic burden is substantial. The objective of this study was to systematically review and examine the quality of contemporary economic evaluations in the diagnosis and management of TBI. Two reviewers independently searched MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials, NHS Economic Evaluation Database, Health Technology Assessment Database, EconLit, and the Tufts CEA Registry for comparative economic evaluations published from 2000 onward (last updated on August 30, 2013). Data on methods, results, and quality were abstracted in duplicate. The results were summarized quantitatively and qualitatively. Of 3539 citations, 24 economic evaluations met our inclusion criteria. Nine were cost-utility, five were cost-effectiveness, three were cost-minimization, and seven were cost-consequences analyses. Only six studies were of high quality. Current evidence from high-quality studies suggests the economic attractiveness of the following strategies: a low medical threshold for computed tomography (CT) scanning of asymptomatic infants with possible inflicted TBI, selective CT scanning of adults with mild TBI as per the Canadian CT Head Rule, management of severe TBI according to the Brain Trauma Foundation guidelines, management of TBI in dedicated neurocritical care units, and early transfer of patients with TBI with nonsurgical lesions to neuroscience centers. Threshold-guided CT scanning, adherence to Brain Trauma Foundation guidelines, and care for patients with TBI, including those with nonsurgical lesions, in specialized settings appear to be economically attractive strategies. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Liu, Zhihua; Yang, Jian; He, Hong S.
2013-01-01
The relative importance of fuel, topography, and weather on fire spread varies at different spatial scales, but how the relative importance of these controls respond to changing spatial scales is poorly understood. We designed a “moving window” resampling technique that allowed us to quantify the relative importance of controls on fire spread at continuous spatial scales using boosted regression trees methods. This quantification allowed us to identify the threshold value for fire size at which the dominant control switches from fuel at small sizes to weather at large sizes. Topography had a fluctuating effect on fire spread across the spatial scales, explaining 20–30% of relative importance. With increasing fire size, the dominant control switched from bottom-up controls (fuel and topography) to top-down controls (weather). Our analysis suggested that there is a threshold for fire size, above which fires are driven primarily by weather and more likely lead to larger fire size. We suggest that this threshold, which may be ecosystem-specific, can be identified using our “moving window” resampling technique. Although the threshold derived from this analytical method may rely heavily on the sampling technique, our study introduced an easily implemented approach to identify scale thresholds in wildfire regimes. PMID:23383247
Self-assembled dye-doped polymer microspheres as whispering gallery mode lasers
NASA Astrophysics Data System (ADS)
Chen, Xiaogang; Sun, Hongyi; Yang, Hongqin; Wu, Xiang; Xie, Shusen
2016-10-01
Microlasers based on high-Q whispering-gallery-mode (WGM) resonances are promising low-threshold laser sources for bio-sensing and imaging applications. In this talk, we demonstrate a cost effective approach to obtain size-controllable polymer microspheres, which can be served as good WGM microcavities. By injecting SU-8 solution into low-refractiveindex UV polymer, self-assembled spherical droplet with smooth surface can be created inside the elastic medium and then solidified by UV exposure. The size of the microspheres can be tuned from several to hundreds of microns. WGM Lasing has been achieved by optically pumping the dye-doped microspheres with ns lasers. Experimental results show that the microsphere lasers have high quality factors and low lasing thresholds. The self-assembled dye-doped polymer microspheres would provide an excellent platform for the micro-laser sources in on-chip biosensing and imaging systems.
Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm
NASA Astrophysics Data System (ADS)
Elahi, Sana; kaleem, Muhammad; Omer, Hammad
2018-01-01
Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.
Jackson, Simon A.; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar
2016-01-01
In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants (N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation. PMID:27790170
Jackson, Simon A; Kleitman, Sabina; Howie, Pauline; Stankov, Lazar
2016-01-01
In this paper, we investigate whether individual differences in performance on heuristic and biases tasks can be explained by cognitive abilities, monitoring confidence, and control thresholds. Current theories explain individual differences in these tasks by the ability to detect errors and override automatic but biased judgments, and deliberative cognitive abilities that help to construct the correct response. Here we retain cognitive abilities but disentangle error detection, proposing that lower monitoring confidence and higher control thresholds promote error checking. Participants ( N = 250) completed tasks assessing their fluid reasoning abilities, stable monitoring confidence levels, and the control threshold they impose on their decisions. They also completed seven typical heuristic and biases tasks such as the cognitive reflection test and Resistance to Framing. Using structural equation modeling, we found that individuals with higher reasoning abilities, lower monitoring confidence, and higher control threshold performed significantly and, at times, substantially better on the heuristic and biases tasks. Individuals with higher control thresholds also showed lower preferences for risky alternatives in a gambling task. Furthermore, residual correlations among the heuristic and biases tasks were reduced to null, indicating that cognitive abilities, monitoring confidence, and control thresholds accounted for their shared variance. Implications include the proposal that the capacity to detect errors does not differ between individuals. Rather, individuals might adopt varied strategies that promote error checking to different degrees, regardless of whether they have made a mistake or not. The results support growing evidence that decision-making involves cognitive abilities that construct actions and monitoring and control processes that manage their initiation.
Davis, T S; Wark, H A C; Hutchinson, D T; Warren, D J; O'Neill, K; Scheinblum, T; Clark, G A; Normann, R A; Greger, B
2016-06-01
An important goal of neuroprosthetic research is to establish bidirectional communication between the user and new prosthetic limbs that are capable of controlling >20 different movements. One strategy for achieving this goal is to interface the prosthetic limb directly with efferent and afferent fibres in the peripheral nervous system using an array of intrafascicular microelectrodes. This approach would provide access to a large number of independent neural pathways for controlling high degree-of-freedom prosthetic limbs, as well as evoking multiple-complex sensory percepts. Utah Slanted Electrode Arrays (USEAs, 96 recording/stimulating electrodes) were implanted for 30 days into the median (Subject 1-M, 31 years post-amputation) or ulnar (Subject 2-U, 1.5 years post-amputation) nerves of two amputees. Neural activity was recorded during intended movements of the subject's phantom fingers and a linear Kalman filter was used to decode the neural data. Microelectrode stimulation of varying amplitudes and frequencies was delivered via single or multiple electrodes to investigate the number, size and quality of sensory percepts that could be evoked. Device performance over time was assessed by measuring: electrode impedances, signal-to-noise ratios (SNRs), stimulation thresholds, number and stability of evoked percepts. The subjects were able to proportionally, control individual fingers of a virtual robotic hand, with 13 different movements decoded offline (r = 0.48) and two movements decoded online. Electrical stimulation across one USEA evoked >80 sensory percepts. Varying the stimulation parameters modulated percept quality. Devices remained intrafascicularly implanted for the duration of the study with no significant changes in the SNRs or percept thresholds. This study demonstrated that an array of 96 microelectrodes can be implanted into the human peripheral nervous system for up to 1 month durations. Such an array could provide intuitive control of a virtual prosthetic hand with broad sensory feedback.
NASA Astrophysics Data System (ADS)
Davis, T. S.; Wark, H. A. C.; Hutchinson, D. T.; Warren, D. J.; O'Neill, K.; Scheinblum, T.; Clark, G. A.; Normann, R. A.; Greger, B.
2016-06-01
Objective. An important goal of neuroprosthetic research is to establish bidirectional communication between the user and new prosthetic limbs that are capable of controlling >20 different movements. One strategy for achieving this goal is to interface the prosthetic limb directly with efferent and afferent fibres in the peripheral nervous system using an array of intrafascicular microelectrodes. This approach would provide access to a large number of independent neural pathways for controlling high degree-of-freedom prosthetic limbs, as well as evoking multiple-complex sensory percepts. Approach. Utah Slanted Electrode Arrays (USEAs, 96 recording/stimulating electrodes) were implanted for 30 days into the median (Subject 1-M, 31 years post-amputation) or ulnar (Subject 2-U, 1.5 years post-amputation) nerves of two amputees. Neural activity was recorded during intended movements of the subject’s phantom fingers and a linear Kalman filter was used to decode the neural data. Microelectrode stimulation of varying amplitudes and frequencies was delivered via single or multiple electrodes to investigate the number, size and quality of sensory percepts that could be evoked. Device performance over time was assessed by measuring: electrode impedances, signal-to-noise ratios (SNRs), stimulation thresholds, number and stability of evoked percepts. Main results. The subjects were able to proportionally, control individual fingers of a virtual robotic hand, with 13 different movements decoded offline (r = 0.48) and two movements decoded online. Electrical stimulation across one USEA evoked >80 sensory percepts. Varying the stimulation parameters modulated percept quality. Devices remained intrafascicularly implanted for the duration of the study with no significant changes in the SNRs or percept thresholds. Significance. This study demonstrated that an array of 96 microelectrodes can be implanted into the human peripheral nervous system for up to 1 month durations. Such an array could provide intuitive control of a virtual prosthetic hand with broad sensory feedback.
NASA Astrophysics Data System (ADS)
Medjoubi, K.; Dawiec, A.
2017-12-01
A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.
Major controlling factors and prediction models for arsenic uptake from soil to wheat plants.
Dai, Yunchao; Lv, Jialong; Liu, Ke; Zhao, Xiaoyan; Cao, Yingfei
2016-08-01
The application of current Chinese agriculture soil quality standards fails to evaluate the land utilization functions appropriately due to the diversity of soil properties and plant species. Therefore, the standards should be amended. A greenhouse experiment was conducted to investigate arsenic (As) enrichment in various soils from 18 Chinese provinces in parallel with As transfer to 8 wheat varieties. The goal of the study was to build and calibrate soil-wheat threshold models to forecast the As threshold of wheat soils. In Shaanxi soils, Wanmai and Jimai were the most sensitive and insensitive wheat varieties, respectively; and in Jiangxi soils, Zhengmai and Xumai were the most sensitive and insensitive wheat varieties, respectively. Relationships between soil properties and the bioconcentration factor (BCF) were built based on stepwise multiple linear regressions. Soil pH was the best predictor of BCF, and after normalizing the regression equation (Log BCF=0.2054 pH- 3.2055, R(2)=0.8474, n=14, p<0.001), we obtained a calibrated model. Using the calibrated model, a continuous soil-wheat threshold equation (HC5=10((-0.2054 pH+2.9935))+9.2) was obtained for the species-sensitive distribution curve, which was built on Chinese food safety standards. The threshold equation is a helpful tool that can be applied to estimate As uptake from soil to wheat. Copyright © 2016 Elsevier Inc. All rights reserved.
de Bekker-Grob, Esther W; de Kok, Inge M C M; Bulten, Johan; van Rosmalen, Joost; Vedder, Judith E M; Arbyn, Marc; Klinkhamer, Paul J J M; Siebers, Albertus G; van Ballegooijen, Marjolein
2012-08-01
Cervical cancer screening with liquid-based cytology (LBC) has been developed as an alternative to the conventional Papanicolaou (CP) smear. Cost-effectiveness is one of the issues when evaluating LBC. Based on the results of a Dutch randomised controlled trial, we conducted cost-effectiveness threshold analyses to investigate under what circumstances manually screened ThinPrep LBC is cost-effective for screening. The MISCAN-Cervix microsimulation model and data from the Dutch NETHCON trial (including 89,784 women) were used to estimate the costs and (quality-adjusted) life years ((QA)LYs) gained for EU screening schedules, varying cost-effectiveness threshold values. Screening strategies were primary cytological screening with LBC or CP, and triage with human papillomavirus (HPV) testing. Threshold analyses showed that screening with LBC as a primary test can be cost-effective if LBC is less than
Effect of Sleep Hygiene Education on Sleep Quality in Hemodialysis Patients
Soleimani, Farzaneh; Hasanpour-Dehkordi, Ali
2016-01-01
Introduction Sleep is referred a regular, recurring and easily revocable state of organism which is characterized by relative immobility and significant increase in response threshold to environmental stimuli. Sleep disorders are common among haemodialysis patients. Aim The aim of this study was to investigate the effect of sleep hygiene education on sleep quality in haemodialysis patients. Materials and Methods This study is a randomized controlled clinical trial. The participants of this study were 60 haemodialysis patients admitted to the Dialysis Center of Shahid Ayatollah Madani Hospital of Khoy, affiliated with the Urmia University of Medical Sciences. Sampling was done randomly and the partcipants were randomly divided into intervention group (30 patients) and control group (30 patients). Sleep quality of participants was measured before and after the intervention by Pittsburgh Sleep Quality Index (PSQI). Training process for sleep hygiene behaviours was presented to the participants face-to-face. The data were analysed using SPSS 16. Results A significant difference in the mean (standard deviation) score for PSQI (p<0.001) was observed before and after intervention in the intervention group, while in the control group, the difference was not significant (p=0.704), In addition, a significant difference was observed in the mean (standard deviation) score for PSQI between the two, intervention and control groups after the educational intervention (p=0.034). Conclusion Sleep hygiene education, alongside other appro-aches, is a low-cost, accessible and practical method which can be implemented within a short period of time. PMID:28208884
Leimbach, Friederike; Georgiev, Dejan; Litvak, Vladimir; Antoniades, Chrystalina; Limousin, Patricia; Jahanshahi, Marjan; Bogacz, Rafal
2018-06-01
During a decision process, the evidence supporting alternative options is integrated over time, and the choice is made when the accumulated evidence for one of the options reaches a decision threshold. Humans and animals have an ability to control the decision threshold, that is, the amount of evidence that needs to be gathered to commit to a choice, and it has been proposed that the subthalamic nucleus (STN) is important for this control. Recent behavioral and neurophysiological data suggest that, in some circumstances, the decision threshold decreases with time during choice trials, allowing overcoming of indecision during difficult choices. Here we asked whether this within-trial decrease of the decision threshold is mediated by the STN and if it is affected by disrupting information processing in the STN through deep brain stimulation (DBS). We assessed 13 patients with Parkinson disease receiving bilateral STN DBS six or more months after the surgery, 11 age-matched controls, and 12 young healthy controls. All participants completed a series of decision trials, in which the evidence was presented in discrete time points, which allowed more direct estimation of the decision threshold. The participants differed widely in the slope of their decision threshold, ranging from constant threshold within a trial to steeply decreasing. However, the slope of the decision threshold did not depend on whether STN DBS was switched on or off and did not differ between the patients and controls. Furthermore, there was no difference in accuracy and RT between the patients in the on and off stimulation conditions and healthy controls. Previous studies that have reported modulation of the decision threshold by STN DBS or unilateral subthalamotomy in Parkinson disease have involved either fast decision-making under conflict or time pressure or in anticipation of high reward. Our findings suggest that, in the absence of reward, decision conflict, or time pressure for decision-making, the STN does not play a critical role in modulating the within-trial decrease of decision thresholds during the choice process.
Allegrini, Maria-Cristina; Canullo, Roberto; Campetella, Giandiego
2009-04-01
Knowledge of accuracy and precision rates is particularly important for long-term studies. Vegetation assessments include many sources of error related to overlooking and misidentification, that are usually influenced by some factors, such as cover estimate subjectivity, observer biased species lists and experience of the botanist. The vegetation assessment protocol adopted in the Italian forest monitoring programme (CONECOFOR) contains a Quality Assurance programme. The paper presents the different phases of QA, separates the 5 main critical points of the whole protocol as sources of random or systematic errors. Examples of Measurement Quality Objectives (MQOs) expressed as Data Quality Limits (DQLs) are given for vascular plant cover estimates, in order to establish the reproducibility of the data. Quality control activities were used to determine the "distance" between the surveyor teams and the control team. Selected data were acquired during the training and inter-calibration courses. In particular, an index of average cover by species groups was used to evaluate the random error (CV 4%) as the dispersion around the "true values" of the control team. The systematic error in the evaluation of species composition, caused by overlooking or misidentification of species, was calculated following the pseudo-turnover rate; detailed species censuses on smaller sampling units were accepted as the pseudo-turnover which always fell below the 25% established threshold; species density scores recorded at community level (100 m(2) surface) rarely exceeded that limit.
NASA Astrophysics Data System (ADS)
Lowe, A. T.; Roberts, E. A.; Galloway, A. W. E.
2016-02-01
Coastal regions around the world are changing rapidly, generating many physiological stressors for marine organisms. Food availability, a major factor determining physiological condition of marine organisms, in these systems reflects the influence of biological and environmental factors, and will likely respond dramatically to long-term changes. Using observations of phytoplankton, detritus, and their corresponding fatty acids and stable isotopes of carbon, nitrogen and sulfur, we identified environmental drivers of pelagic food availability and quality along a salinity gradient in a large tidally influenced estuary (San Juan Archipelago, Salish Sea, USA). Variation in chlorophyll a (Chl a), biomarkers and environmental conditions exhibited a similar range at both tidal and seasonal scales, highlighting a tide-related mechanism controlling productivity that is important to consider for long-term monitoring. Multiple parameters of food availability were inversely and non-linearly correlated to salinity, such that availability of high-quality (based on abundance, essential fatty acid concentration and C:N) seston increased below a salinity threshold of 30. The increased marine productivity was associated with increased pH and dissolved oxygen (DO) at lower salinity. Based on this observation we predicted that a decrease of salinity to below the threshold would result in higher Chl a, temperature, DO and pH across a range of temporal and spatial scales, and tested the prediction with a meta-analysis of available data. At all scales, these variables showed significant and consistent increases related to the salinity threshold. This finding provides important context to the increased frequency of below-threshold salinity over the last 71 years in this region, suggesting greater food availability with positive feedbacks on DO and pH. Together, these findings indicate that many of the environmental factors predicted to increase physiological stress to benthic suspension feeders (e.g. decreased salinity) may simultaneously and paradoxically improve conditions for benthic organisms.
Chen, Kai; Zhou, Lian; Chen, Xiaodong; Bi, Jun; Kinney, Patrick L
2017-05-01
Few multicity studies have addressed the health effects of ozone in China due to the scarcity of ozone monitoring data. A critical scientific and policy-relevant question is whether a threshold exists in the ozone-mortality relationship. Using a generalized additive model and a univariate random-effects meta-analysis, this research evaluated the relationship between short-term ozone exposure and daily total mortality in seven cities of Jiangsu Province, China during 2013-2014. Spline, subset, and threshold models were applied to further evaluate whether a safe threshold level exists. This study found strong evidence that short-term ozone exposure is significantly associated with premature total mortality. A 10μg/m 3 increase in the average of the current and previous days' maximum 8-h average ozone concentration was associated with a 0.55% (95% posterior interval: 0.34%, 0.76%) increase of total mortality. This finding is robust when considering the confounding effect of PM 2.5 , PM 10 , NO 2 , and SO 2 . No consistent evidence was found for a threshold in the ozone-mortality concentration-response relationship down to concentrations well below the current Chinese Ambient Air Quality Standard (CAAQS) level 2 standard (160μg/m 3 ). Our findings suggest that ozone concentrations below the current CAAQS level 2 standard could still induce increased mortality risks in Jiangsu Province, China. Continuous air pollution control measures could yield important health benefits in Jiangsu Province, China, even in cities that meet the current CAAQS level 2 standard. Copyright © 2017 Elsevier Inc. All rights reserved.
IgE Immunoadsorption Knocks Down the Risk of Food-Related Anaphylaxis.
Dahdah, Lamia; Ceccarelli, Stefano; Amendola, Silvia; Campagnano, Pietro; Cancrini, Caterina; Mazzina, Oscar; Fiocchi, Alessandro
2015-12-01
The effects of an immunoadsorption procedure, specifically designed to remove immunoglobulin E (IgE), on food-induced anaphylaxis have never been evaluated. We evaluate the effects of IgE removal on the allergic thresholds to foods. A 6-year-old boy with anaphylaxis to multiple foods and steroid-resistant unstable allergic asthma displayed serum IgE levels of 2800 to 3500 kU/L. To lower IgE serum concentrations, which could be overridden by a high dose of omalizumab, 1.5 plasma volumes were exchanged in 8 apheresis sessions. During the procedure, serum IgE levels fell to 309 kU/L. After the procedure, the threshold of reactivity to baked milk increased from 0.125 to 5 g of milk protein (full tolerance) after the first session, and the threshold of reactivity to hazelnut increased from 0.037 to 0.142 g of protein after the first session, 0.377 g after the eighth, and 1.067 g (full tolerance) after the first administration of omalizumab. Immediately after the sixth IgE immunoadsorption, we started omalizumab therapy. In the next 40 days, the threshold of reactivity to hazelnut increased to 7.730 (full tolerance). Asthma control was obtained, treatment with montelukast was stopped, and fluticasone was tapered from 500 to 175 μg/day. The boy became partially or fully tolerant to all the tested foods, and quality of life was improved. IgE immunoadsorption, used to establish the starting basis for omalizumab administration, is able to increase the tolerance threshold to foods. Copyright © 2015 by the American Academy of Pediatrics.
Chen, Kai; Zhou, Lian; Chen, Xiaodong; Bi, Jun; Kinney, Patrick L.
2017-01-01
Background Few multicity studies have addressed the health effects of ozone in China due to the scarcity of ozone monitoring data. A critical scientific and policy-relevant question is whether a threshold exists in the ozone-mortality relationship. Methods Using a generalized additive model and a univariate random-effects meta-analysis, this research evaluated the relationship between short-term ozone exposure and daily total mortality in seven cities of Jiangsu Province, China during 2013–2014. Spline, subset, and threshold models were applied to further evaluate whether a safe threshold level exists. Results This study found strong evidence that short-term ozone exposure is significantly associated with premature total mortality. A 10 μg/m3 increase in the average of the current and previous days’ maximum 8-h average ozone concentration was associated with a 0.55% (95% posterior interval: 0.34%, 0.76%) increase of total mortality. This finding is robust when considering the confounding effect of PM2.5, PM10, NO2, and SO2. No consistent evidence was found for a threshold in the ozone-mortality concentration-response relationship down to concentrations well below the current Chinese Ambient Air Quality Standard (CAAQS) level 2 standard (160 μg/m3). Conclusions Our findings suggest that ozone concentrations below the current CAAQS level 2 standard could still induce increased mortality risks in Jiangsu Province, China. Continuous air pollution control measures could yield important health benefits in Jiangsu Province, China, even in cities that meet the current CAAQS level 2 standard. PMID:28231551
Pain Intensity Recognition Rates via Biopotential Feature Patterns with Support Vector Machines
Gruss, Sascha; Treister, Roi; Werner, Philipp; Traue, Harald C.; Crawcour, Stephen; Andrade, Adriano; Walter, Steffen
2015-01-01
Background The clinically used methods of pain diagnosis do not allow for objective and robust measurement, and physicians must rely on the patient’s report on the pain sensation. Verbal scales, visual analog scales (VAS) or numeric rating scales (NRS) count among the most common tools, which are restricted to patients with normal mental abilities. There also exist instruments for pain assessment in people with verbal and / or cognitive impairments and instruments for pain assessment in people who are sedated and automated ventilated. However, all these diagnostic methods either have limited reliability and validity or are very time-consuming. In contrast, biopotentials can be automatically analyzed with machine learning algorithms to provide a surrogate measure of pain intensity. Methods In this context, we created a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. Eighty-five participants were subjected to painful heat stimuli (baseline, pain threshold, two intermediate thresholds, and pain tolerance threshold) under controlled conditions and the signals of electromyography, skin conductance level, and electrocardiography were collected. A total of 159 features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, variability, and similarity. Results We achieved classification rates of 90.94% for baseline vs. pain tolerance threshold and 79.29% for baseline vs. pain threshold. The most selected pain features stemmed from the amplitude and similarity group and were derived from facial electromyography. Conclusion The machine learning measurement of pain in patients could provide valuable information for a clinical team and thus support the treatment assessment. PMID:26474183
Setting objective thresholds for rare event detection in flow cytometry
Richards, Adam J.; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N.; Weinhold, Kent J.; Chan, Cliburn
2014-01-01
The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events (“smear”). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143
Should we expect population thresholds for wildlife disease?
Lloyd-Smith, James O.; Cross, P.C.; Briggs, C.J.; Daugherty, M.; Getz, W.M.; Latto, J.; Sanchez, M.; Smith, A.; Swei, A.
2005-01-01
Host population thresholds for invasion or persistence of infectious disease are core concepts of disease ecology, and underlie on-going and controversial disease control policies based on culling and vaccination. Empirical evidence for these thresholds in wildlife populations has been sparse, however, though recent studies have narrowed this gap. Here we review the theoretical bases for population thresholds for disease, revealing why they are difficult to measure and sometimes are not even expected, and identifying important facets of wildlife ecology left out of current theories. We discuss strengths and weaknesses of selected empirical studies that have reported disease thresholds for wildlife, identify recurring obstacles, and discuss implications of our imperfect understanding of wildlife thresholds for disease control policy.
Quality assessment of color images based on the measure of just noticeable color difference
NASA Astrophysics Data System (ADS)
Chou, Chun-Hsien; Hsu, Yun-Hsiang
2014-01-01
Accurate assessment on the quality of color images is an important step to many image processing systems that convey visual information of the reproduced images. An accurate objective image quality assessment (IQA) method is expected to give the assessment result highly agreeing with the subjective assessment. To assess the quality of color images, many approaches simply apply the metric for assessing the quality of gray scale images to each of three color channels of the color image, neglecting the correlation among three color channels. In this paper, a metric for assessing color images' quality is proposed, in which the model of variable just-noticeable color difference (VJNCD) is employed to estimate the visibility thresholds of distortion inherent in each color pixel. With the estimated visibility thresholds of distortion, the proposed metric measures the average perceptible distortion in terms of the quantized distortion according to the perceptual error map similar to that defined by National Bureau of Standards (NBS) for converting the color difference enumerated by CIEDE2000 to the objective score of perceptual quality assessment. The perceptual error map in this case is designed for each pixel according to the visibility threshold estimated by the VJNCD model. The performance of the proposed metric is verified by assessing the test images in the LIVE database, and is compared with those of many well-know IQA metrics. Experimental results indicate that the proposed metric is an effective IQA method that can accurately predict the image quality of color images in terms of the correlation between objective scores and subjective evaluation.
QuickEval: a web application for psychometric scaling experiments
NASA Astrophysics Data System (ADS)
Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius
2015-01-01
QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.
Rethinking the Clinically Based Thresholds of TransCelerate BioPharma for Risk-Based Monitoring.
Zink, Richard C; Dmitrienko, Anastasia; Dmitrienko, Alex
2018-01-01
The quality of data from clinical trials has received a great deal of attention in recent years. Of central importance is the need to protect the well-being of study participants and maintain the integrity of final analysis results. However, traditional approaches to assess data quality have come under increased scrutiny as providing little benefit for the substantial cost. Numerous regulatory guidance documents and industry position papers have described risk-based approaches to identify quality and safety issues. In particular, the position paper of TransCelerate BioPharma recommends defining risk thresholds to assess safety and quality risks based on past clinical experience. This exercise can be extremely time-consuming, and the resulting thresholds may only be relevant to a particular therapeutic area, patient or clinical site population. In addition, predefined thresholds cannot account for safety or quality issues where the underlying rate of observing a particular problem may change over the course of a clinical trial, and often do not consider varying patient exposure. In this manuscript, we appropriate rules commonly utilized for funnel plots to define a traffic-light system for risk indicators based on statistical criteria that consider the duration of patient follow-up. Further, we describe how these methods can be adapted to assess changing risk over time. Finally, we illustrate numerous graphical approaches to summarize and communicate risk, and discuss hybrid clinical-statistical approaches to allow for the assessment of risk at sites with low patient enrollment. We illustrate the aforementioned methodologies for a clinical trial in patients with schizophrenia. Funnel plots are a flexible graphical technique that can form the basis for a risk-based strategy to assess data integrity, while considering site sample size, patient exposure, and changing risk across time.
Method for depositing layers of high quality semiconductor material
Guha, Subhendu; Yang, Chi C.
2001-08-14
Plasma deposition of substantially amorphous semiconductor materials is carried out under a set of deposition parameters which are selected so that the process operates near the amorphous/microcrystalline threshold. This threshold varies as a function of the thickness of the depositing semiconductor layer; and, deposition parameters, such as diluent gas concentrations, must be adjusted as a function of layer thickness. Also, this threshold varies as a function of the composition of the depositing layer, and in those instances where the layer composition is profiled throughout its thickness, deposition parameters must be adjusted accordingly so as to maintain the amorphous/microcrystalline threshold.
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
Sun, Jun; Duan, Yizhou; Li, Jiangtao; Liu, Jiaying; Guo, Zongming
2013-01-01
In the first part of this paper, we derive a source model describing the relationship between the rate, distortion, and quantization steps of the dead-zone plus uniform threshold scalar quantizers with nearly uniform reconstruction quantizers for generalized Gaussian distribution. This source model consists of rate-quantization, distortion-quantization (D-Q), and distortion-rate (D-R) models. In this part, we first rigorously confirm the accuracy of the proposed source model by comparing the calculated results with the coding data of JM 16.0. Efficient parameter estimation strategies are then developed to better employ this source model in our two-pass rate control method for H.264 variable bit rate coding. Based on our D-Q and D-R models, the proposed method is of high stability, low complexity and is easy to implement. Extensive experiments demonstrate that the proposed method achieves: 1) average peak signal-to-noise ratio variance of only 0.0658 dB, compared to 1.8758 dB of JM 16.0's method, with an average rate control error of 1.95% and 2) significant improvement in smoothing the video quality compared with the latest two-pass rate control method.
Effects of sleep bruxism related tinnitus on quality of life.
Saltürk, Ziya; Özçelik, Erdinç; Kumral, Tolgar Lütfi; Çakır, Ozan; Kasımoğlu, Şeref; Atar, Yavuz; Yıldırım, Güven; Berkiten, Güler; Göker, Ayşe Enise; Uyar, Yavuz
2015-01-01
This study aims to analyze the subjective and objective characteristics of tinnitus in sleep bruxism patients. The study included 57 patients (12 males; 45 females; mean age 33.89±12.50 years; range 19 to 55 years) with sleep bruxism and tinnitus (sleep bruxism group) and 24 patients (6 males, 18 females; mean age 43.75±16.19 years; range 21 to 58 years) only with tinnitus (control group). Sleep bruxism was diagnosed by the diagnostic criteria of American Academy of Sleep Medicine. Patients were performed pure tone audiometry to detect hearing thresholds at standard and high frequencies. Tinnitus frequency and loudness were assessed. Subjective aspects of tinnitus were identified by tinnitus handicap inventory. The statistical analysis revealed that the sleep bruxism group had significantly lower hearing thresholds except 1000 Hz and 2000 Hz. Tinnitus frequency was between 3000 Hz and 18000 Hz in sleep bruxism group while it was between 6000 and 16000 Hz in control group with no statistically significant difference (p=0.362). Sleep bruxism group had significantly lower tinnitus loudness and tinnitus handicap inventory scores in comparison to control group (p=0.024 and p=0.000, respectively). Tinnitus caused by sleep bruxism and temporomandibular joint issues has higher frequency and lower loudness compared to patients with only tinnitus.
Controlled sub-nanometer tuning of photonic crystal resonator by carbonaceous nano-dots.
Seo, Min-Kyo; Park, Hong-Gyu; Yang, Jin-Kyu; Kim, Ju-Young; Kim, Se-Heon; Lee, Yong-Hee
2008-06-23
We propose and demonstrate a scheme that enables spectral tuning of a photonic crystal high-quality resonant mode, in steps finer than 0.2 nm, via electron beam induced deposition of carbonaceous nano-dots. The position and size of a nano-dot with a diameter of <100 nm are controlled to an accuracy on the order of nanometers. The possibility of selective modal tuning is also demonstrated by placing nano-dots at locations pre-determined by theoretical computation. The lasing threshold of a photonic crystal mode tends to increase when a nano-dot is grown at the point of strong electric field, showing the absorptive nature of the nano-dot.
Clinical Practice Guidelines From the AABB: Red Blood Cell Transfusion Thresholds and Storage.
Carson, Jeffrey L; Guyatt, Gordon; Heddle, Nancy M; Grossman, Brenda J; Cohn, Claudia S; Fung, Mark K; Gernsheimer, Terry; Holcomb, John B; Kaplan, Lewis J; Katz, Louis M; Peterson, Nikki; Ramsey, Glenn; Rao, Sunil V; Roback, John D; Shander, Aryeh; Tobian, Aaron A R
2016-11-15
More than 100 million units of blood are collected worldwide each year, yet the indication for red blood cell (RBC) transfusion and the optimal length of RBC storage prior to transfusion are uncertain. To provide recommendations for the target hemoglobin level for RBC transfusion among hospitalized adult patients who are hemodynamically stable and the length of time RBCs should be stored prior to transfusion. Reference librarians conducted a literature search for randomized clinical trials (RCTs) evaluating hemoglobin thresholds for RBC transfusion (1950-May 2016) and RBC storage duration (1948-May 2016) without language restrictions. The results were summarized using the Grading of Recommendations Assessment, Development and Evaluation method. For RBC transfusion thresholds, 31 RCTs included 12 587 participants and compared restrictive thresholds (transfusion not indicated until the hemoglobin level is 7-8 g/dL) with liberal thresholds (transfusion not indicated until the hemoglobin level is 9-10 g/dL). The summary estimates across trials demonstrated that restrictive RBC transfusion thresholds were not associated with higher rates of adverse clinical outcomes, including 30-day mortality, myocardial infarction, cerebrovascular accident, rebleeding, pneumonia, or thromboembolism. For RBC storage duration, 13 RCTs included 5515 participants randomly allocated to receive fresher blood or standard-issue blood. These RCTs demonstrated that fresher blood did not improve clinical outcomes. It is good practice to consider the hemoglobin level, the overall clinical context, patient preferences, and alternative therapies when making transfusion decisions regarding an individual patient. Recommendation 1: a restrictive RBC transfusion threshold in which the transfusion is not indicated until the hemoglobin level is 7 g/dL is recommended for hospitalized adult patients who are hemodynamically stable, including critically ill patients, rather than when the hemoglobin level is 10 g/dL (strong recommendation, moderate quality evidence). A restrictive RBC transfusion threshold of 8 g/dL is recommended for patients undergoing orthopedic surgery, cardiac surgery, and those with preexisting cardiovascular disease (strong recommendation, moderate quality evidence). The restrictive transfusion threshold of 7 g/dL is likely comparable with 8 g/dL, but RCT evidence is not available for all patient categories. These recommendations do not apply to patients with acute coronary syndrome, severe thrombocytopenia (patients treated for hematological or oncological reasons who are at risk of bleeding), and chronic transfusion-dependent anemia (not recommended due to insufficient evidence). Recommendation 2: patients, including neonates, should receive RBC units selected at any point within their licensed dating period (standard issue) rather than limiting patients to transfusion of only fresh (storage length: <10 days) RBC units (strong recommendation, moderate quality evidence). Research in RBC transfusion medicine has significantly advanced the science in recent years and provides high-quality evidence to inform guidelines. A restrictive transfusion threshold is safe in most clinical settings and the current blood banking practices of using standard-issue blood should be continued.
Tosh, J; Dixon, S; Carter, A; Daley, A; Petty, J; Roalfe, A; Sharrack, B; Saxton, J M
2014-07-01
Exercise is a safe, non-pharmacological adjunctive treatment for people with multiple sclerosis but cost-effective approaches to implementing exercise within health care settings are needed. The objective of this paper is to assess the cost effectiveness of a pragmatic exercise intervention in conjunction with usual care compared to usual care only in people with mild to moderate multiple sclerosis. A cost-utility analysis of a pragmatic randomised controlled trial over nine months of follow-up was conducted. A total of 120 people with multiple sclerosis were randomised (1:1) to the intervention or usual care. Exercising participants received 18 supervised and 18 home exercise sessions over 12 weeks. The primary outcome for the cost utility analysis was the incremental cost per quality-adjusted life year (QALY) gained, calculated using utilities measured by the EQ-5D questionnaire. The incremental cost per QALY of the intervention was £10,137 per QALY gained compared to usual care. The probability of being cost effective at a £20,000 per QALY threshold was 0.75, rising to 0.78 at a £30,000 per QALY threshold. The pragmatic exercise intervention is highly likely to be cost effective at current established thresholds, and there is scope for it to be tailored to particular sub-groups of patients or services to reduce its cost impact. © The Author(s) 2013.
Top down arsenic uncertainty measurement in water and sediments from Guarapiranga dam (Brazil)
NASA Astrophysics Data System (ADS)
Faustino, M. G.; Lange, C. N.; Monteiro, L. R.; Furusawa, H. A.; Marques, J. R.; Stellato, T. B.; Soares, S. M. V.; da Silva, T. B. S. C.; da Silva, D. B.; Cotrim, M. E. B.; Pires, M. A. F.
2018-03-01
Total arsenic measurements assessment regarding legal threshold demands more than average and standard deviation approach. In this way, analytical measurement uncertainty evaluation was conducted in order to comply with legal requirements and to allow the balance of arsenic in both water and sediment compartments. A top-down approach for measurement uncertainties was applied to evaluate arsenic concentrations in water and sediments from Guarapiranga dam (São Paulo, Brazil). Laboratory quality control and arsenic interlaboratory tests data were used in this approach to estimate the uncertainties associated with the methodology.
On the Design of a Fuzzy Logic-Based Control System for Freeze-Drying Processes.
Fissore, Davide
2016-12-01
This article is focused on the design of a fuzzy logic-based control system to optimize a drug freeze-drying process. The goal of the system is to keep product temperature as close as possible to the threshold value of the formulation being processed, without trespassing it, in such a way that product quality is not jeopardized and the sublimation flux is maximized. The method involves the measurement of product temperature and a set of rules that have been obtained through process simulation with the goal to obtain a unique set of rules for products with very different characteristics. Input variables are the difference between the temperature of the product and the threshold value, the difference between the temperature of the heating fluid and that of the product, and the rate of change of product temperature. The output variables are the variation of the temperature of the heating fluid and the pressure in the drying chamber. The effect of the starting value of the input variables and of the control interval has been investigated, thus resulting in the optimal configuration of the control system. Experimental investigation carried out in a pilot-scale freeze-dryer has been carried out to validate the proposed system. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
48 CFR 46.202-2 - Government reliance on inspection by contractor.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...
48 CFR 46.202-2 - Government reliance on inspection by contractor.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...
48 CFR 46.202-2 - Government reliance on inspection by contractor.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... acquired at or below the simplified acquisition threshold conform to contract quality requirements before... the contractor's internal work processes. In making the determination, the contracting officer shall...
Cameron, David; Ubels, Jasper; Norström, Fredrik
2018-01-01
The amount a government should be willing to invest in adopting new medical treatments has long been under debate. With many countries using formal cost-effectiveness (C/E) thresholds when examining potential new treatments and ever-growing medical costs, accurately setting the level of a C/E threshold can be essential for an efficient healthcare system. The aim of this systematic review is to describe the prominent approaches to setting a C/E threshold, compile available national-level C/E threshold data and willingness-to-pay (WTP) data, and to discern whether associations exist between these values, gross domestic product (GDP) and health-adjusted life expectancy (HALE). This review further examines current obstacles faced with the presently available data. A systematic review was performed to collect articles which have studied national C/E thresholds and willingness-to-pay (WTP) per quality-adjusted life year (QALY) in the general population. Associations between GDP, HALE, WTP, and C/E thresholds were analyzed with correlations. Seventeen countries were identified from nine unique sources to have formal C/E thresholds within our inclusion criteria. Thirteen countries from nine sources were identified to have WTP per QALY data within our inclusion criteria. Two possible associations were identified: C/E thresholds with HALE (quadratic correlation of 0.63), and C/E thresholds with GDP per capita (polynomial correlation of 0.84). However, these results are based on few observations and therefore firm conclusions cannot be made. Most national C/E thresholds identified in our review fall within the WHO's recommended range of one-to-three times GDP per capita. However, the quality and quantity of data available regarding national average WTP per QALY, opportunity costs, and C/E thresholds is poor in comparison to the importance of adequate investment in healthcare. There exists an obvious risk that countries might either over- or underinvest in healthcare if they base their decision-making process on erroneous presumptions or non-evidence-based methodologies. The commonly referred to value of 100,000$ USD per QALY may potentially have some basis.
Cameron, David; Ubels, Jasper; Norström, Fredrik
2018-01-01
ABSTRACT Background: The amount a government should be willing to invest in adopting new medical treatments has long been under debate. With many countries using formal cost-effectiveness (C/E) thresholds when examining potential new treatments and ever-growing medical costs, accurately setting the level of a C/E threshold can be essential for an efficient healthcare system. Objectives: The aim of this systematic review is to describe the prominent approaches to setting a C/E threshold, compile available national-level C/E threshold data and willingness-to-pay (WTP) data, and to discern whether associations exist between these values, gross domestic product (GDP) and health-adjusted life expectancy (HALE). This review further examines current obstacles faced with the presently available data. Methods: A systematic review was performed to collect articles which have studied national C/E thresholds and willingness-to-pay (WTP) per quality-adjusted life year (QALY) in the general population. Associations between GDP, HALE, WTP, and C/E thresholds were analyzed with correlations. Results: Seventeen countries were identified from nine unique sources to have formal C/E thresholds within our inclusion criteria. Thirteen countries from nine sources were identified to have WTP per QALY data within our inclusion criteria. Two possible associations were identified: C/E thresholds with HALE (quadratic correlation of 0.63), and C/E thresholds with GDP per capita (polynomial correlation of 0.84). However, these results are based on few observations and therefore firm conclusions cannot be made. Conclusions: Most national C/E thresholds identified in our review fall within the WHO’s recommended range of one-to-three times GDP per capita. However, the quality and quantity of data available regarding national average WTP per QALY, opportunity costs, and C/E thresholds is poor in comparison to the importance of adequate investment in healthcare. There exists an obvious risk that countries might either over- or underinvest in healthcare if they base their decision-making process on erroneous presumptions or non-evidence-based methodologies. The commonly referred to value of 100,000$ USD per QALY may potentially have some basis. PMID:29564962
Threshold Concepts and Student Engagement: Revisiting Pedagogical Content Knowledge
ERIC Educational Resources Information Center
Zepke, Nick
2013-01-01
This article revisits the notion that to facilitate quality learning requires teachers in higher education to have pedagogical content knowledge. It constructs pedagogical content knowledge as a teaching and learning space that brings content and pedagogy together. On the content knowledge side, it suggests that threshold concepts, akin to a…
Threshold friction velocity influenced by wetness of soils within the Columbia Plateau
USDA-ARS?s Scientific Manuscript database
Windblown dust impacts air quality in the Columbia Plateau of the U.S. Pacific Northwest. Wind erosion of agricultural lands, which is the predominate source of windblown dust in the region, occurs when the friction velocity exceeds the threshold friction velocity (TFV) of the surface. Soil moisture...
Some aspects of doping and medication control in equine sports.
Houghton, Ed; Maynard, Steve
2010-01-01
This chapter reviews drug and medication control in equestrian sports and addresses the rules of racing, the technological advances that have been made in drug detection and the importance of metabolism studies in the development of effective drug surveillance programmes. Typical approaches to screening and confirmatory analysis are discussed, as are the quality processes that underpin these procedures. The chapter also addresses four specific topics relevant to equestrian sports: substances controlled by threshold values, the approach adopted recently by European racing authorities to control some therapeutic substances, anabolic steroids in the horse and LC-MS analysis in drug testing in animal sports and metabolism studies. The purpose of discussing these specific topics is to emphasise the importance of research and development and collaboration to further global harmonisation and the development and support of international rules.
Both, Vanderlei; Thewes, Fabio Rodrigo; Brackmann, Auri; de Oliveira Anese, Rogerio; de Freitas Ferreira, Daniele; Wagner, Roger
2017-01-15
The effects of dynamic controlled atmosphere (DCA) storage based on chlorophyll fluorescence (DCA-CF) and respiratory quotient (DCA-RQ) on the quality and volatile profile of 'Royal Gala' apple were evaluated. DCA storage reduces ACC (1-aminocyclopropane-1-carboxylate) oxidase activity, ethylene production and respiration rate of apples stored for 9months at 1.0°C plus 7days at 20°C, resulting in higher flesh firmness, titratable acidity and lesser physiological disorders, and provided a higher proportion of healthy fruit. Storage in a regular controlled atmosphere gave higher levels of key volatiles (butyl acetate, 2-methylbutyl acetate and hexyl acetate), as compared to fruit stored under DCA-CF, but fruit stored under DCA-RQ 1.5 and RQ 2.0 also showed higher amounts of key volatile compounds, with increment in ethanol and ethyl acetate, but far below the odour threshold. Storage in DCA-CF reduces fruit ester production, especially 2-methylbutyl acetate, which is the most important component of 'Royal Gala' apple flavour. Copyright © 2016 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 46.404 Section 46.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Government Contract Quality Assurance...
Durante, Alessandra Spada; Wieselberg, Margarita Bernal; Roque, Nayara; Carvalho, Sheila; Pucci, Beatriz; Gudayol, Nicolly; de Almeida, Kátia
The use of hearing aids by individuals with hearing loss brings a better quality of life. Access to and benefit from these devices may be compromised in patients who present difficulties or limitations in traditional behavioral audiological evaluation, such as newborns and small children, individuals with auditory neuropathy spectrum, autism, and intellectual deficits, and in adults and the elderly with dementia. These populations (or individuals) are unable to undergo a behavioral assessment, and generate a growing demand for objective methods to assess hearing. Cortical auditory evoked potentials have been used for decades to estimate hearing thresholds. Current technological advances have lead to the development of equipment that allows their clinical use, with features that enable greater accuracy, sensitivity, and specificity, and the possibility of automated detection, analysis, and recording of cortical responses. To determine and correlate behavioral auditory thresholds with cortical auditory thresholds obtained from an automated response analysis technique. The study included 52 adults, divided into two groups: 21 adults with moderate to severe hearing loss (study group); and 31 adults with normal hearing (control group). An automated system of detection, analysis, and recording of cortical responses (HEARLab ® ) was used to record the behavioral and cortical thresholds. The subjects remained awake in an acoustically treated environment. Altogether, 150 tone bursts at 500, 1000, 2000, and 4000Hz were presented through insert earphones in descending-ascending intensity. The lowest level at which the subject detected the sound stimulus was defined as the behavioral (hearing) threshold (BT). The lowest level at which a cortical response was observed was defined as the cortical electrophysiological threshold. These two responses were correlated using linear regression. The cortical electrophysiological threshold was, on average, 7.8dB higher than the behavioral for the group with hearing loss and, on average, 14.5dB higher for the group without hearing loss for all studied frequencies. The cortical electrophysiological thresholds obtained with the use of an automated response detection system were highly correlated with behavioral thresholds in the group of individuals with hearing loss. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Quality of life in childhood, adolescence and adult food allergy: Patient and parent perspectives.
Stensgaard, A; Bindslev-Jensen, C; Nielsen, D; Munch, M; DunnGalvin, A
2017-04-01
Studies of children with food allergy typically only include the mother and have not investigated the relationship between the amount of allergen needed to elicit a clinical reaction (threshold) and health-related quality of life (HRQL). Our aims were (i) to compare self-reported and parent-reported HRQL in different age groups, (ii) to evaluate the impact of severity of allergic reaction and threshold on HRQL, and (iii) to investigate factors associated with patient-reported and parent-reported HRQL. Age-appropriate Food Allergy Quality of Life Questionnaires (FAQLQ) were completed by 73 children, 49 adolescents and 29 adults with peanut, hazelnut or egg allergy. Parents (197 mothers, 120 fathers) assessed their child's HRQL using the FAQLQ-Parent form. Clinical data and threshold values were obtained from a hospital database. Significant factors for HRQL were investigated using univariate and multivariate regression. Female patients reported greater impact of food allergy on HRQL than males did. Egg and hazelnut thresholds did not affect HRQL, but lower peanut threshold was associated with worse HRQL. Both parents scored their child's HRQL better than the child's own assessment, but whereas mother-reported HRQL was significantly affected by limitations in the child's social life, father-reported HRQL was affected by limitations in the family's social life. Severity of allergic reaction did not contribute significantly to HRQL. The risk of accidental allergen ingestion and limitations in social life are associated with worse HRQL. Fathers provide a unique perspective and should have a greater opportunity to contribute to food allergy research. © 2016 John Wiley & Sons Ltd.
Souza, M F; Veloso, L F A; Sampaio, M V; Davis, J A
2017-08-01
Biological features of Diaeretiella rapae (McIntosh), an aphid parasitoid, are conditioned by temperature and host. However, studies of host quality changes due to temperature adaptability have not been performed previously. Therefore, this study evaluated the adaptability of Lipaphis pseudobrassicae (Davis) and Myzus persicae (Sulzer) to high temperature, high temperature effect on their quality as hosts for D. rapae, and on parasitoid's thermal threshold. Aphid development, survivorship, fecundity, and longevity were compared at 19 °C and 28 °C. Host quality in different temperatures was determined through evaluation of parasitoid biology. Thermal threshold of D. rapae was determined using development time data. At 28 °C, development time, rate of immature survival, and total fecundity rates were greater in L. pseudobrassicae than in M. persicae. Development time of D. rapae in L. pseudobrassicae was shorter than that in M. persicae at 28 °C and 31 °C for females and at 31 °C for males. The thermal threshold of D. rapae was 6.38 °C and 3.33 °C for females and 4.45 °C and 3.63 °C for males developed on L. pseudobrassicae and M. persicae, respectively. Diaeretiella rapae size gain was greater in L. pseudobrassicae than that in M. persicae at 25 °C and 28 °C. Lipaphis pseudobrassicae showed better adaptation than M. persicae to elevated temperatures, which resulted in a better quality host for D. rapae at temperatures of 28 °C and 31 °C and a higher lower thermal threshold when the parasitoid developed within L. pseudobrassicae. The host's adaptation to high temperatures is a determinant of host quality for the parasitoid at that same climatic condition. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
48 CFR 46.202-2 - Government reliance on inspection by contractor.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ACQUISITION REGULATION CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-2 Government... the contractor to accomplish all inspection and testing needed to ensure that supplies or services acquired at or below the simplified acquisition threshold conform to contract quality requirements before...
NASA Astrophysics Data System (ADS)
Amanda, A. R.; Widita, R.
2016-03-01
The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.
Using machine learning to examine medication adherence thresholds and risk of hospitalization.
Lo-Ciganic, Wei-Hsuan; Donohue, Julie M; Thorpe, Joshua M; Perera, Subashan; Thorpe, Carolyn T; Marcum, Zachary A; Gellad, Walid F
2015-08-01
Quality improvement efforts are frequently tied to patients achieving ≥80% medication adherence. However, there is little empirical evidence that this threshold optimally predicts important health outcomes. To apply machine learning to examine how adherence to oral hypoglycemic medications is associated with avoidance of hospitalizations, and to identify adherence thresholds for optimal discrimination of hospitalization risk. A retrospective cohort study of 33,130 non-dual-eligible Medicaid enrollees with type 2 diabetes. We randomly selected 90% of the cohort (training sample) to develop the prediction algorithm and used the remaining (testing sample) for validation. We applied random survival forests to identify predictors for hospitalization and fit survival trees to empirically derive adherence thresholds that best discriminate hospitalization risk, using the proportion of days covered (PDC). Time to first all-cause and diabetes-related hospitalization. The training and testing samples had similar characteristics (mean age, 48 y; 67% female; mean PDC=0.65). We identified 8 important predictors of all-cause hospitalizations (rank in order): prior hospitalizations/emergency department visit, number of prescriptions, diabetes complications, insulin use, PDC, number of prescribers, Elixhauser index, and eligibility category. The adherence thresholds most discriminating for risk of all-cause hospitalization varied from 46% to 94% according to patient health and medication complexity. PDC was not predictive of hospitalizations in the healthiest or most complex patient subgroups. Adherence thresholds most discriminating of hospitalization risk were not uniformly 80%. Machine-learning approaches may be valuable to identify appropriate patient-specific adherence thresholds for measuring quality of care and targeting nonadherent patients for intervention.
Reeves, Barnaby C; Pike, Katie; Rogers, Chris A; Brierley, Rachel Cm; Stokes, Elizabeth A; Wordsworth, Sarah; Nash, Rachel L; Miles, Alice; Mumford, Andrew D; Cohen, Alan; Angelini, Gianni D; Murphy, Gavin J
2016-08-01
Uncertainty about optimal red blood cell transfusion thresholds in cardiac surgery is reflected in widely varying transfusion rates between surgeons and cardiac centres. To test the hypothesis that a restrictive compared with a liberal threshold for red blood cell transfusion after cardiac surgery reduces post-operative morbidity and health-care costs. Multicentre, parallel randomised controlled trial and within-trial cost-utility analysis from a UK NHS and Personal Social Services perspective. We could not blind health-care staff but tried to blind participants. Random allocations were generated by computer and minimised by centre and operation. Seventeen specialist cardiac surgery centres in UK NHS hospitals. Patients aged > 16 years undergoing non-emergency cardiac surgery with post-operative haemoglobin < 9 g/dl. Exclusion criteria were: unwilling to have transfusion owing to beliefs; platelet, red blood cell or clotting disorder; ongoing or recurrent sepsis; and critical limb ischaemia. Participants in the liberal group were eligible for transfusion immediately after randomisation (post-operative haemoglobin < 9 g/dl); participants in the restrictive group were eligible for transfusion if their post-operative haemoglobin fell to < 7.5 g/dl during the index hospital stay. The primary outcome was a composite outcome of any serious infectious (sepsis or wound infection) or ischaemic event (permanent stroke, myocardial infarction, gut infarction or acute kidney injury) during the 3 months after randomisation. Events were verified or adjudicated by blinded personnel. Secondary outcomes included blood products transfused; infectious events; ischaemic events; quality of life (European Quality of Life-5 Dimensions); duration of intensive care or high-dependency unit stay; duration of hospital stay; significant pulmonary morbidity; all-cause mortality; resource use, costs and cost-effectiveness. We randomised 2007 participants between 15 July 2009 and 18 February 2013; four withdrew, leaving 1000 and 1003 in the restrictive and liberal groups, respectively. Transfusion rates after randomisation were 53.4% (534/1000) and 92.2% (925/1003). The primary outcome occurred in 35.1% (331/944) and 33.0% (317/962) of participants in the restrictive and liberal groups [odds ratio (OR) 1.11, 95% confidence interval (CI) 0.91 to 1.34; p = 0.30], respectively. There were no subgroup effects for the primary outcome, although some sensitivity analyses substantially altered the estimated OR. There were no differences for secondary clinical outcomes except for mortality, with more deaths in the restrictive group (4.2%, 42/1000 vs. 2.6%, 26/1003; hazard ratio 1.64, 95% CI 1.00 to 2.67; p = 0.045). Serious post-operative complications excluding primary outcome events occurred in 35.7% (354/991) and 34.2% (339/991) of participants in the restrictive and liberal groups, respectively. The total cost per participant from surgery to 3 months postoperatively differed little by group, just £182 less (standard error £488) in the restrictive group, largely owing to the difference in red blood cells cost. In the base-case cost-effectiveness results, the point estimate suggested that the restrictive threshold was cost-effective; however, this result was very uncertain partly owing to the negligible difference in quality-adjusted life-years gained. A restrictive transfusion threshold is not superior to a liberal threshold after cardiac surgery. This finding supports restrictive transfusion due to reduced consumption and costs of red blood cells. However, secondary findings create uncertainty about recommending restrictive transfusion and prompt a new hypothesis that liberal transfusion may be superior after cardiac surgery. Reanalyses of existing trial datasets, excluding all participants who did not breach the liberal threshold, followed by a meta-analysis of the reanalysed results are the most obvious research steps to address the new hypothesis about the possible harm of red blood cell transfusion. Current Controlled Trials ISRCTN70923932. This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 20, No. 60. See the NIHR Journals Library website for further project information.
Pereira, L J; Foureaux, R C; Pereira, C V; Alves, M C; Campos, C H; Rodrigues Garcia, R C M; Andrade, E F; Gonçalves, T M S V
2016-07-01
The relationship between type 2 diabetes oral physiology, nutritional intake and quality of life has not been fully elucidated. We assessed the impact of type 2 diabetes - exclusive or associated with hypertension with beta-blockers treatment - on oral physiology, mastication, nutrition and quality of life. This cross-sectional study was performed with 78 complete dentate subjects (15 natural teeth and six masticatory units minimum; without removable or fixed prostheses), divided into three groups: diabetics (DM) (n = 20; 45·4 ± 9·5 years), diabetics with hypertension and receiving beta-blockers treatment (DMH) (n = 19; 41·1 ± 5·1 years) and controls (n = 39; 44·5 ± 11·7 years) matched for gender, age and socioeconomic status. Blood glucose, masticatory performance, swallowing threshold, taste, food intake, stimulated and unstimulated salivary flow, pH and buffering capacity of saliva were assessed. Glycemia was higher in DM than in controls (P < 0·01). No differences were observed between DM and controls for nutrition and quality of life. Both stimulated and unstimulated salivary flow rate were lower in DMH (P < 0·01), which also presented the lowest number of teeth and masticatory units (P < 0·0001), and reduction in the number of chewing cycles (P < 0·01). Controls showed lower Decayed Missing Filled Teeth index (DMFT) scores in comparison with DMH (P = 0·021). Masticatory performance and saliva buffering capacity were similar among groups. Exclusive type 2 diabetes did not alter oral physiology, nutrition or quality of life. However, when hypertension and beta-blockers treatment were associated with diabetes, the salivary flow rate, chewing cycles and number of teeth decreased. © 2016 John Wiley & Sons Ltd.
Astaraie-Imani, Maryam; Kapelan, Zoran; Fu, Guangtao; Butler, David
2012-12-15
Climate change and urbanisation are key factors affecting the future of water quality and quantity in urbanised catchments and are associated with significant uncertainty. The work reported in this paper is an evaluation of the combined and relative impacts of climate change and urbanisation on the receiving water quality in the context of an Integrated Urban Wastewater System (IUWS) in the UK. The impacts of intervening system operational control parameters are also investigated. Impact is determined by a detailed modelling study using both local and global sensitivity analysis methods together with correlation analysis. The results obtained from the case-study analysed clearly demonstrate that climate change combined with increasing urbanisation is likely to lead to worsening river water quality in terms of both frequency and magnitude of breaching threshold dissolved oxygen and ammonium concentrations. The results obtained also reveal the key climate change and urbanisation parameters that have the largest negative impact as well as the most responsive IUWS operational control parameters including major dependencies between all these parameters. This information can be further utilised to adapt future IUWS operation and/or design which, in turn, should make these systems more resilient to future climate and urbanisation changes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Use of behavioral avoidance testing in natural resource damage assessment
Lipton, J.; Little, E.E.; Marr, J.C.A.; DeLonay, A.J.; Bengston, David A.; Henshel, Diane S.
1996-01-01
Natural Resource Damage Assessment (NRDA) provisions established under federal and state statutes enable natural resource trustees to recover compensation from responsible parties to restore injured natural resources. Behavioral avoidance testing with fish has been used in NRDAs to determine injuries to natural resources and to establish restoration thresholds. In this manuscript we evaluate the use of avoidance testing to NRDA. Specifically, we discuss potential “acceptance criteria” to evaluate the applicability and relevance of avoidance testing. These acceptance criteria include: (1) regulatory relevance, (2) reproducibility of testing, (3) ecological significance, (4) quality assurance/quality control, and (5) relevance to restoration. We discuss each of these criteria with respect to avoidance testing. Overall, we conclude that avoidance testing can be an appropriate, defensible, and desirable aspect of an NRDA.
Risk management in air protection in the Republic of Croatia.
Peternel, Renata; Toth, Ivan; Hercog, Predrag
2014-03-01
In the Republic of Croatia, according to the Air Protection Act, air pollution assessment is obligatory on the whole State territory. For individual regions and populated areas in the State a network has been established for permanent air quality monitoring. The State network consists of stations for measuring background pollution, regional and cross-border remote transfer and measurements as part of international government liabilities, then stations for measuring air quality in areas of cultural and natural heritage, and stations for measuring air pollution in towns and industrial zones. The exceeding of alert and information threshold levels of air pollutants are related to emissions from industrial plants, and accidents. Each excess represents a threat to human health in case of short-time exposure. Monitoring of alert and information threshold levels is carried out at stations from the state and local networks for permanent air quality monitoring according to the Air Quality Measurement Program in the State network for permanent monitoring of air quality and air quality measurement programs in local networks for permanent air quality monitoring. The State network for permanent air quality monitoring has a developed automatic system for reporting on alert and information threshold levels, whereas many local networks under the competence of regional and local self-governments still lack any fully installed systems of this type. In case of accidents, prompt action at all responsibility levels is necessary in order to prevent crisis and this requires developed and coordinated competent units of State Administration as well as self-government units. It is also necessary to be continuously active in improving the implementation of legislative regulations in the field of crises related to critical and alert levels of air pollutants, especially at local levels.
Nakazato, Takeru; Bono, Hidemasa
2017-01-01
Abstract It is important for public data repositories to promote the reuse of archived data. In the growing field of omics science, however, the increasing number of submissions of high-throughput sequencing (HTSeq) data to public repositories prevents users from choosing a suitable data set from among the large number of search results. Repository users need to be able to set a threshold to reduce the number of results to obtain a suitable subset of high-quality data for reanalysis. We calculated the quality of sequencing data archived in a public data repository, the Sequence Read Archive (SRA), by using the quality control software FastQC. We obtained quality values for 1 171 313 experiments, which can be used to evaluate the suitability of data for reuse. We also visualized the data distribution in SRA by integrating the quality information and metadata of experiments and samples. We provide quality information of all of the archived sequencing data, which enable users to obtain sufficient quality sequencing data for reanalyses. The calculated quality data are available to the public in various formats. Our data also provide an example of enhancing the reuse of public data by adding metadata to published research data by a third party. PMID:28449062
NCEP Air Quality Forecast(AQF) Verification. NOAA/NWS/NCEP/EMC
average Select forecast four: Day 1 AOD skill for all thresholds Day 1 Time series for AOD GT 0 Day 2 AOD skill for all thresholds Day 2 Time series for AOD GT 0 Diurnal plots for AOD GT 0 Select statistic type
Iwasaki, Satoshi; Usami, Shin-Ichi; Takahashi, Haruo; Kanda, Yukihiko; Tono, Tetsuya; Doi, Katsumi; Kumakawa, Kozo; Gyo, Kiyofumi; Naito, Yasushi; Kanzaki, Sho; Yamanaka, Noboru; Kaga, Kimitaka
2017-07-01
To report on the safety and efficacy of an investigational active middle ear implant (AMEI) in Japan, and to compare results to preoperative results with a hearing aid. Prospective study conducted in Japan in which 23 Japanese-speaking adults suffering from conductive or mixed hearing loss received a VIBRANT SOUNDBRIDGE with implantation at the round window. Postoperative thresholds, speech perception results (word recognition scores, speech reception thresholds, signal-to-noise ratio [SNR]), and quality of life questionnaires at 20 weeks were compared with preoperative results with all patients receiving the same, best available hearing aid (HA). Statistically significant improvements in postoperative AMEI-aided thresholds (1, 2, 4, and 8 kHz) and on the speech reception thresholds and word recognition scores tests, compared with preoperative HA-aided results, were observed. On the SNR, the subjects' mean values showed statistically significant improvement, with -5.7 dB SNR for the AMEI-aided mean and -2.1 dB SNR for the preoperative HA-assisted mean. The APHAB quality of life questionnaire also showed statistically significant improvement with the AMEI. Results with the AMEI applied to the round window exceeded those of the best available hearing aid in speech perception as well as quality of life questionnaires. There were minimal adverse events or changes to patients' residual hearing.
Dantan, Etienne; Foucher, Yohann; Lorent, Marine; Giral, Magali; Tessier, Philippe
2018-06-01
Defining thresholds of prognostic markers is essential for stratified medicine. Such thresholds are mostly estimated from purely statistical measures regardless of patient preferences potentially leading to unacceptable medical decisions. Quality-Adjusted Life-Years are a widely used preferences-based measure of health outcomes. We develop a time-dependent Quality-Adjusted Life-Years-based expected utility function for censored data that should be maximized to estimate an optimal threshold. We performed a simulation study to compare estimated thresholds when using the proposed expected utility approach and purely statistical estimators. Two applications illustrate the usefulness of the proposed methodology which was implemented in the R package ROCt ( www.divat.fr ). First, by reanalysing data of a randomized clinical trial comparing the efficacy of prednisone vs. placebo in patients with chronic liver cirrhosis, we demonstrate the utility of treating patients with a prothrombin level higher than 89%. Second, we reanalyze the data of an observational cohort of kidney transplant recipients: we conclude to the uselessness of the Kidney Transplant Failure Score to adapt the frequency of clinical visits. Applying such a patient-centered methodology may improve future transfer of novel prognostic scoring systems or markers in clinical practice.
NASA Technical Reports Server (NTRS)
Burns, Bradley M. (Inventor); Blalock, Norman N. (Inventor)
2011-01-01
A short circuit protection system includes an inductor, a switch, a voltage sensing circuit, and a controller. The switch and inductor are electrically coupled to be in series with one another. A voltage sensing circuit is coupled across the switch and the inductor. A controller, coupled to the voltage sensing circuit and the switch, opens the switch when a voltage at the output terminal of the inductor transitions from above a threshold voltage to below the threshold voltage. The controller closes the switch when the voltage at the output terminal of the inductor transitions from below the threshold voltage to above the threshold voltage.
Throughput assurance of wireless body area networks coexistence based on stochastic geometry
Wang, Yinglong; Shu, Minglei; Wu, Shangbin
2017-01-01
Wireless body area networks (WBANs) are expected to influence the traditional medical model by assisting caretakers with health telemonitoring. Within WBANs, the transmit power of the nodes should be as small as possible owing to their limited energy capacity but should be sufficiently large to guarantee the quality of the signal at the receiving nodes. When multiple WBANs coexist in a small area, the communication reliability and overall throughput can be seriously affected due to resource competition and interference. We show that the total network throughput largely depends on the WBANs distribution density (λp), transmit power of their nodes (Pt), and their carrier-sensing threshold (γ). Using stochastic geometry, a joint carrier-sensing threshold and power control strategy is proposed to meet the demand of coexisting WBANs based on the IEEE 802.15.4 standard. Given different network distributions and carrier-sensing thresholds, the proposed strategy derives a minimum transmit power according to varying surrounding environment. We obtain expressions for transmission success probability and throughput adopting this strategy. Using numerical examples, we show that joint carrier-sensing thresholds and transmit power strategy can effectively improve the overall system throughput and reduce interference. Additionally, this paper studies the effects of a guard zone on the throughput using a Matern hard-core point process (HCPP) type II model. Theoretical analysis and simulation results show that the HCPP model can increase the success probability and throughput of networks. PMID:28141841
Mazza, Marianna; Mandelli, Laura; Zaninotto, Leonardo; Nicola, Marco Di; Martinotti, Giovanni; Harnic, Desiree; Bruschi, Angelo; Catalano, Valeria; Tedeschi, Daniela; Colombo, Roberto; Bria, Pietro; Serretti, Alessandro; Janiri, Luigi
2011-12-01
Mixed mood states, even in their sub-threshold forms, may significantly affect the course and outcome of bipolar disorder (BD). To compare two samples of BD patients presenting a major depressive episode and a sub-threshold mixed state in terms of global functioning, clinical outcome, social adjustment and quality of life during a 1-year follow-up. The sample was composed by 90 subjects (Group 1, D) clinically diagnosed with a major depressive episode and 41 patients (Group 2, Mx) for a sub-threshold mixed state. All patients were administered with a pharmacological treatment and evaluated for depressive, anxious and manic symptoms by common rating scales. Further evaluations included a global assessment of severity and functioning, social adjustment and quality of life. All evaluations were performed at baseline and after 1, 3, 6 and 12 months of treatment. The two groups were no different for baseline as well as improvement in global severity and functioning. Though clearly different for symptoms severity, the amount of change of depressive and anxiety symptoms was also no different. Manic symptoms showed instead a trend to persist over time in group 2, whereas a slight increase of manic symptoms was observed in group 1, especially after 6 months of treatment. Moreover, in group 1, some manic symptoms were also detected at the Young Mania Rating Scale (n = 24, 26.6%). Finally, improvement in quality of life and social adjustment was similar in the two groups, though a small trend toward a faster improvement in social adjustment in group 1. Sub-threshold mixed states have a substantial impact on global functioning, social adjustment and subjective well-being, similarly to that of acute phases, or at least major depression. In particular, mixed features, even in their sub-threshold forms, tend to be persistent over time. Finally, manic symptoms may be still often underestimated in depressive episodes, even in patients for BD.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
...-AR32 Implementation of the 2008 National Ambient Air Quality Standards for Ozone: Nonattainment Area Classifications Approach, Attainment Deadlines and Revocation of the 1997 Ozone Standards for Transportation... proposing thresholds for classifying nonattainment areas for the 2008 ozone National Ambient Air Quality...
The case for regime-based water quality standards
G.C. Poole; J.B. Dunham; D.M. Keenan; S.T. Sauter; D.A. McCullough; C. Mebane; J.C. Lockwood; D.A. Essig; M.P. Hicks; D.J. Sturdevant; E.J. Materna; S.A. Spalding; J. Risley; M. Deppman
2004-01-01
Conventional water quality standards have been successful in reducing the concentration of toxic substances in US waters. However, conventional standards are based on simple thresholds and are therefore poorly structured to address human-caused imbalances in dynamic, natural water quality parameters, such as nutrients, sediment, and temperature. A more applicable type...
Flores, Shahida; Sun, Jie; King, Jonathan; Budowle, Bruce
2014-05-01
The GlobalFiler™ Express PCR Amplification Kit uses 6-dye fluorescent chemistry to enable multiplexing of 21 autosomal STRs, 1 Y-STR, 1 Y-indel and the sex-determining marker amelogenin. The kit is specifically designed for processing reference DNA samples in a high throughput manner. Validation studies were conducted to assess the performance and define the limitations of this direct amplification kit for typing blood and buccal reference DNA samples on various punchable collection media. Studies included thermal cycling sensitivity, reproducibility, precision, sensitivity of detection, minimum detection threshold, system contamination, stochastic threshold and concordance. Results showed that optimal amplification and injection parameters for a 1.2mm punch from blood and buccal samples were 27 and 28 cycles, respectively, combined with a 12s injection on an ABI 3500xL Genetic Analyzer. Minimum detection thresholds were set at 100 and 120RFUs for 27 and 28 cycles, respectively, and it was suggested that data from positive amplification controls provided a better threshold representation. Stochastic thresholds were set at 250 and 400RFUs for 27 and 28 cycles, respectively, as stochastic effects increased with cycle number. The minimum amount of input DNA resulting in a full profile was 0.5ng, however, the optimum range determined was 2.5-10ng. Profile quality from the GlobalFiler™ Express Kit and the previously validated AmpFlSTR(®) Identifiler(®) Direct Kit was comparable. The validation data support that reliable DNA typing results from reference DNA samples can be obtained using the GlobalFiler™ Express PCR Amplification Kit. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Transfusion thresholds and other strategies for guiding allogeneic red blood cell transfusion.
Hill, S R; Carless, P A; Henry, D A; Carson, J L; Hebert, P C; McClelland, D B; Henderson, K M
2002-01-01
Most clinical practice guidelines recommend restrictive red cell transfusion practices with the goal of minimising exposure to allogeneic blood (from an unrelated donor). The purpose of this review is to compare clinical outcomes in patients randomised to restrictive versus liberal transfusion thresholds (triggers). To examine the evidence on the effect of transfusion thresholds, on the use of allogeneic and/or autologous blood, and the evidence for any effect on clinical outcomes. Trials were identified by: computer searches of OVID Medline (1966 to December 2000), Current Contents (1993 to Week 48 2000), and the Cochrane Controlled Trials Register (2000 Issue 4). References in identified trials and review articles were checked and authors contacted to identify any additional studies. Controlled trials in which patients were randomised to an intervention group or to a control group. Trials were included where the intervention groups were assigned on the basis of a clear transfusion "trigger", described as a haemoglobin (Hb) or haematocrit (Hct) level below which a RBC transfusion was to be administered. Trial quality was assessed using criteria proposed by Schulz et al. (1995). Relative risks of requiring allogeneic blood transfusion, transfused blood volumes and other clinical outcomes were pooled across trials using a random effects model. Ten trials were identified that reported outcomes for a total of 1780 patients. Restrictive transfusion strategies reduced the risk of receiving a red blood cell (RBC) transfusion by a relative 42% (RR=0.58: 95%CI=0.47,0.71). This equates to an average absolute risk reduction (ARR) of 40% (95%CI=24% to 56%). The volume of RBCs transfused was reduced on average by 0.93 units (95%CI=0.36,1.5 units). However, heterogeneity between these trials was statistically significant (p<0.00001) for these outcomes. Mortality, rates of cardiac events, morbidity, and length of hospital stay were unaffected. Trials were of poor methodological quality. The limited published evidence supports the use of restrictive transfusion triggers in patients who are free of serious cardiac disease. However, most of the data on clinical outcomes were generated by a single trial. The effects of conservative transfusion triggers on functional status, morbidity and mortality, particularly in patients with cardiac disease, need to be tested in further large clinical trials. In countries with inadequate screening of donor blood the data may constitute a stronger basis for avoiding transfusion with allogeneic red cells.
Simple measurement-based admission control for DiffServ access networks
NASA Astrophysics Data System (ADS)
Lakkakorpi, Jani
2002-07-01
In order to provide good Quality of Service (QoS) in a Differentiated Services (DiffServ) network, a dynamic admission control scheme is definitely needed as an alternative to overprovisioning. In this paper, we present a simple measurement-based admission control (MBAC) mechanism for DiffServ-based access networks. Instead of using active measurements only or doing purely static bookkeeping with parameter-based admission control (PBAC), the admission control decisions are based on bandwidth reservations and periodically measured & exponentially averaged link loads. If any link load on the path between two endpoints is over the applicable threshold, access is denied. Link loads are periodically sent to Bandwidth Broker (BB) of the routing domain, which makes the admission control decisions. The information needed in calculating the link loads is retrieved from the router statistics. The proposed admission control mechanism is verified through simulations. Our results prove that it is possible to achieve very high bottleneck link utilization levels and still maintain good QoS.
Adaptive mechanism-based congestion control for networked systems
NASA Astrophysics Data System (ADS)
Liu, Zhi; Zhang, Yun; Chen, C. L. Philip
2013-03-01
In order to assure the communication quality in network systems with heavy traffic and limited bandwidth, a new ATRED (adaptive thresholds random early detection) congestion control algorithm is proposed for the congestion avoidance and resource management of network systems. Different to the traditional AQM (active queue management) algorithms, the control parameters of ATRED are not configured statically, but dynamically adjusted by the adaptive mechanism. By integrating with the adaptive strategy, ATRED alleviates the tuning difficulty of RED (random early detection) and shows a better control on the queue management, and achieve a more robust performance than RED under varying network conditions. Furthermore, a dynamic transmission control protocol-AQM control system using ATRED controller is introduced for the systematic analysis. It is proved that the stability of the network system can be guaranteed when the adaptive mechanism is finely designed. Simulation studies show the proposed ATRED algorithm achieves a good performance in varying network environments, which is superior to the RED and Gentle-RED algorithm, and providing more reliable service under varying network conditions.
Ahrabian, D; Davies, M J; Khunti, K; Yates, T; Gray, A M
2017-01-01
Objectives Prevention of type 2 diabetes mellitus (TD2M) is a priority for healthcare systems. We estimated the cost-effectiveness compared with standard care of a structured education programme (Let's Prevent) targeting lifestyle and behaviour change to prevent progression to T2DM in people with prediabetes. Design Cost-effectiveness analysis alongside randomised controlled trial. Setting 44 general practices in Leicestershire, England. Participants 880 participants with prediabetes randomised to receive either standard care or a 6-hour group structured education programme with follow-up sessions in a primary care setting. Main outcome measure Incremental cost utility from the UK National Health Service (NHS) perspective. Quality of life and resource use measured from baseline and during the 36 months follow-up using the EuroQoL EQ-5D and 15D instruments and an economic questionnaire. Outcomes measured using quality-adjusted life years (QALYs) and healthcare costs calculated in 2012–2013 prices. Results After accounting for clustering and missing data, the intervention group was found to have a net gain of 0.046 (95% CI −0.0171 to 0.109) QALYs over 3 years, adjusted for baseline utility, at an additional cost of £168 (95% CI −395 to 732) per patient compared with the standard care group. The incremental cost-effectiveness ratio is £3643/QALY with an 86% probability of being cost-effective at a willingness to pay threshold of £20 000/QALY. Conclusions The education programme had higher costs and higher quality of life compared with the standard care group. The Let's Prevent programme is very likely to be cost-effective at a willingness to pay threshold of £20 000/QALY gained. Trial registration number ISRCTN80605705. PMID:28069625
Leal, J; Ahrabian, D; Davies, M J; Gray, L J; Khunti, K; Yates, T; Gray, A M
2017-01-09
Prevention of type 2 diabetes mellitus (TD2M) is a priority for healthcare systems. We estimated the cost-effectiveness compared with standard care of a structured education programme (Let's Prevent) targeting lifestyle and behaviour change to prevent progression to T2DM in people with prediabetes. Cost-effectiveness analysis alongside randomised controlled trial. 44 general practices in Leicestershire, England. 880 participants with prediabetes randomised to receive either standard care or a 6-hour group structured education programme with follow-up sessions in a primary care setting. Incremental cost utility from the UK National Health Service (NHS) perspective. Quality of life and resource use measured from baseline and during the 36 months follow-up using the EuroQoL EQ-5D and 15D instruments and an economic questionnaire. Outcomes measured using quality-adjusted life years (QALYs) and healthcare costs calculated in 2012-2013 prices. After accounting for clustering and missing data, the intervention group was found to have a net gain of 0.046 (95% CI -0.0171 to 0.109) QALYs over 3 years, adjusted for baseline utility, at an additional cost of £168 (95% CI -395 to 732) per patient compared with the standard care group. The incremental cost-effectiveness ratio is £3643/QALY with an 86% probability of being cost-effective at a willingness to pay threshold of £20 000/QALY. The education programme had higher costs and higher quality of life compared with the standard care group. The Let's Prevent programme is very likely to be cost-effective at a willingness to pay threshold of £20 000/QALY gained. ISRCTN80605705. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Determination of quality factors by microdosimetry
NASA Astrophysics Data System (ADS)
Al-Affan, I. A. M.; Watt, D. E.
1987-03-01
The application of microdose parameters for the specification of a revised scale of quality factors which would be applicable at low doses and dose rates is examined in terms of an original proposal by Rossi. Two important modifications are suggested to enable an absolute scale of quality factors to be constructed. Allowance should be made to allow for the dependence of the saturation threshold of lineal energy on the type of heavy charged particle. Also, an artificial saturation threshold should be introduced for electron tracks as a mean of modifying the measurements made in the microdosimeter to the more realistic site sizes of nanometer dimensions. The proposed absolute scale of quality factors nicely encompasses the high RBEs of around 3 observed at low doses for tritium β rays and is consistent with the recent recommendation of the ICRP that the quality factor for fast neutrons be increased by a factor of two, assuming that there is no biological repair for the reference radiation.
Codimension-1 Sliding Bifurcations of a Filippov Pest Growth Model with Threshold Policy
NASA Astrophysics Data System (ADS)
Tang, Sanyi; Tang, Guangyao; Qin, Wenjie
A Filippov system is proposed to describe the stage structured nonsmooth pest growth with threshold policy control (TPC). The TPC measure is represented by the total density of both juveniles and adults being chosen as an index for decisions on when to implement chemical control strategies. The proposed Filippov system can have three pieces of sliding segments and three pseudo-equilibria, which result in rich sliding mode bifurcations and local sliding bifurcations including boundary node (boundary focus, or boundary saddle) and tangency bifurcations. As the threshold density varies the model exhibits the interesting global sliding bifurcations sequentially: touching → buckling → crossing → sliding homoclinic orbit to a pseudo-saddle → crossing → touching bifurcations. In particular, bifurcation of a homoclinic orbit to a pseudo-saddle with a figure of eight shape, to a pseudo-saddle-node or to a standard saddle-node have been observed for some parameter sets. This implies that control outcomes are sensitive to the threshold level, and hence it is crucial to choose the threshold level to initiate control strategy. One more sliding segment (or pseudo-equilibrium) is induced by the total density of a population guided switching policy, compared to only the juvenile density guided policy, implying that this control policy is more effective in terms of preventing multiple pest outbreaks or causing the density of pests to stabilize at a desired level such as an economic threshold.
The Threshold of Toxicologic Concern (TTC) is an approach used for a decades in human hazard assessment. A TTC establishes an exposure level for a chemical below which no appreciable risk to human health is expected based upon a de minimis value for toxicity identified for many ...
ERIC Educational Resources Information Center
Shauki, Elvia R.; Benzie, Helen
2017-01-01
This study explores the development of student self-management skills through an oral presentation task. It is motivated by the challenge to maintain consistent quality in students' oral skills and to incorporate national accounting curriculum requirements for threshold learning standards into an accounting subject. The study has been conducted in…
Extended cooperative control synthesis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1994-01-01
This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.
Chica-Olmo, Mario; Luque-Espinar, Juan Antonio; Rodriguez-Galiano, Victor; Pardo-Igúzquiza, Eulogio; Chica-Rivas, Lucía
2014-02-01
Groundwater nitrate pollution associated with agricultural activity is an important environmental problem in the management of this natural resource, as acknowledged by the European Water Framework Directive. Therefore, specific measures aimed to control the risk of water pollution by nitrates must be implemented to minimise its impact on the environment and potential risk to human health. The spatial probability distribution of nitrate contents exceeding a threshold or limit value, established within the quality standard, will be helpful to managers and decision-makers. A methodology based on non-parametric and non-linear methods of Indicator Kriging was used in the elaboration of a nitrate pollution categorical map for the aquifer of Vega de Granada (SE Spain). The map has been obtained from the local estimation of the probability that a nitrate content in an unsampled location belongs to one of the three categories established by the European Water Framework Directive: CL. 1 good quality [Min - 37.5 ppm], CL. 2 intermediate quality [37.5-50 ppm] and CL. 3 poor quality [50 ppm - Max]. The obtained results show that the areas exceeding nitrate concentrations of 50 ppm, poor quality waters, occupy more than 50% of the aquifer area. A great proportion of the area's municipalities are located in these poor quality water areas. The intermediate quality and good quality areas correspond to 21% and 28%, respectively, but with the highest population density. These results are coherent with the experimental data, which show an average nitrate concentration value of 72 ppm, significantly higher than the quality standard limit of 50 ppm. Consequently, the results suggest the importance of planning actions in order to control and monitor aquifer nitrate pollution. © 2013.
Contrast gain control in first- and second-order motion perception.
Lu, Z L; Sperling, G
1996-12-01
A novel pedestal-plus-test paradigm is used to determine the nonlinear gain-control properties of the first-order (luminance) and the second-order (texture-contrast) motion systems, that is, how these systems' responses to motion stimuli are reduced by pedestals and other masking stimuli. Motion-direction thresholds were measured for test stimuli consisting of drifting luminance and texture-contrast-modulation stimuli superimposed on pedestals of various amplitudes. (A pedestal is a static sine-wave grating of the same type and same spatial frequency as the moving test grating.) It was found that first-order motion-direction thresholds are unaffected by small pedestals, but at pedestal contrasts above 1-2% (5-10 x pedestal threshold), motion thresholds increase proportionally to pedestal amplitude (a Weber law). For first-order stimuli, pedestal masking is specific to the spatial frequency of the test. On the other hand, motion-direction thresholds for texture-contrast stimuli are independent of pedestal amplitude (no gain control whatever) throughout the accessible pedestal amplitude range (from 0 to 40%). However, when baseline carrier contrast increases (with constant pedestal modulation amplitude), motion thresholds increase, showing that gain control in second-order motion is determined not by the modulator (as in first-order motion) but by the carrier. Note that baseline contrast of the carrier is inherently independent of spatial frequency of the modulator. The drastically different gain-control properties of the two motion systems and prior observations of motion masking and motion saturation are all encompassed in a functional theory. The stimulus inputs to both first- and second-order motion process are normalized by feedforward, shunting gain control. The different properties arise because the modulator is used to control the first-order gain and the carrier is used to control the second-order gain.
Castell, Stefanie; Schwab, Frank; Geffers, Christine; Bongartz, Hannah; Brunkhorst, Frank M.; Gastmeier, Petra; Mikolajczyk, Rafael T.
2014-01-01
Early and appropriate blood culture sampling is recommended as a standard of care for patients with suspected bloodstream infections (BSI) but is rarely taken into account when quality indicators for BSI are evaluated. To date, sampling of about 100 to 200 blood culture sets per 1,000 patient-days is recommended as the target range for blood culture rates. However, the empirical basis of this recommendation is not clear. The aim of the current study was to analyze the association between blood culture rates and observed BSI rates and to derive a reference threshold for blood culture rates in intensive care units (ICUs). This study is based on data from 223 ICUs taking part in the German hospital infection surveillance system. We applied locally weighted regression and segmented Poisson regression to assess the association between blood culture rates and BSI rates. Below 80 to 90 blood culture sets per 1,000 patient-days, observed BSI rates increased with increasing blood culture rates, while there was no further increase above this threshold. Segmented Poisson regression located the threshold at 87 (95% confidence interval, 54 to 120) blood culture sets per 1,000 patient-days. Only one-third of the investigated ICUs displayed blood culture rates above this threshold. We provided empirical justification for a blood culture target threshold in ICUs. In the majority of the studied ICUs, blood culture sampling rates were below this threshold. This suggests that a substantial fraction of BSI cases might remain undetected; reporting observed BSI rates as a quality indicator without sufficiently high blood culture rates might be misleading. PMID:25520442
Four GABAergic interneurons impose feeding restraint in Drosophila
Pool, Allan-Hermann; Kvello, Pal; Mann, Kevin; Cheung, Samantha K.; Gordon, Michael D.; Wang, Liming; Scott, Kristin
2014-01-01
Summary Feeding is dynamically regulated by the palatability of the food source and the physiological needs of the animal. How consumption is controlled by external sensory cues and internal metabolic state remains under intense investigation. Here, we identify four GABAergic interneurons in the Drosophila brain that establish a central feeding threshold which is required to inhibit consumption. Inactivation of these cells results in indiscriminate and excessive intake of all compounds, independent of taste quality or nutritional state. Conversely, acute activation of these neurons suppresses consumption of water and nutrients. The output from these neurons is required to gate activity in motor neurons that control meal initiation and consumption. Thus, our study reveals a new layer of inhibitory control in feeding circuits that is required to suppress a latent state of unrestricted and non-selective consumption. PMID:24991960
Redox processes and water quality of selected principal aquifer systems
McMahon, P.B.; Chapelle, F.H.
2008-01-01
Reduction/oxidation (redox) conditions in 15 principal aquifer (PA) systems of the United States, and their impact on several water quality issues, were assessed from a large data base collected by the National Water-Quality Assessment Program of the USGS. The logic of these assessments was based on the observed ecological succession of electron acceptors such as dissolved oxygen, nitrate, and sulfate and threshold concentrations of these substrates needed to support active microbial metabolism. Similarly, the utilization of solid-phase electron acceptors such as Mn(IV) and Fe(III) is indicated by the production of dissolved manganese and iron. An internally consistent set of threshold concentration criteria was developed and applied to a large data set of 1692 water samples from the PAs to assess ambient redox conditions. The indicated redox conditions then were related to the occurrence of selected natural (arsenic) and anthropogenic (nitrate and volatile organic compounds) contaminants in ground water. For the natural and anthropogenic contaminants assessed in this study, considering redox conditions as defined by this framework of redox indicator species and threshold concentrations explained many water quality trends observed at a regional scale. An important finding of this study was that samples indicating mixed redox processes provide information on redox heterogeneity that is useful for assessing common water quality issues. Given the interpretive power of the redox framework and given that it is relatively inexpensive and easy to measure the chemical parameters included in the framework, those parameters should be included in routine water quality monitoring programs whenever possible.
Twofold processing for denoising ultrasound medical images.
Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y
2015-01-01
Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India.
NASA Technical Reports Server (NTRS)
Alston, Erica J.; Sokolik, Irina, N.; Doddridge, Bruce G.
2011-01-01
Poor air quality episodes occur often in metropolitan Atlanta, Georgia. The primary focus of this research is to assess the capability of satellites as a tool in characterizing air quality in Atlanta. Results indicate that intra-city PM2.5 concentrations show similar patterns as other U.S. urban areas, with the highest concentrations occurring within the city. Both PM2.5 and MODIS AOD show more increases in the summer than spring, yet MODIS AOD doubles in the summer unlike PM2.5. A majority of OMI AI is below 0.5. Using this value as an ambient measure of carbonaceous aerosols in the urban area, aerosol transport events can be identified. Our results indicate that MODIS AOD is well correlated with PM2.5 on a yearly and seasonal basis with correlation coefficients as high as 0.8 for Terra and 0.7 for Aqua. A possible alternative view of the PM2.5 and AOD relationship is seen through the use of AOD thresholds. These probabilistic thresholds provide a means to describe the AQI through the use of past AOD for a specific area. We use the NAAQS to classify the AOD into different AQI codes, and probabilistically determine thresholds of AOD that represent the majority of a specific AQI category. For example, the majority 80% of moderate AQI days have AOD values between 0.5 - 0.6. The development of thresholds could be a tool used to evaluate air quality from the use of satellites in regions where there are sparse ground-based measurements of PM2.5.
Adibe, Maxwell O; Aguwa, Cletus N; Ukwe, Chinwe V
To assess the cost-effectiveness of pharmaceutical care (PC) intervention versus usual care (UC) in the management of type 2 diabetes. This study was a randomized, controlled study with a 12-month patient follow-up in two Nigerian tertiary hospitals. One hundred and ten patients were randomly assigned to each of the "intervention" (PC) and the "control" (UC) groups. Patients in the UC group received the usual/conventional care offered by the hospitals. Patients in the PC group received UC and PC in the form of structural self-care education and training for 12 months. The economic evaluation was based on patients' perspective. Costs of management of individual complications were calculated from activities involved in their management by using activity-based costing. The impact of the interventions on quality of life was estimated by using the HUI23S4EN.40Q (Mark index 3) questionnaire. The primary outcomes were incremental cost-utility ratio and net monetary benefit. An intention-to-treat approach was used. Two-sample comparisons were made by using Student's t tests for normally distributed variables data at baseline, 6 months, and 12 months. Comparisons of proportions were done by using the chi-square test. The PC intervention led to incremental cost and effect of Nigerian naira (NGN) 10,623 ($69) and 0.12 quality-adjusted life-year (QALY) gained, respectively, with an associated incremental cost-utility ratio of NGN 88,525 ($571) per QALY gained. In the cost-effectiveness acceptability curve, the probability that PC was more cost-effective than UC was 95% at the NGN 250,000 ($1613) per QALY gained threshold and 52% at the NGN 88,600 ($572) per QALY gained threshold. The PC intervention was very cost-effective among patients with type 2 diabetes at the NGN 88,525 ($571.13) per QALY gained threshold, although considerable uncertainty surrounds these estimates. Copyright © 2013, International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc.
Holographic Associative Memory System Using A Thresholding Microchannel Spatial Light Modulator
NASA Astrophysics Data System (ADS)
Song, Q. W.; Yu, Francis T.
1989-05-01
Experimental implementation of a holographic optical associative memory system using a thresholding microchannel spatial light modulator (MSLM) is presented. The first part of the system is basically a joint transform correlator, in which a liquid crystal light valve is used as a square-law converter for the inner product of the addressing and input memories. The MSLM is used as an active element to recall the associated data. If the device is properly thresholded, the system is capable of improving the quality of the output image.
Chen, Yue; Ekstrom, Tor
2016-05-01
Face perception impairment in schizophrenia has been demonstrated, mostly through experimental studies. How this laboratory-defined behavioral impairment is associated with patients' perceptual experience of various faces in everyday life is however unclear. This question is important because a first-person account of face perception has direct consequences on social functioning of patients. In this study, we adapted and administered a self-reported questionnaire on narrative perceptual experience of faces along with psychophysical assessments of face perception in schizophrenia. The self-reported questionnaire includes six rating items of face-related functioning in everyday life, providing a subjective measure of face perception. The psychophysical assessment determines perceptual threshold for discriminating different facial identities, providing an objective measure of face perception. Compared to controls (n = 25), patients (n = 35) showed significantly lower scores (worse performance) in the subjective assessment and significantly higher thresholds (worse performance) in the objective assessment. The subjective and objective face perception assessments were moderately correlated in controls but not in patients. The subjective face perception assessments were significantly correlated with measurements of a social cognitive ability (Theory of Mind), again in controls but not in patients. These results suggest that in schizophrenia the quality of face-related functioning in everyday life is degraded and the role that basic face discrimination capacity plays in face-related everyday functioning is disrupted. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chen, Yue; Ekstrom, Tor
2016-01-01
Objectives Face perception impairment in schizophrenia has been demonstrated, mostly through experimental studies. How this laboratory-defined behavioral impairment is associated with patients’ perceptual experience of various faces in everyday life is however unclear. This question is important because a first-person account of face perception has direct consequences on social functioning of patients. In this study, we adapted and administered a self-reported questionnaire on narrative perceptual experience of faces along with psychophysical assessments of face perception in schizophrenia. Methods The self-reported questionnaire includes six rating items of face-related functioning in everyday life, providing a subjective measure of face perception. The psychophysical assessment determines perceptual threshold for discriminating different facial identities, providing an objective measure of face perception. Results Compared to controls (n=25), patients (n=35) showed significantly lower scores (worse performance) in the subjective assessment and significantly higher thresholds (worse performance) in the objective assessment. The subjective and objective face perception assessments were moderately correlated in controls but not in patients. The subjective face perception assessments were significantly correlated with measurements of a social cognitive ability (Theory of Mind), again in controls but not in patients. Conclusion These results suggest that in schizophrenia the quality of face-related functioning in everyday life is degraded and the role that basic face discrimination capacity plays in face-related everyday functioning is disrupted. PMID:26938027
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Government contract quality assurance for acquisitions at or below the simplified acquisition threshold. 246.404 Section 246.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT QUALITY ASSURANCE...
Helås, T; Sagafos, D; Kleggetveit, I P; Quiding, H; Jönsson, B; Segerdahl, M; Zhang, Z; Salter, H; Schmelz, M; Jørum, E
2017-09-01
Nociceptive thresholds and supra-threshold pain ratings as well as their reduction upon local injection with lidocaine were compared between healthy subjects and patients with erythromelalgia (EM). Lidocaine (0.25, 0.50, 1.0 or 10 mg/mL) or placebo (saline) was injected intradermally in non-painful areas of the lower arm, in a randomized, double-blind manner, to test the effect on dynamic and static mechanical sensitivity, mechanical pain sensitivity, thermal thresholds and supra-threshold heat pain sensitivity. Heat pain thresholds and pain ratings to supra-threshold heat stimulation did not differ between EM-patients (n = 27) and controls (n = 25), neither did the dose-response curves for lidocaine. Only the subgroup of EM-patients with mutations in sodium channel subunits Na V 1.7, 1.8 or 1.9 (n = 8) had increased lidocaine sensitivity for supra-threshold heat stimuli, contrasting lower sensitivity to strong mechanical stimuli. This pattern was particularly clear in the two patients carrying the Na V 1.7 I848T mutations in whom lidocaine's hyperalgesic effect on mechanical pain sensitivity contrasted more effective heat analgesia. Heat pain thresholds are not sensitized in EM patients, even in those with gain-of-function mutations in Na V 1.7. Differential lidocaine sensitivity was overt only for noxious stimuli in the supra-threshold range suggesting that sensitized supra-threshold encoding is important for the clinical pain phenotype in EM in addition to lower activation threshold. Intracutaneous lidocaine dose-dependently blocked nociceptive sensations, but we did not identify EM patients with particular high lidocaine sensitivity that could have provided valuable therapeutic guidance. Acute pain thresholds and supra-threshold heat pain in controls and patients with erythromelalgia do not differ and have the same lidocaine sensitivity. Acute heat pain thresholds even in EM patients with the Na V 1.7 I848T mutation are normal and only nociceptor sensitivity to intradermal lidocaine is changed. Only in EM patients with mutations in Na V 1.7, 1.8 or 1.9 supra-threshold heat and mechanical pain shows differential lidocaine sensitivity as compared to controls. © 2017 European Pain Federation - EFIC®.
Harris, L K; Whay, H R; Murrell, J C
2018-04-01
This study investigated the effects of osteoarthritis (OA) on somatosensory processing in dogs using mechanical threshold testing. A pressure algometer was used to measure mechanical thresholds in 27 dogs with presumed hind limb osteoarthritis and 28 healthy dogs. Mechanical thresholds were measured at the stifles, radii and sternum, and were correlated with scores from an owner questionnaire and a clinical checklist, a scoring system that quantified clinical signs of osteoarthritis. The effects of age and bodyweight on mechanical thresholds were also investigated. Multiple regression models indicated that, when bodyweight was taken into account, dogs with presumed osteoarthritis had lower mechanical thresholds at the stifles than control dogs, but not at other sites. Non-parametric correlations showed that clinical checklist scores and questionnaire scores were negatively correlated with mechanical thresholds at the stifles. The results suggest that mechanical threshold testing using a pressure algometer can detect primary, and possibly secondary, hyperalgesia in dogs with presumed osteoarthritis. This suggests that the mechanical threshold testing protocol used in this study might facilitate assessment of somatosensory changes associated with disease progression or response to treatment. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Hinsby, Klaus; Markager, Stiig; Kronvang, Brian; Windolf, Jørgen; Sonnenborg, Torben; Sørensen, Lærke
2015-04-01
Nitrate, which typically makes up the major part (~>90%) of dissolved inorganic nitrogen in groundwater and surface water, is the most frequent pollutant responsible for European groundwater bodies failing to meet the good status objectives of the European Water Framework Directive generally when comparing groundwater monitoring data with the nitrate quality standard of the Groundwater Directive (50 mg/l = the WHO drinking water standard). Still, while more than 50 % of the European surface water bodies do not meet the objective of good ecological status "only" 25 % of groundwater bodies do not meet the objective of good chemical status according to the river basin management plans reported by the EU member states. However, based on a study on interactions between groundwater, streams and a Danish estuary we argue that nitrate threshold values for aerobic groundwater often need to be significantly below the nitrate quality standard to ensure good ecological status of associated surface water bodies, and hence that the chemical status of European groundwater is worse than indicated by the present assessments. Here we suggest a methodology for derivation of groundwater and stream threshold values for total nitrogen ("nitrate") in a coastal catchment based on assessment of maximum acceptable nitrogen loadings (thresholds) to the associated vulnerable estuary. The applied method use existing information on agricultural practices and point source emissions in the catchment, groundwater, stream quantity and quality monitoring data that all feed data to an integrated groundwater and surface water modelling tool enabling us to conduct an assessment of total nitrogen loads and threshold concentrations derived to ensure/restore good ecological status of the investigated estuary. For the catchment to the Horsens estuary in Denmark we estimate the stream and groundwater thresholds for total nitrogen to be about 13 and 27 mg/l (~ 12 and 25 mg/l of nitrate). The shown example of deriving nitrogen threshold concentrations is for groundwater and streams in a coastal catchment discharging to a vulnerable estuary in Denmark, but the principles may be applied to large river basins with sub-catchments in several countries such as e.g. the Danube or the Rhine. In this case the relevant countries need to collaborate on derivation of nitrogen thresholds based on e.g. maximum acceptable nitrogen loadings to the Black Sea / the North Sea, and finally agree on thresholds for different parts of the river basin. Phosphorus is another nutrient which frequently results in or contributes to the eutrophication of surface waters. The transport and retention processes of total phosphorus (TP) is more complex than for nitrate (or alternatively total N), and presently we are able to establish TP thresholds for streams but not for groundwater. Derivation of TP thresholds is covered in an accompanying paper by Kronvang et al.
Effect of acute stress on taste perception: in relation with baseline anxiety level and body weight.
Ileri-Gurel, Esin; Pehlivanoglu, Bilge; Dogan, Murat
2013-01-01
We aimed to determine the effect of acute stress on taste perception and its modulation in relation to body weight and baseline anxiety in this study. The anxiety of the participants, randomly allocated to stress (n = 35) or control (n = 16) groups, was assessed by State Trait Anxiety Inventory. Stroop color-word interference and cold pressor tests were applied as stress protocol. Glucose and salt taste detection thresholds were evaluated before and after the stress protocol in the stress group and corresponding times in the control group. Stress protocol increased heart rate and blood pressure as an indicator of stress system activation. Following stress glucose and salt thresholds decreased in the stress group, unchanged in the control group. Prestress salt thresholds were positively and decrements in salt thresholds were negatively correlated with trait anxiety scores of participants. The state anxiety levels of stress group positively correlated with the decrease in glucose thresholds. Waist-to-hip ratio was negatively correlated with prestress salt thresholds of the subjects. Our results revealed that thresholds for sweet and salty tastes are modulated during stressful conditions. Our data also demonstrated a relationship between taste perception and baseline anxiety levels of healthy individuals, which may be important to understand the appetite alterations in individuals under stressful conditions.
Paungmali, Aatit; Joseph, Leonard H; Sitilertpisan, Patraporn; Pirunsan, Ubon; Uthaikhup, Sureeporn
2017-11-01
Lumbopelvic stabilization training (LPST) may provide therapeutic benefits on pain modulation in chronic nonspecific low back pain conditions. This study aimed to examine the effects of LPST on pain threshold and pain intensity in comparison with the passive automated cycling intervention and control intervention among patients with chronic nonspecific low back pain. A within-subject, repeated-measures, crossover randomized controlled design was conducted among 25 participants (7 males and 18 females) with chronic nonspecific low back pain. All the participants received 3 different types of experimental interventions, which included LPST, the passive automated cycling intervention, and the control intervention randomly, with 48 hours between the sessions. The pressure pain threshold (PPT), hot-cold pain threshold, and pain intensity were estimated before and after the interventions. Repeated-measures analysis of variance showed that LPST provided therapeutic effects as it improved the PPT beyond the placebo and control interventions (P < 0.01). The pain intensity under the LPST condition was significantly better than that under the passive automated cycling intervention and controlled intervention (P < 0.001). Heat pain threshold under the LPST condition also showed a significant trend of improvement beyond the control (P < 0.05), but no significant effects on cold pain threshold were evident. Lumbopelvic stabilization training may provide therapeutic effects by inducing pain modulation through an improvement in the pain threshold and reduction in pain intensity. LPST may be considered as part of the management programs for treatment of chronic low back pain. © 2017 World Institute of Pain.
Response threshold variance as a basis of collective rationality
Yamamoto, Tatsuhiro
2017-01-01
Determining the optimal choice among multiple options is necessary in various situations, and the collective rationality of groups has recently become a major topic of interest. Social insects are thought to make such optimal choices by collecting individuals' responses relating to an option's value (=a quality-graded response). However, this behaviour cannot explain the collective rationality of brains because neurons can make only ‘yes/no’ responses on the basis of the response threshold. Here, we elucidate the basic mechanism underlying the collective rationality of such simple units and show that an ant species uses this mechanism. A larger number of units respond ‘yes’ to the best option available to a collective decision-maker using only the yes/no mechanism; thus, the best option is always selected by majority decision. Colonies of the ant Myrmica kotokui preferred the better option in a binary choice experiment. The preference of a colony was demonstrated by the workers, which exhibited variable thresholds between two options' qualities. Our results demonstrate how a collective decision-maker comprising simple yes/no judgement units achieves collective rationality without using quality-graded responses. This mechanism has broad applicability to collective decision-making in brain neurons, swarm robotics and human societies. PMID:28484636
Response threshold variance as a basis of collective rationality.
Yamamoto, Tatsuhiro; Hasegawa, Eisuke
2017-04-01
Determining the optimal choice among multiple options is necessary in various situations, and the collective rationality of groups has recently become a major topic of interest. Social insects are thought to make such optimal choices by collecting individuals' responses relating to an option's value (=a quality-graded response). However, this behaviour cannot explain the collective rationality of brains because neurons can make only 'yes/no' responses on the basis of the response threshold. Here, we elucidate the basic mechanism underlying the collective rationality of such simple units and show that an ant species uses this mechanism. A larger number of units respond 'yes' to the best option available to a collective decision-maker using only the yes/no mechanism; thus, the best option is always selected by majority decision. Colonies of the ant Myrmica kotokui preferred the better option in a binary choice experiment. The preference of a colony was demonstrated by the workers, which exhibited variable thresholds between two options' qualities. Our results demonstrate how a collective decision-maker comprising simple yes/no judgement units achieves collective rationality without using quality-graded responses. This mechanism has broad applicability to collective decision-making in brain neurons, swarm robotics and human societies.
Ecosystem Modeling Applied to Nutrient Criteria Development in Rivers
NASA Astrophysics Data System (ADS)
Carleton, James N.; Park, Richard A.; Clough, Jonathan S.
2009-09-01
Threshold concentrations for biological impairment by nutrients are difficult to quantify in lotic systems, yet States and Tribes in the United States are charged with developing water quality criteria to protect these ecosystems from excessive enrichment. The analysis described in this article explores the use of the ecosystem model AQUATOX to investigate impairment thresholds keyed to biological indexes that can be simulated. The indexes selected for this exercise include percentage cyanobacterial biomass of sestonic algae, and benthic chlorophyll a. The calibrated model was used to analyze responses of these indexes to concurrent reductions in phosphorus, nitrogen, and suspended sediment in an enriched upper Midwestern river. Results suggest that the indexes would respond strongly to changes in phosphorus and suspended sediment, and less strongly to changes in nitrogen concentration. Using simulated concurrent reductions in all three water quality constituents, a total phosphorus concentration of 0.1 mg/l was identified as a threshold concentration, and therefore a hypothetical water quality criterion, for prevention of both excessive periphyton growth and sestonic cyanobacterial blooms. This kind of analysis is suggested as a way to evaluate multiple contrasting impacts of hypothetical nutrient and sediment reductions and to define nutrient criteria or target concentrations that balance multiple management objectives concurrently.
Truscott, James E; Werkman, Marleen; Wright, James E; Farrell, Sam H; Sarkar, Rajiv; Ásbjörnsdóttir, Kristjana; Anderson, Roy M
2017-06-30
There is an increased focus on whether mass drug administration (MDA) programmes alone can interrupt the transmission of soil-transmitted helminths (STH). Mathematical models can be used to model these interventions and are increasingly being implemented to inform investigators about expected trial outcome and the choice of optimum study design. One key factor is the choice of threshold for detecting elimination. However, there are currently no thresholds defined for STH regarding breaking transmission. We develop a simulation of an elimination study, based on the DeWorm3 project, using an individual-based stochastic disease transmission model in conjunction with models of MDA, sampling, diagnostics and the construction of study clusters. The simulation is then used to analyse the relationship between the study end-point elimination threshold and whether elimination is achieved in the long term within the model. We analyse the quality of a range of statistics in terms of the positive predictive values (PPV) and how they depend on a range of covariates, including threshold values, baseline prevalence, measurement time point and how clusters are constructed. End-point infection prevalence performs well in discriminating between villages that achieve interruption of transmission and those that do not, although the quality of the threshold is sensitive to baseline prevalence and threshold value. Optimal post-treatment prevalence threshold value for determining elimination is in the range 2% or less when the baseline prevalence range is broad. For multiple clusters of communities, both the probability of elimination and the ability of thresholds to detect it are strongly dependent on the size of the cluster and the size distribution of the constituent communities. Number of communities in a cluster is a key indicator of probability of elimination and PPV. Extending the time, post-study endpoint, at which the threshold statistic is measured improves PPV value in discriminating between eliminating clusters and those that bounce back. The probability of elimination and PPV are very sensitive to baseline prevalence for individual communities. However, most studies and programmes are constructed on the basis of clusters. Since elimination occurs within smaller population sub-units, the construction of clusters introduces new sensitivities for elimination threshold values to cluster size and the underlying population structure. Study simulation offers an opportunity to investigate key sources of sensitivity for elimination studies and programme designs in advance and to tailor interventions to prevailing local or national conditions.
Controlling the misuse of cobalt in horses.
Ho, Emmie N M; Chan, George H M; Wan, Terence S M; Curl, Peter; Riggs, Christopher M; Hurley, Michael J; Sykes, David
2015-01-01
Cobalt is a well-established inducer of hypoxia-like responses, which can cause gene modulation at the hypoxia inducible factor pathway to induce erythropoietin transcription. Cobalt salts are orally active, inexpensive, and easily accessible. It is an attractive blood doping agent for enhancing aerobic performance. Indeed, recent intelligence and investigations have confirmed cobalt was being abused in equine sports. In this paper, population surveys of total cobalt in raceday samples were conducted using inductively coupled plasma mass spectrometry (ICP-MS). Urinary threshold of 75 ng/mL and plasma threshold of 2 ng/mL could be proposed for the control of cobalt misuse in raceday or in-competition samples. Results from administration trials with cobalt-containing supplements showed that common supplements could elevate urinary and plasma cobalt levels above the proposed thresholds within 24 h of administration. It would therefore be necessary to ban the use of cobalt-containing supplements on raceday as well as on the day before racing in order to implement and enforce the proposed thresholds. Since the abuse with huge quantities of cobalt salts can be done during training while the use of legitimate cobalt-containing supplements are also allowed, different urinary and plasma cobalt thresholds would be required to control cobalt abuse in non-raceday or out-of-competition samples. This could be achieved by setting the thresholds above the maximum urinary and plasma cobalt concentrations observed or anticipated from the normal use of legitimate cobalt-containing supplements. Urinary threshold of 2000 ng/mL and plasma threshold of 10 ng/mL were thus proposed for the control of cobalt abuse in non-raceday or out-of-competition samples. Copyright © 2014 John Wiley & Sons, Ltd.
Nylen, Kirk; Likhodii, Sergei; Abdelmalik, Peter A; Clarke, Jasper; Burnham, W McIntyre
2005-08-01
The pentylenetetrazol (PTZ) infusion test was used to compare seizure thresholds in adult and young rats fed either a 4:1 ketogenic diet (KD) or a 6.3:1 KD. We hypothesized that both KDs would significantly elevate seizure thresholds and that the 4:1 KD would serve as a better model of the KD used clinically. Ninety adult rats and 75 young rats were placed on one of five experimental diets: (a) a 4:1 KD, (b) a control diet balanced to the 4:1 KD, (c) a 6.3:1 KD, (d) a standard control diet, or (e) an ad libitum standard control diet. All subjects were seizure tested by using the PTZ infusion test. Blood glucose and beta-hydroxybutyrate (beta-OHB) levels were measured. Neither KD elevated absolute "latencies to seizure" in young or adult rats. Similarly, neither KD elevated "threshold doses" in adult rats. In young rats, the 6.3:1 KD, but not the 4:1 KD, significantly elevated threshold doses. The 6.3:1 KD group showed poorer weight gain than the 4:1 KD group when compared with respective controls. The most dramatic discrepancies were seen in young rats. "Threshold doses" and "latency to seizure" data provided conflicting measures of seizure threshold. This was likely due to the inflation of threshold doses calculated by using the much smaller body weights found in the 6.3:1 KD group. Ultimately, the PTZ infusion test in rats may not be a good preparation to model the anticonvulsant effects of the KD seen clinically, especially when dietary treatments lead to significantly mismatched body weights between the groups.
Pagnuco, Inti Anabela; Revuelta, María Victoria; Bondino, Hernán Gabriel; Brun, Marcel; Ten Have, Arjen
2018-01-01
Protein superfamilies can be divided into subfamilies of proteins with different functional characteristics. Their sequences can be classified hierarchically, which is part of sequence function assignation. Typically, there are no clear subfamily hallmarks that would allow pattern-based function assignation by which this task is mostly achieved based on the similarity principle. This is hampered by the lack of a score cut-off that is both sensitive and specific. HMMER Cut-off Threshold Tool (HMMERCTTER) adds a reliable cut-off threshold to the popular HMMER. Using a high quality superfamily phylogeny, it clusters a set of training sequences such that the cluster-specific HMMER profiles show cluster or subfamily member detection with 100% precision and recall (P&R), thereby generating a specific threshold as inclusion cut-off. Profiles and thresholds are then used as classifiers to screen a target dataset. Iterative inclusion of novel sequences to groups and the corresponding HMMER profiles results in high sensitivity while specificity is maintained by imposing 100% P&R self detection. In three presented case studies of protein superfamilies, classification of large datasets with 100% precision was achieved with over 95% recall. Limits and caveats are presented and explained. HMMERCTTER is a promising protein superfamily sequence classifier provided high quality training datasets are used. It provides a decision support system that aids in the difficult task of sequence function assignation in the twilight zone of sequence similarity. All relevant data and source codes are available from the Github repository at the following URL: https://github.com/BBCMdP/HMMERCTTER.
Pagnuco, Inti Anabela; Revuelta, María Victoria; Bondino, Hernán Gabriel; Brun, Marcel
2018-01-01
Background Protein superfamilies can be divided into subfamilies of proteins with different functional characteristics. Their sequences can be classified hierarchically, which is part of sequence function assignation. Typically, there are no clear subfamily hallmarks that would allow pattern-based function assignation by which this task is mostly achieved based on the similarity principle. This is hampered by the lack of a score cut-off that is both sensitive and specific. Results HMMER Cut-off Threshold Tool (HMMERCTTER) adds a reliable cut-off threshold to the popular HMMER. Using a high quality superfamily phylogeny, it clusters a set of training sequences such that the cluster-specific HMMER profiles show cluster or subfamily member detection with 100% precision and recall (P&R), thereby generating a specific threshold as inclusion cut-off. Profiles and thresholds are then used as classifiers to screen a target dataset. Iterative inclusion of novel sequences to groups and the corresponding HMMER profiles results in high sensitivity while specificity is maintained by imposing 100% P&R self detection. In three presented case studies of protein superfamilies, classification of large datasets with 100% precision was achieved with over 95% recall. Limits and caveats are presented and explained. Conclusions HMMERCTTER is a promising protein superfamily sequence classifier provided high quality training datasets are used. It provides a decision support system that aids in the difficult task of sequence function assignation in the twilight zone of sequence similarity. All relevant data and source codes are available from the Github repository at the following URL: https://github.com/BBCMdP/HMMERCTTER. PMID:29579071
Bayesian framework inspired no-reference region-of-interest quality measure for brain MRI images
Osadebey, Michael; Pedersen, Marius; Arnold, Douglas; Wendel-Mitoraj, Katrina
2017-01-01
Abstract. We describe a postacquisition, attribute-based quality assessment method for brain magnetic resonance imaging (MRI) images. It is based on the application of Bayes theory to the relationship between entropy and image quality attributes. The entropy feature image of a slice is segmented into low- and high-entropy regions. For each entropy region, there are three separate observations of contrast, standard deviation, and sharpness quality attributes. A quality index for a quality attribute is the posterior probability of an entropy region given any corresponding region in a feature image where quality attribute is observed. Prior belief in each entropy region is determined from normalized total clique potential (TCP) energy of the slice. For TCP below the predefined threshold, the prior probability for a region is determined by deviation of its percentage composition in the slice from a standard normal distribution built from 250 MRI volume data provided by Alzheimer’s Disease Neuroimaging Initiative. For TCP above the threshold, the prior is computed using a mathematical model that describes the TCP–noise level relationship in brain MRI images. Our proposed method assesses the image quality of each entropy region and the global image. Experimental results demonstrate good correlation with subjective opinions of radiologists for different types and levels of quality distortions. PMID:28630885
Evaluation Of Water Quality At River Bian In Merauke Papua
NASA Astrophysics Data System (ADS)
Djaja, Irba; Purwanto, P.; Sunoko, H. R.
2018-02-01
River Bian in Merauke Regency has been utilized by local people in Papua (the Marind) who live along the river for fulfilling their daily needs, such as shower, cloth and dish washing, and even defecation, waste disposal, including domestic waste, as well as for ceremonial activities related to the locally traditional culture. Change in land use for other necessities and domestic activities of the local people have mounted pressures on the status of the River Bian, thus decreasing the quality of the river. This study had objectives to find out and to analyze river water quality and water quality status of the River Bian, and its compliance with water quality standards for ideal use. The study determined sample point by a purposive sampling method, taking the water samples with a grab method. The analysis of the water quality was performed by standard and pollution index methods. The study revealed that the water quality of River Bian, concerning BOD, at the station 3 had exceeded quality threshold. COD parameter for all stations had exceeded the quality threshold for class III. At three stations, there was a decreasing value due to increasing PI, as found at the stations 1, 2, and 3. In other words, River Bian had been lightly contaminated.
Lin, Michelle; Thoma, Brent; Trueger, N Seth; Ankel, Felix; Sherbino, Jonathan; Chan, Teresa
2015-10-01
Quality assurance concerns about social media platforms used for education have arisen within the medical education community. As more trainees and clinicians use resources such as blogs and podcasts for learning, we aimed to identify quality indicators for these resources. A previous study identified 151 potentially relevant quality indicators for these social media resources. To identify quality markers for blogs and podcasts using an international cohort of health professions educators. A self-selected group of 44 health professions educators at the 2014 International Conference on Residency Education participated in a Social Media Summit during which a modified Delphi consensus study was conducted to determine which of the 151 quality indicators met the a priori ≥90% inclusion threshold. Thirteen quality indicators classified into the domains of credibility (n=8), content (n=4) and design (n=1) met the inclusion threshold. The quality indicators that were identified may serve as a foundation for further research on quality indicators of social media-based medical education resources and prompt discussion of their legitimacy as a form of educational scholarship. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Postural control after a prolonged treadmill run at individual ventilatory and anaerobic threshold.
Guidetti, Laura; Franciosi, Emanuele; Gallotta, Maria Chiara; Emerenziani, Gian Pietro; Baldari, Carlo
2011-01-01
The objective of the study was to verify whether young males' balance was affected by 30min prolonged treadmill running (TR) at individual ventilatory (IVT) and anaerobic (IAT) thresholds in recovery time. The VO2max, IAT and IVT during an incremental TR were determined. Mean displacement amplitude (Acp) and velocity (Vcp) of center of pressure were recorded before (pre) and after (0min post; 5min post; and 10min post) prolonged TR at IAT and IVT, through posturographic trials performed with eyes open (EO) and closed (EC). Significant differences between IVT and IAT for Vcp, between EO and EC for Acp and Vcp, were observed. The IAT induced higher destabilizing effect when postural trials were performed with EC. The IVT intensity produced also a destabilizing effect on postural control immediately after exercise. An impairment of postural control after prolonged treadmill running exercise at IVT and IAT intensity was showed. However, destabilizing effect on postural control disappeared within 10min after IAT intensity and within 5min after IVT intensity. Key pointsTo verify whether young males' balance was affected by 30min prolonged treadmill running at individual ventilatory and anaerobic thresholds in recovery time.Mean displacement amplitude and velocity of foot pressure center were recorded before and after prolonged treadmill running at individual ventilatory and anaerobic thresholds, through posturographic trials performed with eyes open and closed.Destabilizing effect on postural control disappeared within 10min post individual anaerobic threshold, and within 5min post individual ventilatory threshold.
Sanchez-Migallon Guzman, David; KuKanich, Butch; Keuler, Nicholas S; Klauer, Julia M; Paul-Murphy, Joanne R
2011-06-01
To evaluate the antinociceptive effects and duration of action of nalbuphine HCl administered IM on thermal thresholds in Hispaniolan Amazon parrots (Amazona ventralis). 14 healthy adult Hispaniolan Amazon parrots of unknown sex. 3 doses of nalbuphine (12.5, 25, and 50 mg/kg, IM) and saline (0.9% NaCl) solution (control treatment) were evaluated in a blinded complete crossover experimental design by use of foot withdrawal threshold to a noxious thermal stimulus. Baseline data on thermal threshold were generated 1 hour before administration of nalbuphine or saline solution; thermal threshold measurements were obtained 0.5, 1.5, 3, and 6 hours after administration. Nalbuphine administered IM at 12.5 mg/kg significantly increased the thermal threshold (mean change, 2.4°C), compared with results for the control treatment, and significantly changed thermal threshold for up to 3 hours, compared with baseline results (mean change, 2.6° to 3.8°C). Higher doses of nalbuphine did not significantly change thermal thresholds, compared with results for the control treatment, but had a significant effect, compared with baseline results, for up to 3 and 1.5 hours after administration, respectively. Nalbuphine administered IM at 12.5 mg/kg significantly increased the foot withdrawal threshold to a thermal noxious stimulus in Hispaniolan Amazon parrots. Higher doses of nalbuphine did not result in significantly increased thermal thresholds or a longer duration of action and would be expected to result in less analgesic effect than lower doses. Further studies are needed to fully evaluate the analgesic effects of nalbuphine in psittacine species.
Dynamic and Tunable Threshold Voltage in Organic Electrochemical Transistors.
Doris, Sean E; Pierre, Adrien; Street, Robert A
2018-04-01
In recent years, organic electrochemical transistors (OECTs) have found applications in chemical and biological sensing and interfacing, neuromorphic computing, digital logic, and printed electronics. However, the incorporation of OECTs in practical electronic circuits is limited by the relative lack of control over their threshold voltage, which is important for controlling the power consumption and noise margin in complementary and unipolar circuits. Here, the threshold voltage of OECTs is precisely tuned over a range of more than 1 V by chemically controlling the electrochemical potential at the gate electrode. This threshold voltage tunability is exploited to prepare inverters and amplifiers with improved noise margin and gain, respectively. By coupling the gate electrode with an electrochemical oscillator, single-transistor oscillators based on OECTs with dynamic time-varying threshold voltages are prepared. This work highlights the importance of electrochemistry at the gate electrode in determining the electrical properties of OECTs, and opens a path toward the system-level design of low-power OECT-based electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Merluzzi, Thomas V.; Philip, Errol J.; Zhang, Zhiyong; Sullivan, Courtney
2016-01-01
In racial disparities research, perceived discrimination is a proposed risk factor for unfavorable health outcomes. In a proposed “threshold-constraint” theory, discrimination intensity may exceed a threshold and require coping strategies, but social constraint limits coping options for African Americans, who may react to perceived racial discrimination with disengagement, because active strategies are not viable under this social constraint. Caucasian Americans may experience less discrimination and lower social constraint, and thus may use more active coping strategies. 213 African Americans and 121 Caucasian Americans with cancer participated by completing measures of mistreatment, coping, and quality of life. African Americans reported more mistreatment than Caucasian Americans (p< 001) and attributed mistreatment more to race/ethnicity (p < .001). In the mistreatment-quality of life relationship, disengagement was a significant mediator for Caucasians (B = −.39;CI .13–.83) and African Americans (B = −.20;CI .07–.43). Agentic coping was a significant mediator only for Caucasians (B = −.48;CI .18–.81). Discrimination may exceed threshold more often for African Americans than for Caucasians and social constraint may exert greater limits for African Americans. Results suggest that perceived discrimination affects quality of life for African Americans with cancer because their coping options to counter mistreatment, which is racially based, are limited. This process may also affect treatment, recovery, and survivorship. PMID:25090144
Merluzzi, Thomas V; Philip, Errol J; Zhang, Zhiyong; Sullivan, Courtney
2015-07-01
In racial disparities research, perceived discrimination is a proposed risk factor for unfavorable health outcomes. In a proposed "threshold-constraint" theory, discrimination intensity may exceed a threshold and require coping strategies, but social constraint limits coping options for African Americans, who may react to perceived racial discrimination with disengagement, because active strategies are not viable under this social constraint. Caucasian Americans may experience less discrimination and lower social constraint, and may use more active coping strategies. There were 213 African Americans and 121 Caucasian Americans with cancer who participated by completing measures of mistreatment, coping, and quality of life. African Americans reported more mistreatment than Caucasian Americans (p < 001) and attributed mistreatment more to race or ethnicity (p < .001). In the mistreatment-quality of life relationship, disengagement was a significant mediator for Caucasians (B = -.39; CI .13-.83) and African Americans (B = -.20; CI .07-.43). Agentic coping was a significant mediator only for Caucasians (B = -.48; CI .18-.81). Discrimination may exceed threshold more often for African Americans than for Caucasians and social constraint may exert greater limits for African Americans. Results suggest that perceived discrimination affects quality of life for African Americans with cancer because their coping options to counter mistreatment, which is racially based, are limited. This process may also affect treatment, recovery, and survivorship. (c) 2015 APA, all rights reserved).
Ecstasy use and self-reported disturbances in sleep.
Ogeil, Rowan P; Rajaratnam, Shantha M W; Phillips, James G; Redman, Jennifer R; Broadbear, Jillian H
2011-10-01
Ecstasy users report a number of complaints after its use including disturbed sleep. However, little is known regarding which attributes of ecstasy use are associated with sleep disturbances, which domains of sleep are affected or which factors may predict those ecstasy users likely to have poor sleep quality and/or excessive daytime sleepiness. This study examined questionnaire responses of social drug users (n = 395) to the Pittsburgh Sleep Quality Index and the Epworth Sleepiness Scale. A significant proportion of ecstasy users (69.5%) had Pittsburgh Sleep Quality Index scores above the threshold used to identify sleep disturbance. Although frequency of ecstasy use did not affect the degree of reported sleep disturbance, participants who used larger amounts of ecstasy had poorer sleep. In addition, participants who perceived harmful consequences arising from their ecstasy use or had experienced remorse following ecstasy use had poorer sleep. Clinically relevant levels of sleep disturbance were still evident after controlling for polydrug use. Risk factors for poor sleep quality were younger age, injury post-ecstasy use and having been told to cut down on ecstasy use. Many ecstasy users report poor sleep quality, which likely contributes to the negative effects reported following ecstasy use. Copyright © 2011 John Wiley & Sons, Ltd.
Ohta, Tazro; Nakazato, Takeru; Bono, Hidemasa
2017-06-01
It is important for public data repositories to promote the reuse of archived data. In the growing field of omics science, however, the increasing number of submissions of high-throughput sequencing (HTSeq) data to public repositories prevents users from choosing a suitable data set from among the large number of search results. Repository users need to be able to set a threshold to reduce the number of results to obtain a suitable subset of high-quality data for reanalysis. We calculated the quality of sequencing data archived in a public data repository, the Sequence Read Archive (SRA), by using the quality control software FastQC. We obtained quality values for 1 171 313 experiments, which can be used to evaluate the suitability of data for reuse. We also visualized the data distribution in SRA by integrating the quality information and metadata of experiments and samples. We provide quality information of all of the archived sequencing data, which enable users to obtain sufficient quality sequencing data for reanalyses. The calculated quality data are available to the public in various formats. Our data also provide an example of enhancing the reuse of public data by adding metadata to published research data by a third party. © The Authors 2017. Published by Oxford University Press.
2014-01-01
Background Central disinhibition is a mechanism involved in the physiopathology of fibromyalgia. Melatonin can improve sleep quality, pain and pain threshold. We hypothesized that treatment with melatonin alone or in combination with amitriptyline would be superior to amitriptyline alone in modifying the endogenous pain-modulating system (PMS) as quantified by conditional pain modulation (CPM), and this change in CPM could be associated with serum brain-derived neurotrophic factor (BDNF). We also tested whether melatonin improves the clinical symptoms of pain, pain threshold and sleep quality. Methods Sixty-three females, aged 18 to 65, were randomized to receive bedtime amitriptyline (25 mg) (n = 21), melatonin (10 mg) (n = 21) or melatonin (10 mg) + amitriptyline (25 mg) (n = 21) for a period of six weeks. The descending PMS was assessed with the CPM-TASK. It was assessed the pain score on the Visual Analog Scale (VAS 0-100 mm), the score on Fibromyalgia Impact Questionnaire (FIQ), heat pain threshold (HPT), sleep quality and BDNF serum. Delta values (post- minus pre-treatment) were used to compare the treatment effect. The outcomes variables were collected before, one and six weeks after initiating treatment. Results Melatonin alone or in combination with amitriptyline reduced significantly pain on the VAS compared with amitriptyline alone (P < 0.01). The delta values on the VAS scores were-12.85 (19.93),-17.37 (18.69) and-20.93 (12.23) in the amitriptyline, melatonin and melatonin+amitriptyline groups, respectively. Melatonin alone and in combination increased the inhibitory PMS as assessed by the Numerical Pain Scale [NPS(0-10)] reduction during the CPM-TASK:-2.4 (2.04) melatonin + amitriptyline,-2.65 (1.68) melatonin, and-1.04 (2.06) amitriptyline, (P < 0.05). Melatonin + amitriptyline treated displayed better results than melatonin and amitriptyline alone in terms of FIQ and PPT improvement (P < 0.05, fort both). Conclusion Melatonin increased the inhibitory endogenous pain-modulating system as assessed by the reduction on NPS(0-10) during the CPM-TASK. Melatonin alone or associated with amitriptyline was better than amitriptyline alone in improving pain on the VAS, whereas its association with amitriptyline produced only marginal additional clinical effects on FIQ and PPT. Trial registration Current controlled trail is registered at clinical trials.gov upon under number NCT02041455. Registered January 16, 2014. PMID:25052847
Kuffner, Ilsa B.; Roberts, Kelsey E.; Flannery, Jennifer A.; Morrison, Jennifer M.; Richey, Julie
2017-01-01
Massive corals provide a useful archive of environmental variability, but careful testing of geochemical proxies in corals is necessary to validate the relationship between each proxy and environmental parameter throughout the full range of conditions experienced by the recording organisms. Here we use samples from a coral-growth study to test the hypothesis that Sr/Ca in the coral Siderastrea siderea accurately records sea-surface temperature (SST) in the subtropics (Florida, USA) along 350 km of reef tract. We test calcification rate, measured via buoyant weight, and linear extension (LE) rate, estimated with Alizarin Red-S staining, as predictors of variance in the Sr/Ca records of 39 individual S. siderea corals grown at four outer-reef locations next to in-situ temperature loggers during two, year-long periods. We found that corals with calcification rates < 1.7 mg cm−2 d−1 or < 1.7 mm yr−1 LE returned spuriously high Sr/Ca values, leading to a cold-bias in Sr/Ca-based SST estimates. The threshold-type response curves suggest that extension rate can be used as a quality-control indicator during sample and drill-path selection when using long cores for SST paleoreconstruction. For our corals that passed this quality control step, the Sr/Ca-SST proxy performed well in estimating mean annual temperature across three sites spanning 350 km of the Florida reef tract. However, there was some evidence that extreme temperature stress in 2010 (cold snap) and 2011 (SST above coral-bleaching threshold) may have caused the corals not to record the temperature extremes. Known stress events could be avoided during modern calibrations of paleoproxies.
Testing the fidelity of the Sr/Ca proxy in recording ocean temperature in a western Atlantic coral
NASA Astrophysics Data System (ADS)
Kuffner, I. B.; Roberts, K.; Flannery, J. A.; Richey, J. N.; Morrison, J. M.
2017-12-01
Massive corals provide a useful archive of environmental variability, but careful testing of geochemical proxies in corals is necessary to validate the relationship between each proxy and environmental parameter throughout the full range of conditions experienced by the recording organisms. Here we use samples from a field-based coral-growth study to test the hypothesis that Sr/Ca in the coral Siderastrea siderea accurately records sea-surface temperature (SST) in the subtropics (Florida, USA) along 350 km of reef tract. We test calcification rate, measured via buoyant weight, and linear extension (LE) rate, estimated with Alizarin Red-S staining, as predictors of variance in the Sr/Ca records of 39 individual S. siderea corals grown at four outer-reef locations next to in-situ temperature loggers during two, year-long periods. We found that corals with calcification rates less than 1.7 mg cm-2 d-1 or LE rates less than 1.7 mm yr-1 returned spuriously high Sr/Ca values, leading to a cold bias in Sr/Ca-based SST estimates. The threshold-type response curves suggest that LE rate can be used as a quality-control indicator during sample and microdrill-path selection when using long cores for SST paleoreconstruction. For our corals that passed this quality control step, the Sr/Ca-SST proxy performed well in estimating mean annual SST across three sites spanning 350 km of the Florida reef tract. However, there was some evidence that extreme temperature stress in 2010 (cold snap) and 2011 (SST above coral-bleaching threshold) may have caused the corals not to record the temperature extremes. Known stress events could be avoided during modern calibrations of paleoproxies.
Lower-upper-threshold correlation for underwater range-gated imaging self-adaptive enhancement.
Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; He, Jun; Fan, Songtao; Zhou, Yan; Liu, Yuliang
2016-10-10
In underwater range-gated imaging (URGI), enhancement of low-brightness and low-contrast images is critical for human observation. Traditional histogram equalizations over-enhance images, with the result of details being lost. To compress over-enhancement, a lower-upper-threshold correlation method is proposed for underwater range-gated imaging self-adaptive enhancement based on double-plateau histogram equalization. The lower threshold determines image details and compresses over-enhancement. It is correlated with the upper threshold. First, the upper threshold is updated by searching for the local maximum in real time, and then the lower threshold is calculated by the upper threshold and the number of nonzero units selected from a filtered histogram. With this method, the backgrounds of underwater images are constrained with enhanced details. Finally, the proof experiments are performed. Peak signal-to-noise-ratio, variance, contrast, and human visual properties are used to evaluate the objective quality of the global and regions of interest images. The evaluation results demonstrate that the proposed method adaptively selects the proper upper and lower thresholds under different conditions. The proposed method contributes to URGI with effective image enhancement for human eyes.
Electrocardiogram signal denoising based on a new improved wavelet thresholding
NASA Astrophysics Data System (ADS)
Han, Guoqiang; Xu, Zhijun
2016-08-01
Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.
Ultralow-threshold Raman lasing with CaF2 resonators.
Grudinin, Ivan S; Maleki, Lute
2007-01-15
We demonstrate efficient Raman lasing with CaF2 whispering-gallery-mode resonators. Continuous-wave emission threshold is shown to be possible below 1 microW with a 5mm cavity, which is to our knowledge orders of magnitude lower than in any other Raman source. Low-threshold lasing is made possible by the ultrahigh optical quality factor of the cavity, of the order of Q=5x10(10). Stokes components of up to the fifth order were observed at a pump power of 160 microW, and up to the eighth order at 1 mW. A lasing threshold of 15 microW was also observed in a 100 microm CaF2 microcavity. Potential applications are discussed.
Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage
NASA Astrophysics Data System (ADS)
Gosselin, Claude
This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.
Incorporating biological control into IPM decision making
USDA-ARS?s Scientific Manuscript database
Of the many ways biological control can be incorporated into Integrated Pest Management (IPM) programs, natural enemy thresholds are arguably most easily adopted by stakeholders. Integration of natural enemy thresholds into IPM programs requires ecological and cost/benefit crop production data, thr...
The salt-taste threshold in untreated hypertensive patients.
Kim, Chang-Yeon; Ye, Mi-Kyung; Lee, Young Soo
2017-01-01
The salt-taste threshold can influence the salt appetite, and is thought to be another marker of sodium intake. Many studies have mentioned the relationship between the sodium intake and blood pressure (BP). The aim of this study was to evaluate the relationship between the salt-taste threshold and urinary sodium excretion in normotensive and hypertensive groups. We analyzed 199 patients (mean age 52 years, male 47.3%) who underwent 24-h ambulatory BP monitoring (ABPM). Hypertension was diagnosed as an average daytime systolic BP of ≥135 mmHg or diastolic BP of ≥85 mmHg by the ABPM. We assessed the salt-taste threshold using graded saline solutions. The salt-taste threshold, 24-h urinary sodium and potassium excretion, and echocardiographic data were compared between the control and hypertensive groups. The detection and recognition threshold of the salt taste did not significantly differ between the control and hypertensive groups. The 24-h urinary sodium excretion of hypertensive patients was significantly higher than that of the control group (140.9 ± 59.8 vs. 117.9 ± 57.2 mEq/day, respectively, p = 0.011). Also, the urinary sodium-potassium ratio was significantly higher in the hypertensive patients. There was no correlation between the salt-taste threshold and 24-h urinary sodium excretion. The salt-taste threshold might not be related to the BP status as well as the 24-h urinary sodium excretion.
Pain perception in major depressive disorder: a neurophysiological case-control study.
Zambito Marsala, Sandro; Pistacchi, Michele; Tocco, Pierluigi; Gioulis, Manuela; Fabris, Federico; Brigo, Francesco; Tinazzi, Michele
2015-10-15
Depression and pain may sometimes be related conditions. Occasionally, depression may be associated with physical symptoms, such as back pain and headache. Moreover, depression may impair the subjective response to pain and is likely to influence the pain feeling. Conversely, chronic pain may represent an emotional condition as well as physical sensation, and can influence both the mood and behaviour. To better understand the relationship between pain and depression, we therefore assessed the pain threshold and the tolerance pain threshold in patients with depressive disorders. We conducted a case-control study and selected patients who had recently received a diagnosis of major depression (DSM-IV), before treatment, and without any significant pain complaints. Age- and sex-matched healthy controls were also included. Tactile and pain thresholds were assessed in all subjects through an electrical stimulation test. All results were compared between the groups. 27 patients and 27 age-matched healthy controls were included in the study. Tactile, pain and tolerance thresholds were evaluated in all subjects. The pain threshold and pain tolerance were lower in patients with major depression than controls. All differences were statistically significant (p<0.05). These results suggest the abnormal processing of pain stimuli in depressive disorders. Copyright © 2015 Elsevier B.V. All rights reserved.
Increase of olfactory threshold in plating factory workers exposed to chromium in Korea.
Kitamura, Fumihiko; Yokoyama, Kazuhito; Araki, Shunichi; Nishikitani, Mariko; Choi, Jae-Wook; Yum, Youg-Tae; Park, Hee-Chan; Park, Sang-Hwoi; Sato, Hajime
2003-07-01
To disclose the effects of chromium (Cr) on olfactory function, olfactory threshold tests were conducted on 27 male plating workers (Cr workers) with signs and symptoms of olfactory irritation but without nasal septum perforation or ulcer and on 34 male control subjects in Korean plating factories. The Cr workers had been exposed to Cr fume for 0.9 to 18.2 (mean 7.9) years; their blood Cr concentrations (0.16-3.69, mean 1.29 microg/dl) were significantly higher than those of the 34 control subjects (0.04-1.95, mean 0.55 microg/dl). Scores on recognition thresholds among the Cr workers were significantly higher than those of the control subjects (p < 0.05) and related positively and significantly to the exposure periods of the 27 Cr workers (p < 0.05). Olfactory thresholds were not significantly different between the Cr workers with and without nasal signs or symptoms, except that the scores on the recognition threshold were significantly higher in those experiencing difficulty with smell (p < 0.05). It is suggested that olfactory threshold is affected by Cr without development of nasal septum perforation or ulceration.
Bajaj, Priti; Bajaj, Prem; Madsen, Hans; Arendt-Nielsen, Lars
2002-01-01
The objective was to evaluate somatosensory thresholds to a multimodality stimulation regimen applied both within and outside areas of referred menstrual pain in dysmenorrheic women, over four phases of confirmed ovulatory cycles, and to compare them with thresholds in nondysmenorrheic women during menstruation. Twenty dysmenorrheic women with menstrual pain scoring 5.45 +/- 0.39 cm (mean +/- standard error of mean) on a visual analog scale (10 cm) participated. Fifteen nondysmenorrheic women with a menstrual pain score of 0.4 +/- 0.2 cm participated as controls. Ovulation was confirmed by an enzyme-multiplied immunoassay technique. Menstrual pain was described with the McGill Pain Questionnaire. Areas within menstrual pain referral were two abdominal sites and the midline of the low back, and the arm and thigh were the control areas. The pressure pain threshold (PPT) and pinch pain threshold were determined by a hand-held electronic pressure algometer, the heat pain threshold (HPT) by a contact thermode, and the tactile threshold with von Frey hairs. In dysmenorrheic women the McGill Pain Questionnaire showed a larger sensory and affective component of pain than the evaluative and miscellaneous groups. The HPT and PPT were lower in the menstrual phase than in the ovulatory, luteal, and premenstrual phases, both within and outside areas of referred menstrual pain (p <0.01), with a more pronounced decrease at the referral pain areas. The pinch pain threshold was lower in the menstrual phase than in the ovulatory phase (p <0.02), and the tactile threshold did not differ significantly across the menstrual phases or within any site. Dysmenorrheic women had a lower HPT at the control sites and a lower PPT at the abdomen, back, and control sites, than in those of nondysmenorrheic women in the menstrual phase. The results show reduced somatosensory pain thresholds during menstruation to heat and pressure stimulation, both within and outside areas of referred menstrual pain in dysmenorrheic women. Dysmenorrheic women showed a lower HPT at the control sites and a lower PPT at all the sites than those for nondysmenorrheic women in the menstrual phase. The altered somatosensory thresholds may be dependent on a spinal mechanism of central hyperexcitability, induced by recurrent moderate to severe menstrual pain.
Zhang, Yuqing; Montoya, Luis; Ebrahim, Shanil; Busse, Jason W; Couban, Rachel; McCabe, Randi E; Bieling, Peter; Carrasco-Labra, Alonso; Guyatt, Gordon H
2015-01-01
To conduct a systematic review and meta-analysis to evaluate the effectiveness of hypnosis/relaxation therapy compared to no/minimal treatment in patients with temporomandibular disorders (TMD). Studies reviewed included randomized controlled trials (RCTs) where investigators randomized patients with TMD or an equivalent condition to an intervention arm receiving hypnosis, relaxation training, or hyporelaxation therapy, and a control group receiving no/minimal treatment. The systematic search was conducted without language restrictions, in Medline, EMBASE, CENTRAL, and PsycINFO, from inception to June 30, 2014. Studies were pooled using weighted mean differences and pooled risk ratios (RRs) for continuous outcomes and dichotomous outcomes, respectively, and their associated 95% confidence intervals (CI). Of 3,098 identified citations, 3 studies including 159 patients proved eligible, although none of these described their method of randomization. The results suggested limited or no benefit of hypnosis/relaxation therapy on pain (risk difference in important pain -0.06; 95% CI: -0.18 to 0.05; P = .28), or on pressure pain thresholds on the skin surface over the temporomandibular joint (TMJ) and masticatory muscles. Low-quality evidence suggested some benefit of hypnosis/relaxation therapy on maximal pain (mean difference on 100-mm scale = -28.33; 95% CI: -44.67 to -11.99; P =.007) and active maximal mouth opening (mean difference on 100-mm scale = -2.63 mm; 95% CI: -3.30 mm to -1.96 mm; P < .001) compared to no/minimal treatment. Three RCTs were eligible for the systematic review, but they were with high risk of bias and provided low-quality evidence, suggesting that hypnosis/relaxation therapy may have a beneficial effect on maximal pain and active maximal mouth opening but not on pain and pressure pain threshold. Larger RCTs with low risk of bias are required to confirm or refute these findings and to inform other important patient outcomes.
Cao, Xiaofeng; Wang, Jie; Jiang, Dalin; Sun, Jinhua; Huang, Yi; Luan, Shengji
2017-12-13
The establishment of numeric nutrient criteria is essential to aid the control of nutrient pollution and for protecting and restoring healthy ecological conditions. However, it's necessary to determine whether regional nutrient criteria can be defined in stream ecosystems with a poor ecological status. A database of periphytic diatom samples was collected in July and August 2011 and 2012. In total 172 samples were included in the database with matching environmental variables. Here, percentile estimates, nonparametric change-point analysis (nCPA) and Threshold Indicator Taxa ANalysis (TITAN) were conducted to detect the reference conditions and ecological thresholds along a total nitrogen (TN) and total phosphorus (TP) gradient and ammonia nitrogen (NH 3 -N) for the development of nutrient criteria in the streams of the Lake Dianchi basin. The results highlighted the possibility of establishing regional criteria for nutrient concentrations, which we recommended to be no more than 1.39 mg L -1 for TN, 0.04 mg L -1 for TP and 0.17 mg L -1 for NH 3 -N to prevent nuisance growths of tolerant taxa, and 0.38 mg L -1 for TN, 0.02 mg L -1 for TP and 0.02 mg L -1 for NH 3 -N to maintain high quality waters in streams. Additionally, the influence of excessive background nutrient enrichment on the threshold response, and the ecological interaction with other stressors (HQI, etc.) in the nutrient dynamic process need to be considered to establish the eventual nutrient criteria, regardless of which technique is applied.
SART-Type Half-Threshold Filtering Approach for CT Reconstruction
YU, HENGYONG; WANG, GE
2014-01-01
The ℓ1 regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the ℓp norm (0 < p < 1) and solve the ℓp minimization problem. Very recently, Xu et al. developed an analytic solution for the ℓ1∕2 regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering. PMID:25530928
SART-Type Half-Threshold Filtering Approach for CT Reconstruction.
Yu, Hengyong; Wang, Ge
2014-01-01
The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.
Lee, Gwan-Hyoung; Cui, Xu; Kim, Young Duck; Arefe, Ghidewon; Zhang, Xian; Lee, Chul-Ho; Ye, Fan; Watanabe, Kenji; Taniguchi, Takashi; Kim, Philip; Hone, James
2015-07-28
Emerging two-dimensional (2D) semiconductors such as molybdenum disulfide (MoS2) have been intensively studied because of their novel properties for advanced electronics and optoelectronics. However, 2D materials are by nature sensitive to environmental influences, such as temperature, humidity, adsorbates, and trapped charges in neighboring dielectrics. Therefore, it is crucial to develop device architectures that provide both high performance and long-term stability. Here we report high performance of dual-gated van der Waals (vdW) heterostructure devices in which MoS2 layers are fully encapsulated by hexagonal boron nitride (hBN) and contacts are formed using graphene. The hBN-encapsulation provides excellent protection from environmental factors, resulting in highly stable device performance, even at elevated temperatures. Our measurements also reveal high-quality electrical contacts and reduced hysteresis, leading to high two-terminal carrier mobility (33-151 cm(2) V(-1) s(-1)) and low subthreshold swing (80 mV/dec) at room temperature. Furthermore, adjustment of graphene Fermi level and use of dual gates enable us to separately control contact resistance and threshold voltage. This novel vdW heterostructure device opens up a new way toward fabrication of stable, high-performance devices based on 2D materials.
van Brunschot, Sharon L.; Bergervoet, Jan H. W.; Pagendam, Daniel E.; de Weerdt, Marjanne; Geering, Andrew D. W.; Drenth, André; van der Vlugt, René A. A.
2014-01-01
Efficient and reliable diagnostic tools for the routine indexing and certification of clean propagating material are essential for the management of pospiviroid diseases in horticultural crops. This study describes the development of a true multiplexed diagnostic method for the detection and identification of all nine currently recognized pospiviroid species in one assay using Luminex bead-based suspension array technology. In addition, a new data-driven, statistical method is presented for establishing thresholds for positivity for individual assays within multiplexed arrays. When applied to the multiplexed array data generated in this study, the new method was shown to have better control of false positives and false negative results than two other commonly used approaches for setting thresholds. The 11-plex Luminex MagPlex-TAG pospiviroid array described here has a unique hierarchical assay design, incorporating a near-universal assay in addition to nine species-specific assays, and a co-amplified plant internal control assay for quality assurance purposes. All assays of the multiplexed array were shown to be 100% specific, sensitive and reproducible. The multiplexed array described herein is robust, easy to use, displays unambiguous results and has strong potential for use in routine pospiviroid indexing to improve disease management strategies. PMID:24404188
Robust Adaptive Thresholder For Document Scanning Applications
NASA Astrophysics Data System (ADS)
Hsing, To R.
1982-12-01
In document scanning applications, thresholding is used to obtain binary data from a scanner. However, due to: (1) a wide range of different color backgrounds; (2) density variations of printed text information; and (3) the shading effect caused by the optical systems, the use of adaptive thresholding to enhance the useful information is highly desired. This paper describes a new robust adaptive thresholder for obtaining valid binary images. It is basically a memory type algorithm which can dynamically update the black and white reference level to optimize a local adaptive threshold function. The results of high image quality from different types of simulate test patterns can be obtained by this algorithm. The software algorithm is described and experiment results are present to describe the procedures. Results also show that the techniques described here can be used for real-time signal processing in the varied applications.
Seven newly identified loci for autoimmune thyroid disease.
Cooper, Jason D; Simmonds, Matthew J; Walker, Neil M; Burren, Oliver; Brand, Oliver J; Guo, Hui; Wallace, Chris; Stevens, Helen; Coleman, Gillian; Franklyn, Jayne A; Todd, John A; Gough, Stephen C L
2012-12-01
Autoimmune thyroid disease (AITD), including Graves' disease (GD) and Hashimoto's thyroiditis (HT), is one of the most common of the immune-mediated diseases. To further investigate the genetic determinants of AITD, we conducted an association study using a custom-made single-nucleotide polymorphism (SNP) array, the ImmunoChip. The SNP array contains all known and genotype-able SNPs across 186 distinct susceptibility loci associated with one or more immune-mediated diseases. After stringent quality control, we analysed 103 875 common SNPs (minor allele frequency >0.05) in 2285 GD and 462 HT patients and 9364 controls. We found evidence for seven new AITD risk loci (P < 1.12 × 10(-6); a permutation test derived significance threshold), five at locations previously associated and two at locations awaiting confirmation, with other immune-mediated diseases.
Approaches to Identify Exceedances of Water Quality Thresholds Associated with Ocean Conditions
WED scientists have developed a method to help distinguish whether failures to meet water quality criteria are associated with natural coastal upwelling by using the statistical approach of logistic regression. Estuaries along the west coast of the United States periodically ha...
A Parametric Oscillator Experiment for Undergraduates
NASA Astrophysics Data System (ADS)
Huff, Alison; Thompson, Johnathon; Pate, Jacob; Kim, Hannah; Chiao, Raymond; Sharping, Jay
We describe an upper-division undergraduate-level analytic mechanics experiment or classroom demonstration of a weakly-damped pendulum driven into parametric resonance. Students can derive the equations of motion from first principles and extract key oscillator features, such as quality factor and parametric gain, from experimental data. The apparatus is compact, portable and easily constructed from inexpensive components. Motion control and data acquisition are accomplished using an Arduino micro-controller incorporating a servo motor, laser sensor, and data logger. We record the passage time of the pendulum through its equilibrium position and obtain the maximum speed per oscillation as a function of time. As examples of the interesting physics which the experiment reveals, we present contour plots depicting the energy of the system as functions of driven frequency and modulation depth. We observe the transition to steady state oscillation and compare the experimental oscillation threshold with theoretical expectations. A thorough understanding of this hands-on laboratory exercise provides a foundation for current research in quantum information and opto-mechanics, where damped harmonic motion, quality factor, and parametric amplification are central.
Saco-Alvarez, Liliana; Durán, Iria; Ignacio Lorenzo, J; Beiras, Ricardo
2010-05-01
The sea-urchin embryo test (SET) has been frequently used as a rapid, sensitive, and cost-effective biological tool for marine monitoring worldwide, but the selection of a sensitive, objective, and automatically readable endpoint, a stricter quality control to guarantee optimum handling and biological material, and the identification of confounding factors that interfere with the response have hampered its widespread routine use. Size increase in a minimum of n=30 individuals per replicate, either normal larvae or earlier developmental stages, was preferred to observer-dependent, discontinuous responses as test endpoint. Control size increase after 48 h incubation at 20 degrees C must meet an acceptability criterion of 218 microm. In order to avoid false positives minimums of 32 per thousand salinity, 7 pH and 2mg/L oxygen, and a maximum of 40 microg/L NH(3) (NOEC) are required in the incubation media. For in situ testing size increase rates must be corrected on a degree-day basis using 12 degrees C as the developmental threshold. Copyright 2010 Elsevier Inc. All rights reserved.
Relationship of extinction to perceptual thresholds for single stimuli.
Meador, K J; Ray, P G; Day, L J; Loring, D W
2001-04-24
To demonstrate the effects of target stimulus intensity on extinction to double simultaneous stimuli. Attentional deficits contribute to extinction in patients with brain lesions, but extinction (i.e., masking) can also be produced in healthy subjects. The relationship of extinction to perceptual thresholds for single stimuli remains uncertain. Brief electrical pulses were applied simultaneously to the left and right index fingers of 16 healthy volunteers (8 young and 8 elderly adults) and 4 patients with right brain stroke (RBS). The stimulus to be perceived (i.e., target stimulus) was given at the lowest perceptual threshold to perceive any single stimulus (i.e., Minimal) and at the threshold to perceive 100% of single stimuli. The mask stimulus (i.e., stimulus given to block the target) was applied to the contralateral hand at intensities just below discomfort. Extinction was less for target stimuli at 100% than Minimal threshold for healthy subjects. Extinction of left targets was greater in patients with RBS than elderly control subjects. Left targets were extinguished less than right in healthy subjects. In contrast, the majority of left targets were extinguished in patients with RBS even when right mask intensity was reduced below right 100% threshold for single stimuli. RBS patients had less extinction for right targets despite having greater left mask - threshold difference than control subjects. In patients with RBS, right "targets" at 100% threshold extinguished left "masks" (20%) almost as frequently as left masks extinguished right targets (32%). Subtle changes in target intensity affect extinction in healthy adults. Asymmetries in mask and target intensities (relative to single-stimulus perceptual thresholds) affect extinction in RBS patients less for left targets but more for right targets as compared with control subjects.
Longobardo, G S; Evangelisti, C J; Cherniack, N S
2009-12-01
We examined the effect of arousals (shifts from sleep to wakefulness) on breathing during sleep using a mathematical model. The model consisted of a description of the fluid dynamics and mechanical properties of the upper airways and lungs, as well as a controller sensitive to arterial and brain changes in CO(2), changes in arterial oxygen, and a neural input, alertness. The body was divided into multiple gas store compartments connected by the circulation. Cardiac output was constant, and cerebral blood flows were sensitive to changes in O(2) and CO(2) levels. Arousal was considered to occur instantaneously when afferent respiratory chemical and neural stimulation reached a threshold value, while sleep occurred when stimulation fell below that value. In the case of rigid and nearly incompressible upper airways, lowering arousal threshold decreased the stability of breathing and led to the occurrence of repeated apnoeas. In more compressible upper airways, to maintain stability, increasing arousal thresholds and decreasing elasticity were linked approximately linearly, until at low elastances arousal thresholds had no effect on stability. Increased controller gain promoted instability. The architecture of apnoeas during unstable sleep changed with the arousal threshold and decreases in elasticity. With rigid airways, apnoeas were central. With lower elastances, apnoeas were mixed even with higher arousal thresholds. With very low elastances and still higher arousal thresholds, sleep consisted totally of obstructed apnoeas. Cycle lengths shortened as the sleep architecture changed from mixed apnoeas to total obstruction. Deeper sleep also tended to promote instability by increasing plant gain. These instabilities could be countered by arousal threshold increases which were tied to deeper sleep or accumulated aroused time, or by decreased controller gains.
Tillotson, S L; Fuggle, P W; Smith, I; Ades, A E; Grant, D B
1994-08-13
To assess whether early treatment of congenital hypothyroidism fully prevents intellectual impairment. A national register of children with congenital hypothyroidism who were compared with unaffected children from the same school classes and matched for age, sex, social class, and first language. First three years (1982-4) of a neonatal screening programme in England, Wales, and Northern Ireland. 361 children with congenital hypothyroidism given early treatment and 315 control children. Intelligence quotient (IQ) measured at school entry at 5 years of age with the Wechsler preschool and primary scale of intelligence. There was a discontinuous relation between IQ and plasma thyroxine concentration at diagnosis, with a threshold at 42.8 nmol/l (95% confidence interval 35.2 to 47.1 nmol/l). Hypothyroid children with thyroxine values below 42.8 nmol/l had a mean IQ 10.3 points (6.9 to 13.7 points) lower than those with higher values and than controls. None of the measures of quality of treatment (age at start of treatment (range 1-173 days), average thyroxine dose (12-76 micrograms in the first year), average thyroxine concentration during treatment (79-234 nmol/l in the first year), and thyroxine concentration less than 103 nmol/l at least once during the first year) influenced IQ at age 5. Despite early treatment in congenital hypothyroidism the disease severity has a threshold effect on brain development, probably determined prenatally. The 55% of infants with more severe disease continue to show clinically significant intellectual impairment; infants with milder disease show no such impairment. The findings predict that 10% of early treated infants with severe hypothyroidism, compared with around 40% of those who presented with symptoms in the period before screening began, are likely to require special education.
Hemispheric Lateralization of Motor Thresholds in Relation to Stuttering
Alm, Per A.; Karlsson, Ragnhild; Sundberg, Madeleine; Axelson, Hans W.
2013-01-01
Stuttering is a complex speech disorder. Previous studies indicate a tendency towards elevated motor threshold for the left hemisphere, as measured using transcranial magnetic stimulation (TMS). This may reflect a monohemispheric motor system impairment. The purpose of the study was to investigate the relative side-to-side difference (asymmetry) and the absolute levels of motor threshold for the hand area, using TMS in adults who stutter (n = 15) and in controls (n = 15). In accordance with the hypothesis, the groups differed significantly regarding the relative side-to-side difference of finger motor threshold (p = 0.0026), with the stuttering group showing higher motor threshold of the left hemisphere in relation to the right. Also the absolute level of the finger motor threshold for the left hemisphere differed between the groups (p = 0.049). The obtained results, together with previous investigations, provide support for the hypothesis that stuttering tends to be related to left hemisphere motor impairment, and possibly to a dysfunctional state of bilateral speech motor control. PMID:24146930
Danzon, Patricia M; Drummond, Michael F; Towse, Adrian; Pauly, Mark V
2018-02-01
The fourth section of our Special Task Force report focuses on a health plan or payer's technology adoption or reimbursement decision, given the array of technologies, on the basis of their different values and costs. We discuss the role of budgets, thresholds, opportunity costs, and affordability in making decisions. First, we discuss the use of budgets and thresholds in private and public health plans, their interdependence, and connection to opportunity cost. Essentially, each payer should adopt a decision rule about what is good value for money given their budget; consistent use of a cost-per-quality-adjusted life-year threshold will ensure the maximum health gain for the budget. In the United States, different public and private insurance programs could use different thresholds, reflecting the differing generosity of their budgets and implying different levels of access to technologies. In addition, different insurance plans could consider different additional elements to the quality-adjusted life-year metric discussed elsewhere in our Special Task Force report. We then define affordability and discuss approaches to deal with it, including consideration of disinvestment and related adjustment costs, the impact of delaying new technologies, and comparative cost effectiveness of technologies. Over time, the availability of new technologies may increase the amount that populations want to spend on health care. We then discuss potential modifiers to thresholds, including uncertainty about the evidence used in the decision-making process. This article concludes by discussing the application of these concepts in the context of the pluralistic US health care system, as well as the "excess burden" of tax-financed public programs versus private programs. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Method and system for controlling a rotational speed of a rotor of a turbogenerator
Stahlhut, Ronnie Dean; Vuk, Carl Thomas
2008-12-30
A system and method controls a rotational speed of a rotor or shaft of a turbogenerator in accordance with a present voltage level on a direct current bus. A lower threshold and a higher threshold are established for a speed of a rotor or shaft of a turbogenerator. A speed sensor determines speed data or a speed signal for the rotor or shaft associated with a turbogenerator. A voltage regulator adjusts a voltage level associated with a direct current bus within a target voltage range if the speed data or speed signal indicates that the speed is above the higher threshold or below the lower threshold.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syh, J; Ding, X; Syh, J
2015-06-15
Purpose: An approved proton pencil beam scanning (PBS) treatment plan might not be able to deliver because of existed extremely low monitor unit per beam spot. A dual hybrid plan with higher efficiency of higher spot monitor unit and the efficacy of less number of energy layers were searched and optimized. The range of monitor unit threshold setting was investigated and the plan quality was evaluated by target dose conformity. Methods: Certain limitations and requirements need to be checks and tested before a nominal proton PBS treatment plan can be delivered. The plan needs to be met the machine characterization,more » specification in record and verification to deliver the beams. Minimal threshold of monitor unit, e.g. 0.02, per spot was set to filter the low counts and plan was re-computed. Further MU threshold increment was tested in sequence without sacrificing the plan quality. The number of energy layer was also alternated due to elimination of low count layer(s). Results: Minimal MU/spot threshold, spot spacing in each energy layer and total number of energy layer and the MU weighting of beam spots of each beam were evaluated. Plan optimization between increases of the spot MU (efficiency) and less energy layers of delivery (efficacy) was adjusted. 5% weighting limit of total monitor unit per beam was feasible. Scarce spreading of beam spots was not discouraging as long as target dose conformity within 3% criteria. Conclusion: Each spot size is equivalent to the relative dose in the beam delivery system. The energy layer is associated with the depth of the targeting tumor. Our work is crucial to maintain the best possible quality plan. To keep integrity of all intrinsic elements such as spot size, spot number, layer number and the carried weighting of spots in each layer is important in this study.« less
Zhang, Haihua; Wu, Yishi; Liao, Qing; Zhang, Zhaoyi; Liu, Yanping; Gao, Qinggang; Liu, Peng; Li, Meili; Yao, Jiannian; Fu, Hongbing
2018-06-25
Miniaturized nanowire nanolasers of 3D perovskites feature a high gain coefficient; however, room-temperature optical gain and nanowire lasers from 2D layered perovskites have not been reported to date. A biomimetic approach is presented to construct an artificial ligh-harvesting system in mixed multiple quantum wells (QWs) of 2D-RPPs of (BA) 2 (FA) n-1 Pb n Br 3n+1 , achieving room-temperature ASE and nanowire (NW) lasing. Owing to the improvement of flexible and deformable characteristics provided by organic BA cation layers, high-density large-area NW laser arrays were fabricated with high photostability. Well-controlled dimensions and uniform geometries enabled 2D-RPPs NWs functioning as high-quality Fabry-Perot (FP) lasers with almost identical optical modes, high quality (Q) factor (ca. 1800), and similarly low lasing thresholds. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Audiological manifestations in HIV-positive adults.
Matas, Carla Gentile; Angrisani, Rosanna Giaffredo; Magliaro, Fernanda Cristina Leite; Segurado, Aluisio Augusto Cotrim
2014-07-01
To characterize the findings of behavioral hearing assessment in HIV-positive individuals who received and did not receive antiretroviral treatment. This research was a cross-sectional study. The participants were 45 HIV-positive individuals (18 not exposed and 27 exposed to antiretroviral treatment) and 30 control-group individuals. All subjects completed an audiological evaluation through pure-tone audiometry, speech audiometry, and high-frequency audiometry. The hearing thresholds obtained by pure-tone audiometry were different between groups. The group that had received antiretroviral treatment had higher thresholds for the frequencies ranging from 250 to 3000 Hz compared with the control group and the group not exposed to treatment. In the range of frequencies from 4000 through 8000 Hz, the HIV-positive groups presented with higher thresholds than did the control group. The hearing thresholds determined by high-frequency audiometry were different between groups, with higher thresholds in the HIV-positive groups. HIV-positive individuals presented poorer results in pure-tone and high-frequency audiometry, suggesting impairment of the peripheral auditory pathway. Individuals who received antiretroviral treatment presented poorer results on both tests compared with individuals not exposed to antiretroviral treatment.
Meads, David M; Marshall, Andrea; Hulme, Claire T; Dunn, Janet A; Ford, Hugo E R
2016-01-01
The COUGAR-02 trial recently showed survival and quality-of-life benefits of docetaxel and active symptom control (DXL + ASC) over active symptom control (ASC) alone in patients with refractory oesophagogastric adenocarcinoma. The aim of this study was to conduct an economic evaluation conforming to National Institute for Health and Care Excellence (NICE) technology appraisal guidance to evaluate the cost effectiveness of DXL + ASC versus ASC from the perspective of the English National Health Service (NHS). Cost-utility analyses were conducted using trial data. Utility values were captured using the EQ-5D completed by patients at 3- and 6-weekly intervals, while resource use was captured using nurse-completed report forms and patient reports. Incremental cost-effectiveness ratios (ICERs) were calculated and the main outcome was cost per incremental quality-adjusted life-year (QALY). Nonparametric bootstrapping was conducted to capture sampling uncertainty and to generate a cost-effectiveness acceptability curve (CEAC). The analysis horizon was the trial period (median follow-up 12 months) and no modelling or discounting of future costs and benefits was conducted. Average costs were £9352 and £6218 for DXL + ASC and ASC, respectively, and average QALYs were 0.302 and 0.186, respectively. This yielded an ICER of £27,180 for DXL + ASC. DXL + ASC had a 24 % chance of being cost effective at a £20,000 QALY threshold (lambda) and a mean net monetary benefit of -£821; this rose to 59 % and £332 when the threshold was raised to £30,000. If NICE end-of-life criteria are applied, the probability of cost effectiveness increases to 90 % (at lambda = £50,000). Results were robust to sensitivity analyses. DXL + ASC is likely to be cost effective if an end-of-life premium is applied. Further research should determine the impact of different utility measurement strategies and different chemotherapy delivery modes on estimates of cost effectiveness.
Cartographic quality of ERTS-1 images
NASA Technical Reports Server (NTRS)
Welch, R. I.
1973-01-01
Analyses of simulated and operational ERTS images have provided initial estimates of resolution, ground resolution, detectability thresholds and other measures of image quality of interest to earth scientists and cartographers. Based on these values, including an approximate ground resolution of 250 meters for both RBV and MSS systems, the ERTS-1 images appear suited to the production and/or revision of planimetric and photo maps of 1:500,000 scale and smaller for which map accuracy standards are compatible with the imaged detail. Thematic mapping, although less constrained by map accuracy standards, will be influenced by measurement thresholds and errors which have yet to be accurately determined for ERTS images. This study also indicates the desirability of establishing a quantitative relationship between image quality values and map products which will permit both engineers and cartographers/earth scientists to contribute to the design requirements of future satellite imaging systems.
Implications of Transaction Costs for Acquisition Program Cost Breaches
2013-06-01
scope of the work, communicating the basis on which the estimate is built, identifying the quality of the data, determining the level of risk, and...projects such as bases, schools, missile storage facilities, maintenance facilities, medical/ dental clinics, libraries, and military family housing...was established as a threshold for measuring cost growth. This prevents a program from rebaselining to avoid a Nunn- McCurdy cost threshold breach. In
Concurrent Transmission Based on Channel Quality in Ad Hoc Networks: A Game Theoretic Approach
NASA Astrophysics Data System (ADS)
Chen, Chen; Gao, Xinbo; Li, Xiaoji; Pei, Qingqi
In this paper, a decentralized concurrent transmission strategy in shared channel in Ad Hoc networks is proposed based on game theory. Firstly, a static concurrent transmissions game is used to determine the candidates for transmitting by channel quality threshold and to maximize the overall throughput with consideration of channel quality variation. To achieve NES (Nash Equilibrium Solution), the selfish behaviors of node to attempt to improve the channel gain unilaterally are evaluated. Therefore, this game allows each node to be distributed and to decide whether to transmit concurrently with others or not depending on NES. Secondly, as there are always some nodes with lower channel gain than NES, which are defined as hunger nodes in this paper, a hunger suppression scheme is proposed by adjusting the price function with interferences reservation and forward relay, to fairly give hunger nodes transmission opportunities. Finally, inspired by stock trading, a dynamic concurrent transmission threshold determination scheme is implemented to make the static game practical. Numerical results show that the proposed scheme is feasible to increase concurrent transmission opportunities for active nodes, and at the same time, the number of hunger nodes is greatly reduced with the least increase of threshold by interferences reservation. Also, the good performance on network goodput of the proposed model can be seen from the results.
Receiver Operating Characteristic Curve Analysis of Beach Water Quality Indicator Variables
Morrison, Ann Michelle; Coughlin, Kelly; Shine, James P.; Coull, Brent A.; Rex, Andrea C.
2003-01-01
Receiver operating characteristic (ROC) curve analysis is a simple and effective means to compare the accuracies of indicator variables of bacterial beach water quality. The indicator variables examined in this study were previous day's Enterococcus density and antecedent rainfall at 24, 48, and 96 h. Daily Enterococcus densities and 15-min rainfall values were collected during a 5-year (1996 to 2000) study of four Boston Harbor beaches. The indicator variables were assessed for their ability to correctly classify water as suitable or unsuitable for swimming at a maximum threshold Enterococcus density of 104 CFU/100 ml. Sensitivity and specificity values were determined for each unique previous day's Enterococcus density and antecedent rainfall volume and used to construct ROC curves. The area under the ROC curve was used to compare the accuracies of the indicator variables. Twenty-four-hour antecedent rainfall classified elevated Enterococcus densities more accurately than previous day's Enterococcus density (P = 0.079). An empirically derived threshold for 48-h antecedent rainfall, corresponding to a sensitivity of 0.75, was determined from the 1996 to 2000 data and evaluated to ascertain if the threshold would produce a 0.75 sensitivity with independent water quality data collected in 2001 from the same beaches. PMID:14602593
A threshold model of content knowledge transfer for socioscientific argumentation
NASA Astrophysics Data System (ADS)
Sadler, Troy D.; Fowler, Samantha R.
2006-11-01
This study explores how individuals make use of scientific content knowledge for socioscientific argumentation. More specifically, this mixed-methods study investigates how learners apply genetics content knowledge as they justify claims relative to genetic engineering. Interviews are conducted with 45 participants, representing three distinct groups: high school students with variable genetics knowledge, college nonscience majors with little genetics knowledge, and college science majors with advanced genetics knowledge. During the interviews, participants advance positions concerning three scenarios dealing with gene therapy and cloning. Arguments are assessed in terms of the number of justifications offered as well as justification quality, based on a five-point rubric. Multivariate analysis of variance results indicate that college science majors outperformed the other groups in terms of justification quality and frequency. Argumentation does not differ among nonscience majors or high school students. Follow-up qualitative analyses of interview responses suggest that all three groups tend to focus on similar, sociomoral themes as they negotiate socially complex, genetic engineering issues, but that the science majors frequently reference specific science content knowledge in the justification of their claims. Results support the Threshold Model of Content Knowledge Transfer, which proposes two knowledge thresholds around which argumentation quality can reasonably be expected to increase. Research and educational implications of these findings are discussed.
Messali, Andrew; Hay, Joel W.; Villacorta, Reginald
2013-01-01
Background The objective of this work was to determine the cost-effectiveness of temozolomide compared with that of radiotherapy alone in the adjuvant treatment of newly diagnosed glioblastoma. Temozolomide is the only chemotherapeutic agent to have demonstrated a significant survival benefit in a randomized clinical trial. Our analysis builds on earlier work by incorporating caregiver time costs and generic temozolomide availability. It is also the first analysis applicable to the US context. Methods A systematic literature review was conducted to collect relevant data. Transition probabilities were calculated from randomized controlled trial data comparing temozolomide plus radiotherapy with radiotherapy alone. Direct costs were calculated from charges reported by the Mayo Clinic. Utilities were obtained from a previous cost-utility analysis. Using these data, a Markov model with a 1-month cycle length and 5-year time horizon was constructed. Results The addition of brand Temodar and generic temozolomide to the standard radiotherapy regimen was associated with base-case incremental cost-effectiveness ratios of $102 364 and $8875, respectively, per quality-adjusted life-year. The model was most sensitive to the progression-free survival associated with the use of only radiotherapy. Conclusions Both the brand and generic base-case estimates are cost-effective under a willingness-to-pay threshold of $150 000 per quality-adjusted life-year. All 1-way sensitivity analyses produced incremental cost-effectiveness ratios below this threshold. We conclude that both the brand Temodar and generic temozolomide are cost-effective treatments for newly diagnosed glioblastoma within the US context. However, assuming that the generic product produces equivalent quality of life and survival benefits, it would be significantly more cost-effective than the brand option. PMID:23935155
Cury, Rubens G; Galhardoni, Ricardo; Teixeira, Manoel J; Dos Santos Ghilardi, Maria G; Silva, Valquiria; Myczkowski, Martin L; Marcolin, Marco A; Barbosa, Egberto R; Fonoff, Erich T; Ciampi de Andrade, Daniel
2016-12-01
Subthalamic deep brain stimulation (STN-DBS) is used to treat refractory motor complications in Parkinson disease (PD), but its effects on nonmotor symptoms remain uncertain. Up to 80% of patients with PD may have pain relief after STN-DBS, but it is unknown whether its analgesic properties are related to potential effects on sensory thresholds or secondary to motor improvement. We have previously reported significant and long-lasting pain relief after DBS, which did not correlate with motor symptomatic control. Here we present secondary data exploring the effects of DBS on sensory thresholds in a controlled way and have explored the relationship between these changes and clinical pain and motor improvement after surgery. Thirty-seven patients were prospectively evaluated before STN-DBS and 12 months after the procedure compared with healthy controls. Compared with baseline, patients with PD showed lower thermal and mechanical detection and higher cold pain thresholds after surgery. There were no changes in heat and mechanical pain thresholds. Compared with baseline values in healthy controls, patients with PD had higher thermal and mechanical detection thresholds, which decreased after surgery toward normalization. These sensory changes had no correlation with motor or clinical pain improvement after surgery. These data confirm the existence of sensory abnormalities in PD and suggest that STN-DBS mainly influenced the detection thresholds rather than painful sensations. However, these changes may depend on the specific effects of DBS on somatosensory loops with no correlation to motor or clinical pain improvement.
Spatial contrast sensitivity vision loss in children with cortical visual impairment.
Good, William V; Hou, Chuan; Norcia, Anthony M
2012-11-19
Although cortical visual impairment (CVI) is the leading cause of bilateral vision impairment in children in Western countries, little is known about the effects of CVI on visual function. The aim of this study was to compare visual evoked potential measures of contrast sensitivity and grating acuity in children with CVI with those of age-matched typically developing controls. The swept parameter visual evoked potential (sVEP) was used to measure contrast sensitivity and grating acuity in 34 children with CVI at 5 months to 5 years of age and in 16 age-matched control children. Contrast thresholds and spatial frequency thresholds (grating acuities) were derived by extrapolating the tuning functions to zero amplitude. These thresholds and maximal suprathreshold response amplitudes were compared between groups. Among 34 children with CVI, 30 had measurable but reduced contrast sensitivity with a median threshold of 10.8% (range 5.0%-30.0% Michelson), and 32 had measurable but reduced grating acuity with median threshold 0.49 logMAR (9.8 c/deg, range 5-14 c/deg). These thresholds were significantly reduced, compared with age-matched control children. In addition, response amplitudes over the entire sweep range for both measures were significantly diminished in children with CVI compared with those of control children. Our results indicate that spatial contrast sensitivity and response amplitudes are strongly affected by CVI. The substantial degree of loss in contrast sensitivity suggests that contrast is a sensitive measure for evaluating vision deficits in patients with CVI.
Rodríguez Barrios, José Manuel; Pérez Alcántara, Ferran; Crespo Palomo, Carlos; González García, Paloma; Antón De Las Heras, Enrique; Brosa Riestra, Max
2012-12-01
The objective of this study was to evaluate the methodological characteristics of cost-effectiveness evaluations carried out in Spain, since 1990, which include LYG as an outcome to measure the incremental cost-effectiveness ratio. A systematic review of published studies was conducted describing their characteristics and methodological quality. We analyse the cost per LYG results in relation with a commonly accepted Spanish cost-effectiveness threshold and the possible relation with the cost per quality adjusted life year (QALY) gained when they both were calculated for the same economic evaluation. A total of 62 economic evaluations fulfilled the selection criteria, 24 of them including the cost per QALY gained result as well. The methodological quality of the studies was good (55%) or very good (26%). A total of 124 cost per LYG results were obtained with a mean ratio of 49,529
O'Mahony, James F; Coughlan, Diarmuid
2016-01-01
Ireland is one of the few countries worldwide to have an explicit cost-effectiveness threshold. In 2012, an agreement between government and the pharmaceutical industry that provided substantial savings on existing medications set the threshold at €45,000/quality-adjusted life-year (QALY). This replaced a previously unofficial threshold of €20,000/QALY. According to the agreement, drugs within the threshold will be granted reimbursement, whereas those exceeding it may still be approved following further negotiation. A number of drugs far exceeding the threshold have been approved recently. The agreement only applies to pharmaceuticals. There are four reasons for concern regarding Ireland's threshold. The absence of an explicit threshold for non-drug interventions leaves it unclear if there is parity in willingness to pay across all interventions. As the threshold resembles a price floor rather than a ceiling, in principle it only offers a weak barrier to cost-ineffective interventions. It has no empirical basis. Finally, it is probably too high given recent estimates of a threshold for the UK based on the cost effectiveness of services forgone of approximately £13,000/QALY. An excessive threshold risks causing the Irish health system unintended harm. The lack of an empirically informed threshold means the policy recommendations of cost-effectiveness analysis cannot be considered as fully evidence- based rational rationing. Policy makers should consider these issues and recent Irish legislation that defined cost effectiveness in terms of the opportunity cost of services forgone when choosing what threshold to apply once the current industry agreement expires at the end of 2015
Quality of warfarin control in atrial fibrillation patients in South East Queensland, Australia.
Bernaitis, N; Badrick, T; Davey, A K; Anoopkumar-Dukie, S
2016-08-01
Warfarin is widely prescribed to decrease the risk of stroke in atrial fibrillation (AF) patients. Due to patient variability in response, regular monitoring is required, and time in therapeutic range (TTR) used to indicate quality of warfarin control with a TTR>60% is recommended. Recently, an Australian Government review of anticoagulants identified the need to establish current warfarin control and determine the potential place of the newer oral anticoagulants. To determine warfarin control by a pathology practice in Queensland, Australia and identify factors influencing TTR. Retrospective data were collected from Sullivan Nicolaides Pathology, a major pathology practice offering a warfarin care programme in Australia. Patients enrolled in their programme as of September 2014 were included in the study. TTR was calculated using INR test results, and test dates using the Rosendaal method with mean patient TTR were used for analysis and comparison. Exclusions were target therapeutic range outside 2.0-3.0, less than two INR tests and programme treatment time of less than 30 days. The eligible 3692 AF patients had 73.6% of INR tests within the therapeutic range. The mean TTR was 81%, with 97% of patients above a TTR of 60%. TTR was not significantly influenced by age, gender or socioeconomic factors. The observed mean TTR of over 80% is superior to the minimum recommended threshold of 60%. The TTR achieved by the Queensland pathology practice demonstrates that dedicated warfarin programmes can produce high-quality warfarin care, ensuring the full benefit of warfarin for Australian patients. © 2016 Royal Australasian College of Physicians.
Nimdet, Khachapon; Chaiyakunapruk, Nathorn; Vichansavakul, Kittaya; Ngorsuraches, Surachat
2015-01-01
A number of studies have been conducted to estimate willingness to pay (WTP) per quality-adjusted life years (QALY) in patients or general population for various diseases. However, there has not been any systematic review summarizing the relationship between WTP per QALY and cost-effectiveness (CE) threshold based on World Health Organization (WHO) recommendation. To systematically review willingness-to-pay per quality-adjusted-life-year (WTP per QALY) literature, to compare WTP per QALY with Cost-effectiveness (CE) threshold recommended by WHO, and to determine potential influencing factors. We searched MEDLINE, EMBASE, Psyinfo, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Center of Research Dissemination (CRD), and EconLit from inception through 15 July 2014. To be included, studies have to estimate WTP per QALY in health-related issues using stated preference method. Two investigators independently reviewed each abstract, completed full-text reviews, and extracted information for included studies. We compared WTP per QALY to GDP per capita, analyzed, and summarized potential influencing factors. Out of 3,914 articles founded, 14 studies were included. Most studies (92.85%) used contingent valuation method, while only one study used discrete choice experiments. Sample size varied from 104 to 21,896 persons. The ratio between WTP per QALY and GDP per capita varied widely from 0.05 to 5.40, depending on scenario outcomes (e.g., whether it extended/saved life or improved quality of life), severity of hypothetical scenarios, duration of scenario, and source of funding. The average ratio of WTP per QALY and GDP per capita for extending life or saving life (2.03) was significantly higher than the average for improving quality of life (0.59) with the mean difference of 1.43 (95% CI, 1.81 to 1.06). This systematic review provides an overview summary of all studies estimating WTP per QALY studies. The variation of ratio of WTP per QALY and GDP per capita depended on several factors may prompt discussions on the CE threshold policy. Our research work provides a foundation for defining future direction of decision criteria for an evidence-informed decision making system.
Paskiet, Diane; Jenke, Dennis; Ball, Douglas; Houston, Christopher; Norwood, Daniel L; Markovic, Ingrid
2013-01-01
The Product Quality Research Institute (PQRI) is a non-profit consortium of organizations working together to generate and share timely, relevant, and impactful information that advances drug product quality and development. The collaborative activities of PQRI participants have, in the case of orally inhaled and nasal drug products (OINDPs), resulted in comprehensive and widely-accepted recommendations for leachables assessments to help ensure patient safety with respect to this class of packaged drug products. These recommendations, which include scientifically justified safety thresholds for leachables, represent a significant milestone towards establishing standardized approaches for safety qualification of leachables in OINDP. To build on the success of the OINDP effort, PQRI's Parenteral and Ophthalmic Drug Products (PODP) Leachables and Extractables Working Group was formed to extrapolate the OINDP threshold concepts and best practice recommendations to other dosage forms with high concern for interaction with packaging/delivery systems. This article considers the general aspects of leachables and their safety assessment, introduces the PODP Work Plan and initial study Protocol, discusses the laboratory studies being conducted by the PODP Chemistry Team, outlines the strategy being developed by the PODP Toxicology Team for the safety qualification of PODP leachables, and considers the issues associated with application of the safety thresholds, particularly with respect to large-volume parenterals. Lastly, the unique leachables issues associated with biologics are described. The Product Quality Research Institute (PQRI) is a non-profit consortium involving industry organizations, academia, and regulatory agencies that together provide recommendations in support of regulatory guidance to advance drug product quality. The collaborative activities of the PQRI Orally Inhaled and Nasal Drug Products Leachables and Extractables Working Group resulted in a systematic and science-based approach to identify and qualify leachables, including the concept of safety thresholds. Concepts from this widely accepted approach, formally publicized in 2006, are being extrapolated to parenteral and ophthalmic drug products. This article provides an overview of extractables and leachables in drug products and biologics and discusses the PQRI Work Plan and Protocols developed by the PQRI Parenteral and Ophthalmic Drug Products Leachables and Extractables Working Group.
Facial arthralgia and myalgia: can they be differentiated by trigeminal sensory assessment?
Eliav, Eli; Teich, Sorin; Nitzan, Dorit; El Raziq, Daood Abid; Nahlieli, Oded; Tal, Michael; Gracely, Richard H; Benoliel, Rafael
2003-08-01
Heat and electrical detection thresholds were assessed in 72 patients suffering from painful temporomandibular disorder. Employing widely accepted criteria, 44 patients were classified as suffering from temporomandibular joint (TMJ) arthralgia (i.e. pain originating from the TMJ) and 28 from myalgia (i.e. pain originating from the muscles of mastication). Electrical stimulation was employed to assess thresholds in large myelinated nerve fibers (Abeta) and heat application to assess thresholds in unmyelinated nerve fibers (C). The sensory tests were performed bilaterally in three trigeminal nerve sites: the auriculotemporal nerve territory (AUT), buccal nerve territory (BUC) and the mental nerve territory (MNT). In addition, 22 healthy asymptomatic controls were examined. A subset of ten arthralgia patients underwent arthrocentesis and electrical detection thresholds were additionally assessed following the procedure. Electrical detection threshold ratios were calculated by dividing the affected side by the control side, thus reduced ratios indicate hypersensitivity of the affected side. In control patients, ratios obtained at all sites did not vary significantly from the expected value of 'one' (mean with 95% confidence intervals; AUT, 1:0.95-1.06; BUC, 1.01:0.93-1.11; MNT, 0.97:0.88-1.05, all areas one sample analysis P>0.05). In arthralgia patients mean ratios (+/-SEM) obtained for the AUT territory (0.63+/-0.03) were significantly lower compared to ratios for the MNT (1.02+/-0.03) and BUC (0.96+/-0.04) territories (repeated measures analysis of variance (RANOVA), P<0.0001) and compared to the AUT ratios in myalgia (1.27+/-0.09) and control subjects (1+/-0.06, ANOVA, P<0.0001). In the myalgia group the electrical detection threshold ratios in the AUT territory were significantly elevated compared to the AUT ratios in control subjects (Dunnett test, P<0.05), but only approached statistical significance compared to the MNT (1.07+/-0.04) and BUC (1.11+/-0.06) territories (RANOVA, F(2,27)=3.12, P=0.052). There were no significant differences between and within the groups for electrical detection threshold ratios in the BUC and MNT nerve territories, and for the heat detection thresholds in all tested sites. Following arthrocentesis, mean electrical detection threshold ratios in the AUT territory were significantly elevated from 0.64+/-0.06 to 0.99+/-0.04 indicating resolution of the hypersensitivity (paired t-test, P=0.001). In conclusion, large myelinated fiber hypersensitivity is found in the skin overlying TMJs with clinical pain and pathology but is not found in controls. In patients with muscle-related facial pain there was significant elevation of the electrical detection threshold in the AUT region.
NASA Astrophysics Data System (ADS)
Jiménez-Guerrero, P.; Baró, R.; Gómez-Navarro, J. J.; Lorente-Plazas, R.; García-Valero, J. A.; Hernández, Z.; Montávez, J. P.
2012-04-01
A wide number of studies show that several areas over Europe exceed some of the air quality thresholds established in the legislation. These exceedances will become more frequent under future climate change scenarios, since the policies aimed at improving air quality in the EU directives have not accounted for the variations in the climate. Climate change alone will influence the future concentrations of atmospheric pollutants through modifications of gas-phase chemistry, transport, removal, and natural emissions. In this sense, chemistry transport models (CTMs) play a key role in assessing and understanding the emissions abatement plans through the use of sensitivity analysis strategies. These sensitivity analyses characterize the change in model output due to variations in model input parameters. Since the management strategies of air pollutant emission is one of the predominant factors for controlling urban air quality, this work assesses the impact of various emission reduction scenarios in air pollution levels over Europe under two climate change scenarios. The methodology includes the use of a climate version of the meteorological model MM5 coupled with the CHIMERE chemistry transport model. Experiments span the periods 1971-2000, as a reference, and 2071-2100, as two future enhanced greenhouse gas and aerosol scenarios (SRES A2 and B2). The atmospheric simulations have an horizontal resolution of 25 km and 23 vertical layers up to 100 hPa, and are driven by the global climate model ECHO-G . In order to represent the sensitivity of the chemistry and transport of aerosols, tropospheric ozone and other photochemical species, several hypothetical scenarios of emission control have been implemented to quantify the influence of diverse emission sources in the area, such as on-road traffic, port and industrial emissions, among others. The modeling strategy lies on a sensitivity analysis to determine the emission reduction and strategy needed in the target area in order to attain the standards and thresholds set in the European Directive 2008/50/EC. Results depict that the system is able to characterize the exceedances occurring in Europe, mainly related to the maximum 8h moving average exceeding the target value of 120 μg/m3, mainly over southern Europe. Also, compliance of the PM10 daily limit values (50 μg/m3) is not achieved over wide areas in Europe. The sensitivity analysis indicates that large reductions of precursors emissions are needed in all the scenarios examined for attaining the thresholds set in the European Directive. In most cases this abatement strategy is hard to take into practice (e.g. unrealistic percentage of emission reductions in on-road traffic, industry or harbor activity); however, ozone and particulate matter air pollution improve considerably in most of the scenarios included. Results also unveil the propagation of uncertainties from the meteorological projections into future air quality and claim for future studies aimed at deepening the knowledge about the parameterized processes, the definition of emissions and, last, reducing uncertainties.
Sustainability of Reef Ecosystem Services under Expanded Water Quality Standards in St. Croix, USVI
Under the U.S. Clean Water Act, States and Territories are to establish water quality criteria to protect designated uses, such as fishable or swimmable water resources. However, establishment of chemical and physical thresholds does not necessarily ensure protection of the biot...
Terhune, Devin B; Murray, Elizabeth; Near, Jamie; Stagg, Charlotte J; Cowey, Alan; Cohen Kadosh, Roi
2015-11-01
Phosphenes are illusory visual percepts produced by the application of transcranial magnetic stimulation to occipital cortex. Phosphene thresholds, the minimum stimulation intensity required to reliably produce phosphenes, are widely used as an index of cortical excitability. However, the neural basis of phosphene thresholds and their relationship to individual differences in visual cognition are poorly understood. Here, we investigated the neurochemical basis of phosphene perception by measuring basal GABA and glutamate levels in primary visual cortex using magnetic resonance spectroscopy. We further examined whether phosphene thresholds would relate to the visuospatial phenomenology of grapheme-color synesthesia, a condition characterized by atypical binding and involuntary color photisms. Phosphene thresholds negatively correlated with glutamate concentrations in visual cortex, with lower thresholds associated with elevated glutamate. This relationship was robust, present in both controls and synesthetes, and exhibited neurochemical, topographic, and threshold specificity. Projector synesthetes, who experience color photisms as spatially colocalized with inducing graphemes, displayed lower phosphene thresholds than associator synesthetes, who experience photisms as internal images, with both exhibiting lower thresholds than controls. These results suggest that phosphene perception is driven by interindividual variation in glutamatergic activity in primary visual cortex and relates to cortical processes underlying individual differences in visuospatial awareness. © The Author 2015. Published by Oxford University Press.
Sustainable thresholds for cooperative epidemiological models.
Barrios, Edwin; Gajardo, Pedro; Vasilieva, Olga
2018-05-22
In this paper, we introduce a method for computing sustainable thresholds for controlled cooperative models described by a system of ordinary differential equations, a property shared by a wide class of compartmental models in epidemiology. The set of sustainable thresholds refers to constraints (e.g., maximal "allowable" number of human infections; maximal "affordable" budget for disease prevention, diagnosis and treatments; etc.), parameterized by thresholds, that can be sustained by applying an admissible control strategy starting at the given initial state and lasting the whole period of the control intervention. This set, determined by the initial state of the dynamical system, virtually provides useful information for more efficient (or cost-effective) decision-making by exhibiting the trade-offs between different types of constraints and allowing the user to assess future outcomes of control measures on transient behavior of the dynamical system. In order to accentuate the originality of our approach and to reveal its potential significance in real-life applications, we present an example relying on the 2013 dengue outbreak in Cali, Colombia, where we compute the set of sustainable thresholds (in terms of the maximal "affordable" budget and the maximal "allowable" levels of active infections among human and vector populations) that could be sustained during the epidemic outbreak. Copyright © 2018 Elsevier Inc. All rights reserved.
A Method For Assessing Economic Thresholds of Hardwood Competition
Steven A. Knowe
2002-01-01
A procedure was developed for computing economic thresholds for hardwood competition in pine plantations. The economic threshold represents the break-even level of competition above which hardwood control is a financially attractive treatment. Sensitivity analyses were conducted to examine the relative importance of biological and economic factors in determining...
Speckle reduction in optical coherence tomography images based on wave atoms
Du, Yongzhao; Liu, Gangjun; Feng, Guoying; Chen, Zhongping
2014-01-01
Abstract. Optical coherence tomography (OCT) is an emerging noninvasive imaging technique, which is based on low-coherence interferometry. OCT images suffer from speckle noise, which reduces image contrast. A shrinkage filter based on wave atoms transform is proposed for speckle reduction in OCT images. Wave atoms transform is a new multiscale geometric analysis tool that offers sparser expansion and better representation for images containing oscillatory patterns and textures than other traditional transforms, such as wavelet and curvelet transforms. Cycle spinning-based technology is introduced to avoid visual artifacts, such as Gibbs-like phenomenon, and to develop a translation invariant wave atoms denoising scheme. The speckle suppression degree in the denoised images is controlled by an adjustable parameter that determines the threshold in the wave atoms domain. The experimental results show that the proposed method can effectively remove the speckle noise and improve the OCT image quality. The signal-to-noise ratio, contrast-to-noise ratio, average equivalent number of looks, and cross-correlation (XCOR) values are obtained, and the results are also compared with the wavelet and curvelet thresholding techniques. PMID:24825507
Ultracompact low-threshold organic laser.
Deotare, Parag B; Mahony, Thomas S; Bulović, Vladimir
2014-11-25
We report an ultracompact low-threshold laser with an Alq3:DCM host:guest molecular organic thin film gain layer. The device uses a photonic crystal nanobeam cavity which provides a high quality factor to mode volume (Q/V) ratio and increased spontaneous emission factor along with a small footprint. Lasing is observed with a threshold of 4.2 μJ/cm(2) when pumped by femtosecond pulses of λ = 400 nm wavelength light. We also model the dynamics of the laser and show good agreement with the experimental data. The inherent waveguide geometry of the structure enables easy on-chip integration with potential applications in biochemical sensing, inertial sensors, and data communication.
Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding
Sun, Lijuan; Guo, Jian; Xu, Bin; Li, Shujing
2017-01-01
The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO), which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur's entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO) and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO), the differential evolution (DE), the Artifical Bee Colony (ABC), and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability. PMID:28127305
Shamsi, M B; Venkatesh, S; Tanwar, M; Singh, G; Mukherjee, S; Malhotra, N; Kumar, R; Gupta, N P; Mittal, S; Dada, R
2010-05-01
The growing concern on transmission of genetic diseases in assisted reproduction technique (ART) and the lacunae in the conventional semen analysis to accurately predict the semen quality has led to the need for new techniques to identify the best quality sperm that can be used in assisted procreation techniques. This study analyzes the sperm parameters in the context of DNA damage in cytogenetically normal, AZF non deleted infertile men for DNA damage by comet assay. Seventy infertile men and 40 fertile controls were evaluated for the semen quality by conventional semen parameters and the sperms were also analyzed for DNA integrity by comet assay. The patients were classified into oligozoospermic (O), asthenozoospermic (A), teratozoospermic (T), oligoasthenoteratozoospermic (OAT) categories and infertile men with normal semen profile. The extent of DNA damage was assessed by visual scoring method of comets. Idiopathic infertile men with normal semen profile (n=18) according to conventional method and patients with history of spontaneous abortions and normal semen profile (n=10) had high degree of DNA damage (29 and 47% respectively) as compared to fertile controls (7%). The O, A, T and OAT categories of patients had a variably higher DNA damage load as compared to fertile controls. The normal range and threshold for DNA damage as a predictor of male fertility potential and technique which could assess the sperm DNA damage are necessary to lower the trauma of couples experiencing recurrent spontaneous abortion or failure in ART.
Haplotypic Analysis of Wellcome Trust Case Control Consortium Data
Browning, Brian L.; Browning, Sharon R.
2008-01-01
We applied a recently developed multilocus association testing method (localized haplotype clustering) to Wellcome Trust Case Control Consortium data (14,000 cases of seven common diseases and 3,000 shared controls genotyped on the Affymetrix 500K array). After rigorous data quality filtering, we identified three disease-associated loci with strong statistical support from localized haplotype cluster tests but with only marginal significance in single marker tests. These loci are chromosomes 10p15.1 with type 1 diabetes (p = 5.1 × 10-9), 12q15 with type 2 diabetes (p = 1.9 × 10-7) and 15q26.2 with hypertension (p = 2.8 × 10-8). We also detected the association of chromosome 9p21.3 with type 2 diabetes (p = 2.8 × 10-8), although this locus did not pass our stringent genotype quality filters. The association of 10p15.1 with type 1 diabetes and 9p21.3 with type 2 diabetes have both been replicated in other studies using independent data sets. Overall, localized haplotype cluster analysis had better success detecting disease associated variants than a previous single-marker analysis of imputed HapMap SNPs. We found that stringent application of quality score thresholds to genotype data substantially reduced false-positive results arising from genotype error. In addition, we demonstrate that it is possible to simultaneously phase 16,000 individuals genotyped on genome-wide data (450K markers) using the Beagle software package. PMID:18224336
Effect of mental stress on cold pain in chronic tension-type headache sufferers.
Cathcart, Stuart; Winefield, Anthony H; Lushington, Kurt; Rolan, Paul
2009-10-01
Mental stress is a noted contributing factor in chronic tension-type headache (CTH), however the mechanisms underlying this are not clearly understood. One proposition is that stress aggravates already increased pain sensitivity in CTH sufferers. This hypothesis could be partially tested by examining effects of mental stress on threshold and supra-threshold experimental pain processing in CTH sufferers. Such studies have not been reported to date. The present study measured pain detection and tolerance thresholds and ratings of supra-threshold pain stimulation from cold pressor test in CTH sufferers (CTH-S) and healthy Control (CNT) subjects exposed to a 60-min stressful mental task, and in CTH sufferers exposed to a 60-min neutral condition (CTH-N). Headache sufferers had lower pain tolerance thresholds and increased pain intensity ratings compared to controls. Pain detection and tolerance thresholds decreased and pain intensity ratings increased during the stress task, with a greater reduction in pain detection threshold and increase in pain intensity ratings in the CTH-S compared to CNT group. The results support the hypothesis that mental stress contributes to CTH through aggravating already increased pain sensitivity in CTH sufferers.
Kosa, S Daisy; Gafni, Amiram; House, Andrew A; Lawrence, JulieAnn; Moist, Louise; Nathoo, Bharat; Tam, Paul; Sarabia, Alicia; Thabane, Lehana; Wu, George; Lok, Charmaine E
2017-03-01
We developed the Hemodialysis Infection Prevention Protocols Ontario-Shower Technique (HIPPO-ST) to permit hemodialysis (HD) patients with central venous catheters (catheters) to shower without additional infection risk. Our primary objective was to determine the feasibility of conducting a parallel randomized controlled trial (RCT) to evaluate the impact of HIPPO-ST on catheter-related bacteremia (CRB) in adult HD patients. Adult HD patients using catheters were recruited from 11 HD units. Patients were randomized to receive HIPPO-ST or standard care and were followed up for 6 months. Only CRB-outcome assessors were blinded. For the study to be considered feasible, 4 of 5 feasibility outcomes, each with its own statistical threshold for success, must have been achieved. A total of 68 patients were randomized (33 HIPPO-ST and 35 control) and were followed up to 6 months. Of 5 measures of feasibility, 4 were achieved: (1) accurate CRB rate documented (threshold: κ level >0.80); (2) 97.8% (279/285) of satellite HD patients with catheters were screened (threshold: >95%); (3) 88% (23/26) in the HIPPO-ST arm were successfully educated by 6 months (threshold: >80%); and (4) 0% (0/29) patients in the control arm were "contaminated," that is, using HIPPO-ST (threshold: <5%). However, only 44.2% (72/163) of eligible patients consented to participate (threshold: >80%). The rate of CRB was similarly low in HIPPO-ST and control groups (0.68 vs. 0.88/1000 catheter days). This HIPPO-ST pilot study demonstrated the feasibility of the larger HIPPO-ST study, especially given the high levels of education success with the HIPPO-ST arm and the low levels of contamination in the control arm.
Threshold voltage control in TmSiO/HfO2 high-k/metal gate MOSFETs
NASA Astrophysics Data System (ADS)
Dentoni Litta, E.; Hellström, P.-E.; Östling, M.
2015-06-01
High-k interfacial layers have been proposed as a way to extend the scalability of Hf-based high-k/metal gate CMOS technology, which is currently limited by strong degradations in threshold voltage control, channel mobility and device reliability when the chemical oxide (SiOx) interfacial layer is scaled below 0.4 nm. We have previously demonstrated that thulium silicate (TmSiO) is a promising candidate as a high-k interfacial layer, providing competitive advantages in terms of EOT scalability and channel mobility. In this work, the effect of the TmSiO interfacial layer on threshold voltage control is evaluated, showing that the TmSiO/HfO2 dielectric stack is compatible with threshold voltage control techniques commonly used with SiOx/HfO2 stacks. Specifically, we show that the flatband voltage can be set in the range -1 V to +0.5 V by the choice of gate metal and that the effective workfunction of the stack is properly controlled by the metal workfunction in a gate-last process flow. Compatibility with a gate-first approach is also demonstrated, showing that integration of La2O3 and Al2O3 capping layers can induce a flatband voltage shift of at least 150 mV. Finally, the effect of the annealing conditions on flatband voltage is investigated, finding that the duration of the final forming gas anneal can be used as a further process knob to tune the threshold voltage. The evaluation performed on MOS capacitors is confirmed by the fabrication of TmSiO/HfO2/TiN MOSFETs achieving near-symmetric threshold voltages at sub-nm EOT.
NASA Astrophysics Data System (ADS)
Ren, Zhong; Liu, Guodong; Xiong, Zhihua
2016-10-01
The photoacoustic signals denoising of glucose is one of most important steps in the quality identification of the fruit because the real-time photoacoustic singals of glucose are easily interfered by all kinds of noises. To remove the noises and some useless information, an improved wavelet threshld function were proposed. Compared with the traditional wavelet hard and soft threshold functions, the improved wavelet threshold function can overcome the pseudo-oscillation effect of the denoised photoacoustic signals due to the continuity of the improved wavelet threshold function, and the error between the denoised signals and the original signals can be decreased. To validate the feasibility of the improved wavelet threshold function denoising, the denoising simulation experiments based on MATLAB programmimg were performed. In the simulation experiments, the standard test signal was used, and three different denoising methods were used and compared with the improved wavelet threshold function. The signal-to-noise ratio (SNR) and the root-mean-square error (RMSE) values were used to evaluate the performance of the improved wavelet threshold function denoising. The experimental results demonstrate that the SNR value of the improved wavelet threshold function is largest and the RMSE value is lest, which fully verifies that the improved wavelet threshold function denoising is feasible. Finally, the improved wavelet threshold function denoising was used to remove the noises of the photoacoustic signals of the glucose solutions. The denoising effect is also very good. Therefore, the improved wavelet threshold function denoising proposed by this paper, has a potential value in the field of denoising for the photoacoustic singals.
Low authority-threshold control for large flexible structures
NASA Technical Reports Server (NTRS)
Zimmerman, D. C.; Inman, D. J.; Juang, J.-N.
1988-01-01
An improved active control strategy for the vibration control of large flexible structures is presented. A minimum force, low authority-threshold controller is developed to bring a system with or without known external disturbances back into an 'allowable' state manifold over a finite time interval. The concept of a constrained, or allowable feedback form of the controller is introduced that reflects practical hardware implementation concerns. The robustness properties of the control strategy are then assessed. Finally, examples are presented which highlight the key points made within the paper.
Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría
2018-03-24
Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.
Developing a more useful surface quality metric for laser optics
NASA Astrophysics Data System (ADS)
Turchette, Quentin; Turner, Trey
2011-02-01
Light scatter due to surface defects on laser resonator optics produces losses which lower system efficiency and output power. The traditional methodology for surface quality inspection involves visual comparison of a component to scratch and dig (SAD) standards under controlled lighting and viewing conditions. Unfortunately, this process is subjective and operator dependent. Also, there is no clear correlation between inspection results and the actual performance impact of the optic in a laser resonator. As a result, laser manufacturers often overspecify surface quality in order to ensure that optics will not degrade laser performance due to scatter. This can drive up component costs and lengthen lead times. Alternatively, an objective test system for measuring optical scatter from defects can be constructed with a microscope, calibrated lighting, a CCD detector and image processing software. This approach is quantitative, highly repeatable and totally operator independent. Furthermore, it is flexible, allowing the user to set threshold levels as to what will or will not constitute a defect. This paper details how this automated, quantitative type of surface quality measurement can be constructed, and shows how its results correlate against conventional loss measurement techniques such as cavity ringdown times.
Beddoe, Rachael; Costanza, Robert; Farley, Joshua; Garza, Eric; Kent, Jennifer; Kubiszewski, Ida; Martinez, Luz; McCowen, Tracy; Murphy, Kathleen; Myers, Norman; Ogden, Zach; Stapleton, Kevin; Woodward, John
2009-01-01
A high and sustainable quality of life is a central goal for humanity. Our current socio-ecological regime and its set of interconnected worldviews, institutions, and technologies all support the goal of unlimited growth of material production and consumption as a proxy for quality of life. However, abundant evidence shows that, beyond a certain threshold, further material growth no longer significantly contributes to improvement in quality of life. Not only does further material growth not meet humanity's central goal, there is mounting evidence that it creates significant roadblocks to sustainability through increasing resource constraints (i.e., peak oil, water limitations) and sink constraints (i.e., climate disruption). Overcoming these roadblocks and creating a sustainable and desirable future will require an integrated, systems level redesign of our socio-ecological regime focused explicitly and directly on the goal of sustainable quality of life rather than the proxy of unlimited material growth. This transition, like all cultural transitions, will occur through an evolutionary process, but one that we, to a certain extent, can control and direct. We suggest an integrated set of worldviews, institutions, and technologies to stimulate and seed this evolutionary redesign of the current socio-ecological regime to achieve global sustainability. PMID:19240221
NASA Astrophysics Data System (ADS)
Kilic, M.; Akyol, S. M.
2012-08-01
The air quality and thermal comfort strongly influenced by the heat and mass transfer take place together in an automobile cabin. In this study, it is aimed to investigate and assess the effects of air intake settings (recirculation and fresh air) on the thermal comfort, air quality satisfaction and energy usage during the cooling period of an automobile cabin. For this purpose, measurements (temperature, air velocity, CO2) were performed at various locations inside the cabin. Furthermore, whole body and local responses of the human subjects were noted while skin temperatures were measured. A mathematical model was arranged in order to estimate CO2 concentration and energy usage inside the vehicle cabin and verified with experimental data. It is shown that CO2 level inside of the cabin can be greater than the threshold value recommended for the driving safety if two and more occupants exist in the car. It is also shown that an advanced climate control system may satisfy the requirements for the air quality and thermal comfort as well as to reduce the energy usage for the cooling of a vehicle cabin.
Explaining tolerance for bitterness in chocolate ice cream using solid chocolate preferences.
Harwood, Meriel L; Loquasto, Joseph R; Roberts, Robert F; Ziegler, Gregory R; Hayes, John E
2013-08-01
Chocolate ice cream is commonly formulated with higher sugar levels than nonchocolate flavors to compensate for the inherent bitterness of cocoa. Bitterness, however, is an integral part of the complex flavor of chocolate. In light of the global obesity epidemic, many consumers and health professionals are concerned about the levels of added sugars in foods. Once a strategy for balancing undesirable bitterness and health concerns regarding added sugars has been developed, the task becomes determining whether that product will be acceptable to the consumer. Thus, the purpose of this research was to manipulate the bitterness of chocolate ice cream to examine how this influences consumer preferences. The main goal of this study was to estimate group rejection thresholds for bitterness in chocolate ice cream, and to see if solid chocolate preferences (dark vs. milk) generalized to ice cream. A food-safe bitter ingredient, sucrose octaacetate, was added to chocolate ice cream to alter bitterness without disturbing other the sensory qualities of the ice cream samples, including texture. Untrained chocolate ice cream consumers participated in a large-scale sensory test by indicating their preferences for blinded pairs of unspiked and spiked samples, where the spiked sample had increasing levels of the added bitterant. As anticipated, the group containing individuals who prefer milk chocolate had a much lower tolerance for bitterness in their chocolate ice cream compared with the group of individuals who prefer dark chocolate; indeed, the dark chocolate group tolerated almost twice as much added bitterant in the ice cream before indicating a significant preference for the unspiked (control) ice cream. This work demonstrates the successful application of the rejection threshold method to a complex dairy food. Estimating rejection thresholds could prove to be an effective tool for determining acceptable formulations or quality limits when considering attributes that become objectionable at high intensities. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Explaining tolerance for bitterness in chocolate ice cream using solid chocolate preferences
Harwood, Meriel L.; Loquasto, Joseph R.; Roberts, Robert F.; Ziegler, Gregory R.; Hayes, John E.
2016-01-01
Chocolate ice cream is commonly formulated with higher sugar levels than nonchocolate flavors to compensate for the inherent bitterness of cocoa. Bitterness, however, is an integral part of the complex flavor of chocolate. In light of the global obesity epidemic, many consumers and health professionals are concerned about the levels of added sugars in foods. Once a strategy for balancing undesirable bitterness and health concerns regarding added sugars has been developed, the task becomes determining whether that product will be acceptable to the consumer. Thus, the purpose of this research was to manipulate the bitterness of chocolate ice cream to examine how this influences consumer preferences. The main goal of this study was to estimate group rejection thresholds for bitterness in chocolate ice cream, and to see if solid chocolate preferences (dark vs. milk) generalized to ice cream. A food-safe bitter ingredient, sucrose octaacetate, was added to chocolate ice cream to alter bitterness without disturbing other the sensory qualities of the ice cream samples, including texture. Untrained chocolate ice cream consumers participated in a large-scale sensory test by indicating their preferences for blinded pairs of unspiked and spiked samples, where the spiked sample had increasing levels of the added bitterant. As anticipated, the group containing individuals who prefer milk chocolate had a much lower tolerance for bitterness in their chocolate ice cream compared with the group of individuals who prefer dark chocolate; indeed, the dark chocolate group tolerated almost twice as much added bitterant in the ice cream before indicating a significant preference for the unspiked (control) ice cream. This work demonstrates the successful application of the rejection threshold method to a complex dairy food. Estimating rejection thresholds could prove to be an effective tool for determining acceptable formulations or quality limits when considering attributes that become objectionable at high intensities. PMID:23769376
LOGIC OF CONTROLLED THRESHOLD DEVICES.
The synthesis of threshold logic circuits from several points of view is presented. The first approach is applicable to resistor-transistor networks...in which the outputs are tied to a common collector resistor. In general, fewer threshold logic gates than NOR gates connected to a common collector...network to realize a specified function such that the failure of any but the output gate can be compensated for by a change in the threshold level (and
Owusu, Stephanie; Huynh, Alexander; Gruenthal, Eric; Prusik, Julia; Owusu-Sarpong, Stephane; Cherala, Rasan; Peng, Sophia; Pilitsis, Julie G; McCallum, Sarah E
2017-08-01
Spinal cord stimulation (SCS) is an efficacious therapy used to treat chronic pain. The type of SCS programming is important in improving patients' quality of life and overall satisfaction. In this study, 19 patients who underwent SCS with traditional devices were given between 4 and 6 programs including programs with stimulation below sensory threshold and above sensory threshold. Usage patterns and preferences were assessed. SCS patients were given 4-6 programs, some above sensory threshold and some below threshold immediately postoperatively after permanent implantation. Usage patterns of different programs were documented, including percent of time that the settings were used and preference for above threshold vs. below threshold settings during sleeping, walking, sitting, and vigorous activity. Improvements at three months in Oswestry disability index (ODI), numeric rating scale (NRS), Beck depression inventory (BDI), McGill pain questionnaire (MPQ), pain catastrophizing scale (PCS), insomnia severity index (ISI), and Epworth sleepiness scale (ESS) were evaluated. Patients were all trialed on above sensory threshold programs. Six weeks after implantation, most patients preferred above threshold stimulation (74%) vs. below threshold waveforms (21%). Patient diagnosis, type/location of lead or recharging burden played no role in patient preference. Above threshold patients had significantly better improvement in BDI scores than did below threshold patients (p < 0.05) at three-month follow-up but also had worse ESS scores (p < 0.05). Above threshold stimulation was preferred for walking and sitting (p < 0.05). Results indicate that when given the option between waveforms inducing paresthesias and those that do not, SCS patients tend to prefer waveforms that induce paresthesias. Among users of above threshold waveforms, there was preference for these settings during walking and sitting. There was a trend for below threshold preference in vigorous activity and sleeping. © 2017 International Neuromodulation Society.
Wilkinson, Dominic James
2011-01-01
When is it permissible to allow a newborn infant to die on the basis of their future quality of life? The prevailing official view is that treatment may be withdrawn only if the burdens in an infant's future life outweigh the benefits. In this paper I outline and defend an alternative view. On the Threshold View, treatment may be withdrawn from infants if their future well-being is below a threshold that is close to, but above the zero-point of well-being. I present four arguments in favor of the Threshold View, and identify and respond to several counterarguments. I conclude that it is justifiable in some circumstances for parents and doctors to decide to allow an infant to die even though the infant's life would be worth living. The Threshold View provides a justification for treatment decisions that is more consistent, more robust, and potentially more practical than the standard view. PMID:21337273
NaCl intake and preference threshold of spontaneously hypertensive rats.
Fregly, M J
1975-09-01
Both male and female spontaneously hypertensive (SH) rats have an appetite for NaCl solution. The appetite is present when a choice is offered between distilled water and either isotonic or hypertonic (0.25 M) NaCl solution to drink. Total fluid intake (water plus NaCl solution) was greater for SH rats than for controls while food intakes (g/100 g body wt/day) of SH rats were not different from controls. Mean body weight of SH rats was always less than that of controls. The appetite for NaCl solution was accompanied by a significant reduction in preference (detection) threshold. SH rats could detect the difference between distilled water and NaCl solution when the concentration of the latter was 12 mEq/liter compared to a control threshold of 30 mEq/liter. The NaCl appetite and reduced NaCl preference threshold induced by spontaneous hypertension is in marked contrast to the NaCl aversion induced by other types of experimentally induced hypertension in rats. The mechanism or mechanisms responsible for these differences remain for further study.
Threshold-based insulin-pump interruption for reduction of hypoglycemia.
Bergenstal, Richard M; Klonoff, David C; Garg, Satish K; Bode, Bruce W; Meredith, Melissa; Slover, Robert H; Ahmann, Andrew J; Welsh, John B; Lee, Scott W; Kaufman, Francine R
2013-07-18
The threshold-suspend feature of sensor-augmented insulin pumps is designed to minimize the risk of hypoglycemia by interrupting insulin delivery at a preset sensor glucose value. We evaluated sensor-augmented insulin-pump therapy with and without the threshold-suspend feature in patients with nocturnal hypoglycemia. We randomly assigned patients with type 1 diabetes and documented nocturnal hypoglycemia to receive sensor-augmented insulin-pump therapy with or without the threshold-suspend feature for 3 months. The primary safety outcome was the change in the glycated hemoglobin level. The primary efficacy outcome was the area under the curve (AUC) for nocturnal hypoglycemic events. Two-hour threshold-suspend events were analyzed with respect to subsequent sensor glucose values. A total of 247 patients were randomly assigned to receive sensor-augmented insulin-pump therapy with the threshold-suspend feature (threshold-suspend group, 121 patients) or standard sensor-augmented insulin-pump therapy (control group, 126 patients). The changes in glycated hemoglobin values were similar in the two groups. The mean AUC for nocturnal hypoglycemic events was 37.5% lower in the threshold-suspend group than in the control group (980 ± 1200 mg per deciliter [54.4 ± 66.6 mmol per liter] × minutes vs. 1568 ± 1995 mg per deciliter [87.0 ± 110.7 mmol per liter] × minutes, P<0.001). Nocturnal hypoglycemic events occurred 31.8% less frequently in the threshold-suspend group than in the control group (1.5 ± 1.0 vs. 2.2 ± 1.3 per patient-week, P<0.001). The percentages of nocturnal sensor glucose values of less than 50 mg per deciliter (2.8 mmol per liter), 50 to less than 60 mg per deciliter (3.3 mmol per liter), and 60 to less than 70 mg per deciliter (3.9 mmol per liter) were significantly reduced in the threshold-suspend group (P<0.001 for each range). After 1438 instances at night in which the pump was stopped for 2 hours, the mean sensor glucose value was 92.6 ± 40.7 mg per deciliter (5.1 ± 2.3 mmol per liter). Four patients (all in the control group) had a severe hypoglycemic event; no patients had diabetic ketoacidosis. This study showed that over a 3-month period the use of sensor-augmented insulin-pump therapy with the threshold-suspend feature reduced nocturnal hypoglycemia, without increasing glycated hemoglobin values. (Funded by Medtronic MiniMed; ASPIRE ClinicalTrials.gov number, NCT01497938.).
NASA Astrophysics Data System (ADS)
Sobczak, Grzegorz; DÄ browska, ElŻbieta; Teodorczyk, Marian; Kalbarczyk, Joanna; MalÄ g, Andrzej
2013-01-01
Low quality of the optical beam emitted by high-power laser diodes is the main disadvantage of these devices. The two most important reasons are highly non-Gaussian beam profile with relatively wide divergence in the junction plane and the filamentation effect. Designing laser diode as an array of narrow, close to each other single-mode waveguides is one of the solutions to this problem. In such devices called phase locked arrays (PLA) there is no room for filaments formation. The consequence of optical coupling of many single-mode waveguides is the device emission in the form of few almost diffraction limited beams. Because of losses in regions between active stripes the PLA devices have, however, somewhat higher threshold current and lower slope efficiencies compared to wide-stripe devices of similar geometry. In this work the concept of the high-power laser diode resonator consisted of joined PLA and wide stripe segments is proposed. Resulting changes of electro-optical characteristics of PLA are discussed. The devices are based on the asymmetric heterostructure designed for improvement of the catastrophic optical damage threshold as well as thermal and electrical resistances. Due to reduced distance from the active layer to surface in this heterostructure, better stability of current (and gain) distribution with changing drive level is expected. This could lead to better stability of optical field distribution and supermodes control. The beam divergence reduction in the direction perpendicular of the junction plane has been also achieved.
Venderink, Wulphert; Govers, Tim M; de Rooij, Maarten; Fütterer, Jurgen J; Sedelaar, J P Michiel
2017-05-01
Three commonly used prostate biopsy approaches are systematic transrectal ultrasound guided, direct in-bore MRI guided, and image fusion guided. The aim of this study was to calculate which strategy is most cost-effective. A decision tree and Markov model were developed to compare cost-effectiveness. Literature review and expert opinion were used as input. A strategy was deemed cost-effective if the costs of gaining one quality-adjusted life year (incremental cost-effectiveness ratio) did not exceed the willingness-to-pay threshold of €80,000 (≈$85,000 in January 2017). A base case analysis was performed to compare systematic transrectal ultrasound- and image fusion-guided biopsies. Because of a lack of appropriate literature regarding the accuracy of direct in-bore MRI-guided biopsy, a threshold analysis was performed. The incremental cost-effectiveness ratio for fusion-guided biopsy compared with systematic transrectal ultrasound-guided biopsy was €1386 ($1470) per quality-adjusted life year gained, which was below the willingness-to-pay threshold and thus assumed cost-effective. If MRI findings are normal in a patient with clinically significant prostate cancer, the sensitivity of direct in-bore MRI-guided biopsy has to be at least 88.8%. If that is the case, the incremental cost-effectiveness ratio is €80,000 per quality-adjusted life year gained and thus cost-effective. Fusion-guided biopsy seems to be cost-effective compared with systematic transrectal ultrasound-guided biopsy. Future research is needed to determine whether direct in-bore MRI-guided biopsy is the best pathway; in this study a threshold was calculated at which it would be cost-effective.
Zhao, Fei-Li; Yue, Ming; Yang, Hua; Wang, Tian; Wu, Jiu-Hong; Li, Shu-Chuen
2011-03-01
To estimate the willingness to pay (WTP) per quality-adjusted life year (QALY) ratio with the stated preference data and compare the results obtained between chronic prostatitis (CP) patients and general population (GP). WTP per QALY was calculated with the subjects' own health-related utility and the WTP value. Two widely used preference-based health-related quality of life instruments, EuroQol (EQ-5D) and Short Form 6D (SF-6D), were used to elicit utility for participants' own health. The monthly WTP values for moving from participants' current health to a perfect health were elicited using closed-ended iterative bidding contingent valuation method. A total of 268 CP patients and 364 participants from GP completed the questionnaire. We obtained 4 WTP/QALY ratios ranging from $4700 to $7400, which is close to the lower bound of local gross domestic product per capita, a threshold proposed by World Health Organization. Nevertheless, these values were lower than other proposed thresholds and published empirical researches on diseases with mortality risk. Furthermore, the WTP/QALY ratios from the GP were significantly lower than those from the CP patients, and different determinants were associated with the within group variation identified by multiple linear regression. Preference elicitation methods are acceptable and feasible in the socio-cultural context of an Asian environment and the calculation of WTP/QALY ratio produced meaningful answers. The necessity of considering the QALY type or disease-specific QALY in estimating WTP/QALY ratio was highlighted and 1 to 3 times of gross domestic product/capita recommended by World Health Organization could potentially serve as a benchmark for threshold in this Asian context.
Nute, Jessica L; Jacobsen, Megan C; Stefan, Wolfgang; Wei, Wei; Cody, Dianna D
2018-04-01
A prototype QC phantom system and analysis process were developed to characterize the spectral capabilities of a fast kV-switching dual-energy computed tomography (DECT) scanner. This work addresses the current lack of quantitative oversight for this technology, with the goal of identifying relevant scan parameters and test metrics instrumental to the development of a dual-energy quality control (DEQC). A prototype elliptical phantom (effective diameter: 35 cm) was designed with multiple material inserts for DECT imaging. Inserts included tissue equivalent and material rods (including iodine and calcium at varying concentrations). The phantom was scanned on a fast kV-switching DECT system using 16 dual-energy acquisitions (CTDIvol range: 10.3-62 mGy) with varying pitch, rotation time, and tube current. The circular head phantom (22 cm diameter) was scanned using a similar protocol (12 acquisitions; CTDIvol range: 36.7-132.6 mGy). All acquisitions were reconstructed at 50, 70, 110, and 140 keV and using a water-iodine material basis pair. The images were evaluated for iodine quantification accuracy, stability of monoenergetic reconstruction CT number, noise, and positional constancy. Variance component analysis was used to identify technique parameters that drove deviations in test metrics. Variances were compared to thresholds derived from manufacturer tolerances to determine technique parameters that had a nominally significant effect on test metrics. Iodine quantification error was largely unaffected by any of the technique parameters investigated. Monoenergetic HU stability was found to be affected by mAs, with a threshold under which spectral separation was unsuccessful, diminishing the utility of DECT imaging. Noise was found to be affected by CTDIvol in the DEQC body phantom, and CTDIvol and mA in the DEQC head phantom. Positional constancy was found to be affected by mAs in the DEQC body phantom and mA in the DEQC head phantom. A streamlined scan protocol was developed to further investigate the effects of CTDIvol and rotation time while limiting data collection to the DEQC body phantom. Further data collection will be pursued to determine baseline values and statistically based failure thresholds for the validation of long-term DECT scanner performance. © 2018 American Association of Physicists in Medicine.
Holman, Benjamin W B; Coombs, Cassius E O; Morris, Stephen; Kerr, Matthew J; Hopkins, David L
2017-11-01
Beef loins (LL) stored under different chilled-then-frozen storage combinations (up to 5 and 52weeks, respectively) and two frozen holding temperatures were evaluated for microbial load and meat quality parameters. We found holding temperature effects to be negligible, which suggest -12°C could deliver comparable quality LL to -18°C across these same storage periods. Meat quality parameters varied significantly, but when compared to existing consumer thresholds these may not be perceptible, colour being the exception which proved unacceptable, earlier into retail display when either chilled and subsequent frozen storage periods were increased. There was insufficient detection of key spoilage microbes to allow for statistical analysis, potentially due to the hygienic and commercially representative LL source, although variation in water activity, glycogen content, pH and other moisture parameters conducive to microbial proliferation were influenced by chilled-then-frozen storage. These outcomes could be applied to defining storage thresholds that assure beef quality within export networks, leveraging market access, and improving product management. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Measuring Gait Quality in Parkinson’s Disease through Real-Time Gait Phase Recognition
Mileti, Ilaria; Germanotta, Marco; Di Sipio, Enrica; Imbimbo, Isabella; Pacilli, Alessandra; Erra, Carmen; Petracca, Martina; Del Prete, Zaccaria; Bentivoglio, Anna Rita; Padua, Luca
2018-01-01
Monitoring gait quality in daily activities through wearable sensors has the potential to improve medical assessment in Parkinson’s Disease (PD). In this study, four gait partitioning methods, two based on thresholds and two based on a machine learning approach, considering the four-phase model, were compared. The methods were tested on 26 PD patients, both in OFF and ON levodopa conditions, and 11 healthy subjects, during walking tasks. All subjects were equipped with inertial sensors placed on feet. Force resistive sensors were used to assess reference time sequence of gait phases. Goodness Index (G) was evaluated to assess accuracy in gait phases estimation. A novel synthetic index called Gait Phase Quality Index (GPQI) was proposed for gait quality assessment. Results revealed optimum performance (G < 0.25) for three tested methods and good performance (0.25 < G < 0.70) for one threshold method. The GPQI resulted significantly higher in PD patients than in healthy subjects, showing a moderate correlation with clinical scales score. Furthermore, in patients with severe gait impairment, GPQI was found higher in OFF than in ON state. Our results unveil the possibility of monitoring gait quality in PD through real-time gait partitioning based on wearable sensors. PMID:29558410
NASA Technical Reports Server (NTRS)
Carr, James L.; Madani, Houria
2007-01-01
Geostationary Operational Environmental Satellite (GOES) Image Navigation and Registration (INR) performance is specified at the 3- level, meaning that 99.7% of a collection of individual measurements must comply with specification thresholds. Landmarks are measured by the Replacement Product Monitor (RPM), part of the operational GOES ground system, to assess INR performance and to close the INR loop. The RPM automatically discriminates between valid and invalid measurements enabling it to run without human supervision. In general, this screening is reliable, but a small population of invalid measurements will be falsely identified as valid. Even a small population of invalid measurements can create problems when assessing performance at the 3-sigma level. This paper describes an additional layer of quality control whereby landmarks of the highest quality ("platinum") are identified by their self-consistency. The platinum screening criteria are not simple statistical outlier tests against sigma values in populations of INR errors. In-orbit INR performance metrics for GOES-12 and GOES-13 are presented using the platinum landmark methodology.
Climatic controls on the global distribution, abundance, and species richness of mangrove forests
Osland, Michael J.; Feher, Laura C.; Griffith, Kereen; Cavanaugh, Kyle C.; Enwright, Nicholas M.; Day, Richard H.; Stagg, Camille L.; Krauss, Ken W.; Howard, Rebecca J.; Grace, James B.; Rogers, Kerrylee
2017-01-01
Mangrove forests are highly productive tidal saline wetland ecosystems found along sheltered tropical and subtropical coasts. Ecologists have long assumed that climatic drivers (i.e., temperature and rainfall regimes) govern the global distribution, structure, and function of mangrove forests. However, data constraints have hindered the quantification of direct climate-mangrove linkages in many parts of the world. Recently, the quality and availability of global-scale climate and mangrove data have been improving. Here, we used these data to better understand the influence of air temperature and rainfall regimes upon the distribution, abundance, and species richness of mangrove forests. Although our analyses identify global-scale relationships and thresholds, we show that the influence of climatic drivers is best characterized via regional range limit-specific analyses. We quantified climatic controls across targeted gradients in temperature and/or rainfall within 14 mangrove distributional range limits. Climatic thresholds for mangrove presence, abundance, and species richness differed among the 14 studied range limits. We identified minimum temperature-based thresholds for range limits in eastern North America, eastern Australia, New Zealand, eastern Asia, eastern South America, and southeast Africa. We identified rainfall-based thresholds for range limits in western North America, western Gulf of Mexico, western South America, western Australia, Middle East, northwest Africa, east central Africa, and west central Africa. Our results show that in certain range limits (e.g., eastern North America, western Gulf of Mexico, eastern Asia), winter air temperature extremes play an especially important role. We conclude that rainfall and temperature regimes are both important in western North America, western Gulf of Mexico, and western Australia. With climate change, alterations in temperature and rainfall regimes will affect the global distribution, abundance, and diversity of mangrove forests. In general, warmer winter temperatures are expected to allow mangroves to expand poleward at the expense of salt marshes. However, dispersal and habitat availability constraints may hinder expansion near certain range limits. Along arid and semi-arid coasts, decreases or increases in rainfall are expected to lead to mangrove contraction or expansion, respectively. Collectively, our analyses quantify climate-mangrove linkages and improve our understanding of the expected global- and regional-scale effects of climate change upon mangrove forests.
Han, Su-Ting; Zhou, Ye; Yang, Qing Dan; Zhou, Li; Huang, Long-Biao; Yan, Yan; Lee, Chun-Sing; Roy, Vellaisamy A L
2014-02-25
Tunable memory characteristics are used in multioperational mode circuits where memory cells with various functionalities are needed in one combined device. It is always a challenge to obtain control over threshold voltage for multimode operation. On this regard, we use a strategy of shifting the work function of reduced graphene oxide (rGO) in a controlled manner through doping gold chloride (AuCl3) and obtained a gradient increase of rGO work function. By inserting doped rGO as floating gate, a controlled threshold voltage (Vth) shift has been achieved in both p- and n-type low voltage flexible memory devices with large memory window (up to 4 times for p-type and 8 times for n-type memory devices) in comparison with pristine rGO floating gate memory devices. By proper energy band engineering, we demonstrated a flexible floating gate memory device with larger memory window and controlled threshold voltage shifts.
Smith, Darren A; Saranga, Jacob; Pritchard, Andrew; Kommatas, Nikolaos A; Punnoose, Shinu Kovelal; Kale, Supriya Tukaram
2018-01-01
Mulligan's mobilisation-with-movement (MWM) techniques are proposed to achieve their clinical benefit via neurophysiological mechanisms. However, previous research has focussed on responses in the sympathetic nervous system only, and is not conclusive. An alternative measure of neurophysiological response to MWM is required to support or refute this mechanism of action. Recently, vibration threshold (VT) has been used to quantify changes in the sensory nervous system in patients experiencing musculoskeletal pain. To investigate the effect of a lateral glide MWM of the hip joint on vibration threshold compared to a placebo and control condition in asymptomatic volunteers. Fifteen asymptomatic volunteers participated in this single-blinded, randomised, within-subject, placebo, control design. Participants received each of three interventions in a randomised order; a lateral glide MWM of the hip joint into flexion, a placebo MWM, and a control intervention. Vibration threshold (VT) measures were taken at baseline and immediately after each intervention. Mean change in VT from baseline was calculated for each intervention and then analysed for between group differences using a one-way analysis of variance (ANOVA). A one-way ANOVA revealed no statistically significant differences between the three experimental conditions (P = 0.812). This small study found that a lateral glide MWM of the hip did not significantly change vibration threshold compared to a placebo and control intervention in an asymptomatic population. This study provides a method of using vibration threshold to investigate the potential neurophysiological effects of a manual therapy intervention that should be repeated in a larger, symptomatic population. Copyright © 2016 Elsevier Ltd. All rights reserved.
Effect of Age and Severity of Facial Palsy on Taste Thresholds in Bell's Palsy Patients
Park, Jung Min; Kim, Myung Gu; Jung, Junyang; Kim, Sung Su; Jung, A Ra; Kim, Sang Hoon
2017-01-01
Background and Objectives To investigate whether taste thresholds, as determined by electrogustometry (EGM) and chemical taste tests, differ by age and the severity of facial palsy in patients with Bell's palsy. Subjects and Methods This study included 29 patients diagnosed with Bell's palsy between January 2014 and May 2015 in our hospital. Patients were assorted into age groups and by severity of facial palsy, as determined by House-Brackmann Scale, and their taste thresholds were assessed by EGM and chemical taste tests. Results EGM showed that taste thresholds at four locations on the tongue and one location on the central soft palate, 1 cm from the palatine uvula, were significantly higher in Bell's palsy patients than in controls (p<0.05). In contrast, chemical taste tests showed no significant differences in taste thresholds between the two groups (p>0.05). The severity of facial palsy did not affect taste thresholds, as determined by both EGM and chemical taste tests (p>0.05). The overall mean electrical taste thresholds on EGM were higher in younger Bell's palsy patients than in healthy subjects, with the difference at the back-right area of the tongue differing significantly (p<0.05). In older individuals, however, no significant differences in taste thresholds were observed between Bell's palsy patients and healthy subjects (p>0.05). Conclusions Electrical taste thresholds were higher in Bell's palsy patients than in controls. These differences were observed in younger, but not in older, individuals. PMID:28417103
Assumpção, Ana; Matsutani, Luciana A; Yuan, Susan L; Santo, Adriana S; Sauer, Juliana; Mango, Pamela; Marques, Amelia P
2017-11-29
Exercise therapy is an effective component of fibromyalgia (FM) treatment. However, it is important to know the effects and specificities of the different types of exercise: muscle stretching and resistance training. To verify and compare the effectiveness of muscle stretching exercise and resistance training for symptoms and quality of life in FM patients. Randomized controlled trial. Physical therapy service, FM outpatient clinic. Forty-four women with FM (79 screened). Patients were randomly allocated into a stretching group (n=14), resistance group (n=16), and control group (n=14). Pain was assessed using the visual analog scale, pain threshold using a Fischer dolorimeter, FM symptoms using the Fibromyalgia Impact Questionnaire (FIQ), and quality of life using the Medical Outcomes Study 36-item Short- Form Health Survey (SF-36). The three intervention groups continued with usual medical treatment. In addition, the stretching and resistance groups performed two different exercise programs twice a week for 12 weeks. After treatment, the stretching group showed the highest SF-36 physical functioning score (p=0.01) and the lowest bodily pain score (p=0.01). The resistance group had the lowest FIQ depression score (p=0.02). The control group had the highest score for FIQ morning tiredness and stiffness, and the lowest score for SF-36 vitality. In clinical analyses, the stretching group had significant improvement in quality of life for all SF-36 domains, and the resistance group had significant improvement in FM symptoms and in quality of life for SF-36 domains of physical functioning, vitality, social function, emotional role, and mental health. Muscle stretching exercise was the most effective modality in improving quality of life, especially with regard to physical functioning and pain, and resistance training was the most effective modality in reducing depression. The trial included a control group and two intervention groups, both of which received exercise programs created specifically for patients with FM. In clinical practice, we suggest including both of these modalities in an exercise therapy program for FM.
Kent, Tiffany L; Glybina, Inna V; Abrams, Gary W; Iezzi, Raymond
2008-01-01
To determine whether the sustained intravitreous delivery of CNTF modulates cortical response thresholds to electrical retinal stimulation in the RCS rat model of retinal degeneration. Animals were assigned to four groups: untreated, nonsurgical control and infusion groups of 10 ng/d CNTF, 1 ng/d CNTF, and PBS vehicle control. Thresholds for electrically evoked cortical potentials (EECPs) were recorded in response to transcorneal electrical stimulation of the retina at p30 and again at p60, after a three-week infusion. As the retina degenerated over time, EECP thresholds in response to electrical retinal stimulation increased. Eyes treated with 10 ng/d CNTF demonstrated significantly greater retinal sensitivity to electrical stimulation when compared with all other groups. In addition, eyes treated with 1 ng/d CNTF demonstrated significantly greater retinal sensitivity than both PBS-treated and untreated control groups. Retinal sensitivity to electrical stimulation was preserved in animals treated with chronic intravitreous infusion of CNTF. These data suggest that CNTF-mediated retinal neuroprotection may be a novel therapy that can lower stimulus thresholds in patients about to undergo retinal prosthesis implantation. Furthermore, it may maintain the long-term efficacy of these devices in patients.
David W. P. Manning; Amy D. Rosemond; Vladislav Gulis; Jonathan P. Benstead; John S. Kominoski; John C. Maerz
2016-01-01
Nutrient enrichment of detritus-based streams increases detrital resource quality for consumers and stimulates breakdown rates of particulate organic carbon (C). The relative importance of dissolved inorganic nitrogen (N) vs. phosphorus (P) for detrital quality and their effects on microbial- vs. detritivore-mediated detrital breakdown are poorly understood....
Fitness and Independence after SCI: Defining Meaningful Change and Thresholds
2016-10-01
STATEMENT Approved for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Quality of life after SCI/D is depends more on...determine if low fitness is limiting transfer ability. 15. SUBJECT TERMS Spinal Cord Injury, Fitness, Independence, Quality of Life 16. SECURITY... quality of life . Examples include: • data or databases; • physical collections; • audio or video products; • software; • models; • educational aids
Why do adults with dyslexia have poor global motion sensitivity?
Conlon, Elizabeth G; Lilleskaret, Gry; Wright, Craig M; Stuksrud, Anne
2013-01-01
Two experiments aimed to determine why adults with dyslexia have higher global motion thresholds than typically reading controls. In Experiment 1, the dot density and number of animation frames presented in the dot stimulus were manipulated because of findings that use of a high dot density can normalize coherence thresholds in individuals with dyslexia. Dot densities were 14.15 and 3.54 dots/deg(2). These were presented for five (84 ms) or eight (134 ms) frames. The dyslexia group had higher coherence thresholds in all conditions than controls. However, in the high dot density, long duration condition, both reader groups had the lowest thresholds indicating normal temporal recruitment. These results indicated that the dyslexia group could sample the additional signals dots over space and then integrate these with the same efficiency as controls. In Experiment 2, we determined whether briefly presenting a fully coherent prime moving in either the same or opposite direction of motion to a partially coherent test stimulus would systematically increase and decrease global motion thresholds in the reader groups. When the direction of motion in the prime and test was the same, global motion thresholds increased for both reader groups. The increase in coherence thresholds was significantly greater for the dyslexia group. When the motion of the prime and test were presented in opposite directions, coherence thresholds were reduced in both groups. No group threshold differences were found. We concluded that the global motion processing deficit found in adults with dyslexia can be explained by undersampling of the target motion signals. This might occur because of difficulties directing attention to the relevant motion signals in the random dot pattern, and not a specific difficulty integrating global motion signals. These effects are most likely to occur in the group with dyslexia when more complex computational processes are required to process global motion.
Why do adults with dyslexia have poor global motion sensitivity?
Conlon, Elizabeth G.; Lilleskaret, Gry; Wright, Craig M.; Stuksrud, Anne
2013-01-01
Two experiments aimed to determine why adults with dyslexia have higher global motion thresholds than typically reading controls. In Experiment 1, the dot density and number of animation frames presented in the dot stimulus were manipulated because of findings that use of a high dot density can normalize coherence thresholds in individuals with dyslexia. Dot densities were 14.15 and 3.54 dots/deg2. These were presented for five (84 ms) or eight (134 ms) frames. The dyslexia group had higher coherence thresholds in all conditions than controls. However, in the high dot density, long duration condition, both reader groups had the lowest thresholds indicating normal temporal recruitment. These results indicated that the dyslexia group could sample the additional signals dots over space and then integrate these with the same efficiency as controls. In Experiment 2, we determined whether briefly presenting a fully coherent prime moving in either the same or opposite direction of motion to a partially coherent test stimulus would systematically increase and decrease global motion thresholds in the reader groups. When the direction of motion in the prime and test was the same, global motion thresholds increased for both reader groups. The increase in coherence thresholds was significantly greater for the dyslexia group. When the motion of the prime and test were presented in opposite directions, coherence thresholds were reduced in both groups. No group threshold differences were found. We concluded that the global motion processing deficit found in adults with dyslexia can be explained by undersampling of the target motion signals. This might occur because of difficulties directing attention to the relevant motion signals in the random dot pattern, and not a specific difficulty integrating global motion signals. These effects are most likely to occur in the group with dyslexia when more complex computational processes are required to process global motion. PMID:24376414
Results of the Level-1 Water-Quality Inventory at the Pinnacles National Monument, June 2006
Borchers, James W.; Lyttge, Michael S.
2007-01-01
To help define baseline water quality of key water resources at Pinnacles National Monument, California, the U.S. Geological Survey collected and analyzed ground water from seven springs sampled during June 2006. During the dry season, seeps and springs are the primary source of water for wildlife in the monument and provide habitat for plants, amphibians, and aquatic life. Water samples were analyzed for dissolved concentrations of major ions, trace elements, nutrients, stable isotopes of hydrogen and oxygen, and tritium. In most cases, the concentrations of measured water-quality constituents in spring samples were lower than California threshold standards for drinking water and Federal threshold standards for drinking water and aquatic life. The concentrations of dissolved arsenic in three springs were above the Federal Maximum Contaminant Level for drinking water (10 g/L). Water-quality information for samples collected from the springs will provide a reference point for comparison of samples collected from future monitoring networks and hydrologic studies in the Pinnacles National Monument, and will help National Park Service managers assess relations between water chemistry, geology, and land use.
NASA Astrophysics Data System (ADS)
Leonarduzzi, Elena; Molnar, Peter; McArdell, Brian W.
2017-08-01
A high-resolution gridded daily precipitation data set was combined with a landslide inventory containing over 2000 events in the period 1972-2012 to analyze rainfall thresholds which lead to landsliding in Switzerland. We colocated triggering rainfall to landslides, developed distributions of triggering and nontriggering rainfall event properties, and determined rainfall thresholds and intensity-duration ID curves and validated their performance. The best predictive performance was obtained by the intensity-duration ID threshold curve, followed by peak daily intensity Imax and mean event intensity Imean. Event duration by itself had very low predictive power. A single country-wide threshold of Imax = 28 mm/d was extended into space by regionalization based on surface erodibility and local climate (mean daily precipitation). It was found that wetter local climate and lower erodibility led to significantly higher rainfall thresholds required to trigger landslides. However, we showed that the improvement in model performance due to regionalization was marginal and much lower than what can be achieved by having a high-quality landslide database. Reference cases in which the landslide locations and timing were randomized and the landslide sample size was reduced showed the sensitivity of the Imax rainfall threshold model. Jack-knife and cross-validation experiments demonstrated that the model was robust. The results reported here highlight the potential of using rainfall ID threshold curves and rainfall threshold values for predicting the occurrence of landslides on a country or regional scale with possible applications in landslide warning systems, even with daily data.
Improved Controller Design of Grid Friendly™ Appliances for Primary Frequency Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jianming; Sun, Yannan; Marinovici, Laurentiu D.
2015-09-01
The Grid Friendlymore » $$^\\textrm{TM}$$ Appliance~(GFA) controller, developed at Pacific Northwest National Laboratory, can autonomously switch off the appliances by detecting the under-frequency events. In this paper, the impacts of curtailing frequency threshold on the performance of frequency responsive GFAs are carefully analyzed first. The current method of selecting curtailing frequency thresholds for GFAs is found to be insufficient to guarantee the desired performance especially when the frequency deviation is shallow. In addition, the power reduction of online GFAs could be so excessive that it can even impact the system response negatively. As a remedy to the deficiency of the current controller design, a different way of selecting curtailing frequency thresholds is proposed to ensure the effectiveness of GFAs in frequency protection. Moreover, it is also proposed to introduce a supervisor at each distribution feeder to monitor the curtailing frequency thresholds of online GFAs and take corrective actions if necessary.« less
A Connection Admission Control Method for Web Server Systems
NASA Astrophysics Data System (ADS)
Satake, Shinsuke; Inai, Hiroshi; Saito, Tomoya; Arai, Tsuyoshi
Most browsers establish multiple connections and download files in parallel to reduce the response time. On the other hand, a web server limits the total number of connections to prevent from being overloaded. That could decrease the response time, but would increase the loss probability, the probability of which a newly arriving client is rejected. This paper proposes a connection admission control method which accepts only one connection from a newly arriving client when the number of connections exceeds a threshold, but accepts new multiple connections when the number of connections is less than the threshold. Our method is aimed at reducing the response time by allowing as many clients as possible to establish multiple connections, and also reducing the loss probability. In order to reduce spending time to examine an adequate threshold for web server administrators, we introduce a procedure which approximately calculates the loss probability under a condition that the threshold is given. Via simulation, we validate the approximation and show effectiveness of the admission control.
Adaptive measurements of urban runoff quality
NASA Astrophysics Data System (ADS)
Wong, Brandon P.; Kerkez, Branko
2016-11-01
An approach to adaptively measure runoff water quality dynamics is introduced, focusing specifically on characterizing the timing and magnitude of urban pollutographs. Rather than relying on a static schedule or flow-weighted sampling, which can miss important water quality dynamics if parameterized inadequately, novel Internet-enabled sensor nodes are used to autonomously adapt their measurement frequency to real-time weather forecasts and hydrologic conditions. This dynamic approach has the potential to significantly improve the use of constrained experimental resources, such as automated grab samplers, which continue to provide a strong alternative to sampling water quality dynamics when in situ sensors are not available. Compared to conventional flow-weighted or time-weighted sampling schemes, which rely on preset thresholds, a major benefit of the approach is the ability to dynamically adapt to features of an underlying hydrologic signal. A 28 km2 urban watershed was studied to characterize concentrations of total suspended solids (TSS) and total phosphorus. Water quality samples were autonomously triggered in response to features in the underlying hydrograph and real-time weather forecasts. The study watershed did not exhibit a strong first flush and intraevent concentration variability was driven by flow acceleration, wherein the largest loadings of TSS and total phosphorus corresponded with the steepest rising limbs of the storm hydrograph. The scalability of the proposed method is discussed in the context of larger sensor network deployments, as well the potential to improving control of urban water quality.
Optimal control strategy for a novel computer virus propagation model on scale-free networks
NASA Astrophysics Data System (ADS)
Zhang, Chunming; Huang, Haitao
2016-06-01
This paper aims to study the combined impact of reinstalling system and network topology on the spread of computer viruses over the Internet. Based on scale-free network, this paper proposes a novel computer viruses propagation model-SLBOSmodel. A systematic analysis of this new model shows that the virus-free equilibrium is globally asymptotically stable when its spreading threshold is less than one; nevertheless, it is proved that the viral equilibrium is permanent if the spreading threshold is greater than one. Then, the impacts of different model parameters on spreading threshold are analyzed. Next, an optimally controlled SLBOS epidemic model on complex networks is also studied. We prove that there is an optimal control existing for the control problem. Some numerical simulations are finally given to illustrate the main results.
NASA Astrophysics Data System (ADS)
Weerasinghe, H. W. Kushan; Dadashzadeh, Neda; Thirugnanasambandam, Manasadevi P.; Debord, Benoît.; Chafer, Matthieu; Gérôme, Frédéric; Benabid, Fetah; Corwin, Kristan L.; Washburn, Brian R.
2018-02-01
The effect of gas pressure, fiber length, and optical pump power on an acetylene mid-infrared hollow-core optical fiber gas laser (HOFGLAS) is experimentally determined in order to scale the laser to higher powers. The absorbed optical power and threshold power are measured for different pressures providing an optimum pressure for a given fiber length. We observe a linear dependence of both absorbed pump energy and lasing threshold for the acetylene HOFGLAS, while maintaining a good mode quality with an M-squared of 1.15. The threshold and mode behavior are encouraging for scaling to higher pressures and pump powers.
Improving ontology matching with propagation strategy and user feedback
NASA Astrophysics Data System (ADS)
Li, Chunhua; Cui, Zhiming; Zhao, Pengpeng; Wu, Jian; Xin, Jie; He, Tianxu
2015-07-01
Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. The existing approach requires a threshold to produce matching candidates and use a small set of constraints acting as filter to select the final alignments. We introduce novel match propagation strategy to model the influences between potential entity mappings across ontologies, which can help to identify the correct correspondences and produce missed correspondences. The estimation of appropriate threshold is a difficult task. We propose an interactive method for threshold selection through which we obtain an additional measurable improvement. Running experiments on a public dataset has demonstrated the effectiveness of proposed approach in terms of the quality of result alignment.
78 FR 69177 - Ownership and Control Reports, Forms 102/102S, 40/40S, and 71
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... that comprise each special account; requiring the reporting of certain omnibus account information on..., information regarding the owners and controllers of volume threshold accounts reported on Form 102B and that... introducing a new information collection for omnibus volume threshold accounts in New Form 71.\\11\\ The rules...
Threshold flux-controlled memristor model and its equivalent circuit implementation
NASA Astrophysics Data System (ADS)
Wu, Hua-Gan; Bao, Bo-Cheng; Chen, Mo
2014-11-01
Modeling a memristor is an effective way to explore the memristor properties due to the fact that the memristor devices are still not commercially available for common researchers. In this paper, a physical memristive device is assumed to exist whose ionic drift direction is perpendicular to the direction of the applied voltage, upon which, corresponding to the HP charge-controlled memristor model, a novel threshold flux-controlled memristor model with a window function is proposed. The fingerprints of the proposed model are analyzed. Especially, a practical equivalent circuit of the proposed model is realized, from which the corresponding experimental fingerprints are captured. The equivalent circuit of the threshold memristor model is appropriate for various memristors based breadboard experiments.
Paganoni, C.A.; Chang, K.C.; Robblee, M.B.
2006-01-01
A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.
NASA Astrophysics Data System (ADS)
Paganoni, Christopher A.; Chang, K. C.; Robblee, Michael B.
2006-05-01
A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.
Modeled summer background concentration nutrients and ...
We used regression models to predict background concentration of four water quality indictors: total nitrogen (N), total phosphorus (P), chloride, and total suspended solids (TSS), in the mid-continent (USA) great rivers, the Upper Mississippi, the Lower Missouri, and the Ohio. From best-model linear regressions of water quality indicators with land use and other stressor variables, we determined the concentration of the indicators when the land use and stressor variables were all set to zero the y-intercept. Except for total P on the Upper Mississippi River and chloride on the Ohio River, we were able to predict background concentration from significant regression models. In every model with more than one predictor variable, the model included at least one variable representing agricultural land use and one variable representing development. Predicted background concentration of total N was the same on the Upper Mississippi and Lower Missouri rivers (350 ug l-1), which was much lower than a published eutrophication threshold and percentile-based thresholds (25th percentile of concentration at all sites in the population) but was similar to a threshold derived from the response of sestonic chlorophyll a to great river total N concentration. Background concentration of total P on the Lower Missouri (53 ug l-1) was also lower than published and percentile-based thresholds. Background TSS concentration was higher on the Lower Missouri (30 mg l-1) than the other ri
Gauchard, Gérome C; Gangloff, Pierre; Jeandel, Claude; Perrin, Philippe P
2003-09-01
Balance disorders increase considerably with age due to a decrease in posture regulation quality, and are accompanied by a higher risk of falling. Conversely, physical activities have been shown to improve the quality of postural control in elderly individuals and decrease the number of falls. The aim of this study was to evaluate the impact of two types of exercise on the visual afferent and on the different parameters of static balance regulation. Static postural control was evaluated in 44 healthy women aged over 60 years. Among them, 15 regularly practiced proprioceptive physical activities (Group I), 12 regularly practiced bioenergetic physical activities (Group II), and 18 controls walked on a regular basis (Group III). Group I participants displayed lower sway path and area values, whereas Group III participants displayed the highest, both in eyes-open and eyes-closed conditions. Group II participants displayed intermediate values, close to those of Group I in the eyes-open condition and those of Group III in the eyes-closed condition. Visual afferent contribution was more pronounced for Group II and III participants than for Group I participants. Proprioceptive exercise appears to have the best impact on balance regulation and precision. Besides, even if bioenergetic activity improves postural control in simple postural tasks, more difficult postural tasks show that this type of activity does not develop a neurosensorial proprioceptive input threshold as well, probably on account of the higher contribution of visual afferent.
Quality index of radiological devices: results of one year of use.
Tofani, Alessandro; Imbordino, Patrizia; Lecci, Antonio; Bonannini, Claudia; Del Corona, Alberto; Pizzi, Stefano
2003-01-01
The physical quality index (QI) of radiological devices summarises in a single numerical value between 0 and 1 the results of constancy tests. The aim of this paper is to illustrate the results of the use of such an index on all public radiological devices in the Livorno province over one year. The quality index was calculated for 82 radiological devices of a wide range of types by implementing its algorithm in a spreadsheet-based software for the automatic handling of quality control data. The distribution of quality index values was computed together with the associated statistical quantities. This distribution is strongly asymmetrical, with a sharp peak near the highest QI values. The mean quality index values for the different types of device show some inhomogeneity: in particular, mammography and panoramic dental radiography devices show far lower quality than other devices. In addition, our analysis has identified the parameters that most frequently do not pass the quality tests for each type of device. Finally, we sought some correlation between quality and age of the device, but this was poorly significant. The quality index proved to be a useful tool providing an overview of the physical conditions of radiological devices. By selecting adequate QI threshold values for, it also helps to decide whether a given device should be upgraded or replaced. The identification of critical parameters for each type of device may be used to improve the definition of the QI by attributing greater weights to critical parameters, so as to better address the maintenance of radiological devices.
Effect of image quality on calcification detection in digital mammography
Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C.
2012-01-01
Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC (AFROC) area decreased from 0.84 to 0.63 and the ROC area decreased from 0.91 to 0.79 (p < 0.0001). This corresponded to a 30% drop in lesion sensitivity at a NLF equal to 0.1. Detection was also sensitive to the dose used. There was no significant difference in detection between the two image processing algorithms used (p > 0.05). It was additionally found that lower threshold gold thickness from CDMAM analysis implied better cluster detection. The measured threshold gold thickness passed the acceptable limit set in the EU standards for all image qualities except half dose CR. However, calcification detection varied significantly between image qualities. This suggests that the current EU guidelines may need revising. Conclusions: Microcalcification detection was found to be sensitive to detector and dose used. Standard measurements of image quality were a good predictor of microcalcification cluster detection. PMID:22755704
Effect of image quality on calcification detection in digital mammography.
Warren, Lucy M; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M; Wallis, Matthew G; Chakraborty, Dev P; Dance, David R; Bosmans, Hilde; Young, Kenneth C
2012-06-01
This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC (AFROC) area decreased from 0.84 to 0.63 and the ROC area decreased from 0.91 to 0.79 (p < 0.0001). This corresponded to a 30% drop in lesion sensitivity at a NLF equal to 0.1. Detection was also sensitive to the dose used. There was no significant difference in detection between the two image processing algorithms used (p > 0.05). It was additionally found that lower threshold gold thickness from CDMAM analysis implied better cluster detection. The measured threshold gold thickness passed the acceptable limit set in the EU standards for all image qualities except half dose CR. However, calcification detection varied significantly between image qualities. This suggests that the current EU guidelines may need revising. Microcalcification detection was found to be sensitive to detector and dose used. Standard measurements of image quality were a good predictor of microcalcification cluster detection. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Finneran, James J.; Carder, Donald A.; Ridgway, Sam H.
2002-01-01
The relative contributions of acoustic pressure and particle velocity to the low-frequency, underwater hearing abilities of the bottlenose dolphin (Tursiops truncatus) and white whale (Delphinapterus leucas) were investigated by measuring (masked) hearing thresholds while manipulating the relationship between the pressure and velocity. This was accomplished by varying the distance within the near field of a single underwater sound projector (experiment I) and using two underwater sound projectors and an active sound control system (experiment II). The results of experiment I showed no significant change in pressure thresholds as the distance between the subject and the sound source was changed. In contrast, velocity thresholds tended to increase and intensity thresholds tended to decrease as the source distance decreased. These data suggest that acoustic pressure is a better indicator of threshold, compared to particle velocity or mean active intensity, in the subjects tested. Interpretation of the results of experiment II (the active sound control system) was difficult because of complex acoustic conditions and the unknown effects of the subject on the generated acoustic field; however, these data also tend to support the results of experiment I and suggest that odontocete thresholds should be reported in units of acoustic pressure, rather than intensity.
Bidirectional control system for energy flow in solar powered flywheel
NASA Technical Reports Server (NTRS)
Nola, Frank J. (Inventor)
1987-01-01
An energy storage system for a spacecraft is provided which employs a solar powered flywheel arrangement including a motor/generator which, in different operating modes, drives the flywheel and is driven thereby. A control circuit, including a threshold comparator, senses the output of a solar energy converter, and when a threshold voltage is exceeded thereby indicating the availability of solar power for the spacecraft loads, activates a speed control loop including the motor/generator so as to accelerate the flywheel to a constant speed and thereby store mechanical energy, while also supplying energy from the solar converter to the loads. Under circumstances where solar energy is not available and thus the threshold voltage is not exceeded, the control circuit deactivates the speed control loop and activates a voltage control loop that provides for operation of the motor as a generator so that mechanical energy from the flywheel is converted into electrical energy for supply to the spacecraft loads.
Re: Request for Correction - IRIS Assessment for Trichloroethylene
Letter from Faye Graul providing supplemental information to her Request for Correction for Threshold of Trichloroethylene Contamination of Maternal Drinking Waters submitted under the Information Quality Act.
Bahouth, George; Graygo, Jill; Digges, Kennerly; Schulman, Carl; Baur, Peter
2014-01-01
The objectives of this study are to (1) characterize the population of crashes meeting the Centers for Disease Control and Prevention (CDC)-recommended 20% risk of Injury Severity Score (ISS)>15 injury and (2) explore the positive and negative effects of an advanced automatic crash notification (AACN) system whose threshold for high-risk indications is 10% versus 20%. Binary logistic regression analysis was performed to predict the occurrence of motor vehicle crash injuries at both the ISS>15 and Maximum Abbreviated Injury Scale (MAIS) 3+ level. Models were trained using crash characteristics recommended by the CDC Committee on Advanced Automatic Collision Notification and Triage of the Injured Patient. Each model was used to assign the probability of severe injury (defined as MAIS 3+ or ISS>15 injury) to a subset of NASS-CDS cases based on crash attributes. Subsequently, actual AIS and ISS levels were compared with the predicted probability of injury to determine the extent to which the seriously injured had corresponding probabilities exceeding the 10% and 20% risk thresholds. Models were developed using an 80% sample of NASS-CDS data from 2002 to 2012 and evaluations were performed using the remaining 20% of cases from the same period. Within the population of seriously injured (i.e., those having one or more AIS 3 or higher injuries), the number of occupants whose injury risk did not exceed the 10% and 20% thresholds were estimated to be 11,700 and 18,600, respectively, each year using the MAIS 3+ injury model. For the ISS>15 model, 8,100 and 11,000 occupants sustained ISS>15 injuries yet their injury probability did not reach the 10% and 20% probability for severe injury respectively. Conversely, model predictions suggested that, at the 10% and 20% thresholds, 207,700 and 55,400 drivers respectively would be incorrectly flagged as injured when their injuries had not reached the AIS 3 level. For the ISS>15 model, 87,300 and 41,900 drivers would be incorrectly flagged as injured when injury severity had not reached the ISS>15 injury level. This article provides important information comparing the expected positive and negative effects of an AACN system with thresholds at the 10% and 20% levels using 2 outcome metrics. Overall, results suggest that the 20% risk threshold would not provide a useful notification to improve the quality of care for a large number of seriously injured crash victims. Alternately, a lower threshold may increase the over triage rate. Based on the vehicle damage observed for crashes reaching and exceeding the 10% risk threshold, we anticipate that rescue services would have been deployed based on current Public Safety Answering Point (PSAP) practices.
A Climatology of dust emission in northern Africa using surface observations from 1984-2012
NASA Astrophysics Data System (ADS)
Cowie, Sophie; Knippertz, Peter; Marsham, John
2014-05-01
The huge quantity of mineral dust emitted annually from northern Africa makes this area crucial to the global dust cycle. Once in the atmosphere, dust aerosols have a significant impact on the global radiation budget, clouds, the carbon cycle and can even act as a fertilizer to rain forests in South America. Current model estimates of dust production from northern Africa are uncertain. At the heart of this problem is insufficient understanding of key dust emitting processes such as haboobs (cold pools generated through evaporation of convective precipitation), low-level jets (LLJs) and dry convection (dust devils and dust plumes). Scarce observations in this region, in particular in the Sahara, make model evaluation difficult. This work uses long-term surface observations from 70 stations situated in the Sahara and Sahel to explore the diurnal, seasonal and geographical variations in dust emission events and thresholds. Quality flags are applied to each station to indicate a day-time bias or gaps in the time period 1984-2012. The frequency of dust emission (FDE) is calculated using the present weather codes (WW) of SYNOP reports, where WW = 07,08,09,30-35 and 98. Thresholds are investigated by estimating the wind speeds for which there is a 25%, 50% and 75% probability of dust emission. The 50% threshold is used to calculate strong wind frequency (SWF) and the diagnostic parameter dust uplift potential (DUP); a thresholded cubic function of wind-speed which quantifies the dust generating power of winds. Stations are grouped into 6 areas (North Algeria, Central Sahara, Egypt, West Sahel, Central Sahel and Sudan) for more in-depth analysis of these parameters. Spatially, thresholds are highest in northern Algeria and lowest in the Sahel around the latitude band 16N-21N. Annual mean FDE is anti-correlated with the threshold, showing the importance of spatial variations in thresholds for mean dust emission. The annual cycles of FDE and SWF for the 6 grouped areas are highly correlated (0.95 to 0.99). These correlations are barely reduced when annual-mean thresholds are used, showing that seasonal variations in thresholds are not the main control on the seasonal variations in FDE. Relationships between annual cycles in FDE and DUP are more complex than between FDE and SWF, reflecting the seasonal variations in the types and intensities of dust events. FDE is highest in spring north of 23N. South of this, where stations are directly influenced by the summer monsoon, the annual cycle in FDE is much more variable. Half of the total DUP occurs at wind-speeds greater than ~ 28 ms-1, which highlights the importance of rare high-energy wind events. The likely meteorological mechanisms generating these patterns are discussed.
The Quality of Randomized Controlled Trials in Pediatric Orthopaedics: Are We Improving?
Dodwell, Emily; Dua, Shiv; Dulai, Sukhdeep K; Astone, Kristina; Mulpuri, Kishore
2015-01-01
The quality of randomized controlled trials (RCTs) in orthopaedics is a topic of considerable importance, as RCTs play a major role in guiding clinical practice. The quality of RCTs published between 1995 and 2005 has previously been documented. The purpose of the current study was to assess and describe the quality of pediatric orthopaedic RCTs published from 2005 to 2012, by identifying study characteristics associated with higher quality and outlining areas for improvement. A standardized literature search was used to identify pediatric orthopaedic RCTs published in 7 well-recognized journals between September 2005 and July 2012 inclusive. The Detsky Quality Assessment Scale and the CONSORT checklist for Non-Pharmacologic Trials were used to assess the quality of the RCTs. Scores for the Detsky and CONSORT were calculated by 2 independent blinded orthopaedic surgeon reviewers with epidemiologic training. Forty RCTs were included in this analysis. The mean percentage score on the Detsky quality scale was 67%. Sixteen (40%) of the articles satisfied the threshold for a satisfactory level of methodological quality (Detsky >75%). Twenty-five (63%) of these studies were negative studies, concluding no difference between treatment arms. In 52% of the negative studies, an a priori sample size analysis was absent, and 28% were self-described as underpowered. In multiple variable regression analysis, only working with a statistician was significantly associated with higher Detsky percentage scores (P=0.01). There is a trend for improving quality in pediatric orthopaedic RCTs. Compared with past reports, the mean Detsky score improved from 53% to 67%, and the proportion meeting an acceptable level of quality improved from 19% to 40%. One of the most concerning findings of this study was the lack of attention to sample size and power analysis, and the potential for underpowered studies. Ongoing efforts are necessary to improve the conduct and reporting of clinical trials in pediatric orthopaedics. Pediatric orthopaedic surgeons, JPO, and POSNA are working toward improving levels of quality in pediatric orthopaedic research. This paper highlights progress that has been made, and addresses some high-yield areas for future improvement.
NASA Astrophysics Data System (ADS)
Takahashi, Hajime; Hanafusa, Yuki; Kimura, Yoshinari; Kitamura, Masatoshi
2018-03-01
Oxygen plasma treatment has been carried out to control the threshold voltage in organic thin-film transistors (TFTs) having a SiO2 gate dielectric prepared by rf sputtering. The threshold voltage linearly changed in the range of -3.7 to 3.1 V with the increase in plasma treatment time. Although the amount of change is smaller than that for organic TFTs having thermally grown SiO2, the tendency of the change was similar to that for thermally grown SiO2. To realize different plasma treatment times on the same substrate, a certain region on the SiO2 surface was selected using a shadow mask, and was treated with oxygen plasma. Using the process, organic TFTs with negative threshold voltages and those with positive threshold voltages were fabricated on the same substrate. As a result, enhancement/depletion inverters consisting of the organic TFTs operated at supply voltages of 5 to 15 V.
Network-level reproduction number and extinction threshold for vector-borne diseases.
Xue, Ling; Scoglio, Caterina
2015-06-01
The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.
Element concentrations on Hypogymnia physodes after three years of transplanting along Lake Michigan
Bennett, J.P.; Dibben, M.J.; Lyman, K.J.
1996-01-01
Improvements in air quality in air polluted areas are often followed by recolonization of habitats by sensitive lichens that had died out when air quality was worse. To test the hypothesis that air quality at Indiana Dunes National Lakeshore has improved such that lichens could recolonize the area, samples of a species that once grew in the park, Hypogymnia physodes, were transplanted from Door County, Wisconsin to the park and three other sites along the western shore of Lake Michigan, including one at the site of origin as a control. The lichens were sampled for 3 years and tissue concentrations of 20 chemical elements were measured. There were no significant differences between concentrations over the 3 year study duration at the control site in Door County, suggesting that transplanting itself had no impacts on tissue concentrations. All but two elements increased in concentration from north to south with the greatest increases occurring in the third year of the study. Lichens at Indiana Dunes at the end of the study had suffered severe mortality. Chromium increased the most from north to south but concentrations were not higher than maxima observed in other studies. Arsenic and sulfur, however, exceeded known toxic thresholds or maxima observed in other studies on this species. Four hypotheses are presented to explain the toxicity of elements to this species.
NASA Astrophysics Data System (ADS)
Hu, Hang; Yu, Hong; Zhang, Yongzhi
2013-03-01
Cooperative spectrum sensing, which can greatly improve the ability of discovering the spectrum opportunities, is regarded as an enabling mechanism for cognitive radio (CR) networks. In this paper, we employ a double threshold detection method in energy detector to perform spectrum sensing, only the CR users with reliable sensing information are allowed to transmit one bit local decision to the fusion center. Simulation results will show that our proposed double threshold detection method could not only improve the sensing performance but also save the bandwidth of the reporting channel compared with the conventional detection method with one threshold. By weighting the sensing performance and the consumption of system resources in a utility function that is maximized with respect to the number of CR users, it has been shown that the optimal number of CR users is related to the price of these Quality-of-Service (QoS) requirements.
Audio-visual temporal perception in children with restored hearing.
Gori, Monica; Chilosi, Anna; Forli, Francesca; Burr, David
2017-05-01
It is not clear how audio-visual temporal perception develops in children with restored hearing. In this study we measured temporal discrimination thresholds with an audio-visual temporal bisection task in 9 deaf children with restored audition, and 22 typically hearing children. In typically hearing children, audition was more precise than vision, with no gain in multisensory conditions (as previously reported in Gori et al. (2012b)). However, deaf children with restored audition showed similar thresholds for audio and visual thresholds and some evidence of gain in audio-visual temporal multisensory conditions. Interestingly, we found a strong correlation between auditory weighting of multisensory signals and quality of language: patients who gave more weight to audition had better language skills. Similarly, auditory thresholds for the temporal bisection task were also a good predictor of language skills. This result supports the idea that the temporal auditory processing is associated with language development. Copyright © 2017. Published by Elsevier Ltd.
Improved Bat Algorithm Applied to Multilevel Image Thresholding
2014-01-01
Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed. PMID:25165733
Kulongoski, Justin T.; Belitz, Kenneth
2007-01-01
Ground-water quality in the approximately 1,000-square-mile Monterey Bay and Salinas Valley study unit was investigated from July through October 2005 as part of the California Ground-Water Ambient Monitoring and Assessment (GAMA) program. The study was designed to provide a spatially unbiased assessment of raw ground-water quality, as well as a statistically consistent basis for comparing water quality throughout California. Samples were collected from 94 public-supply wells and 3 monitoring wells in Monterey, Santa Cruz, and San Luis Obispo Counties. Ninety-one of the public-supply wells sampled were selected to provide a spatially distributed, randomized monitoring network for statistical representation of the study area. Six wells were sampled to evaluate changes in water chemistry: three wells along a ground-water flow path were sampled to evaluate lateral changes, and three wells at discrete depths from land surface were sampled to evaluate changes in water chemistry with depth from land surface. The ground-water samples were analyzed for volatile organic compounds (VOCs), pesticides, pesticide degradates, nutrients, major and minor ions, trace elements, radioactivity, microbial indicators, and dissolved noble gases (the last in collaboration with Lawrence Livermore National Laboratory). Naturally occurring isotopes (tritium, carbon-14, helium-4, and the isotopic composition of oxygen and hydrogen) also were measured to help identify the source and age of the sampled ground water. In total, 270 constituents and water-quality indicators were investigated for this study. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, water typically is treated, disinfected, and (or) blended with other waters to maintain water quality. In addition, regulatory thresholds apply to treated water that is served to the consumer, not to raw ground water. In this study, only six constituents, alpha radioactivity, N-nitrosodimethylamine, 1,2,3-trichloropropane, nitrate, radon-222, and coliform bacteria were detected at concentrations higher than health-based regulatory thresholds. Six constituents, including total dissolved solids, hexavalent chromium, iron, manganese, molybdenum, and sulfate were detected at concentrations above levels set for aesthetic concerns. One-third of the randomized wells sampled for the Monterey Bay and Salinas Valley GAMA study had at least a single detection of a VOC or gasoline additive. Twenty-eight of the 88 VOCs and gasoline additives investigated were found in ground-water samples; however, detected concentrations were one-third to one-sixty-thousandth of their respective regulatory thresholds. Compounds detected in 10 percent or more of the wells sampled include chloroform, a compound resulting from the chlorination of water, and tetrachloroethylene (PCE), a common solvent. Pesticides and pesticide degradates also were detected in one-third of the ground-water samples collected; however, detected concentrations were one-thirtieth to one-fourteen-thousandth of their respective regulatory thresholds. Ten of the 122 pesticides and pesticide degradates investigated were found in ground-water samples. Compounds detected in 10 percent or more of the wells sampled include the herbicide simazine, and the pesticide degradate deethylatrazine. Ground-water samples had a median total dissolved solids (TDS) concentration of 467 milligrams per liter (mg/L), and 16 of the 34 samples had TDS concentrations above the recommended secondary maximum contaminant level (SMCL-a threshold established for aesthetic qualities: taste, odor, and color) of 500 mg/L, while four samples had concentrations above the upper SMCL of 1,000 mg/L. Concentrations of nitrate plus nitrite ranged from 0.04 to 37.8 mg/L (as nitrogen), and two samples had concentrations above the health-based threshold for nitrate of 10 mg/L (as nitrogen). The median sulfate concentration
Junker, M H; Danuser, B; Monn, C; Koller, T
2001-10-01
The objective of this study was to provide a basis for effectively protecting nonsmokers from acute sensory impacts and for preventing deterioration of indoor air quality caused by environmental tobacco smoke (ETS) emissions. With an olfactory experiment we determined odor detection thresholds (OT) of sidestream ETS (sETS), and with a full-body exposure experiment we investigated sensory symptoms at very low sETS exposure concentrations. OT concentrations for sETS are three and more orders of magnitude lower than ETS concentrations measured in field settings and correspond to a fresh air dilution volume of > 19,000 m(3) per cigarette, over 100 times more than had previously been suggested for acceptable indoor air conditions. Eye and nasal irritations were observed at one order of magnitude lower sETS concentrations than previously reported, corresponding to a fresh air dilution volume of > 3,000 m(3) per cigarette. These findings have great practical implications for defining indoor air quality standards in indoor compartments where ETS emissions occur. Our study strongly supports the implementation and control of smoking policies such as segregating smoking areas from areas where smoking is not permitted or instituting smoking bans in public buildings.
Castelli, Joël; Depeursinge, Adrien; de Bari, Berardino; Devillers, Anne; de Crevoisier, Renaud; Bourhis, Jean; Prior, John O
2017-06-01
In the context of oropharyngeal cancer treated with definitive radiotherapy, the aim of this retrospective study was to identify the best threshold value to compute metabolic tumor volume (MTV) and/or total lesion glycolysis to predict local-regional control (LRC) and disease-free survival. One hundred twenty patients with a locally advanced oropharyngeal cancer from 2 different institutions treated with definitive radiotherapy underwent FDG PET/CT before treatment. Various MTVs and total lesion glycolysis were defined based on 2 segmentation methods: (i) an absolute threshold of SUV (0-20 g/mL) or (ii) a relative threshold for SUVmax (0%-100%). The parameters' predictive capabilities for disease-free survival and LRC were assessed using the Harrell C-index and Cox regression model. Relative thresholds between 40% and 68% and absolute threshold between 5.5 and 7 had a similar predictive value for LRC (C-index = 0.65 and 0.64, respectively). Metabolic tumor volume had a higher predictive value than gross tumor volume (C-index = 0.61) and SUVmax (C-index = 0.54). Metabolic tumor volume computed with a relative threshold of 51% of SUVmax was the best predictor of disease-free survival (hazard ratio, 1.23 [per 10 mL], P = 0.009) and LRC (hazard ratio: 1.22 [per 10 mL], P = 0.02). The use of different thresholds within a reasonable range (between 5.5 and 7 for an absolute threshold and between 40% and 68% for a relative threshold) seems to have no major impact on the predictive value of MTV. This parameter may be used to identify patient with a high risk of recurrence and who may benefit from treatment intensification.
Cool, Geneviève; Lebel, Alexandre; Sadiq, Rehan; Rodriguez, Manuel J
2015-12-01
The regional variability of the probability of occurrence of high total trihalomethane (TTHM) levels was assessed using multilevel logistic regression models that incorporate environmental and infrastructure characteristics. The models were structured in a three-level hierarchical configuration: samples (first level), drinking water utilities (DWUs, second level) and natural regions, an ecological hierarchical division from the Quebec ecological framework of reference (third level). They considered six independent variables: precipitation, temperature, source type, seasons, treatment type and pH. The average probability of TTHM concentrations exceeding the targeted threshold was 18.1%. The probability was influenced by seasons, treatment type, precipitations and temperature. The variance at all levels was significant, showing that the probability of TTHM concentrations exceeding the threshold is most likely to be similar if located within the same DWU and within the same natural region. However, most of the variance initially attributed to natural regions was explained by treatment types and clarified by spatial aggregation on treatment types. Nevertheless, even after controlling for treatment type, there was still significant regional variability of the probability of TTHM concentrations exceeding the threshold. Regional variability was particularly important for DWUs using chlorination alone since they lack the appropriate treatment required to reduce the amount of natural organic matter (NOM) in source water prior to disinfection. Results presented herein could be of interest to authorities in identifying regions with specific needs regarding drinking water quality and for epidemiological studies identifying geographical variations in population exposure to disinfection by-products (DBPs).
Coyle, Doug; Ko, Yoo-Joung; Coyle, Kathryn; Saluja, Ronak; Shah, Keya; Lien, Kelly; Lam, Henry; Chan, Kelvin K W
2017-04-01
To assess the cost-effectiveness of gemcitabine (G), G + 5-fluorouracil, G + capecitabine, G + cisplatin, G + oxaliplatin, G + erlotinib, G + nab-paclitaxel (GnP), and FOLFIRINOX in the treatment of advanced pancreatic cancer from a Canadian public health payer's perspective, using data from a recently published Bayesian network meta-analysis. Analysis was conducted through a three-state Markov model and used data on the progression of disease with treatment from the gemcitabine arms of randomized controlled trials combined with estimates from the network meta-analysis for the newer regimens. Estimates of health care costs were obtained from local providers, and utilities were derived from the literature. The model estimates the effect of treatment regimens on costs and quality-adjusted life-years (QALYs) discounted at 5% per annum. At a willingness-to-pay (WTP) threshold of greater than $30,666 per QALY, FOLFIRINOX would be the most optimal regimen. For a WTP threshold of $50,000 per QALY, the probability that FOLFIRINOX would be optimal was 57.8%. There was no price reduction for nab-paclitaxel when GnP was optimal. From a Canadian public health payer's perspective at the present time and drug prices, FOLFIRINOX is the optimal regimen on the basis of the cost-effectiveness criterion. GnP is not cost-effective regardless of the WTP threshold. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Critical Power: An Important Fatigue Threshold in Exercise Physiology.
Poole, David C; Burnley, Mark; Vanhatalo, Anni; Rossiter, Harry B; Jones, Andrew M
2016-11-01
: The hyperbolic form of the power-duration relationship is rigorous and highly conserved across species, forms of exercise, and individual muscles/muscle groups. For modalities such as cycling, the relationship resolves to two parameters, the asymptote for power (critical power [CP]) and the so-called W' (work doable above CP), which together predict the tolerable duration of exercise above CP. Crucially, the CP concept integrates sentinel physiological profiles-respiratory, metabolic, and contractile-within a coherent framework that has great scientific and practical utility. Rather than calibrating equivalent exercise intensities relative to metabolically distant parameters such as the lactate threshold or V˙O2max, setting the exercise intensity relative to CP unifies the profile of systemic and intramuscular responses and, if greater than CP, predicts the tolerable duration of exercise until W' is expended, V˙O2max is attained, and intolerance is manifested. CP may be regarded as a "fatigue threshold" in the sense that it separates exercise intensity domains within which the physiological responses to exercise can (
Assessing the impacts of sediments from dredging on corals.
Jones, Ross; Bessell-Browne, Pia; Fisher, Rebecca; Klonowski, Wojciech; Slivkoff, Matthew
2016-01-15
There is a need to develop water quality thresholds for dredging near coral reefs that can relate physical pressures to biological responses and define exposure conditions above which effects could occur. Water quality characteristics during dredging have, however, not been well described. Using information from several major dredging projects, we describe sediment particle sizes in the water column/seabed, suspended sediment concentrations at different temporal scales during natural and dredging-related turbidity events, and changes in light quantity/quality underneath plumes. These conditions differ considerably from those used in past laboratory studies of the effects of sediments on corals. The review also discusses other problems associated with using information from past studies for developing thresholds such as the existence of multiple different and inter-connected cause-effect pathways (which can confuse/confound interpretations), the use of sediment proxies, and the reliance on information from sediment traps to justify exposure regimes in sedimentation experiments. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Gersberg, Richard; Tiedge, Jürgen; Gottstein, Dana; Altmann, Sophie; Watanabe, Kayo; Lüderitz, Volker
2008-04-01
In early 1999, primary treatment and discharge of sewage from Tijuana, Mexico (approximately 95 million liters per day) began through South Bay Ocean Outfall (SBOO) into the ocean 4.3 km offshore. In this study, statistical comparisons were made of the bacterial water quality (total and fecal coliforms and enterococci densities) of the ocean, both before and after discharge of sewage to the SBOO began, so that the effect of this ocean discharge on nearshore ocean water quality could be quantitatively assessed. The frequency of exceedence of bacterial indicator thresholds was statistically analyzed for 11 shore (surfzone) stations throughout US and Mexico using the Fisher's exact test, for the years before (1995-1998) as compared to after the SBOO discharge began (1999-2003). Only four of the 11 shoreline stations (S2, S3, S11, and S12) showed significant improvement (decreased frequency of exceedence of bacterial indicator thresholds) after SBOO discharge began.
Cost-effectiveness of dabigatran for stroke prophylaxis in atrial fibrillation.
Shah, Shimoli V; Gage, Brian F
2011-06-07
Recent studies have investigated alternatives to warfarin for stroke prophylaxis in patients with atrial fibrillation (AF), but whether these alternatives are cost-effective is unknown. On the basis of the results from Randomized Evaluation of Long Term Anticoagulation Therapy (RE-LY) and other trials, we developed a decision-analysis model to compare the cost and quality-adjusted survival of various antithrombotic therapies. We ran our Markov model in a hypothetical cohort of 70-year-old patients with AF using a cost-effectiveness threshold of $50 000/quality-adjusted life-year. We estimated the cost of dabigatran as US $9 a day. For a patient with an average risk of major hemorrhage (≈3%/y), the most cost-effective therapy depended on stroke risk. For patients with the lowest stroke rate (CHADS2 stroke score of 0), only aspirin was cost-effective. For patients with a moderate stroke rate (CHADS2 score of 1 or 2), warfarin was cost-effective unless the risk of hemorrhage was high or quality of international normalized ratio control was poor (time in the therapeutic range <57.1%). For patients with a high stroke risk (CHADS(2) stroke score ≥3), dabigatran 150 mg (twice daily) was cost-effective unless international normalized ratio control was excellent (time in the therapeutic range >72.6%). Neither dabigatran 110 mg nor dual therapy (aspirin and clopidogrel) was cost-effective. Dabigatran 150 mg (twice daily) was cost-effective in AF populations at high risk of hemorrhage or high risk of stroke unless international normalized ratio control with warfarin was excellent. Warfarin was cost-effective in moderate-risk AF populations unless international normalized ratio control was poor.
NASA Astrophysics Data System (ADS)
Chang, Jenghwa; Aronson, Raphael; Graber, Harry L.; Barbour, Randall L.
1995-05-01
We present results examining the dependence of image quality for imaging in dense scattering media as influenced by the choice of parameters pertaining to the physical measurement and factors influencing the efficiency of the computation. The former includes the density of the weight matrix as affected by the target volume, view angle, and source condition. The latter includes the density of the weight matrix and type of algorithm used. These were examined by solving a one-step linear perturbation equation derived from the transport equation using three different algorithms: POCS, CGD, and SART algorithms with contraints. THe above were explored by evaluating four different 3D cylindrical phantom media: a homogeneous medium, an media containing a single black rod on the axis, a single black rod parallel to the axis, and thirteen black rods arrayed in the shape of an 'X'. Solutions to the forward problem were computed using Monte Carlo methods for an impulse source, from which was calculated time- independent and time harmonic detector responses. The influence of target volume on image quality and computational efficiency was studied by computing solution to three types of reconstructions: 1) 3D reconstruction, which considered each voxel individually, 2) 2D reconstruction, which assumed that symmetry along the cylinder axis was know a proiri, 3) 2D limited reconstruction, which assumed that only those voxels in the plane of the detectors contribute information to the detecot readings. The effect of view angle was explored by comparing computed images obtained from a single source, whose position was varied, as well as for the type of tomographic measurement scheme used (i.e., radial scan versus transaxial scan). The former condition was also examined for the dependence of the above on choice of source condition [ i.e., cw (2D reconstructions) versus time-harmonic (2D limited reconstructions) source]. The efficiency of the computational effort was explored, principally, by conducting a weight matrix 'threshold titration' study. This involved computing the ratio of each matrix element to the maximum element of its row and setting this to zero if the ratio was less than a preselected threshold. Results obtained showed that all three types of reconstructions provided good image quality. The 3D reconstruction outperformed the other two reconstructions. The time required for 2D and 2D limited reconstruction is much less (< 10%) than that for the 3D reconstruction. The 'threshold titration' study shows that artifacts were present when the threshold was 5% or higher, and no significant differences of image quality were observed when the thresholds were less tha 1%, in which case 38% (21,849 of 57,600) of the total weight elements were set to zero. Restricting the view angle produced degradation in image quality, but, in all cases, clearly recognizable images were obtained.
Channel Bank Cohesion and the Maintenance of Suspension Rivers
NASA Astrophysics Data System (ADS)
Dunne, K. B. J.; Jerolmack, D. J.
2017-12-01
Gravel-bedded rivers organize their channel geometry and grain size such that transport is close to the threshold of motion at bankfull. Sand-bedded rivers, however, typically maintain bankfull fluid shear (or Shields) stresses far in excess of threshold; there is no widely accepted explanation for these "suspension rivers". We propose that all alluvial rivers are at the threshold of motion for their erosion-limiting material, i.e., the structural component of the river cross-section that is most difficult to mobilize. The entrainment threshold of gravel is large enough that bank cohesion has little influence on gravel-bed rivers. Sand, however, is the most easily entrained material; silt and clay can raise the entrainment threshold of sand by orders of magnitude. We examine a global dataset of river channel geometry and show that the shear stress range for sand-bedded channels is entirely within the range of entrainment thresholds for sand-mud mixtures - suggesting that rivers that suspend their sandy bed material are still threshold rivers in terms of bank material. We then present new findings from a New Jersey coastal-plain river examining if and how river-bank toe composition controls hydraulic geometry. We consider the toe because it is the foundation of the river bank, and its erosion leads to channel widening. Along a 20-km profile of the river we measure cross-section geometry, bed slope, and bed and bank composition, and we explore multiple methods of measuring the threshold shear stress of the the river-bank toe in-situ. As the composition of the river bed transitions from gravel to sand, we see preliminary evidence of a shift from bed-threshold to bank-threshold control on hydraulic geometry. We also observe that sub-bankfull flows are insufficient to erode (cohesive) bank materials, even though transport of sand is active at nearly all flows. Our findings highlight the importance of focusing on river-bank toe material, which in the studied stream is always submerged. The toe is more compacted and more resistant to erosion than the subaerially-exposed upper bank. We find mounting evidence that sand-bedded rivers are much like gravel-bedded river; they are near-threshold channels in which the suspended load does not play a controlling role in the determination of equilibrium hydraulic geometry.
Critical levels as applied to ozone for North American forests
Robert C. Musselman
2006-01-01
The United States and Canada have used concentration-based parameters for air quality standards for ozone effects on forests in North America. The European critical levels method for air quality standards uses an exposure-based parameter, a cumulative ozone concentration index with a threshold cutoff value. The critical levels method has not been used in North America...
Soil quality standards and guidelines for forest sustainability in northwestern North America
Deborah Page-Dumroese; Martin Jurgensen; William Elliot; Thomas Rice; John Nesser; Thomas Collins; Robert Meurisse
2000-01-01
Soil quality standards and guidelines of the USDA Forest Service were some of the first in the world to be developed to evaluate changes in forest soil productivity and sustainability after harvesting and site preparation. International and national development of criteria and indicators for maintenance of soil productivity make it imperative to have adequate threshold...
ERIC Educational Resources Information Center
Kubler, Silvia, Ed.; Portmann, Paul R., Ed.
1994-01-01
This collection of articles on Bilingualism includes: "Fremdsprachenunterricht fur Fortgeschrittene: ein Uberblick" (Foreign Language Learning for Advanced Students: An Overview) (Paul R. Portmann); "Never Mind the Width, Feel the Quality: From Quantity to Quality in Language Teaching at Advanced Levels" (Mike Makosch); "Irren ist menschlich: Ein…
Threshold and subthreshold Generalized Anxiety Disorder (GAD) and suicide ideation.
Gilmour, Heather
2016-11-16
Subthreshold Generalized Anxiety Disorder (GAD) has been reported to be at least as prevalent as threshold GAD and of comparable clinical significance. It is not clear if GAD is uniquely associated with the risk of suicide, or if psychiatric comorbidity drives the association. Data from the 2012 Canadian Community Health Survey-Mental Health were used to estimate the prevalence of threshold and subthreshold GAD in the household population aged 15 or older. As well, the relationship between GAD and suicide ideation was studied. Multivariate logistic regression was used in a sample of 24,785 people to identify significant associations, while adjusting for the confounding effects of sociodemographic factors and other mental disorders. In 2012, an estimated 722,000 Canadians aged 15 or older (2.6%) met the criteria for threshold GAD; an additional 2.3% (655,000) had subthreshold GAD. For people with threshold GAD, past 12-month suicide ideation was more prevalent among men than women (32.0% versus 21.2% respectively). In multivariate models that controlled sociodemographic factors, the odds of past 12-month suicide ideation among people with either past 12-month threshold or subthreshold GAD were significantly higher than the odds for those without GAD. When psychiatric comorbidity was also controlled, associations between threshold and subthreshold GAD and suicidal ideation were attenuated, but remained significant. Threshold and subthreshold GAD affect similar percentages of the Canadian household population. This study adds to the literature that has identified an independent association between threshold GAD and suicide ideation, and demonstrates that an association is also apparent for subthreshold GAD.
The Definition, Rationale, and Effects of Thresholding in OCT Angiography.
Cole, Emily D; Moult, Eric M; Dang, Sabin; Choi, WooJhon; Ploner, Stefan B; Lee, ByungKun; Louzada, Ricardo; Novais, Eduardo; Schottenhamml, Julia; Husvogt, Lennart; Maier, Andreas; Fujimoto, James G; Waheed, Nadia K; Duker, Jay S
2017-01-01
To examine the definition, rationale, and effects of thresholding in OCT angiography (OCTA). A theoretical description of OCTA thresholding in combination with qualitative and quantitative analysis of the effects of OCTA thresholding in eyes from a retrospective case series. Four eyes were qualitatively examined: 1 from a 27-year-old control, 1 from a 78-year-old exudative age-related macular degeneration (AMD) patient, 1 from a 58-year-old myopic patient, and 1 from a 77-year-old nonexudative AMD patient with geographic atrophy (GA). One eye from a 75-year-old nonexudative AMD patient with GA was quantitatively analyzed. A theoretical thresholding model and a qualitative and quantitative description of the dependency of OCTA on thresholding level. Due to the presence of system noise, OCTA thresholding is a necessary step in forming OCTA images; however, thresholding can complicate the relationship between blood flow and OCTA signal. Thresholding in OCTA can cause significant artifacts, which should be considered when interpreting and quantifying OCTA images.
Chang, Mun Young; Gwon, Tae Mok; Lee, Ho Sun; Lee, Jun Ho; Oh, Seung Ha; Kim, Sung June; Park, Min-Hyun
2017-03-15
The present study aimed to evaluate the effects of systemic lipoic acid on hearing preservation after cochlear implantation. Twelve Dunkin-Hartley guinea pigs were randomly divided into two groups: the control group and the lipoic acid group. Animals in the lipoic acid group received lipoic acid intraperitoneally for 4 weeks. A sterilised silicone electrode-dummy was inserted through the round window to a depth of approximately 5 mm. The hearing level was measured using auditory brainstem responses (ABRs) prior to electrode-dummy insertion, and at 4 days and 1, 2, 3 and 4 weeks after electrode-dummy insertion. The threshold shift was defined as the difference between the pre-operative threshold and each of the post-operative thresholds. The cochleae were examined histologically 4 weeks after electrode-dummy insertion. Threshold shifts changed with frequency but not time. At 2kHz, ABR threshold shifts were statistically significantly lower in the lipoic acid group than the control group. At 8, 16 and 32kHz, there was no significant difference in the ABR threshold shift between the two groups. Histologic review revealed less intracochlear fibrosis along the electrode-dummy insertion site in the lipoic acid group than in the control group. The spiral ganglion cell densities of the basal, middle and apical turns were significantly higher in the lipoic acid group compared with the control group. Therefore, systemic lipoic acid administration appears to effectively preserve hearing at low frequencies in patients undergoing cochlear implantation. These effects may be attributed to the protection of spiral ganglion cells and prevention of intracochlear fibrosis. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sutula, Martha; Kudela, Raphael; Hagy, James D.; Harding, Lawrence W.; Senn, David; Cloern, James E.; Bricker, Suzanne; Berg, Gry Mine; Beck, Marcus
2017-10-01
San Francisco Bay (SFB), USA, is highly enriched in nitrogen and phosphorus, but has been resistant to the classic symptoms of eutrophication associated with over-production of phytoplankton. Observations in recent years suggest that this resistance may be weakening, shown by: significant increases of chlorophyll-a (chl-a) and decreases of dissolved oxygen (DO), common occurrences of phytoplankton taxa that can form Harmful Algal Blooms (HAB), and algal toxins in water and mussels reaching levels of concern. As a result, managers now ask: what levels of chl-a in SFB constitute tipping points of phytoplankton biomass beyond which water quality will become degraded, requiring significant nutrient reductions to avoid impairments? We analyzed data for DO, phytoplankton species composition, chl-a, and algal toxins to derive quantitative relationships between three indicators (HAB abundance, toxin concentrations, DO) and chl-a. Quantile regressions relating HAB abundance and DO to chl-a were significant, indicating SFB is at increased risk of adverse HAB and low DO levels if chl-a continues to increase. Conditional probability analysis (CPA) showed chl-a of 13 mg m-3 as a "protective" threshold below which probabilities for exceeding alert levels for HAB abundance and toxins were reduced. This threshold was similar to chl-a of 13-16 mg m-3 that would meet a SFB-wide 80% saturation Water Quality Criterion (WQC) for DO. Higher "at risk" chl-a thresholds from 25 to 40 mg m-3 corresponded to 0.5 probability of exceeding alert levels for HAB abundance, and for DO below a WQC of 5.0 mg L-1 designated for lower South Bay (LSB) and South Bay (SB). We submit these thresholds as a basis to assess eutrophication status of SFB and to inform nutrient management actions. This approach is transferrable to other estuaries to derive chl-a thresholds protective against eutrophication.
[Fibromyalgia in men and women: comparison of the main clinical symptoms].
Miró, Elena; Diener, Fabián N; Martínez, Ma Pilar; Sánchez, Ana I; Valenza, Marie Carmen
2012-02-01
The prevalence of fibromyalgia (FM) in males is much lower than in women. Thus, current knowledge about the syndrome has been developed from research with women. The aim of the present study is to analyze whether FM manifestations differ as a function of sex. Two clinical groups with FM (21 males and 21 women) and a control group of healthy men (n= 21) participated in the study. Several aspects of pain, sleep, fatigue, psychopathology, emotional distress and functional impact of FM were evaluated with an algometer and questionnaires. The clinical groups showed a significantly greater impairment than the control group in all the self-report measures. However, the FM patients only showed significant differences in the sensibility threshold to the pain, which was lower in the women. In addition, the best predictor of the experience of pain in males was sleep quality, and in the women, catastrofying pain. Our results suggest that the most effective therapeutic strategies to control pain may be different for men and women.
Quality-control of an hourly rainfall dataset and climatology of extremes for the UK.
Blenkinsop, Stephen; Lewis, Elizabeth; Chan, Steven C; Fowler, Hayley J
2017-02-01
Sub-daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non-operation of gauges. Given the prospect of an intensification of short-duration rainfall in a warming climate, the identification of such errors is essential if sub-daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near-complete hourly records for 1992-2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n-largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north-south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub-daily rainfall, with convection dominating during summer. The resulting quality-controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality-control procedures for sub-daily data, the validation of the new generation of very high-resolution climate models and improved understanding of the drivers of extreme rainfall.
Improving Forecast Skill by Assimilation of AIRS Temperature Soundings
NASA Technical Reports Server (NTRS)
Susskind, Joel; Reale, Oreste
2010-01-01
AIRS was launched on EOS Aqua on May 4, 2002, together with AMSU-A and HSB, to form a next generation polar orbiting infrared and microwave atmospheric sounding system. The primary products of AIRS/AMSU-A are twice daily global fields of atmospheric temperature-humidity profiles, ozone profiles, sea/land surface skin temperature, and cloud related parameters including OLR. The AIRS Version 5 retrieval algorithm, is now being used operationally at the Goddard DISC in the routine generation of geophysical parameters derived from AIRS/AMSU data. A major innovation in Version 5 is the ability to generate case-by-case level-by-level error estimates delta T(p) for retrieved quantities and the use of these error estimates for Quality Control. We conducted a number of data assimilation experiments using the NASA GEOS-5 Data Assimilation System as a step toward finding an optimum balance of spatial coverage and sounding accuracy with regard to improving forecast skill. The model was run at a horizontal resolution of 0.5 deg. latitude X 0.67 deg longitude with 72 vertical levels. These experiments were run during four different seasons, each using a different year. The AIRS temperature profiles were presented to the GEOS-5 analysis as rawinsonde profiles, and the profile error estimates delta (p) were used as the uncertainty for each measurement in the data assimilation process. We compared forecasts analyses generated from the analyses done by assimilation of AIRS temperature profiles with three different sets of thresholds; Standard, Medium, and Tight. Assimilation of Quality Controlled AIRS temperature profiles significantly improve 5-7 day forecast skill compared to that obtained without the benefit of AIRS data in all of the cases studied. In addition, assimilation of Quality Controlled AIRS temperature soundings performs better than assimilation of AIRS observed radiances. Based on the experiments shown, Tight Quality Control of AIRS temperature profile performs best on the average from the perspective of improving Global 7 day forecast skill.
Hyperbaric Oxygen Therapy Can Diminish Fibromyalgia Syndrome – Prospective Clinical Trial
Efrati, Shai; Golan, Haim; Bechor, Yair; Faran, Yifat; Daphna-Tekoah, Shir; Sekler, Gal; Fishlev, Gregori; Ablin, Jacob N.; Bergan, Jacob; Volkov, Olga; Friedman, Mony; Ben-Jacob, Eshel; Buskila, Dan
2015-01-01
Background Fibromyalgia Syndrome (FMS) is a persistent and debilitating disorder estimated to impair the quality of life of 2–4% of the population, with 9:1 female-to-male incidence ratio. FMS is an important representative example of central nervous system sensitization and is associated with abnormal brain activity. Key symptoms include chronic widespread pain, allodynia and diffuse tenderness, along with fatigue and sleep disturbance. The syndrome is still elusive and refractory. The goal of this study was to evaluate the effect of hyperbaric oxygen therapy (HBOT) on symptoms and brain activity in FMS. Methods and Findings A prospective, active control, crossover clinical trial. Patients were randomly assigned to treated and crossover groups: The treated group patients were evaluated at baseline and after HBOT. Patients in the crossover-control group were evaluated three times: baseline, after a control period of no treatment, and after HBOT. Evaluations consisted of physical examination, including tender point count and pain threshold, extensive evaluation of quality of life, and single photon emission computed tomography (SPECT) imaging for evaluation of brain activity. The HBOT protocol comprised 40 sessions, 5 days/week, 90 minutes, 100% oxygen at 2ATA. Sixty female patients were included, aged 21–67 years and diagnosed with FMS at least 2 years earlier. HBOT in both groups led to significant amelioration of all FMS symptoms, with significant improvement in life quality. Analysis of SPECT imaging revealed rectification of the abnormal brain activity: decrease of the hyperactivity mainly in the posterior region and elevation of the reduced activity mainly in frontal areas. No improvement in any of the parameters was observed following the control period. Conclusions The study provides evidence that HBOT can improve the symptoms and life quality of FMS patients. Moreover, it shows that HBOT can induce neuroplasticity and significantly rectify abnormal brain activity in pain related areas of FMS patients. Trial Registration ClinicalTrials.gov NCT01827683 PMID:26010952
Sandsund, Catherine; Towers, Richard; Thomas, Karen; Tigue, Ruth; Lalji, Amyn; Fernandes, Andreia; Doyle, Natalie; Jordan, Jake; Gage, Heather; Shaw, Clare
2017-08-28
Holistic needs assessment (HNA) and care planning are proposed to address unmet needs of people treated for cancer. We tested whether HNA and care planning by an allied health professional improved cancer-specific quality of life for women following curative treatment for stage I-III gynaecological cancer. Consecutive women were invited to participate in a randomised controlled study (HNA and care planning vs usual care) at a UK cancer centre. Data were collected by questionnaire at baseline, 3 and 6 months. The outcomes were 6-month change in European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire-C30 (version 3), global score (primary) and, in EORTC subscales, generic quality of life and self-efficacy (secondary). The study was blinded for data management and analysis. Differences in outcomes were compared between groups. Health service utilisation and quality-adjusted life years (QALY) (from Short Form-6) were gathered for a cost-effectiveness analysis. Thematic analysis was used to interpret data from an exit interview. 150 women consented (75 per group); 10 undertook interviews. For 124 participants (61 intervention, 63 controls) with complete data, no statistically significant differences were seen between groups in the primary endpoint. The majority of those interviewed reported important personal gains they attributed to the intervention, which reflected trends to improvement seen in EORTC functional and symptom scales. Economic analysis suggests a 62% probability of cost-effectiveness at a £30 000/QALY threshold. Care plan development with an allied health professional is cost-effective, acceptable and useful for some women treated for stage I-III gynaecological cancer. We recommend its introduction early in the pathway to support person-centred care. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ERIC Educational Resources Information Center
McArthur, G. M.; Bishop, D. V. M.
2004-01-01
The reliability and validity of a frequency discrimination (FD) task were tested in 16 people with specific language impairment (SLI) and 16 people with normal spoken language (controls). The FD thresholds of the 2 groups indicated that FD thresholds for 25-ms and 250-ms tones were remarkably stable across 18 months. The FD thresholds were lower…
De Cremer, David
2007-02-01
The present research examined the moderating effect of the level of threshold on people's preferences for different leader types in step-level public good dilemmas. It was assumed that the primary focus of people in step-level public good dilemmas is to make sure that the group surpasses the threshold. Consequently, when the level of threshold is difficult to reach people are expected to provide more support for and cooperate with a leader that monitors and controls the contributions made toward the public good. However, if the threshold is easy to surpass people will focus more on whether the obtained public good or bonus will be distributed according to agreements, suggesting that people will provide more support to and cooperate with a leader that monitors and controls the distribution of the bonus. These predictions were confirmed across two experiments using a step-level public good paradigm with a dichotomous (Study 1) and a continuous (Study 2) contribution choice. Moreover, the results also revealed that perceptions of trust accounted, in part, for the effect of level of threshold on people's leadership preferences.
Roalf, David R.; Quarmley, Megan; Elliott, Mark A.; Satterthwaite, Theodore D.; Vandekar, Simon N.; Ruparel, Kosha; Gennatas, Efstathios D.; Calkins, Monica E.; Moore, Tyler M.; Hopson, Ryan; Prabhakaran, Karthik; Jackson, Chad T.; Verma, Ragini; Hakonarson, Hakon; Gur, Ruben C.; Gur, Raquel E.
2015-01-01
Background Diffusion tensor imaging (DTI) is applied in investigation of brain biomarkers for neurodevelopmental and neurodegenerative disorders. However, the quality of DTI measurements, like other neuroimaging techniques, is susceptible to several confounding factors (e.g. motion, eddy currents), which have only recently come under scrutiny. These confounds are especially relevant in adolescent samples where data quality may be compromised in ways that confound interpretation of maturation parameters. The current study aims to leverage DTI data from the Philadelphia Neurodevelopmental Cohort (PNC), a sample of 1,601 youths ages of 8–21 who underwent neuroimaging, to: 1) establish quality assurance (QA) metrics for the automatic identification of poor DTI image quality; 2) examine the performance of these QA measures in an external validation sample; 3) document the influence of data quality on developmental patterns of typical DTI metrics. Methods All diffusion-weighted images were acquired on the same scanner. Visual QA was performed on all subjects completing DTI; images were manually categorized as Poor, Good, or Excellent. Four image quality metrics were automatically computed and used to predict manual QA status: Mean voxel intensity outlier count (MEANVOX), Maximum voxel intensity outlier count (MAXVOX), mean relative motion (MOTION) and temporal signal-to-noise ratio (TSNR). Classification accuracy for each metric was calculated as the area under the receiver-operating characteristic curve (AUC). A threshold was generated for each measure that best differentiated visual QA status and applied in a validation sample. The effects of data quality on sensitivity to expected age effects in this developmental sample were then investigated using the traditional MRI diffusion metrics: fractional anisotropy (FA) and mean diffusivity (MD). Finally, our method of QA is compared to DTIPrep. Results TSNR (AUC=0.94) best differentiated Poor data from Good and Excellent data. MAXVOX (AUC=0.88) best differentiated Good from Excellent DTI data. At the optimal threshold, 88% of Poor data and 91% Good/Excellent data were correctly identified. Use of these thresholds on a validation dataset (n=374) indicated high accuracy. In the validation sample 83% of Poor data and 94% of Excellent data was identified using thresholds derived from the training sample. Both FA and MD were affected by the inclusion of poor data in an analysis of age, sex and race in a matched comparison sample. In addition, we show that the inclusion of poor data results in significant attenuation of the correlation between diffusion metrics (FA and MD) and age during a critical neurodevelopmental period. We find higher correspondence between our QA method and DTIPrep for Poor data, but we find our method to be more robust for apparently high-quality images. Conclusion Automated QA of DTI can facilitate large-scale, high-throughput quality assurance by reliably identifying both scanner and subject induced imaging artifacts. The results present a practical example of the confounding effects of artifacts on DTI analysis in a large population-based sample, and suggest that estimates of data quality should not only be reported but also accounted for in data analysis, especially in studies of development. PMID:26520775
Griffioen, Mari A; Greenspan, Joel D; Johantgen, Meg; Von Rueden, Kathryn; O'Toole, Robert V; Dorsey, Susan G; Renn, Cynthia L
2018-01-01
Chronic pain is a significant problem for patients with lower extremity injuries. While pain hypersensitivity has been identified in many chronic pain conditions, it is not known whether patients with chronic pain following lower extremity fracture report pain hypersensitivity in the injured leg. To quantify and compare peripheral somatosensory function and sensory nerve activation thresholds in persons with chronic pain following lower extremity fractures with a cohort of persons with no history of lower extremity fractures. This was a cross-sectional study where quantitative sensory testing and current perception threshold testing were conducted on the injured and noninjured legs of cases and both legs of controls. A total of 14 cases and 28 controls participated in the study. Mean time since injury at the time of testing for cases was 22.3 (standard deviation = 12.1) months. The warmth detection threshold ( p = .024) and nerve activation thresholds at 2,000 Hz ( p < .001) and 250 Hz ( p = .002), respectively, were significantly higher in cases compared to controls. This study suggests that patients with chronic pain following lower extremity fractures may experience hypoesthesia in the injured leg, which contrasts with the finding of hyperesthesia previously observed in other chronic pain conditions but is in accord with patients with nerve injuries and surgeries. This is the first study to examine peripheral sensory nerve function at the site of injury in patients with chronic pain following lower extremity fractures using quantitative sensory testing and current perception threshold testing.
Systemic immunity influences hearing preservation in cochlear implantation.
Souter, Melanie; Eastwood, Hayden; Marovic, Paul; Kel, Gordana; Wongprasartsuk, Sarin; Ryan, Allen F; O'Leary, Stephen John
2012-06-01
To determine whether a systemic immune response influences hearing thresholds and tissue response after cochlear implantation of hearing guinea pigs. Guinea pigs were inoculated with sterile antigen (Keyhole limpet hemocyanin) 3 weeks before cochlear implantation. Pure-tone auditory brainstem response thresholds were performed before implantation and 1 and 4 weeks later. Dexamethasone phosphate 20% was adsorbed onto a hyaluronic acid carboxymethylcellulose sponge and was applied to the round window for 30 minutes before electrode insertion. Normal saline was used for controls. Cochlear histology was performed at 4 weeks after implantation to assess the tissue response to implantation. To control for the effect of keyhole limpet hemocyanin priming, a group of unprimed animals underwent cochlear implantation with a saline-soaked pledget applied to the round window. Keyhole limpet hemocyanin priming had no significant detrimental effect on thresholds without implantation. Thresholds were elevated after implantation across all frequencies tested (2-32 kHz) in primed animals but only at higher frequencies (4-32 kHz) in unprimed controls. In primed animals, dexamethasone treatment significantly reduced threshold shifts at 2 and 8 kHz. Keyhole limpet hemocyanin led to the more frequent observation of lymphocytes in the tissue response to the implant. Systemic immune activation at the time of cochlear implantation broadened the range of frequencies experiencing elevated thresholds after implantation. Local dexamethasone provides partial protection against this hearing loss, but the degree and extent of protection are less compared to previous studies with unprimed animals.
Re: Supplement to Request for Correction - IRIS Assessment of Trichloroethylene
Letter from Faye Graul providing supplemental information to her Request for Correction for Threshold of Trichloroethylene Contamination of Maternal Drinking Waters submitted under the Information Quality Act.
The sensitivity of the sole of the foot in patients with Morbus Parkinson.
Prätorius, B; Kimmeskamp, S; Milani, T L
2003-08-07
The sensory input of the foot has an important influence on balance. In patients with Morbus Parkinson (PD-patients) balance control is often impaired. Therefore, the aim of this study was to quantify the sensitivity of the plantar foot in PD-patients. Five sites of the plantar foot were examined in 24 PD-patients and in 20 controls using Semmes-Weinstein Monofilaments for touch pressure and a vibration-exciter (30 Hz) for vibration. The results show significantly higher thresholds in PD-patients. For each tested location (except the heel) the thresholds are at least twice as high as in controls. Moreover, this study proved the correlation between motor and somatosensory systems: the stronger the motor deficiencies in PD-patients (Unified Parkinson's Disease Rating System score) the higher the sensitivity thresholds for vibration. In conclusion, reduced sensitivity of the plantar foot may contribute to impaired balance control.
Oosterhuis, H J; Bouwsma, C; van Halsema, B; Hollander, R A; Kros, C J; Tombroek, I
1992-10-03
Quantification of vibration perception and fingertip sensation in routine neurological examination. Neurological Clinic, University Hospital, Groningen, the Netherlands. Prospective, controlled investigation. Vibration perception and fingertip sensation were quantified in a large group of normal control persons of various ages and in neurological patients and compared with the usual sensory tests at routine neurological examination. The vibration perception limit was measured with a biothesiometer without accelerometer, the fingertip sensation with a device for two-point discrimination slightly modified according to Renfrew ('Renfrew meter'). Concordance of the tests was studied by calculating kappa values. The normal values of both sensory qualities had a log-normal distribution and increased with age. The values obtained with the Renfrew meter correlated well with those of the two-point discrimination and stereognosis but were systematically higher than those indicated by Renfrew. Both methods appear useful at routine neurological examination if certain measuring precautions are taken.
Simultaneously Mitigating Near-Term Climate Change and Improving Human Health and Food Security
NASA Astrophysics Data System (ADS)
Shindell, Drew; Kuylenstierna, Johan C. I.; Vignati, Elisabetta; van Dingenen, Rita; Amann, Markus; Klimont, Zbigniew; Anenberg, Susan C.; Muller, Nicholas; Janssens-Maenhout, Greet; Raes, Frank; Schwartz, Joel; Faluvegi, Greg; Pozzoli, Luca; Kupiainen, Kaarle; Höglund-Isaksson, Lena; Emberson, Lisa; Streets, David; Ramanathan, V.; Hicks, Kevin; Oanh, N. T. Kim; Milly, George; Williams, Martin; Demkine, Volodymyr; Fowler, David
2012-01-01
Tropospheric ozone and black carbon (BC) contribute to both degraded air quality and global warming. We considered ~400 emission control measures to reduce these pollutants by using current technology and experience. We identified 14 measures targeting methane and BC emissions that reduce projected global mean warming ~0.5°C by 2050. This strategy avoids 0.7 to 4.7 million annual premature deaths from outdoor air pollution and increases annual crop yields by 30 to 135 million metric tons due to ozone reductions in 2030 and beyond. Benefits of methane emissions reductions are valued at $700 to $5000 per metric ton, which is well above typical marginal abatement costs (less than $250). The selected controls target different sources and influence climate on shorter time scales than those of carbon dioxide-reduction measures. Implementing both substantially reduces the risks of crossing the 2°C threshold.
Channel MAC Protocol for Opportunistic Communication in Ad Hoc Wireless Networks
NASA Astrophysics Data System (ADS)
Ashraf, Manzur; Jayasuriya, Aruna; Perreau, Sylvie
2008-12-01
Despite significant research effort, the performance of distributed medium access control methods has failed to meet theoretical expectations. This paper proposes a protocol named "Channel MAC" performing a fully distributed medium access control based on opportunistic communication principles. In this protocol, nodes access the channel when the channel quality increases beyond a threshold, while neighbouring nodes are deemed to be silent. Once a node starts transmitting, it will keep transmitting until the channel becomes "bad." We derive an analytical throughput limit for Channel MAC in a shared multiple access environment. Furthermore, three performance metrics of Channel MAC—throughput, fairness, and delay—are analysed in single hop and multihop scenarios using NS2 simulations. The simulation results show throughput performance improvement of up to 130% with Channel MAC over IEEE 802.11. We also show that the severe resource starvation problem (unfairness) of IEEE 802.11 in some network scenarios is reduced by the Channel MAC mechanism.
Do Contemporary Randomized Controlled Trials Meet ESMO Thresholds for Meaningful Clinical Benefit?
Del Paggio, J C; Azariah, B; Sullivan, R; Hopman, W M; James, F V; Roshni, S; Tannock, I F; Booth, C M
2017-01-01
The European Society for Medical Oncology (ESMO) recently released a magnitude of clinical benefit scale (ESMO-MCBS) for systemic therapies for solid cancers. Here, we evaluate contemporary randomized controlled trials (RCTs) against the proposed ESMO thresholds for meaningful clinical benefit. RCTs evaluating systemic therapy for breast cancer, nonsmall cell lung cancer (NSCLC), colorectal cancer (CRC), and pancreatic cancer published 2011-2015 were reviewed. Data were abstracted regarding trial characteristics and outcomes, and these were applied to the ESMO-MCBS. We also determined whether RCTs were designed to detect an effect that would meet clinical benefit as defined by the ESMO-MCBS. About 277 eligible RCTs were included (40% breast, 31% NSCLC, 22% CRC, 6% pancreas). Median sample size was 532 and 83% were funded by industry. Among all 277 RCTs, the experimental therapy was statistically superior to the control arm in 138 (50%) trials: results of only 31% (43/138) of these trials met the ESMO-MCBS clinical benefit threshold. RCTs with curative intent were more likely to meet clinically meaningful thresholds than those with palliative intent [61% (19/31) versus 22% (24/107), P < 0.001]. Among the 226 RCTs for which the ESMO-MCBS could be applied, 31% (70/226) were designed to detect an effect size that could meet ESMO-MCBS thresholds. Less than one-third of contemporary RCTs with statistically significant results meet ESMO thresholds for meaningful clinical benefit, and this represents only 15% of all published trials. Investigators, funding agencies, regulatory agencies, and industry should adopt more stringent thresholds for meaningful benefit in the design of future RCTs. © The Author 2016. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Nimdet, Khachapon; Chaiyakunapruk, Nathorn; Vichansavakul, Kittaya; Ngorsuraches, Surachat
2015-01-01
Background A number of studies have been conducted to estimate willingness to pay (WTP) per quality-adjusted life years (QALY) in patients or general population for various diseases. However, there has not been any systematic review summarizing the relationship between WTP per QALY and cost-effectiveness (CE) threshold based on World Health Organization (WHO) recommendation. Objective To systematically review willingness-to-pay per quality-adjusted-life-year (WTP per QALY) literature, to compare WTP per QALY with Cost-effectiveness (CE) threshold recommended by WHO, and to determine potential influencing factors. Methods We searched MEDLINE, EMBASE, Psyinfo, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Center of Research Dissemination (CRD), and EconLit from inception through 15 July 2014. To be included, studies have to estimate WTP per QALY in health-related issues using stated preference method. Two investigators independently reviewed each abstract, completed full-text reviews, and extracted information for included studies. We compared WTP per QALY to GDP per capita, analyzed, and summarized potential influencing factors. Results Out of 3,914 articles founded, 14 studies were included. Most studies (92.85%) used contingent valuation method, while only one study used discrete choice experiments. Sample size varied from 104 to 21,896 persons. The ratio between WTP per QALY and GDP per capita varied widely from 0.05 to 5.40, depending on scenario outcomes (e.g., whether it extended/saved life or improved quality of life), severity of hypothetical scenarios, duration of scenario, and source of funding. The average ratio of WTP per QALY and GDP per capita for extending life or saving life (2.03) was significantly higher than the average for improving quality of life (0.59) with the mean difference of 1.43 (95% CI, 1.81 to 1.06). Conclusion This systematic review provides an overview summary of all studies estimating WTP per QALY studies. The variation of ratio of WTP per QALY and GDP per capita depended on several factors may prompt discussions on the CE threshold policy. Our research work provides a foundation for defining future direction of decision criteria for an evidence-informed decision making system. PMID:25855971
Deviney, Frank A.; Rice, Karen; Brown, Donald E.
2012-01-01
Natural resource managers require information concerning the frequency, duration, and long-term probability of occurrence of water-quality indicator (WQI) violations of defined thresholds. The timing of these threshold crossings often is hidden from the observer, who is restricted to relatively infrequent observations. Here, a model for the hidden process is linked with a model for the observations, and the parameters describing duration, return period, and long-term probability of occurrence are estimated using Bayesian methods. A simulation experiment is performed to evaluate the approach under scenarios based on the equivalent of a total monitoring period of 5-30 years and an observation frequency of 1-50 observations per year. Given constant threshold crossing rate, accuracy and precision of parameter estimates increased with longer total monitoring period and more-frequent observations. Given fixed monitoring period and observation frequency, accuracy and precision of parameter estimates increased with longer times between threshold crossings. For most cases where the long-term probability of being in violation is greater than 0.10, it was determined that at least 600 observations are needed to achieve precise estimates. An application of the approach is presented using 22 years of quasi-weekly observations of acid-neutralizing capacity from Deep Run, a stream in Shenandoah National Park, Virginia. The time series also was sub-sampled to simulate monthly and semi-monthly sampling protocols. Estimates of the long-term probability of violation were unbiased despite sampling frequency; however, the expected duration and return period were over-estimated using the sub-sampled time series with respect to the full quasi-weekly time series.
Effect of Heavy Consumption of Alcoholic Beverages on the Perception of Sweet and Salty Taste.
Silva, Camile S; Dias, Vaneria R; Almeida, Juliane A Regis; Brazil, Jamile M; Santos, Ramon A; Milagres, Maria P
2016-05-01
To determine the threshold index of sweet and salty tastes in alcoholics undergoing treatment. Taste threshold was assessed using type 3-Alternative Forced Choice in a control group (92 non-alcoholic volunteers) and a test group (92 alcoholics in therapy). The test group completed a structured questionnaire on lifestyle and habits. Significant difference were found between the threshold rates found in the test (3.78) and control groups (1.39). In the salty stimulus, no significant difference was noted in the threshold detection between the control (0.17) and test groups (0.30). A significant correlation was observed between the index Pearson's threshold to sweet taste in the test group and their reported alcohol consumption. The test group reported characteristics such as loss of appetite (93%), weight loss during consumption (62%) and weight gain after quitting drinking (72%). That the alcoholic group reported less sensitivity to sweet taste suggests that drinking habits may influence choice of foods, with a greater preference for foods with higher sucrose concentration. This contribute to poor health, because excess consumption of sugar raises risk for several diseases. No conclusive results were found for the salty stimulus. © The Author 2015. Medical Council on Alcohol and Oxford University Press. All rights reserved.
The Evaluation of Olfactory Function in Patients With Schizophrenia.
Robabeh, Soleimani; Mohammad, Jalali Mir; Reza, Ahmadi; Mahan, Badri
2015-04-23
The aim of this study was to compare olfactory threshold, smell identification, intensity and pleasantness ratings between patients with schizophrenia and healthy controls, and (2) to evaluate correlations between ratings of olfactory probes and illness characteristics. Thirty one patients with schizophrenia and 31 control subjects were assessed with the olfactory n-butanol threshold test, the Iran smell identification test (Ir-SIT), and the suprathreshold amyl acetate odor intensity and odor pleasantness rating test. All olfactory tasks were performed unirhinally. Patients with schizophrenia showed disrupted olfaction in all four measures. Longer duration of schizophrenia was associated with a larger impairment of olfactory threshold or microsmic range on the Ir-SIT (P=0.04, P=0.05, respectively). In patients with schizophrenia, female subjects' ratings of pleasantness followed the same trend as control subjects, whereas male patients' ratings showed an opposite trend. Patients exhibiting high positive score on the positive and negative syndrome scale (PANSS) performed better on the olfactory threshold test (r=0.37, P=0.04). The higher odor pleasantness ratings of patients were associated with presence of positive symptoms. The results suggest that both male and female patients with schizophrenia had difficulties on the olfactory threshold and smell identification tests, but appraisal of odor pleasantness was more disrupted in male patients.
Ozone's Threat Hits Back Mexico City
NASA Astrophysics Data System (ADS)
Velasco, E.; Retama, A.; Guzman, D.
2016-12-01
Last March the Mexican authorities activated after 13 years the environmental alarm when ozone (O3) reached 210 ppb. The emergency measures created confusion among the public, who had lost memory of previous air quality crisis. Despite Mexico City has experienced a significant progress towards achieving cleaner air during the last 20 years, a recent relaxation in traffic regulations and meteorology favorable for photochemical activity triggered this new episode. All criteria pollutants of primary origin have been controlled and are in compliance with the Mexican Air Quality Standards. However, O3 and fine particles still exceed the standard threshold concentrations. For instance, 49-64% of the days have exceeded the 1-hour O3 standard of 95 ppb during the last 5 years. The current control policies, which responded to the integration of air quality information by authorities and scientists, have apparently started to lose effectiveness. Although precursor gases, such as alkanes and aromatics have shown an important decrease, reactive olefins have gained importance. The increase of motor-vehicles in recent years seems to fuel again the atmosphere's reactivity. This paper analyses the effectiveness of the emergency measures during the crisis based on the knowledge obtained from previous large field studies and the comprehensive data collected by the local air quality monitoring network. It is 10 years from MILAGRO, the last interdisciplinary study that examined the air pollution of the most populous city in North America. We call for a new collaborative research initiative based on a major field measurement campaign with the target of revealing new insights into the meteorology, emission of primary pollutants and precursor gases, photochemical production and formation of secondary particles in the atmosphere of Mexico City to improve its air quality, as well as of similar cities in the developing world.
Weston, Victoria C.; Meurer, William J.; Frederiksen, Shirley M.; Fox, Allison K.; Scott, Phillip A.
2016-01-01
Objectives Cluster randomized trials (CRTs) are increasingly utilized to evaluate quality improvement interventions aimed at healthcare providers. In trials testing emergency department interventions, migration of emergency physicians (EPs) between hospitals is an important concern, as contamination may affect both internal and external validity. We hypothesized that geographically isolating emergency departments would prevent migratory contamination in a CRT designed to increase ED delivery of tPA in stroke (The INSTINCT Trial). Methods INSTINCT was a prospective, cluster randomized, controlled trial. 24 Michigan community hospitals were randomly selected in matched pairs for study. Contamination was defined at the cluster level, with substantial contamination defined a priori as >10% of EPs affected. Non-adherence, total crossover (contamination + non-adherence), migration distance and characteristics were determined. Results 307 emergency physicians were identified at all sites. Overall, 7 (2.3%) changed study sites. 1 moved between control sites, leaving 6 (2.0%) total crossovers. Of these, 2 (0.7%) moved from intervention to control (contamination) and 4 (1.3%) moved from control to intervention (non-adherence). Contamination was observed in 2 of 12 control sites, with 17% and 9% contamination of the total site EP workforce at follow-up, respectively. Average migration distance was 42 miles for all EPs moving in the study and 35 miles for EPs moving from intervention to control sites. Conclusion The mobile nature of emergency physicians should be considered in the design of quality improvement CRTs. Increased reporting of contamination in CRTs is encouraged to clarify thresholds and facilitate CRT design. PMID:25440230
Improved Atmospheric Soundings and Error Estimates from Analysis of AIRS/AMSU Data
NASA Technical Reports Server (NTRS)
Susskind, Joel
2007-01-01
The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Three very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control; and 3) development of an accurate AIRS only cloud clearing and retrieval system. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions, without the need for microwave observations in the cloud clearing step as has been done previously. In this methodology, longwave C02 channel observations in the spectral region 700 cm-' to 750 cm-' are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm-' to 2395 cm-' are used for temperature sounding purposes. The new methodology for improved error estimates and their use in quality control is described briefly and results are shown indicative of their accuracy. Results are also shown of forecast impact experiments assimilating AIRS Version 5.0 retrieval products in the Goddard GEOS 5 Data Assimilation System using different quality control thresholds.
On the Estimation of the Cost-Effectiveness Threshold: Why, What, How?
Vallejo-Torres, Laura; García-Lorenzo, Borja; Castilla, Iván; Valcárcel-Nazco, Cristina; García-Pérez, Lidia; Linertová, Renata; Polentinos-Castro, Elena; Serrano-Aguilar, Pedro
2016-01-01
Many health care systems claim to incorporate the cost-effectiveness criterion in their investment decisions. Information on the system's willingness to pay per effectiveness unit, normally measured as quality-adjusted life-years (QALYs), however, is not available in most countries. This is partly because of the controversy that remains around the use of a cost-effectiveness threshold, about what the threshold ought to represent, and about the appropriate methodology to arrive at a threshold value. The aim of this article was to identify and critically appraise the conceptual perspectives and methodologies used to date to estimate the cost-effectiveness threshold. We provided an in-depth discussion of different conceptual views and undertook a systematic review of empirical analyses. Identified studies were categorized into the two main conceptual perspectives that argue that the threshold should reflect 1) the value that society places on a QALY and 2) the opportunity cost of investment to the system given budget constraints. These studies showed different underpinning assumptions, strengths, and limitations, which are highlighted and discussed. Furthermore, this review allowed us to compare the cost-effectiveness threshold estimates derived from different types of studies. We found that thresholds based on society's valuation of a QALY are generally larger than thresholds resulting from estimating the opportunity cost to the health care system. This implies that some interventions with positive social net benefits, as informed by individuals' preferences, might not be an appropriate use of resources under fixed budget constraints. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buechler, Miles A.
2012-05-02
This memo discusses observations that have been made in regards to a series of monotonic and cyclic uniaxial experiments performed on PBX9501 by Darla Thompson under Enhanced Surveilance Campaign support. These observations discussed in Section Cyclic compression observations strongly suggest the presence of viscoelastic, plastic, and damage phenomena in the mechanical response of the material. In Secton Uniaxial data analysis and observations methods are discussed for separating out the viscoelastic effects. A crude application of those methods suggests the possibility of a critical stress below which plasticity and damage may be negligible. The threshold should be explored because if itmore » exists it will be an important feature of any constitutive model. Additionally, if the threshold exists then modifications of experimental methods may be feasible which could potentially simplify future experiments or provide higher quality data from those experiments. A set of experiments to explore the threshold stress are proposed in Section Exploratory tests program for identifying threshold stress.« less
High efficiency low threshold current 1.3 μm InAs quantum dot lasers on on-axis (001) GaP/Si
NASA Astrophysics Data System (ADS)
Jung, Daehwan; Norman, Justin; Kennedy, M. J.; Shang, Chen; Shin, Bongki; Wan, Yating; Gossard, Arthur C.; Bowers, John E.
2017-09-01
We demonstrate highly efficient, low threshold InAs quantum dot lasers epitaxially grown on on-axis (001) GaP/Si substrates using molecular beam epitaxy. Electron channeling contrast imaging measurements show a threading dislocation density of 7.3 × 106 cm-2 from an optimized GaAs template grown on GaP/Si. The high-quality GaAs templates enable as-cleaved quantum dot lasers to achieve a room-temperature continuous-wave (CW) threshold current of 9.5 mA, a threshold current density as low as 132 A/cm2, a single-side output power of 175 mW, and a wall-plug-efficiency of 38.4% at room temperature. As-cleaved QD lasers show ground-state CW lasing up to 80 °C. The application of a 95% high-reflectivity coating on one laser facet results in a CW threshold current of 6.7 mA, which is a record-low value for any kind of Fabry-Perot laser grown on Si.
Improving fMRI reliability in presurgical mapping for brain tumours.
Stevens, M Tynan R; Clarke, David B; Stroink, Gerhard; Beyea, Steven D; D'Arcy, Ryan Cn
2016-03-01
Functional MRI (fMRI) is becoming increasingly integrated into clinical practice for presurgical mapping. Current efforts are focused on validating data quality, with reliability being a major factor. In this paper, we demonstrate the utility of a recently developed approach that uses receiver operating characteristic-reliability (ROC-r) to: (1) identify reliable versus unreliable data sets; (2) automatically select processing options to enhance data quality; and (3) automatically select individualised thresholds for activation maps. Presurgical fMRI was conducted in 16 patients undergoing surgical treatment for brain tumours. Within-session test-retest fMRI was conducted, and ROC-reliability of the patient group was compared to a previous healthy control cohort. Individually optimised preprocessing pipelines were determined to improve reliability. Spatial correspondence was assessed by comparing the fMRI results to intraoperative cortical stimulation mapping, in terms of the distance to the nearest active fMRI voxel. The average ROC-r reliability for the patients was 0.58±0.03, as compared to 0.72±0.02 in healthy controls. For the patient group, this increased significantly to 0.65±0.02 by adopting optimised preprocessing pipelines. Co-localisation of the fMRI maps with cortical stimulation was significantly better for more reliable versus less reliable data sets (8.3±0.9 vs 29±3 mm, respectively). We demonstrated ROC-r analysis for identifying reliable fMRI data sets, choosing optimal postprocessing pipelines, and selecting patient-specific thresholds. Data sets with higher reliability also showed closer spatial correspondence to cortical stimulation. ROC-r can thus identify poor fMRI data at time of scanning, allowing for repeat scans when necessary. ROC-r analysis provides optimised and automated fMRI processing for improved presurgical mapping. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J; Chan, F; Newman, B
2014-06-15
Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less
Geller, A M; Oshiro, W M; Haykal-Coates, N; Kodavanti, P R; Bushnell, P J
2001-02-01
Developmental exposure to polychlorinated biphenyls (PCBs) has been associated with behavioral and cognitive deficits in humans and animal models. Perinatal exposure to PCBs has also been associated with sensory deficits in animal models. These effects were hypothesized to be mediated in part by ortho-substituted PCBs, which do not or weakly bind to the aryl hydrocarbon (Ah) receptor. The present studies were designed to determine whether perinatal exposure to Aroclor 1254, a commercial mixture of > 99% ortho-substituted PCBs, would affect cognitive and sensory function in Long-Evans rats. Adult male and female offspring of female rats fed Aroclor 1254 (Lot #124-191; doses of 0, 1, or 6 mg/kg/day; gestational day 6 through postnatal day 21; n = eight/group) were trained to perform a signal detection task capable of assessing sensory thresholds. Training included autoshaping and operant conditioning. Thresholds for detecting a 1-s light stimulus were determined under background illuminations ranging from 2 lux to complete darkness. Female rats exposed to Aroclor 1254 autoshaped more rapidly than control females, at a rate akin to control males. Control females had lower thresholds than control males at all levels of background illumination. These differences were abolished by Aroclor 1254, which reduced thresholds in males and increased thresholds in females. These data extend previous findings of gender-specific effects of PCBs on neurobehavioral development to measures of acquisition and sensory function.
Processing circuitry for single channel radiation detector
NASA Technical Reports Server (NTRS)
Holland, Samuel D. (Inventor); Delaune, Paul B. (Inventor); Turner, Kathryn M. (Inventor)
2009-01-01
Processing circuitry is provided for a high voltage operated radiation detector. An event detector utilizes a comparator configured to produce an event signal based on a leading edge threshold value. A preferred event detector does not produce another event signal until a trailing edge threshold value is satisfied. The event signal can be utilized for counting the number of particle hits and also for controlling data collection operation for a peak detect circuit and timer. The leading edge threshold value is programmable such that it can be reprogrammed by a remote computer. A digital high voltage control is preferably operable to monitor and adjust high voltage for the detector.
Sporadic frame dropping impact on quality perception
NASA Astrophysics Data System (ADS)
Pastrana-Vidal, Ricardo R.; Gicquel, Jean Charles; Colomes, Catherine; Cherifi, Hocine
2004-06-01
Over the past few years there has been an increasing interest in real time video services over packet networks. When considering quality, it is essential to quantify user perception of the received sequence. Severe motion discontinuities are one of the most common degradations in video streaming. The end-user perceives a jerky motion when the discontinuities are uniformly distributed over time and an instantaneous fluidity break is perceived when the motion loss is isolated or irregularly distributed. Bit rate adaptation techniques, transmission errors in the packet networks or restitution strategy could be the origin of this perceived jerkiness. In this paper we present a psychovisual experiment performed to quantify the effect of sporadically dropped pictures on the overall perceived quality. First, the perceptual detection thresholds of generated temporal discontinuities were measured. Then, the quality function was estimated in relation to a single frame dropping for different durations. Finally, a set of tests was performed to quantify the effect of several impairments distributed over time. We have found that the detection thresholds are content, duration and motion dependent. The assessment results show how quality is impaired by a single burst of dropped frames in a 10 sec sequence. The effect of several bursts of discarded frames, irregularly distributed over the time is also discussed.
Statistical corruption in Beijing's air quality data has likely ended in 2012
NASA Astrophysics Data System (ADS)
Stoerk, Thomas
2016-02-01
This research documents changes in likely misreporting in official air quality data from Beijing for the years 2008-2013. It is shown that, consistent with prior research, the official Chinese data report suspiciously few observations that exceed the politically important Blue Sky Day threshold, a particular air pollution level used to evaluate local officials, and an excess of observations just below that threshold. Similar data, measured by the US Embassy in Beijing, do not show this irregularity. To document likely misreporting, this analysis proposes a new way of comparing air quality data via Benford's Law, a statistical regularity known to fit air pollution data. Using this method to compare the official data to the US Embassy data for the first time, I find that the Chinese data fit Benford's Law poorly until a change in air quality measurements at the end of 2012. From 2013 onwards, the Chinese data fit Benford's Law closely. The US Embassy data, by contrast, exhibit no variation over time in the fit with Benford's Law, implying that the underlying pollution processes remain unchanged. These findings suggest that misreporting of air quality data for Beijing has likely ended in 2012. Additionally, I use aerosol optical density data to show the general applicability of this method of detecting likely misreporting in air pollution data.
Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning
NASA Technical Reports Server (NTRS)
Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)
2002-01-01
A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang- Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.
Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning
NASA Technical Reports Server (NTRS)
Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)
2002-01-01
A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang-Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.
Hassan, Hishar; Abu Bakar, Suharzelim; Halim, Khairul Najah Che A; Idris, Jaleezah; Nordin, Abdul Jalil
2016-01-01
Prostate cancer continues to be the most prevalent cancer in men in Malaysia. As time progresses, the prospect of PET imaging modality in diagnosis of prostate cancer is promising, with on-going improvement on novel tracers. Among all tracers, 18F-Fluorocholine is reported to be a reputable tracer and reliable diagnostic technique for prostate imaging. Nonetheless, only 18F-Fluorodeoxyglucose (18F-FDG) is available and used in most oncology cases in Malaysia. With a small scale GMP-based radiopharmaceuticals laboratory set-up, initial efforts have been taken to put Malaysia on 18F-Fluorocholine map. This article presents a convenient, efficient and reliable method for quality control analysis of 18F-Fluorocholine. Besides, the aim of this research work is to assist local GMP radiopharmaceuticals laboratories and local authority in Malaysia for quality control analysis of 18F-Fluorocholine guideline. In this study, prior to synthesis, quality control analysis method for 18F-Fluorocholine was developed and validated, by adapting the equipment set-up used in 18F-Fluorodeoxyglucose (18FFDG) routine production. Quality control on the 18F-Fluorocholine was performed by means of pH, radionuclidic identity, radio-high performance liquid chromatography equipped with ultraviolet, radio- thin layer chromatography, gas chromatography and filter integrity test. Post-synthesis; the pH of 18F-Fluorocholine was 6.42 ± 0.04, with half-life of 109.5 minutes (n = 12). The radiochemical purity was consistently higher than 99%, both in radio-high performance liquid chromatography equipped with ultraviolet (r-HPLC; SCX column, 0.25 M NaH2PO4: acetonitrile) and radio-thin layer chromatography method (r-TLC). The calculated relative retention time (RRT) in r-HPLC was 1.02, whereas the retention factor (Rf) in r-TLC was 0.64. Potential impurities from 18F-Fluorocholine synthesis such as ethanol, acetonitrile, dimethylethanolamine and dibromomethane were determined in gas chromatography. Using our parameters, (capillary column: DB-200, 30 m x 0.53 mm x 1 um) and oven temperature of 35°C (isothermal), all compounds were well resolved and eluted within 3 minutes. Level of ethanol and acetonitrile in 18F-Fluorocholine were detected below threshold limit; less than 5 mg/ml and 0.41 mg/ml respectively. Meanwhile, dimethylethanolamine and dibromomethane were undetectable. A convenient, efficient and reliable quality control analysis work-up procedure for 18FFluorocholine has been established and validated to comply all the release criteria. The convenient method of quality control analysis may provide a guideline to local GMP radiopharmaceutical laboratories to start producing 18F-Fluorocholine as a tracer for prostate cancer imaging.
Decreases in Human Semen Quality with Age Among Healthy Men
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eskenazi, B.; Wyrobek, A.J.; Kidd, S.A.
The objective of this report is to characterize the associations between age and semen quality among healthy active men after controlling for identified covariates. Ninety-seven healthy, nonsmoking men between 22 and 80 years without known fertility problems who worked for or retired from a large research laboratory. There was a gradual decrease in all semen parameters from 22-80 years of age. After adjusting for covariates, volume decreased 0.03 ml per year (p = 0.001); sperm concentration decreased 2.5% per year (p = 0.005); total count decreased 3.6% per year of age (p < 0.001); motility decreased 0.7% per year (Pmore » < 0.001); progressive motility decreased 3.1% per year (p < 0.001); and total progressively motile sperm decreased 4.8% per year (p < 0.001). In a group of healthy active men, semen volume, sperm concentration, total sperm count, and sperm motility decrease continuously between 22-80 years of age, with no evidence of a threshold.« less
Threshold Switchable Particles (TSPs) To Control Internal Hemorrhage
2016-09-01
hemorrhage at local sites. Four collaborating laboratories worked together under this contract to define threshold levels of activators of blood clotting...such that the candidate clotting activators will circulate in the blood at a concentration below the threshold necessary to trigger clotting, but...accumulation of the activators at sites of internal injury/bleeding will cause the local concentration of clotting activators to exceed the clotting
Exercises for adolescent idiopathic scoliosis.
Romano, Michele; Minozzi, Silvia; Bettany-Saltikov, Josette; Zaina, Fabio; Chockalingam, Nachiappan; Kotwicki, Tomasz; Maier-Hennes, Axel; Negrini, Stefano
2012-08-15
Adolescent idiopathic scoliosis (AIS) is a three-dimensional deformity of the spine . While AIS can progress during growth and cause a surface deformity, it is usually not symptomatic. However, in adulthood, if the final spinal curvature surpasses a certain critical threshold, the risk of health problems and curve progression is increased. The use of scoliosis-specific exercises (SSE) to reduce progression of AIS and postpone or avoid other more invasive treatments is controversial. To evaluate the efficacy of SSE in adolescent patients with AIS. The following databases (up to 30 March 2011) were searched with no language limitations: CENTRAL (The Cochrane Library 2011, issue 2), MEDLINE (from January 1966), EMBASE (from January 1980), CINHAL (from January 1982), SportDiscus (from January 1975), PsycInfo (from January 1887), PEDro (from January 1929). We screened reference lists of articles and also conducted an extensive handsearch of grey literature. Randomised controlled trials and prospective cohort studies with a control group comparing exercises with no treatment, other treatment, surgery, and different types of exercises. Two review authors independently selected studies, assessed risk of bias and extracted data. Two studies (154 participants) were included. There is low quality evidence from one randomised controlled study that exercises as an adjunctive to other conservative treatments increase the efficacy of these treatments (thoracic curve reduced: mean difference (MD) 9.00, (95% confidence interval (CI) 5.47 to 12.53); lumbar curve reduced:MD 8.00, (95% CI 5.08 to 10.92)). There is very low quality evidence from a prospective controlled cohort study that scoliosis-specific exercises structured within an exercise programme can reduce brace prescription (risk ratio (RR) 0.24, (95% CI 0.06 to1.04) as compared to usual physiotherapy (many different kinds of general exercises according to the preferences of the single therapists within different facilities). There is a lack of high quality evidence to recommend the use of SSE for AIS. One very low quality study suggested that these exercises may be more effective than electrostimulation, traction and postural training to avoid scoliosis progression, but better quality research needs to be conducted before the use of SSE can be recommended in clinical practice.
Exercises for adolescent idiopathic scoliosis: a Cochrane systematic review.
Romano, Michele; Minozzi, Silvia; Zaina, Fabio; Saltikov, Josette Bettany; Chockalingam, Nachiappan; Kotwicki, Tomasz; Hennes, Axel Maier; Negrini, Stefano
2013-06-15
Systematic review of interventions. To evaluate the efficacy of scoliosis-specific exercise (SSE) in adolescent patients with adolescent idiopathic scoliosis (AIS). AIS is a 3-dimensional deformity of the spine. Although AIS can progress during growth and cause a surface deformity, it is usually not symptomatic. However, in adulthood, if the final spinal curvature surpasses a certain critical threshold, the risk of health problems and curve progression is increased. The use of SSEs to reduce progression of AIS and postpone or avoid other more invasive treatments is controversial. The following databases (up to March 30, 2011) were searched with no language limitations: CENTRAL (The Cochrane Library 2011, issue 2), MEDLINE (from January 1966), EMBASE (from January 1980), CINHAL (from January 1982), SPORTDiscus (from January 1975), PsycINFO (from January 1887), and PEDro (from January 1929). We screened reference lists of articles and conducted an extensive hand search of gray literature. randomized controlled trials and prospective cohort studies with a control group comparing exercises with no treatment, other treatment, surgery, and different types of exercises. Two review authors independently selected studies, assessed risk of bias and extracted data. Two studies (154 participants) were included. There is low-quality evidence from 1 randomized controlled study that exercises as an adjunctive to other conservative treatments to increase the efficacy of these treatments (thoracic curve reduced: mean difference 9.00, [95% confidence interval, 5.47-12.53]; lumbar curve reduced: mean difference 8.00, [95% confidence interval, 5.08-10.92]). There is very low-quality evidence from a prospective controlled cohort study that SSEs structured within an exercise program can reduce brace prescription (risk ratio, 0.24; [95% confidence interval, 0.06-1.04]) as compared with "usual physiotherapy" [many different kinds of general exercises according to the preferences of the single therapists within different facilities]). There is a lack of high-quality evidence to recommend the use of SSE for AIS. One very low-quality study suggested that these exercises may be more effective than electrostimulation, traction, and postural training to avoid scoliosis progression, but better quality research needs to be conducted before the use of SSE can be recommended in clinical practice. 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie
Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into halfmore » of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC (AFROC) area decreased from 0.84 to 0.63 and the ROC area decreased from 0.91 to 0.79 (p < 0.0001). This corresponded to a 30% drop in lesion sensitivity at a NLF equal to 0.1. Detection was also sensitive to the dose used. There was no significant difference in detection between the two image processing algorithms used (p > 0.05). It was additionally found that lower threshold gold thickness from CDMAM analysis implied better cluster detection. The measured threshold gold thickness passed the acceptable limit set in the EU standards for all image qualities except half dose CR. However, calcification detection varied significantly between image qualities. This suggests that the current EU guidelines may need revising. Conclusions: Microcalcification detection was found to be sensitive to detector and dose used. Standard measurements of image quality were a good predictor of microcalcification cluster detection.« less
Retained energy-based coding for EEG signals.
Bazán-Prieto, Carlos; Blanco-Velasco, Manuel; Cárdenas-Barrera, Julián; Cruz-Roldán, Fernando
2012-09-01
The recent use of long-term records in electroencephalography is becoming more frequent due to its diagnostic potential and the growth of novel signal processing methods that deal with these types of recordings. In these cases, the considerable volume of data to be managed makes compression necessary to reduce the bit rate for transmission and storage applications. In this paper, a new compression algorithm specifically designed to encode electroencephalographic (EEG) signals is proposed. Cosine modulated filter banks are used to decompose the EEG signal into a set of subbands well adapted to the frequency bands characteristic of the EEG. Given that no regular pattern may be easily extracted from the signal in time domain, a thresholding-based method is applied for quantizing samples. The method of retained energy is designed for efficiently computing the threshold in the decomposition domain which, at the same time, allows the quality of the reconstructed EEG to be controlled. The experiments are conducted over a large set of signals taken from two public databases available at Physionet and the results show that the compression scheme yields better compression than other reported methods. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.
Reduced somatosensory impairment by piezosurgery during orthognathic surgery of the mandible.
Brockmeyer, Phillipp; Hahn, Wolfram; Fenge, Stefan; Moser, Norman; Schliephake, Henning; Gruber, Rudolf Matthias
2015-09-01
This clinical trial aimed to test the hypothesis that piezosurgery causes reduced nerval irritations and, thus, reduced somatosensory impairment when used in orthognathic surgery of the mandible. To this end, 37 consecutive patients with Angle Class II and III malocclusion were treated using bilateral sagittal split osteotomies (BSSO) of the mandible. In a split mouth design, randomized one side of the mandible was operated using a conventional saw, while a piezosurgery device was used on the contralateral side. In order to test the individual qualities of somatosensory function, quantitative sensory testings (QSTs) were performed 1 month, 6 months and 1 year after surgery. A comparison of the data using a two-way analysis of variance (ANOVA) revealed a significant reduction in postoperative impairment in warm detection threshold (WDT) (P = 0.046), a decreased dynamic mechanical allodynia (ALL) (P = 0.002) and a decreased vibration detection threshold (VDT) (P = 0.030) on the piezosurgery side of the mandible as opposed to the conventionally operated control side. In the remaining QSTs, minor deviations from the preoperative baseline conditions and a more rapid regression could be observed. Piezosurgery caused reduced somatosensory impairment and a faster recovery of somatosensory functions in the present investigation.
Drawing a baseline in aesthetic quality assessment
NASA Astrophysics Data System (ADS)
Rubio, Fernando; Flores, M. Julia; Puerta, Jose M.
2018-04-01
Aesthetic classification of images is an inherently subjective task. There does not exist a validated collection of images/photographs labeled as having good or bad quality from experts. Nowadays, the closest approximation to that is to use databases of photos where a group of users rate each image. Hence, there is not a unique good/bad label but a rating distribution given by users voting. Due to this peculiarity, it is not possible to state the problem of binary aesthetic supervised classification in such a direct mode as other Computer Vision tasks. Recent literature follows an approach where researchers utilize the average rates from the users for each image, and they establish an arbitrary threshold to determine their class or label. In this way, images above the threshold are considered of good quality, while images below the threshold are seen as bad quality. This paper analyzes current literature, and it reviews those attributes able to represent an image, differentiating into three families: specific, general and deep features. Among those which have been proved more competitive, we have selected a representative subset, being our main goal to establish a clear experimental framework. Finally, once features were selected, we have used them for the full AVA dataset. We have to remark that to perform validation we report not only accuracy values, which is not that informative in this case, but also, metrics able to evaluate classification power within imbalanced datasets. We have conducted a series of experiments so that distinct well-known classifiers are learned from data. Like that, this paper provides what we could consider valuable and valid baseline results for the given problem.
Ackard, Diann M; Richter, Sara; Egan, Amber; Engel, Scott; Cronemeyer, Catherine L
2014-04-01
Compare general and disease-specific health-related quality of life (HRQoL) among female patients with an eating disorder (ED). Female patients (n = 221; 95.3% Caucasian; 94.0% never married) completed the Medical Outcome Short Form Health Survey (SF-36) and Eating Disorders Quality of Life (EDQoL) as part of a study of treatment outcomes. Multivariate regression models were used to compare HRQoL differences across initial ED diagnosis (85 AN-R, 19 AN-B/P, 27 BN, 90 EDNOS) and ED diagnostic classification at time of outcome assessment (140 no ED, 38 subthreshold ED, 43 full threshold ED). There were no significant differences across ED diagnosis at initial assessment on either of the SF-36 Component Summary scores. However, patients with AN-B/P scored poorer on the work/school EDQoL subscales than other ED diagnoses, and on the psychological EDQoL subscale compared to AN-R and EDNOS. At outcome assessment, comparisons across full threshold, subthreshold and no ED classification indicated that those with no ED reported better HRQoL than those with full threshold ED on the SF-36 Mental Components Summary and three of four EDQoL subscales. Furthermore, those with no ED reported better psychological HRQoL than those with subthreshold ED. Disease-specific HRQOL measures are important to use when comparing HRQoL in ED patients across treatment and outcome, and may have the sensitivity to detect meaningful differences by diagnosis more so than generic instruments. EDQoL scores from patients remitted from symptoms approach but do not reach scores for unaffected college females; thus, treatment should continue until quality of life is restored. Copyright © 2013 Wiley Periodicals, Inc.
Exploring the limits of frequency lowering
Souza, Pamela E.; Arehart, Kathryn H.; Kates, James M.; Croghan, Naomi B.H.; Gehani, Namita
2013-01-01
Objective This study examined how frequency lowering affected sentence intelligibility and quality, for adults with postlingually acquired, mild-to-moderate hearing loss. Method Listeners included adults aged 60–92 years with sloping sensorineural loss and a control group of similarly-aged adults with normal hearing. Sentences were presented in quiet and babble at a range of signal-to-noise ratios. Intelligibility and quality were measured with varying amounts of frequency lowering, implemented using a form of frequency compression. Results Moderate amounts of compression, particularly with high cutoff frequencies, had minimal effects on intelligibility. Listeners with the greatest high-frequency hearing loss showed the greatest benefit. Sentence intelligibility decreased with more compression. Listeners were more affected by a given set of parameters in noise. In quiet, any amount of compression resulted in lower speech quality for most listeners, with the greatest degradation for listeners with better high-frequency hearing. Quality ratings were lower with background noise, and in noise the effect of changing compression parameters was small. Conclusions The benefits of frequency lowering in adults were affected by the compression parameters as well as individual hearing thresholds. Data are consistent with the idea that frequency lowering can be viewed in terms of an improved audibility vs increased distortion tradeoff. PMID:23785188
Setting limits: Using air pollution thresholds to protect and restore U.S
Mark E Fenn; Kathleen F. Lambert; Tamara F. Blett; Douglas A. Burns; Linda H. Pardo; Gary M. Lovett; Richard A. Haeuber; David C. Evers; Charles T. Driscoll; Dean S. Jeffries
2011-01-01
More than four decades of research provide unequivocal evidence that sulfur, nitrogen, and mercury pollution have altered, and will continue to alter, our nationâs lands and waters. The emission and deposition of air pollutants harm native plants and animals, degrade water quality, affect forest productivity, and are damaging to human health. Many air quality policies...
Jakobsen, Janus Christian; Katakam, Kiran Kumar; Schou, Anne; Hellmuth, Signe Gade; Stallknecht, Sandra Elkjær; Leth-Møller, Katja; Iversen, Maria; Banke, Marianne Bjørnø; Petersen, Iggiannguaq Juhl; Klingenberg, Sarah Louise; Krogh, Jesper; Ebert, Sebastian Elgaard; Timm, Anne; Lindschou, Jane; Gluud, Christian
2017-02-08
The evidence on selective serotonin reuptake inhibitors (SSRIs) for major depressive disorder is unclear. Our objective was to conduct a systematic review assessing the effects of SSRIs versus placebo, 'active' placebo, or no intervention in adult participants with major depressive disorder. We searched for eligible randomised clinical trials in The Cochrane Library's CENTRAL, PubMed, EMBASE, PsycLIT, PsycINFO, Science Citation Index Expanded, clinical trial registers of Europe and USA, websites of pharmaceutical companies, the U.S. Food and Drug Administration (FDA), and the European Medicines Agency until January 2016. All data were extracted by at least two independent investigators. We used Cochrane systematic review methodology, Trial Sequential Analysis, and calculation of Bayes factor. An eight-step procedure was followed to assess if thresholds for statistical and clinical significance were crossed. Primary outcomes were reduction of depressive symptoms, remission, and adverse events. Secondary outcomes were suicides, suicide attempts, suicide ideation, and quality of life. A total of 131 randomised placebo-controlled trials enrolling a total of 27,422 participants were included. None of the trials used 'active' placebo or no intervention as control intervention. All trials had high risk of bias. SSRIs significantly reduced the Hamilton Depression Rating Scale (HDRS) at end of treatment (mean difference -1.94 HDRS points; 95% CI -2.50 to -1.37; P < 0.00001; 49 trials; Trial Sequential Analysis-adjusted CI -2.70 to -1.18); Bayes factor below predefined threshold (2.01*10 -23 ). The effect estimate, however, was below our predefined threshold for clinical significance of 3 HDRS points. SSRIs significantly decreased the risk of no remission (RR 0.88; 95% CI 0.84 to 0.91; P < 0.00001; 34 trials; Trial Sequential Analysis adjusted CI 0.83 to 0.92); Bayes factor (1426.81) did not confirm the effect). SSRIs significantly increased the risks of serious adverse events (OR 1.37; 95% CI 1.08 to 1.75; P = 0.009; 44 trials; Trial Sequential Analysis-adjusted CI 1.03 to 1.89). This corresponds to 31/1000 SSRI participants will experience a serious adverse event compared with 22/1000 control participants. SSRIs also significantly increased the number of non-serious adverse events. There were almost no data on suicidal behaviour, quality of life, and long-term effects. SSRIs might have statistically significant effects on depressive symptoms, but all trials were at high risk of bias and the clinical significance seems questionable. SSRIs significantly increase the risk of both serious and non-serious adverse events. The potential small beneficial effects seem to be outweighed by harmful effects. PROSPERO CRD42013004420.
The effect of symmetrical and asymmetrical hearing impairment on music quality perception.
Cai, Yuexin; Zhao, Fei; Chen, Yuebo; Liang, Maojin; Chen, Ling; Yang, Haidi; Xiong, Hao; Zhang, Xueyuan; Zheng, Yiqing
2016-09-01
The purpose of this study was to investigate the effect of symmetrical, asymmetrical and unilateral hearing impairment on music quality perception. Six validated music pieces in the categories of classical music, folk music and pop music were used to assess music quality in terms of its 'pleasantness', 'naturalness', 'fullness', 'roughness' and 'sharpness'. 58 participants with sensorineural hearing loss [20 with unilateral hearing loss (UHL), 20 with bilateral symmetrical hearing loss (BSHL) and 18 with bilateral asymmetrical hearing loss (BAHL)] and 29 normal hearing (NH) subjects participated in the present study. Hearing impaired (HI) participants had greater difficulty in overall music quality perception than NH participants. Participants with BSHL rated music pleasantness and naturalness to be higher than participants with BAHL. Moreover, the hearing thresholds of the better ears from BSHL and BAHL participants as well as the hearing thresholds of the worse ears from BSHL participants were negatively correlated to the pleasantness and naturalness perception. HI participants rated the familiar music pieces higher than unfamiliar music pieces in the three music categories. Music quality perception in participants with hearing impairment appeared to be affected by symmetry of hearing loss, degree of hearing loss and music familiarity when they were assessed using the music quality rating test (MQRT). This indicates that binaural symmetrical hearing is important to achieve a high level of music quality perception in HI listeners. This emphasizes the importance of provision of bilateral hearing assistive devices for people with asymmetrical hearing impairment.
Olsen, Lisa D.; Fram, Miranda S.; Belitz, Kenneth
2010-01-01
Trace-element quality-control samples (for example, source-solution blanks, field blanks, and field replicates) were collected as part of a statewide investigation of groundwater quality in California, known as the Priority Basins Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basins Project is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB) to assess and monitor the quality of groundwater resources used for drinking-water supply and to improve public knowledge of groundwater quality in California. Trace-element field blanks were collected to evaluate potential bias in the corresponding environmental data. Bias in the environmental data could result from contamination in the field during sample collection, from the groundwater coming into contact with contaminants on equipment surfaces or from other sources, or from processing, shipping, or analyzing the samples. Bias affects the interpretation of environmental data, particularly if any constituents are present solely as a result of extrinsic contamination that would have otherwise been absent from the groundwater that was sampled. Field blanks were collected, analyzed, and reviewed to identify and quantify extrinsic contamination bias. Data derived from source-solution blanks and laboratory quality-control samples also were considered in evaluating potential contamination bias. Eighty-six field-blank samples collected from May 2004 to January 2008 were analyzed for the concentrations of 25 trace elements. Results from these field blanks were used to interpret the data for the 816 samples of untreated groundwater collected over the same period. Constituents analyzed were aluminum (Al), antimony (Sb), arsenic (As), barium (Ba), beryllium (Be), boron (B), cadmium (Cd), chromium (Cr), cobalt (Co), copper (Cu), iron (Fe), lead (Pb), lithium (Li), manganese (Mn), mercury (Hg), molybdenum (Mo), nickel (Ni), selenium (Se), silver (Ag), strontium (Sr), thallium (Tl), tungsten (W), uranium (U), vanadium (V), and zinc (Zn). The detection frequency and the 90th percentile concentration at greater than 90 percent confidence were determined from the field-blank data for each trace element, and these results were compared to each constituent's long-term method detection level (LT-MDL) to determine whether a study reporting level (SRL) was necessary to ensure that no more than 10 percent of the detections in groundwater samples could be attributed solely to contamination bias. Only two of the trace elements analyzed, Li and Se, had zero detections in the 86 field blanks. Ten other trace elements (Sb, As, Be, B, Cd, Co, Mo, Ag, Tl, and U) were detected in fewer than 5 percent of the field blanks. The field-blank results for these constituents did not necessitate establishing SRLs. Of the 13 constituents that were detected in more than 5 percent of the field blanks, six (Al, Ba, Cr, Mn, Hg, and V) had field-blank results that indicated a need for SRLs that were at or below the highest laboratory reporting levels (LRL) used during the sampling period; these SRLs were needed for concentrations between the LT-MDLs and LRLs. The other seven constituents with detection frequencies above 5 percent (Cu, Fe, Pb, Ni, Sr, W, and Zn) had field-blank results that necessitated SRLs greater than the highest LRLs used during the study period. SRLs for these seven constituents, each set at the 90th percentile of their concentrations in the field blanks, were at least an order of magnitude below the regulatory thresholds established for drinking water for health or aesthetic purposes; therefore, reporting values below the SRLs as less than or equal to (=) the measured value would not prevent the identification of values greater than the drinking-water thresholds. The SRLs and drinking-water thresholds, respectively, for these 7 trace elements are Cu (1.7 ?g/L and 1,300
Miranda, Leandro E.; Omer, A.R.; Killgore, K.J.
2017-01-01
The Mississippi Alluvial Valley includes hundreds of floodplain lakes that support unique fish assemblages and high biodiversity. Irrigation practices in the valley have lowered the water table, increasing the cost of pumping water, and necessitating the use of floodplain lakes as a source of water for irrigation. This development has prompted the need to regulate water withdrawals to protect aquatic resources, but it is unknown how much water can be withdrawn from lakes before ecological integrity is compromised. To estimate withdrawal limits, we examined descriptors of lake water quality (i.e., total nitrogen, total phosphorus, turbidity, Secchi visibility, chlorophyll-a) and fish assemblages (species richness, diversity, composition) relative to maximum depth in 59 floodplain lakes. Change-point regression analysis was applied to identify critical depths at which the relationships between depth and lake descriptors exhibited a rapid shift in slope, suggesting possible thresholds. All our water quality and fish assemblage descriptors showed rapid changes relative to depth near 1.2–2.0 m maximum depth. This threshold span may help inform regulatory decisions about water withdrawal limits. Alternatives to explain the triggers of the observed threshold span are considered.
Consensus sediment quality guidelines for polycyclic aromatic hydrocarbon mixtures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swartz, R.C.
1999-04-01
Sediment quality guidelines (SQGs) for polycyclic aromatic hydrocarbons (PAHs) have been derived from a variety of laboratory, field, and theoretical foundations. They include the screening level concentration, effects ranges-low and -median, equilibrium partitioning concentrations, apparent effects threshold, {Sigma}PAH model, and threshold and probable effects levels. The resolution of controversial differences among the PAH SQGs lies in an understanding of the effects of mixtures. Polycyclic aromatic hydrocarbons virtually always occur in field-collected sediment as a complex mixture of covarying compounds. When expressed as a mixture concentration, that is, total PAH (TPAH), the guidelines form three clusters that were intended in theirmore » original derivations to represent threshold (TEC = 290 {micro}g/g organic carbon [OC]), median (MEC = 1,800 {micro}g/g OC), and extreme (EEC = 10,000 {micro}g/g OC) effects concentrations. The TEC/MEC/EEC consensus guidelines provide a unifying synthesis of other SQGs, reflect causal rather than correlative effects, account for mixtures, and predict sediment toxicity and benthic community perturbations at sites of PAH contamination. The TEC offers the most useful SQG because PAH mixtures are unlikely to cause adverse effects on benthic ecosystems below the TEC.« less
Cost-utility analysis of meaning-centered group psychotherapy for cancer survivors.
van der Spek, Nadia; Jansen, Femke; Holtmaat, Karen; Vos, Joël; Breitbart, William; van Uden-Kraan, Cornelia F; Tollenaar, Rob A E M; Cuijpers, Pim; Coupé, Veerle M H; Verdonck-de Leeuw, Irma M
2018-04-06
Meaning-centered group psychotherapy for cancer survivors (MCGP-CS) improves meaning, psychological well-being, and mental adjustment to cancer and reduces psychological distress. This randomized controlled trial was conducted to investigate the cost-utility of MCGP-CS compared with supportive group psychotherapy (SGP) and care-as-usual (CAU). In total, 170 patients were randomized to MCGP-CS, SGP, or CAU. Intervention costs, direct medical and nonmedical costs, productivity losses, and health-related quality of life were measured until 6 months follow-up, using the TIC-P, PRODISQ, data from the hospital information system, and the EQ-5D. The cost-utility was calculated by comparing mean cumulative costs and quality-adjusted life years (QALYs). Mean total costs ranged from €4492 (MCGP-CS) to €5304 (CAU). Mean QALYs ranged .507 (CAU) to .540 (MCGP-CS). MCGP-CS had a probability of 74% to be both less costly and more effective than CAU, and 49% compared with SGP. Sensitivity analyses showed these findings are robust. If society is willing to pay €0 for one gained QALY, MCGP-CS has a 78% probability of being cost-effective compared with CAU. This increases to 85% and 92% at willingness-to-pay thresholds of €10 000 and €30 000, which are commonly accepted thresholds. MCGP-CS is highly likely a cost-effective intervention, meaning that there is a positive balance between the costs and gains of MCGP-CS, in comparison with SGP and CAU. Copyright © 2018 John Wiley & Sons, Ltd.
Electronic bidirectional valve circuit prevents crossover distortion and threshold effect
NASA Technical Reports Server (NTRS)
Kernick, A.
1966-01-01
Four-terminal network forms a bidirectional valve which will switch or alternate an ac signal without crossover distortion or threshold effect. In this network, an isolated control signal is sufficient for circuit turn-on.
NASA Astrophysics Data System (ADS)
Cavanaugh, K. C.; Kellner, J.; Cook-Patton, S.; Williams, P.; Feller, I. C.; Parker, J.
2014-12-01
Due to limitations of purely correlative species distribution models, there is a need for more integration of experimental approaches when studying impacts of climate change on species distributions. Here we used controlled experiments to identify physiological thresholds that control poleward range limits of three species of mangroves found in North America. We found that all three species exhibited a threshold response to extreme cold, but freeze tolerance thresholds varied among species. From these experiments we developed a climate metric, freeze degree days (FDD), which incorporates both the intensity and frequency of freezes. When included in distribution models, FDD was a better predictor of mangrove presence/absence than other temperature-based metrics. Using 27 years of satellite imagery, we linked FDD to past changes in mangrove abundance in Florida, further supporting the relevance of FDD. We then used downscaled climate projections of FDD to project poleward migration of these range limits over the next 50 years.
Downregulation of cough by exercise and voluntary hyperpnea.
Fontana, Giovanni A
2010-01-01
No information exists on the effects of hyperpnea on the sensory and cognitive aspects of coughing evoked by inhalation of tussigenic agents. The threshold for the cough reflex induced by inhalation of increasing concentrations of ultrasonically nebulized distilled water (fog), and the index of cough reflex sensitivity, was assessed in 12 healthy humans in control conditions, during exercise, and during voluntary isocapnic hyperventilation (VIH) to the same level as the exercise. The intensity of the urge-to-cough (UTC), a cognitive component of coughing, was also recorded throughout the trials. The log-log relationship between inhaled fog concentrations and the correspondingly evoked UTC values, an index of the perceptual magnitude of the UTC sensitivity, was also calculated. Cough appearance was always assessed audiovisually. At an exercise level of 80% of anaerobic threshold, the mean cough threshold was increased from a control value of 1.03 +/- 0.65 to 2.25 +/- 1.14 ml/min (p < 0.01), i.e., cough sensitivity was downregulated. With VIH, the mean (+/-SD) threshold increased from 1.03 +/- 0.65 to 2.42 +/- 1.16 ml/min (p < 0.01), a similar downregulation. With exercise and VIH compared with control, mean UTC values at cough threshold were not significantly changed: control, 3.83 +/- 1.11 cm; exercise, 3.12 +/- 0.82 cm; VIH, 4.08 +/- 1.67 cm. Since the slopes of the log fog concentration/log UTC value were approximately halved during exercise and VIH compared with control, the UTC sensitivity to fog was depressed (p < 0.01). The results indicate that the adjustments brought into action by exercise-induced or voluntary hyperventilation exert inhibitory influences on the sensory and cognitive components of fog-induced cough.
Desensitization of the cough reflex by exercise and voluntary isocapnic hyperpnea.
Lavorini, Federico; Fontana, Giovanni A; Chellini, Elisa; Magni, Chiara; Duranti, Roberto; Widdicombe, John
2010-05-01
Little is known about the effects of exercise on the sensory and cognitive aspects of coughing evoked by inhalation of tussigenic agents. The threshold for the cough reflex induced by inhalation of increasing nebulizer outputs of ultrasonically nebulized distilled water (fog), an index of cough reflex sensitivity, was assessed in twelve healthy humans in control conditions, during exercise and during voluntary isocapnic hyperpnea (VIH) at the same ventilatory level as the exercise. The intensity of the urge to cough (UTC), a cognitive component of coughing, was recorded throughout the trials on a linear scale. The relationships between inhaled fog nebulizer outputs and the correspondingly evoked UTC values, an index of the perceptual magnitude of the UTC sensitivity, were also calculated. Cough appearance was always assessed audiovisually. At an exercise level of 80% of anaerobic threshold, the median cough threshold was increased from a control value of 0.73 to 2.22 ml/min (P<0.01), i.e., cough sensitivity was downregulated. With VIH, the threshold increased from 0.73 to 2.22 ml/min (P<0.01), a similar downregulation. With exercise and VIH compared with control, mean UTC values at cough threshold were unchanged, i.e., control, 3.83 cm; exercise, 3.12 cm; VIH, 4.08 cm. The relationship of the fog nebulizer output/UTC value was linear in control conditions and logarithmic during both exercise and VIH. The perception of the magnitude of the UTC seems to be influenced by signals or sensations arising from exercising limb and thoracic muscles and/or by higher nervous (cortical) mechanisms. The results indicate that the adjustments brought into action by exercise-induced or voluntary hyperpnea exert inhibitory influences on the sensory and cognitive components of fog-induced cough.
48 CFR 49.504 - Termination of fixed-price contracts for default.
Code of Federal Regulations, 2010 CFR
2010-10-01
... simplified acquisition threshold, if appropriate (e.g., if the acquisition involves items with a history of unsatisfactory quality). (2) Transportation. If the contract is for transportation or transportation-related...
Carmo, João; Ferreira, Jorge; Costa, Francisco; Carmo, Pedro; Cavaco, Diogo; Carvalho, Salomé; Morgado, Francisco; Adragão, Pedro; Mendes, Miguel
2017-10-01
The efficacy and safety of warfarin for stroke prevention in atrial fibrillation (AF) depend on the time in the therapeutic range (TTR) with an international normalised ratio (INR) of 2.0-3.0. This meta-analysis focused the relative efficacy and safety of non-VKA oral anticoagulants (NOAC) compared with warfarin at different thresholds of centre's TTR (cTTR). We searched PubMed, Embase, CENTRAL and websites of regulatory agencies, limiting searches to randomized phase 3 trials. Primary outcomes were stroke or systemic embolism (SSE) and major or non-major clinically relevant (NMCR) bleeding. We used a random-effects model to pool effect on outcomes according to different thresholds of cTTR. Four TTR sub-studies with a total of 71,222 patients were included. The benefit of NOAC in reducing SSE compared with warfarin was significantly higher in patients at cTTR<60% (HR 0.79, 95% CI 0.68-0.90) and at 60% to <70% (0.82, 0.71-0.95) but not at ≥70% (1.00, 0.82-1.23) with a significant interaction for cTTR<70% or ≥70% (p=0.042). The risk of major or NMCR bleeding was significantly lower with NOAC as compared with warfarin in patients at all sub-groups (0.67, 0.54-0.83 for patients at cTTR<60% and 0.75, 0.63-0.89 at 60% to <70%) except for cTTR≥70% (HR 0.84, 0.64-1.11), but the interaction for cTTR<70% or ≥70% was not statistically significant (p=0.271). The superiority in efficacy of NOAC compared with warfarin for stroke prevention is lost above a cTTR threshold of approximately 70%, but the relative safety appears to be less modified by the centre-based quality of INR control. Copyright © 2017 Elsevier B.V. All rights reserved.
Assessing the value of mepolizumab for severe eosinophilic asthma: a cost-effectiveness analysis.
Whittington, Melanie D; McQueen, R Brett; Ollendorf, Daniel A; Tice, Jeffrey A; Chapman, Richard H; Pearson, Steven D; Campbell, Jonathan D
2017-02-01
Adding mepolizumab to standard treatment with inhaled corticosteroids and controller medications could decrease asthma exacerbations and use of long-term oral steroids in patients with severe disease and increased eosinophils; however, mepolizumab is costly and its cost effectiveness is unknown. To estimate the cost effectiveness of mepolizumab. A Markov model was used to determine the incremental cost per quality-adjusted life year (QALY) gained for mepolizumab plus standard of care (SoC) and for SoC alone. The population, adults with severe eosinophilic asthma, was modeled for a lifetime time horizon. A responder scenario analysis was conducted to determine the cost effectiveness for a cohort able to achieve and maintain asthma control. Over a lifetime treatment horizon, 23.96 exacerbations were averted per patient receiving mepolizumab plus SoC. Avoidance of exacerbations and decrease in long-term oral steroid use resulted in more than $18,000 in cost offsets among those receiving mepolizumab, but treatment costs increased by more than $600,000. Treatment with mepolizumab plus SoC vs SoC alone resulted in a cost-effectiveness estimate of $386,000 per QALY. To achieve cost effectiveness of approximately $150,000 per QALY, mepolizumab would require a more than 60% price discount. At current pricing, treating a responder cohort yielded cost-effectiveness estimates near $160,000 per QALY. The estimated cost effectiveness of mepolizumab exceeds value thresholds. Achieving these thresholds would require significant discounts from the current list price. Alternatively, treatment limited to responders improves the cost effectiveness toward, but remains still slightly above, these thresholds. Payers interested in improving the efficiency of health care resources should consider negotiations of the mepolizumab price and ways to predict and assess the response to mepolizumab. Copyright © 2016 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Tillotson, S. L.; Fuggle, P. W.; Smith, I.; Ades, A. E.; Grant, D. B.
1994-01-01
OBJECTIVES--To assess whether early treatment of congenital hypothyroidism fully prevents intellectual impairment. DESIGN--A national register of children with congenital hypothyroidism who were compared with unaffected children from the same school classes and matched for age, sex, social class, and first language. SETTING--First three years (1982-4) of a neonatal screening programme in England, Wales, and Northern Ireland. SUBJECTS--361 children with congenital hypothyroidism given early treatment and 315 control children. MAIN OUTCOME MEASURES--Intelligence quotient (IQ) measured at school entry at 5 years of age with the Wechsler preschool and primary scale of intelligence. RESULTS--There was a discontinuous relation between IQ and plasma thyroxine concentration at diagnosis, with a threshold at 42.8 nmol/l (95% confidence interval 35.2 to 47.1 nmol/l). Hypothyroid children with thyroxine values below 42.8 nmol/l had a mean IQ 10.3 points (6.9 to 13.7 points) lower than those with higher values and than controls. None of the measures of quality of treatment (age at start of treatment (range 1-173 days), average thyroxine dose (12-76 micrograms in the first year), average thyroxine concentration during treatment (79-234 nmol/l in the first year), and thyroxine concentration less than 103 nmol/l at least once during the first year) influenced IQ at age 5. CONCLUSIONS--Despite early treatment in congenital hypothyroidism the disease severity has a threshold effect on brain development, probably determined prenatally. The 55% of infants with more severe disease continue to show clinically significant intellectual impairment; infants with milder disease show no such impairment. The findings predict that 10% of early treated infants with severe hypothyroidism, compared with around 40% of those who presented with symptoms in the period before screening began, are likely to require special education. PMID:7920127