Sample records for quality estimation based

  1. Audiovisual quality estimation of mobile phone video cameras with interpretation-based quality approach

    NASA Astrophysics Data System (ADS)

    Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte

    2007-01-01

    We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.

  2. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  3. Sleep Quality Estimation based on Chaos Analysis for Heart Rate Variability

    NASA Astrophysics Data System (ADS)

    Fukuda, Toshio; Wakuda, Yuki; Hasegawa, Yasuhisa; Arai, Fumihito; Kawaguchi, Mitsuo; Noda, Akiko

    In this paper, we propose an algorithm to estimate sleep quality based on a heart rate variability using chaos analysis. Polysomnography(PSG) is a conventional and reliable system to diagnose sleep disorder and to evaluate its severity and therapeatic effect, by estimating sleep quality based on multiple channels. However, a recording process requires a lot of time and a controlled environment for measurement and then an analyzing process of PSG data is hard work because the huge sensed data should be manually evaluated. On the other hand, it is focused that some people make a mistake or cause an accident due to lost of regular sleep and of homeostasis these days. Therefore a simple home system for checking own sleep is required and then the estimation algorithm for the system should be developed. Therefore we propose an algorithm to estimate sleep quality based only on a heart rate variability which can be measured by a simple sensor such as a pressure sensor and an infrared sensor in an uncontrolled environment, by experimentally finding the relationship between chaos indices and sleep quality. The system including the estimation algorithm can inform patterns and quality of own daily sleep to a user, and then the user can previously arranges his life schedule, pays more attention based on sleep results and consult with a doctor.

  4. Parameter Search Algorithms for Microwave Radar-Based Breast Imaging: Focal Quality Metrics as Fitness Functions.

    PubMed

    O'Loughlin, Declan; Oliveira, Bárbara L; Elahi, Muhammad Adnan; Glavin, Martin; Jones, Edward; Popović, Milica; O'Halloran, Martin

    2017-12-06

    Inaccurate estimation of average dielectric properties can have a tangible impact on microwave radar-based breast images. Despite this, recent patient imaging studies have used a fixed estimate although this is known to vary from patient to patient. Parameter search algorithms are a promising technique for estimating the average dielectric properties from the reconstructed microwave images themselves without additional hardware. In this work, qualities of accurately reconstructed images are identified from point spread functions. As the qualities of accurately reconstructed microwave images are similar to the qualities of focused microscopic and photographic images, this work proposes the use of focal quality metrics for average dielectric property estimation. The robustness of the parameter search is evaluated using experimental dielectrically heterogeneous phantoms on the three-dimensional volumetric image. Based on a very broad initial estimate of the average dielectric properties, this paper shows how these metrics can be used as suitable fitness functions in parameter search algorithms to reconstruct clear and focused microwave radar images.

  5. Machine-Learning Based Channel Quality and Stability Estimation for Stream-Based Multichannel Wireless Sensor Networks.

    PubMed

    Rehan, Waqas; Fischer, Stefan; Rehan, Maaz

    2016-09-12

    Wireless sensor networks (WSNs) have become more and more diversified and are today able to also support high data rate applications, such as multimedia. In this case, per-packet channel handshaking/switching may result in inducing additional overheads, such as energy consumption, delays and, therefore, data loss. One of the solutions is to perform stream-based channel allocation where channel handshaking is performed once before transmitting the whole data stream. Deciding stream-based channel allocation is more critical in case of multichannel WSNs where channels of different quality/stability are available and the wish for high performance requires sensor nodes to switch to the best among the available channels. In this work, we will focus on devising mechanisms that perform channel quality/stability estimation in order to improve the accommodation of stream-based communication in multichannel wireless sensor networks. For performing channel quality assessment, we have formulated a composite metric, which we call channel rank measurement (CRM), that can demarcate channels into good, intermediate and bad quality on the basis of the standard deviation of the received signal strength indicator (RSSI) and the average of the link quality indicator (LQI) of the received packets. CRM is then used to generate a data set for training a supervised machine learning-based algorithm (which we call Normal Equation based Channel quality prediction (NEC) algorithm) in such a way that it may perform instantaneous channel rank estimation of any channel. Subsequently, two robust extensions of the NEC algorithm are proposed (which we call Normal Equation based Weighted Moving Average Channel quality prediction (NEWMAC) algorithm and Normal Equation based Aggregate Maturity Criteria with Beta Tracking based Channel weight prediction (NEAMCBTC) algorithm), that can perform channel quality estimation on the basis of both current and past values of channel rank estimation. In the end, simulations are made using MATLAB, and the results show that the Extended version of NEAMCBTC algorithm (Ext-NEAMCBTC) outperforms the compared techniques in terms of channel quality and stability assessment. It also minimizes channel switching overheads (in terms of switching delays and energy consumption) for accommodating stream-based communication in multichannel WSNs.

  6. Machine-Learning Based Channel Quality and Stability Estimation for Stream-Based Multichannel Wireless Sensor Networks

    PubMed Central

    Rehan, Waqas; Fischer, Stefan; Rehan, Maaz

    2016-01-01

    Wireless sensor networks (WSNs) have become more and more diversified and are today able to also support high data rate applications, such as multimedia. In this case, per-packet channel handshaking/switching may result in inducing additional overheads, such as energy consumption, delays and, therefore, data loss. One of the solutions is to perform stream-based channel allocation where channel handshaking is performed once before transmitting the whole data stream. Deciding stream-based channel allocation is more critical in case of multichannel WSNs where channels of different quality/stability are available and the wish for high performance requires sensor nodes to switch to the best among the available channels. In this work, we will focus on devising mechanisms that perform channel quality/stability estimation in order to improve the accommodation of stream-based communication in multichannel wireless sensor networks. For performing channel quality assessment, we have formulated a composite metric, which we call channel rank measurement (CRM), that can demarcate channels into good, intermediate and bad quality on the basis of the standard deviation of the received signal strength indicator (RSSI) and the average of the link quality indicator (LQI) of the received packets. CRM is then used to generate a data set for training a supervised machine learning-based algorithm (which we call Normal Equation based Channel quality prediction (NEC) algorithm) in such a way that it may perform instantaneous channel rank estimation of any channel. Subsequently, two robust extensions of the NEC algorithm are proposed (which we call Normal Equation based Weighted Moving Average Channel quality prediction (NEWMAC) algorithm and Normal Equation based Aggregate Maturity Criteria with Beta Tracking based Channel weight prediction (NEAMCBTC) algorithm), that can perform channel quality estimation on the basis of both current and past values of channel rank estimation. In the end, simulations are made using MATLAB, and the results show that the Extended version of NEAMCBTC algorithm (Ext-NEAMCBTC) outperforms the compared techniques in terms of channel quality and stability assessment. It also minimizes channel switching overheads (in terms of switching delays and energy consumption) for accommodating stream-based communication in multichannel WSNs. PMID:27626429

  7. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  8. Parameter-based estimation of CT dose index and image quality using an in-house android™-based software

    NASA Astrophysics Data System (ADS)

    Mubarok, S.; Lubis, L. E.; Pawiro, S. A.

    2016-03-01

    Compromise between radiation dose and image quality is essential in the use of CT imaging. CT dose index (CTDI) is currently the primary dosimetric formalisms in CT scan, while the low and high contrast resolutions are aspects indicating the image quality. This study was aimed to estimate CTDIvol and image quality measures through a range of exposure parameters variation. CTDI measurements were performed using PMMA (polymethyl methacrylate) phantom of 16 cm diameter, while the image quality test was conducted by using catphan ® 600. CTDI measurements were carried out according to IAEA TRS 457 protocol using axial scan mode, under varied parameters of tube voltage, collimation or slice thickness, and tube current. Image quality test was conducted accordingly under the same exposure parameters with CTDI measurements. An Android™ based software was also result of this study. The software was designed to estimate the value of CTDIvol with maximum difference compared to actual CTDIvol measurement of 8.97%. Image quality can also be estimated through CNR parameter with maximum difference to actual CNR measurement of 21.65%.

  9. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    EPA Science Inventory

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  10. Improved protein model quality assessments by changing the target function.

    PubMed

    Uziela, Karolis; Menéndez Hurtado, David; Shu, Nanjiang; Wallner, Björn; Elofsson, Arne

    2018-06-01

    Protein modeling quality is an important part of protein structure prediction. We have for more than a decade developed a set of methods for this problem. We have used various types of description of the protein and different machine learning methodologies. However, common to all these methods has been the target function used for training. The target function in ProQ describes the local quality of a residue in a protein model. In all versions of ProQ the target function has been the S-score. However, other quality estimation functions also exist, which can be divided into superposition- and contact-based methods. The superposition-based methods, such as S-score, are based on a rigid body superposition of a protein model and the native structure, while the contact-based methods compare the local environment of each residue. Here, we examine the effects of retraining our latest predictor, ProQ3D, using identical inputs but different target functions. We find that the contact-based methods are easier to predict and that predictors trained on these measures provide some advantages when it comes to identifying the best model. One possible reason for this is that contact based methods are better at estimating the quality of multi-domain targets. However, training on the S-score gives the best correlation with the GDT_TS score, which is commonly used in CASP to score the global model quality. To take the advantage of both of these features we provide an updated version of ProQ3D that predicts local and global model quality estimates based on different quality estimates. © 2018 Wiley Periodicals, Inc.

  11. Evaluation of quality of precipitation products: A case study using WRF and IMERG data over the central United States

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lin, L. F.; Bras, R. L.

    2017-12-01

    Hydrological applications rely on the availability and quality of precipitation products, specially model- and satellite-based products for use in areas without ground measurements. It is known that the quality of model- and satellite-based precipitation products are complementary—model-based products exhibiting high quality during winters while satellite-based products seem to be better during summers. To explore that behavior, this study uses 2-m air temperature as auxiliary information to evaluate high-resolution (0.1°×0.1° every hour) precipitation products from Weather Research and Forecasting (WRF) simulations and from version-4 Integrated Multi-satellite Retrievals for GPM (IMERG) early and final runs. The products are evaluated relative to the reference NCEP Stage IV precipitation estimates over the central United States in 2016. The results show that the WRF and IMERG final-run estimates are nearly unbiased while the IMERG early-run estimates positively biased. The results also show that the WRF estimates exhibit high correlations with the reference data when the temperature falls below 280°K and the IMERG estimates (i.e., both early and final runs) do so when the temperature exceeds 280°K. Moreover, the temperature threshold of 280°K, which distinguishes the quality of the WRF and the IMERG products, does not vary significantly with either season or location. This study not only adds insight into current precipitation research on the quality of precipitation products but also suggests a simple way for choosing either a model- or satellite-based product or a hybrid model/satellite product for applications.

  12. Determining the Uncertainties in Prescribed Burn Emissions Through Comparison of Satellite Estimates to Ground-based Estimates and Air Quality Model Evaluations in Southeastern US

    NASA Astrophysics Data System (ADS)

    Odman, M. T.; Hu, Y.; Russell, A. G.

    2016-12-01

    Prescribed burning is practiced throughout the US, and most widely in the Southeast, for the purpose of maintaining and improving the ecosystem, and reducing the wildfire risk. However, prescribed burn emissions contribute significantly to the of trace gas and particulate matter loads in the atmosphere. In places where air quality is already stressed by other anthropogenic emissions, prescribed burns can lead to major health and environmental problems. Air quality modeling efforts are under way to assess the impacts of prescribed burn emissions. Operational forecasts of the impacts are also emerging for use in dynamic management of air quality as well as the burns. Unfortunately, large uncertainties exist in the process of estimating prescribed burn emissions and these uncertainties limit the accuracy of the burn impact predictions. Prescribed burn emissions are estimated by using either ground-based information or satellite observations. When there is sufficient local information about the burn area, the types of fuels, their consumption amounts, and the progression of the fire, ground-based estimates are more accurate. In the absence of such information satellites remain as the only reliable source for emission estimation. To determine the level of uncertainty in prescribed burn emissions, we compared estimates derived from a burn permit database and other ground-based information to the estimates by the Biomass Burning Emissions Product derived from a constellation of NOAA and NASA satellites. Using these emissions estimates we conducted simulations with the Community Multiscale Air Quality (CMAQ) model and predicted trace gas and particulate matter concentrations throughout the Southeast for two consecutive burn seasons (2015 and 2016). In this presentation, we will compare model predicted concentrations to measurements at monitoring stations and evaluate if the differences are commensurate with our emission uncertainty estimates. We will also investigate if spatial and temporal patterns in the differences reveal the sources of the uncertainty in the prescribed burn emission estimates.

  13. Genomics meets applied ecology: Characterizing habitat quality for sloths in a tropical agroecosystem.

    PubMed

    Fountain, Emily D; Kang, Jung Koo; Tempel, Douglas J; Palsbøll, Per J; Pauli, Jonathan N; Zachariah Peery, M

    2018-01-01

    Understanding how habitat quality in heterogeneous landscapes governs the distribution and fitness of individuals is a fundamental aspect of ecology. While mean individual fitness is generally considered a key to assessing habitat quality, a comprehensive understanding of habitat quality in heterogeneous landscapes requires estimates of dispersal rates among habitat types. The increasing accessibility of genomic approaches, combined with field-based demographic methods, provides novel opportunities for incorporating dispersal estimation into assessments of habitat quality. In this study, we integrated genomic kinship approaches with field-based estimates of fitness components and approximate Bayesian computation (ABC) procedures to estimate habitat-specific dispersal rates and characterize habitat quality in two-toed sloths (Choloepus hoffmanni) occurring in a Costa Rican agricultural ecosystem. Field-based observations indicated that birth and survival rates were similar in a sparsely shaded cacao farm and adjacent cattle pasture-forest mosaic. Sloth density was threefold higher in pasture compared with cacao, whereas home range size and overlap were greater in cacao compared with pasture. Dispersal rates were similar between the two habitats, as estimated using ABC procedures applied to the spatial distribution of pairs of related individuals identified using 3,431 single nucleotide polymorphism and 11 microsatellite locus genotypes. Our results indicate that crops produced under a sparse overstorey can, in some cases, constitute lower-quality habitat than pasture-forest mosaics for sloths, perhaps because of differences in food resources or predator communities. Finally, our study demonstrates that integrating field-based demographic approaches with genomic methods can provide a powerful means for characterizing habitat quality for animal populations occurring in heterogeneous landscapes. © 2017 John Wiley & Sons Ltd.

  14. Intelligence in Bali--A Case Study on Estimating Mean IQ for a Population Using Various Corrections Based on Theory and Empirical Findings

    ERIC Educational Resources Information Center

    Rindermann, Heiner; te Nijenhuis, Jan

    2012-01-01

    A high-quality estimate of the mean IQ of a country requires giving a well-validated test to a nationally representative sample, which usually is not feasible in developing countries. So, we used a convenience sample and four corrections based on theory and empirical findings to arrive at a good-quality estimate of the mean IQ in Bali. Our study…

  15. A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado

    USGS Publications Warehouse

    Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.

    2007-01-01

    Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.

  16. An Evidence-Based Approach to Estimating the National and State Costs of PreK-3rd. FCD Policy Brief Advancing PK-3rd. No.10

    ERIC Educational Resources Information Center

    Picus, Lawrence O.; Odden, Allan; Goetz, Michael

    2009-01-01

    This study estimates the costs of providing a high-quality PreK-3rd education approach in all 50 states plus the District of Columbia. Relying on an Evidence-Based approach to school finance adequacy, it identifies the staffing resources needed to offer high-quality integrated PreK-3rd programs and then estimates the costs of those resources. By…

  17. Researches of fruit quality prediction model based on near infrared spectrum

    NASA Astrophysics Data System (ADS)

    Shen, Yulin; Li, Lian

    2018-04-01

    With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.

  18. Site quality relationships for shortleaf pine

    Treesearch

    David L. Graney

    1986-01-01

    Existing information about site quality relationships for shortleaf pine (Pinus echinata Mill.) in the southeastern United States is reviewed in this paper. Estimates of site quality, whether from direct tree measurements or indirect estimates based on soil and site features, are only local observations for many points on the landscape. To be of value to the land...

  19. Statistical Methods and Sampling Design for Estimating Step Trends in Surface-Water Quality

    USGS Publications Warehouse

    Hirsch, Robert M.

    1988-01-01

    This paper addresses two components of the problem of estimating the magnitude of step trends in surface water quality. The first is finding a robust estimator appropriate to the data characteristics expected in water-quality time series. The J. L. Hodges-E. L. Lehmann class of estimators is found to be robust in comparison to other nonparametric and moment-based estimators. A seasonal Hodges-Lehmann estimator is developed and shown to have desirable properties. Second, the effectiveness of various sampling strategies is examined using Monte Carlo simulation coupled with application of this estimator. The simulation is based on a large set of total phosphorus data from the Potomac River. To assure that the simulated records have realistic properties, the data are modeled in a multiplicative fashion incorporating flow, hysteresis, seasonal, and noise components. The results demonstrate the importance of balancing the length of the two sampling periods and balancing the number of data values between the two periods.

  20. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  1. A Comparison of Turbidity-Based and Streamflow-Based Estimates of Suspended-Sediment Concentrations in Three Chesapeake Bay Tributaries

    USGS Publications Warehouse

    Jastram, John D.; Moyer, Douglas; Hyer, Kenneth

    2009-01-01

    Fluvial transport of sediment into the Chesapeake Bay estuary is a persistent water-quality issue with major implications for the overall health of the bay ecosystem. Accurately and precisely estimating the suspended-sediment concentrations (SSC) and loads that are delivered to the bay, however, remains challenging. Although manual sampling of SSC produces an accurate series of point-in-time measurements, robust extrapolation to unmeasured periods (especially highflow periods) has proven to be difficult. Sediment concentrations typically have been estimated using regression relations between individual SSC values and associated streamflow values; however, suspended-sediment transport during storm events is extremely variable, and it is often difficult to relate a unique SSC to a given streamflow. With this limitation for estimating SSC, innovative approaches for generating detailed records of suspended-sediment transport are needed. One effective method for improved suspended-sediment determination involves the continuous monitoring of turbidity as a surrogate for SSC. Turbidity measurements are theoretically well correlated to SSC because turbidity represents a measure of water clarity that is directly influenced by suspended sediments; thus, turbidity-based estimation models typically are effective tools for generating SSC data. The U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency Chesapeake Bay Program and Virginia Department of Environmental Quality, initiated continuous turbidity monitoring on three major tributaries of the bay - the James, Rappahannock, and North Fork Shenandoah Rivers - to evaluate the use of turbidity as a sediment surrogate in rivers that deliver sediment to the bay. Results of this surrogate approach were compared to the traditionally applied streamflow-based approach for estimating SSC. Additionally, evaluation and comparison of these two approaches were conducted for nutrient estimations. Results demonstrate that the application of turbidity-based estimation models provides an improved method for generating a continuous record of SSC, relative to the classical approach that uses streamflow as a surrogate for SSC. Turbidity-based estimates of SSC were found to be more accurate and precise than SSC estimates from streamflow-based approaches. The turbidity-based SSC estimation models explained 92 to 98 percent of the variability in SSC, while streamflow-based models explained 74 to 88 percent of the variability in SSC. Furthermore, the mean absolute error of turbidity-based SSC estimates was 50 to 87 percent less than the corresponding values from the streamflow-based models. Statistically significant differences were detected between the distributions of residual errors and estimates from the two approaches, indicating that the turbidity-based approach yields estimates of SSC with greater precision than the streamflow-based approach. Similar improvements were identified for turbidity-based estimates of total phosphorus, which is strongly related to turbidity because total phosphorus occurs predominantly in particulate form. Total nitrogen estimation models based on turbidity and streamflow generated estimates of similar quality, with the turbidity-based models providing slight improvements in the quality of estimations. This result is attributed to the understanding that nitrogen transport is dominated by dissolved forms that relate less directly to streamflow and turbidity. Improvements in concentration estimation resulted in improved estimates of load. Turbidity-based suspended-sediment loads estimated for the James River at Cartersville, VA, monitoring station exhibited tighter confidence interval bounds and a coefficient of variation of 12 percent, compared with a coefficient of variation of 38 percent for the streamflow-based load.

  2. A Value-Added Estimate of Higher Education Quality of US States

    ERIC Educational Resources Information Center

    Zhang, Lei

    2009-01-01

    States differ substantially in higher education policies. Little is known about the effects of state policies on the performance of public colleges and universities, largely because no clear measures of college quality exist. In this paper, I estimate the average quality of public colleges of US states based on the value-added to individuals'…

  3. Facial motion parameter estimation and error criteria in model-based image coding

    NASA Astrophysics Data System (ADS)

    Liu, Yunhai; Yu, Lu; Yao, Qingdong

    2000-04-01

    Model-based image coding has been given extensive attention due to its high subject image quality and low bit-rates. But the estimation of object motion parameter is still a difficult problem, and there is not a proper error criteria for the quality assessment that are consistent with visual properties. This paper presents an algorithm of the facial motion parameter estimation based on feature point correspondence and gives the motion parameter error criteria. The facial motion model comprises of three parts. The first part is the global 3-D rigid motion of the head, the second part is non-rigid translation motion in jaw area, and the third part consists of local non-rigid expression motion in eyes and mouth areas. The feature points are automatically selected by a function of edges, brightness and end-node outside the blocks of eyes and mouth. The numbers of feature point are adjusted adaptively. The jaw translation motion is tracked by the changes of the feature point position of jaw. The areas of non-rigid expression motion can be rebuilt by using block-pasting method. The estimation approach of motion parameter error based on the quality of reconstructed image is suggested, and area error function and the error function of contour transition-turn rate are used to be quality criteria. The criteria reflect the image geometric distortion caused by the error of estimated motion parameters properly.

  4. Practical Considerations for Optic Nerve Estimation in Telemedicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karnowski, Thomas Paul; Aykac, Deniz; Chaum, Edward

    The projected increase in diabetes in the United States and worldwide has created a need for broad-based, inexpensive screening for diabetic retinopathy (DR), an eye disease which can lead to vision impairment. A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion / anomaly detection is a low-cost way of achieving broad-based screening. In this work we report on the effect of quality estimation on an optic nerve (ON) detection method with a confidence metric. We report on an improvement of the fusion technique using a data set from an ophthalmologists practice then show themore » results of the method as a function of image quality on a set of images from an on-line telemedicine network collected in Spring 2009 and another broad-based screening program. We show that the fusion method, combined with quality estimation processing, can improve detection performance and also provide a method for utilizing a physician-in-the-loop for images that may exceed the capabilities of automated processing.« less

  5. Improved Satellite-based Photosysnthetically Active Radiation (PAR) for Air Quality Studies

    NASA Astrophysics Data System (ADS)

    Pour Biazar, A.; McNider, R. T.; Cohan, D. S.; White, A.; Zhang, R.; Dornblaser, B.; Doty, K.; Wu, Y.; Estes, M. J.

    2015-12-01

    One of the challenges in understanding the air quality over forested regions has been the uncertainties in estimating the biogenic hydrocarbon emissions. Biogenic volatile organic compounds, BVOCs, play a critical role in atmospheric chemistry, particularly in ozone and particulate matter (PM) formation. In southeastern United States, BVOCs (mostly as isoprene) are the dominant summertime source of reactive hydrocarbon. Despite significant efforts in improving BVOC estimates, the errors in emission inventories remain a concern. Since BVOC emissions are particularly sensitive to the available photosynthetically active radiation (PAR), model errors in PAR result in large errors in emission estimates. Thus, utilization of satellite observations to estimate PAR can help in reducing emission uncertainties. Satellite-based PAR estimates rely on the technique used to derive insolation from satellite visible brightness measurements. In this study we evaluate several insolation products against surface pyranometer observations and offer a bias correction to generate a more accurate PAR product. The improved PAR product is then used in biogenic emission estimates. The improved biogenic emission estimates are compared to the emission inventories over Texas and used in air quality simulation over the period of August-September 2013 (NASA's Discover-AQ field campaign). A series of sensitivity simulations will be performed and evaluated against Discover-AQ observations to test the impact of satellite-derived PAR on air quality simulations.

  6. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    NASA Technical Reports Server (NTRS)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  7. Improving best-phase image quality in cardiac CT by motion correction with MAM optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohkohl, Christopher; Bruder, Herbert; Stierstorfer, Karl

    2013-03-15

    Purpose: Research in image reconstruction for cardiac CT aims at using motion correction algorithms to improve the image quality of the coronary arteries. The key to those algorithms is motion estimation, which is currently based on 3-D/3-D registration to align the structures of interest in images acquired in multiple heart phases. The need for an extended scan data range covering several heart phases is critical in terms of radiation dose to the patient and limits the clinical potential of the method. Furthermore, literature reports only slight quality improvements of the motion corrected images when compared to the most quiet phasemore » (best-phase) that was actually used for motion estimation. In this paper a motion estimation algorithm is proposed which does not require an extended scan range but works with a short scan data interval, and which markedly improves the best-phase image quality. Methods: Motion estimation is based on the definition of motion artifact metrics (MAM) to quantify motion artifacts in a 3-D reconstructed image volume. The authors use two different MAMs, entropy, and positivity. By adjusting the motion field parameters, the MAM of the resulting motion-compensated reconstruction is optimized using a gradient descent procedure. In this way motion artifacts are minimized. For a fast and practical implementation, only analytical methods are used for motion estimation and compensation. Both the MAM-optimization and a 3-D/3-D registration-based motion estimation algorithm were investigated by means of a computer-simulated vessel with a cardiac motion profile. Image quality was evaluated using normalized cross-correlation (NCC) with the ground truth template and root-mean-square deviation (RMSD). Four coronary CT angiography patient cases were reconstructed to evaluate the clinical performance of the proposed method. Results: For the MAM-approach, the best-phase image quality could be improved for all investigated heart phases, with a maximum improvement of the NCC value by 100% and of the RMSD value by 81%. The corresponding maximum improvements for the registration-based approach were 20% and 40%. In phases with very rapid motion the registration-based algorithm obtained better image quality, while the image quality of the MAM algorithm was superior in phases with less motion. The image quality improvement of the MAM optimization was visually confirmed for the different clinical cases. Conclusions: The proposed method allows a software-based best-phase image quality improvement in coronary CT angiography. A short scan data interval at the target heart phase is sufficient, no additional scan data in other cardiac phases are required. The algorithm is therefore directly applicable to any standard cardiac CT acquisition protocol.« less

  8. An index approach to performance-based payments for water quality.

    PubMed

    Maille, Peter; Collins, Alan R

    2012-05-30

    In this paper we describe elements of a field research project that presented farmers with economic incentives to control nitrate runoff. The approach used is novel in that payments are based on ambient water quality and water quantity produced by a watershed rather than proxies for water quality conservation. Also, payments are made based on water quality relative to a control watershed, and therefore, account for stochastic fluctuations in background nitrate levels. Finally, the program pays farmers as a group to elicit team behavior. We present our approach to modeling that allowed us to estimate prices for water and resulting payment levels. We then compare these preliminary estimates to the actual values recorded over 33 months of fieldwork. We find that our actual payments were 29% less than our preliminary estimates, due in part to the failure of our ecological model to estimate discharge accurately. Despite this shortfall, the program attracted the participation of 53% of the farmers in the watershed, and resulted in substantial nitrate abatement activity. Given this favorable response, we propose that research efforts focus on implementing field trials of group-level performance-based payments. Ideally these programs would be low risk and control for naturally occurring contamination. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Spatio-temporal water quality mapping from satellite images using geographically and temporally weighted regression

    NASA Astrophysics Data System (ADS)

    Chu, Hone-Jay; Kong, Shish-Jeng; Chang, Chih-Hua

    2018-03-01

    The turbidity (TB) of a water body varies with time and space. Water quality is traditionally estimated via linear regression based on satellite images. However, estimating and mapping water quality require a spatio-temporal nonstationary model, while TB mapping necessitates the use of geographically and temporally weighted regression (GTWR) and geographically weighted regression (GWR) models, both of which are more precise than linear regression. Given the temporal nonstationary models for mapping water quality, GTWR offers the best option for estimating regional water quality. Compared with GWR, GTWR provides highly reliable information for water quality mapping, boasts a relatively high goodness of fit, improves the explanation of variance from 44% to 87%, and shows a sufficient space-time explanatory power. The seasonal patterns of TB and the main spatial patterns of TB variability can be identified using the estimated TB maps from GTWR and by conducting an empirical orthogonal function (EOF) analysis.

  10. Validation of Ocean Color Remote Sensing Reflectance Using Autonomous Floats

    NASA Technical Reports Server (NTRS)

    Gerbi, Gregory P.; Boss, Emanuel; Werdell, P. Jeremy; Proctor, Christopher W.; Haentjens, Nils; Lewis, Marlon R.; Brown, Keith; Sorrentino, Diego; Zaneveld, J. Ronald V.; Barnard, Andrew H.; hide

    2016-01-01

    The use of autonomous proling oats for observational estimates of radiometric quantities in the ocean is explored, and the use of this platform for validation of satellite-based estimates of remote sensing reectance in the ocean is examined. This effort includes comparing quantities estimated from oat and satellite data at nominal wavelengths of 412, 443, 488, and 555 nm, and examining sources and magnitudes of uncertainty in the oat estimates. This study had 65 occurrences of coincident high-quality observations from oats and MODIS Aqua and 15 occurrences of coincident high-quality observations oats and Visible Infrared Imaging Radi-ometer Suite (VIIRS). The oat estimates of remote sensing reectance are similar to the satellite estimates, with disagreement of a few percent in most wavelengths. The variability of the oatsatellite comparisons is similar to the variability of in situsatellite comparisons using a validation dataset from the Marine Optical Buoy (MOBY). This, combined with the agreement of oat-based and satellite-based quantities, suggests that oats are likely a good platform for validation of satellite-based estimates of remote sensing reectance.

  11. Use of a macroinvertebrate based biotic index to estimate critical metal concentrations for good ecological water quality.

    PubMed

    Van Ael, Evy; De Cooman, Ward; Blust, Ronny; Bervoets, Lieven

    2015-01-01

    Large datasets from total and dissolved metal concentrations in Flemish (Belgium) fresh water systems and the associated macroinvertebrate-based biotic index MMIF (Multimetric Macroinvertebrate Index Flanders) were used to estimate critical metal concentrations for good ecological water quality, as imposed by the European Water Framework Directive (2000). The contribution of different stressors (metals and water characteristics) to the MMIF were studied by constructing generalized linear mixed effect models. Comparison between estimated critical concentrations and the European and Flemish EQS, shows that the EQS for As, Cd, Cu and Zn seem to be sufficient to reach a good ecological quality status as expressed by the invertebrate-based biotic index. In contrast, the EQS for Cr, Hg and Pb are higher than the estimated critical concentrations, which suggests that when environmental concentrations are at the same level as the EQS a good quality status might not be reached. The construction of mixed models that included metal concentrations in their structure did not lead to a significant outcome. However, mixed models showed the primary importance of water characteristics (oxygen level, temperature, ammonium concentration and conductivity) for the MMIF. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Cost-effectiveness of lobectomy versus genetic testing (Afirma®) for indeterminate thyroid nodules: Considering the costs of surveillance.

    PubMed

    Balentine, Courtney J; Vanness, David J; Schneider, David F

    2018-01-01

    We evaluated whether diagnostic thyroidectomy for indeterminate thyroid nodules would be more cost-effective than genetic testing after including the costs of long-term surveillance. We used a Markov decision model to estimate the cost-effectiveness of thyroid lobectomy versus genetic testing (Afirma®) for evaluation of indeterminate (Bethesda 3-4) thyroid nodules. The base case was a 40-year-old woman with a 1-cm indeterminate nodule. Probabilities and estimates of utilities were obtained from the literature. Cost estimates were based on Medicare reimbursements with a 3% discount rate for costs and quality-adjusted life-years. During a 5-year period after the diagnosis of indeterminate thyroid nodules, lobectomy was less costly and more effective than Afirma® (lobectomy: $6,100; 4.50 quality-adjusted life- years vs Afirma®: $9,400; 4.47 quality-adjusted life-years). Only in 253 of 10,000 simulations (2.5%) did Afirma® show a net benefit at a cost-effectiveness threshold of $100,000 per quality- adjusted life-years. There was only a 0.3% probability of Afirma® being cost saving and a 14.9% probability of improving quality-adjusted life-years. Our base case estimate suggests that diagnostic lobectomy dominates genetic testing as a strategy for ruling out malignancy of indeterminate thyroid nodules. These results, however, were highly sensitive to estimates of utilities after lobectomy and living under surveillance after Afirma®. Published by Elsevier Inc.

  13. Concentrations, loads, and yields of total phosphorus, total nitrogen, and suspended sediment and bacteria concentrations in the Wister Lake Basin, Oklahoma and Arkansas, 2011-13

    USGS Publications Warehouse

    Buck, Stephanie D.

    2014-01-01

    The Poteau Valley Improvement Authority uses Wister Lake in southeastern Oklahoma as a public water supply. Total phosphorus, total nitrogen, and suspended sediments from agricultural runoff and discharges from wastewater treatment plants and other sources have degraded water quality in the lake. As lake-water quality has degraded, water-treatment cost, chemical usage, and sludge production have increased for the Poteau Valley Improvement Authority. The U.S. Geological Survey (USGS), in cooperation with the Poteau Valley Improvement Authority, investigated and summarized concentrations of total phosphorus, total nitrogen, suspended sediment, and bacteria (Escherichia coli and Enterococcus sp.) in surface water flowing to Wister Lake. Estimates of total phosphorus, total nitrogen, and suspended sediment loads, yields, and flow-weighted mean concentrations of total phosphorus and total nitrogen concentrations were made for the Wister Lake Basin for a 3-year period from October 2010 through September 2013. Data from water samples collected at fixed time increments during base-flow conditions and during runoff conditions at the Poteau River at Loving, Okla. (USGS station 07247015), the Poteau River near Heavener, Okla. (USGS station 07247350), and the Fourche Maline near Leflore, Okla. (USGS station 07247650), water-quality stations were used to evaluate water quality over the range of streamflows in the basin. These data also were collected to estimate annual constituent loads and yields by using regression models. At the Poteau River stations, total phosphorus, total nitrogen, and suspended sediment concentrations in surface-water samples were significantly larger in samples collected during runoff conditions than in samples collected during base-flow conditions. At the Fourche Maline station, in contrast, concentrations of these constituents in water samples collected during runoff conditions were not significantly larger than concentrations during base-flow conditions. Flow-weighted mean total phosphorus concentrations at all three stations from 2011 to 2013 were several times larger than the Oklahoma State Standard for Scenic Rivers (0.037 milligrams per liter [mg/L]), with the largest flow-weighted phosphorus concentrations typically being measured at the Poteau River at Loving, Okla., station. Flow-weighted mean total nitrogen concentrations did not vary substantially between the Poteau River stations and the Fourche Maline near Leflore, Okla., station. At all of the sampled water-quality stations, bacteria (Escherichia coli and Enterococcus sp.) concentrations were substantially larger in water samples collected during runoff conditions than in water samples collected during base-flow conditions from 2011 to 2013. Estimated annual loads of total phosphorus, total nitrogen, and suspended sediment in the Poteau River stations during runoff conditions ranged from 82 to 98 percent of the total annual loads of those constituents. Estimated annual loads of total phosphorus, total nitrogen, and suspended sediment in the Fourche Maline during runoff conditions ranged from 86 to nearly 100 percent of the total annual loads. Estimated seasonal total phosphorus loads generally were smallest during base-flow and runoff conditions in autumn. Estimated seasonal total phosphorus loads during base-flow conditions tended to be largest in winter and during runoff conditions tended to be largest in the spring. Estimated seasonal total nitrogen loads tended to be smallest in autumn during base-flow and runoff conditions and largest in winter during runoff conditions. Estimated seasonal suspended sediment loads tended to be smallest during base-flow conditions in the summer and smallest during runoff conditions in the autumn. The largest estimated seasonal suspended sediment loads during runoff conditions typically were in the spring. The estimated mean annual total phosphorus yield was largest at the Poteau River at Loving, Okla., water-quality station. The estimated mean annual total phosphorus yield was largest during base flow at the Poteau River at Loving, Okla., water-quality station and at both of the Poteau River water-quality stations during runoff conditions. The estimated mean annual total nitrogen yields were largest at the Poteau River water-quality stations. Estimated mean annual total nitrogen yields were largest during base-flow and runoff conditions at the Poteau River at Loving, Okla., water-quality station. The estimated mean annual suspended sediment yield was largest at the Poteau River near Heavener, Okla., water-quality station during base-flow and runoff conditions. Flow-weighted mean concentrations indicated that total phosphorus inputs from the Poteau River Basin in the Wister Lake Basin were larger than from the Fourche Maline Basin. Flow-weighted mean concentrations of total nitrogen did not vary spatially in a consistent manner. The Poteau River and the Fourche Maline contributed estimated annual total phosphorus loads of 137 to 278 tons per year (tons/yr) to Wister Lake. Between 89 and 95 percent of the annual total phosphorus loads were transported to Wister Lake during runoff conditions. The Poteau River and the Fourche Maline contributed estimated annual total nitrogen loads of 657 to 1,294 tons/yr, with 86 to 94 percent of the annual total nitrogen loads being transported to Wister Lake during runoff conditions. The Poteau River and the Fourche Maline contributed estimated annual total suspended sediment loads of 110,919 to 234,637 tons/yr, with 94 to 99 percent of the annual suspended sediment loads being transported to Wister Lake during runoff conditions. Most of the total phosphorus and suspended sediment were delivered to Wister Lake during runoff conditions in the spring. The majority of the total nitrogen was delivered to Wister Lake during runoff conditions in winter.

  14. Estimation and Fusion for Tracking Over Long-Haul Links Using Artificial Neural Networks

    DOE PAGES

    Liu, Qiang; Brigham, Katharine; Rao, Nageswara S. V.

    2017-02-01

    In a long-haul sensor network, sensors are remotely deployed over a large geographical area to perform certain tasks, such as tracking and/or monitoring of one or more dynamic targets. A remote fusion center fuses the information provided by these sensors so that a final estimate of certain target characteristics – such as the position – is expected to possess much improved quality. In this paper, we pursue learning-based approaches for estimation and fusion of target states in longhaul sensor networks. In particular, we consider learning based on various implementations of artificial neural networks (ANNs). Finally, the joint effect of (i)more » imperfect communication condition, namely, link-level loss and delay, and (ii) computation constraints, in the form of low-quality sensor estimates, on ANN-based estimation and fusion, is investigated by means of analytical and simulation studies.« less

  15. Estimation and Fusion for Tracking Over Long-Haul Links Using Artificial Neural Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Qiang; Brigham, Katharine; Rao, Nageswara S. V.

    In a long-haul sensor network, sensors are remotely deployed over a large geographical area to perform certain tasks, such as tracking and/or monitoring of one or more dynamic targets. A remote fusion center fuses the information provided by these sensors so that a final estimate of certain target characteristics – such as the position – is expected to possess much improved quality. In this paper, we pursue learning-based approaches for estimation and fusion of target states in longhaul sensor networks. In particular, we consider learning based on various implementations of artificial neural networks (ANNs). Finally, the joint effect of (i)more » imperfect communication condition, namely, link-level loss and delay, and (ii) computation constraints, in the form of low-quality sensor estimates, on ANN-based estimation and fusion, is investigated by means of analytical and simulation studies.« less

  16. Use of NEXRAD radar-based observations for quality control of in-situ rain gauge measurements

    NASA Astrophysics Data System (ADS)

    Nelson, B. R.; Prat, O.; Leeper, R.

    2017-12-01

    Rain gauge quality control is an often over looked important step in the archive of historical precipitation estimates. We investigate the possibilities that exist for using ground based radar networks for quality control of rain gauge measurements. This process includes the point to pixel comparisons of the rain gauge measurements with NEXRAD observations. There are two NEXRAD based data sets used for reference; the NCEP stage IV and the NWS MRMS gridded data sets. The NCEP stage IV data set is available at 4km hourly for the period 2002-present and includes the radar-gauge bias adjusted precipitation estimate. The NWS MRMS data set includes several different variables such as reflectivity, radar-only estimates, precipitation flag, and radar-gauge bias adjusted precipitation estimates. The latter product provides for much more information to apply quality control such as identification of precipitation type, identification of storm type and Z-R relation. In addition, some of the variables are available at 5-minute scale. The rain gauge networks that are investigated are the Climate Reference Network (CRN), the Fischer-Porter Hourly Precipitation Database (HPD), and the Hydrometeorological Automated Data System (HADS). The CRN network is available at the 5-minute scale, the HPD network is available at the 15-minute and hourly scale, and HADS is available at the hourly scale. The varying scales present challenges for comparisons. However given the higher resolution radar-based products we identify an optimal combination of rain gauge observations that can be compared to the radar-based information. The quality control process focuses on identifying faulty gauges in direct comparison while a deeper investigation focuses on event-based differences from light rain to extreme storms.

  17. Quantitative estimates of the impact of sensitivity and specificity in mammographic screening in Germany.

    PubMed Central

    Warmerdam, P G; de Koning, H J; Boer, R; Beemsterboer, P M; Dierks, M L; Swart, E; Robra, B P

    1997-01-01

    STUDY OBJECTIVE: To estimate quantitatively the impact of the quality of mammographic screening (in terms of sensitivity and specificity) on the effects and costs of nationwide breast cancer screening. DESIGN: Three plausible "quality" scenarios for a biennial breast cancer screening programme for women aged 50-69 in Germany were analysed in terms of costs and effects using the Microsimulation Screening Analysis model on breast cancer screening and the natural history of breast cancer. Firstly, sensitivity and specificity in the expected situation (or "baseline" scenario) were estimated from a model based analysis of empirical data from 35,000 screening examinations in two German pilot projects. In the second "high quality" scenario, these properties were based on the more favourable diagnostic results from breast cancer screening projects and the nationwide programme in The Netherlands. Thirdly, a worst case, "low quality" hypothetical scenario with a 25% lower sensitivity than that experienced in The Netherlands was analysed. SETTING: The epidemiological and social situation in Germany in relation to mass screening for breast cancer. RESULTS: In the "baseline" scenario, an 11% reduction in breast cancer mortality was expected in the total German female population, ie 2100 breast cancer deaths would be prevented per year. It was estimated that the "high quality" scenario, based on Dutch experience, would lead to the prevention of an additional 200 deaths per year and would also cut the number of false positive biopsy results by half. The cost per life year gained varied from Deutsche mark (DM) 15,000 on the "high quality" scenario to DM 21,000 in the "low quality" setting. CONCLUSIONS: Up to 20% of the total costs of a screening programme can be spent on quality improvement in order to achieve a substantially higher reduction in mortality and reduce undesirable side effects while retaining the same cost effectiveness ratio as that estimated from the German data. PMID:9196649

  18. Beef quality parameters estimation using ultrasound and color images

    PubMed Central

    2015-01-01

    Background Beef quality measurement is a complex task with high economic impact. There is high interest in obtaining an automatic quality parameters estimation in live cattle or post mortem. In this paper we set out to obtain beef quality estimates from the analysis of ultrasound (in vivo) and color images (post mortem), with the measurement of various parameters related to tenderness and amount of meat: rib eye area, percentage of intramuscular fat and backfat thickness or subcutaneous fat. Proposal An algorithm based on curve evolution is implemented to calculate the rib eye area. The backfat thickness is estimated from the profile of distances between two curves that limit the steak and the rib eye, previously detected. A model base in Support Vector Regression (SVR) is trained to estimate the intramuscular fat percentage. A series of features extracted on a region of interest, previously detected in both ultrasound and color images, were proposed. In all cases, a complete evaluation was performed with different databases including: color and ultrasound images acquired by a beef industry expert, intramuscular fat estimation obtained by an expert using a commercial software, and chemical analysis. Conclusions The proposed algorithms show good results to calculate the rib eye area and the backfat thickness measure and profile. They are also promising in predicting the percentage of intramuscular fat. PMID:25734452

  19. LESS: Link Estimation with Sparse Sampling in Intertidal WSNs

    PubMed Central

    Ji, Xiaoyu; Chen, Yi-chao; Li, Xiaopeng; Xu, Wenyuan

    2018-01-01

    Deploying wireless sensor networks (WSN) in the intertidal area is an effective approach for environmental monitoring. To sustain reliable data delivery in such a dynamic environment, a link quality estimation mechanism is crucial. However, our observations in two real WSN systems deployed in the intertidal areas reveal that link update in routing protocols often suffers from energy and bandwidth waste due to the frequent link quality measurement and updates. In this paper, we carefully investigate the network dynamics using real-world sensor network data and find it feasible to achieve accurate estimation of link quality using sparse sampling. We design and implement a compressive-sensing-based link quality estimation protocol, LESS, which incorporates both spatial and temporal characteristics of the system to aid the link update in routing protocols. We evaluate LESS in both real WSN systems and a large-scale simulation, and the results show that LESS can reduce energy and bandwidth consumption by up to 50% while still achieving more than 90% link quality estimation accuracy. PMID:29494557

  20. Methodological Issues Surrounding the Use of Baseline Health-Related Quality of Life Data to Inform Trial-Based Economic Evaluations of Interventions Within Emergency and Critical Care Settings: A Systematic Literature Review.

    PubMed

    Dritsaki, Melina; Achana, Felix; Mason, James; Petrou, Stavros

    2017-05-01

    Trial-based cost-utility analyses require health-related quality of life data that generate utility values in order to express health outcomes in terms of quality-adjusted life years (QALYs). Assessments of baseline health-related quality of life are problematic where trial participants are incapacitated or critically ill at the time of randomisation. This review aims to identify and critique methods for handling non-availability of baseline health-related quality of life data in trial-based cost-utility analyses within emergency and critical illness settings. A systematic literature review was conducted, following PRISMA guidelines, to identify trial-based cost-utility analyses of interventions within emergency and critical care settings. Databases searched included the National Institute for Health Research (NIHR) Journals Library (1991-July 2016), Cochrane Library (all years); National Health Service (NHS) Economic Evaluation Database (all years) and Ovid MEDLINE/Embase (without time restriction). Strategies employed to handle non-availability of baseline health-related quality of life data in final QALY estimations were identified and critiqued. A total of 4224 published reports were screened, 19 of which met the study inclusion criteria (mean trial size 1670): 14 (74 %) from the UK, four (21%) from other European countries and one (5%) from India. Twelve studies (63%) were based in emergency departments and seven (37%) in intensive care units. Only one study was able to elicit patient-reported health-related quality of life at baseline. To overcome the lack of baseline data when estimating QALYs, eight studies (42%) assigned a fixed utility weight corresponding to either death, an unconscious health state or a country-specific norm to patients at baseline, four (21%) ignored baseline utilities, three (16%) applied values from another study, one (5%) generated utility values via retrospective recall and one (5%) elicited utilities from experts. A preliminary exploration of these methods shows that incremental QALY estimation is unlikely to be biased if balanced trial allocation is achieved and subsequent collection of health-related quality of life data occurs at the earliest possible opportunity following commencement of treatment, followed by an adequate number of follow-up assessments. Trial-based cost-utility analyses within emergency and critical illness settings have applied different methods for QALY estimation, employing disparate assumptions about the health-related quality of life of patients at baseline. Where baseline measurement is not practical, measurement at the earliest opportunity following commencement of treatment should minimise bias in QALY estimation.

  1. Non-invasive Fetal ECG Signal Quality Assessment for Multichannel Heart Rate Estimation.

    PubMed

    Andreotti, Fernando; Graser, Felix; Malberg, Hagen; Zaunseder, Sebastian

    2017-12-01

    The noninvasive fetal ECG (NI-FECG) from abdominal recordings offers novel prospects for prenatal monitoring. However, NI-FECG signals are corrupted by various nonstationary noise sources, making the processing of abdominal recordings a challenging task. In this paper, we present an online approach that dynamically assess the quality of NI-FECG to improve fetal heart rate (FHR) estimation. Using a naive Bayes classifier, state-of-the-art and novel signal quality indices (SQIs), and an existing adaptive Kalman filter, FHR estimation was improved. For the purpose of training and validating the proposed methods, a large annotated private clinical dataset was used. The suggested classification scheme demonstrated an accuracy of Krippendorff's alpha in determining the overall quality of NI-FECG signals. The proposed Kalman filter outperformed alternative methods for FHR estimation achieving accuracy. The proposed algorithm was able to reliably reflect changes of signal quality and can be used in improving FHR estimation. NI-ECG signal quality estimation and multichannel information fusion are largely unexplored topics. Based on previous works, multichannel FHR estimation is a field that could strongly benefit from such methods. The developed SQI algorithms as well as resulting classifier were made available under a GNU GPL open-source license and contributed to the FECGSYN toolbox.

  2. A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.

    PubMed

    Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa

    2016-05-17

    Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.

  3. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  4. Probability-based estimates of site-specific copper water quality criteria for the Chesapeake Bay, USA.

    PubMed

    Arnold, W Ray; Warren-Hicks, William J

    2007-01-01

    The object of this study was to estimate site- and region-specific dissolved copper criteria for a large embayment, the Chesapeake Bay, USA. The intent is to show the utility of 2 copper saltwater quality site-specific criteria estimation models and associated region-specific criteria selection methods. The criteria estimation models and selection methods are simple, efficient, and cost-effective tools for resource managers. The methods are proposed as potential substitutes for the US Environmental Protection Agency's water effect ratio methods. Dissolved organic carbon data and the copper criteria models were used to produce probability-based estimates of site-specific copper saltwater quality criteria. Site- and date-specific criteria estimations were made for 88 sites (n = 5,296) in the Chesapeake Bay. The average and range of estimated site-specific chronic dissolved copper criteria for the Chesapeake Bay were 7.5 and 5.3 to 16.9 microg Cu/L. The average and range of estimated site-specific acute dissolved copper criteria for the Chesapeake Bay were 11.7 and 8.3 to 26.4 microg Cu/L. The results suggest that applicable national and state copper criteria can increase in much of the Chesapeake Bay and remain protective. Virginia Department of Environmental Quality copper criteria near the mouth of the Chesapeake Bay, however, need to decrease to protect species of equal or greater sensitivity to that of the marine mussel, Mytilus sp.

  5. Adaptive compressed sensing of multi-view videos based on the sparsity estimation

    NASA Astrophysics Data System (ADS)

    Yang, Senlin; Li, Xilong; Chong, Xin

    2017-11-01

    The conventional compressive sensing for videos based on the non-adaptive linear projections, and the measurement times is usually set empirically. As a result, the quality of videos reconstruction is always affected. Firstly, the block-based compressed sensing (BCS) with conventional selection for compressive measurements was described. Then an estimation method for the sparsity of multi-view videos was proposed based on the two dimensional discrete wavelet transform (2D DWT). With an energy threshold given beforehand, the DWT coefficients were processed with both energy normalization and sorting by descending order, and the sparsity of the multi-view video can be achieved by the proportion of dominant coefficients. And finally, the simulation result shows that, the method can estimate the sparsity of video frame effectively, and provides an active basis for the selection of compressive observation times. The result also shows that, since the selection of observation times is based on the sparsity estimated with the energy threshold provided, the proposed method can ensure the reconstruction quality of multi-view videos.

  6. Simplification of a light-based model for estimating final internode length in greenhouse cucumber canopies.

    PubMed

    Kahlen, Katrin; Stützel, Hartmut

    2011-10-01

    Light quantity and quality affect internode lengths in cucumber (Cucumis sativus), whereby leaf area and the optical properties of the leaves mainly control light quality within a cucumber plant community. This modelling study aimed at providing a simple, non-destructive method to predict final internode lengths (FILs) using light quantity and leaf area data. Several simplifications of a light quantity and quality sensitive model for estimating FILs in cucumber have been tested. The direct simplifications substitute the term for the red : far-red (R : FR) ratios, by a term for (a) the leaf area index (LAI, m(2) m(-2)) or (b) partial LAI, the cumulative leaf area per m(2) ground, where leaf area per m(2) ground is accumulated from the top of each plant until a number, n, of leaves per plant is reached. The indirect simplifications estimate the input R : FR ratio based on partial leaf area and plant density. In all models, simulated FILs were in line with the measured FILs over various canopy architectures and light conditions, but the prediction quality varied. The indirect simplification based on leaf area of ten leaves revealed the best fit with measured data. Its prediction quality was even higher than of the original model. This study showed that for vertically trained cucumber plants, leaf area data can substitute local light quality data for estimating FIL data. In unstressed canopies, leaf area over the upper ten ranks seems to represent the feedback of the growing architecture on internode elongation with respect to light quality. This highlights the role of this domain of leaves as the primary source for the specific R : FR signal controlling the final length of an internode and could therefore guide future research on up-scaling local processes to the crop level.

  7. Single image super-resolution reconstruction algorithm based on eage selection

    NASA Astrophysics Data System (ADS)

    Zhang, Yaolan; Liu, Yijun

    2017-05-01

    Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.

  8. Development of the Quality of Australian Nursing Documentation in Aged Care (QANDAC) instrument to assess paper-based and electronic resident records.

    PubMed

    Wang, Ning; Björvell, Catrin; Hailey, David; Yu, Ping

    2014-12-01

    To develop an Australian nursing documentation in aged care (Quality of Australian Nursing Documentation in Aged Care (QANDAC)) instrument to measure the quality of paper-based and electronic resident records. The instrument was based on the nursing process model and on three attributes of documentation quality identified in a systematic review. The development process involved five phases following approaches to designing criterion-referenced measures. The face and content validities and the inter-rater reliability of the instrument were estimated using a focus group approach and consensus model. The instrument contains 34 questions in three sections: completion of nursing history and assessment, description of care process and meeting the requirements of data entry. Estimates of the validity and inter-rater reliability of the instrument gave satisfactory results. The QANDAC instrument may be a useful audit tool for quality improvement and research in aged care documentation. © 2013 ACOTA.

  9. A probabilistic-based approach to monitoring tool wear state and assessing its effect on workpiece quality in nickel-based alloys

    NASA Astrophysics Data System (ADS)

    Akhavan Niaki, Farbod

    The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the mechanisms of tool failure were first identified and, based on the rapid catastrophic failure of the tool, a Bayesian inference method (i.e., Markov Chain Monte Carlo, MCMC) was used for parameter calibration of tool wear using a power mechanistic model. The calibrated model was then used in the state space probabilistic framework of a Kalman filter to estimate the tool flank wear. Furthermore, an on-machine laser measuring system was utilized and fused into the Kalman filter to improve the estimation accuracy. In the turning operation the behavior of progressive wear was investigated as well. Due to the nonlinear nature of wear in turning, an extended Kalman filter was designed for tracking progressive wear, and the results of the probabilistic-based method were compared with a deterministic technique, where significant improvement (more than 60% increase in estimation accuracy) was achieved. To fulfill the second objective of this research in understanding the underlying effects of wear on part quality in cutting nickel-based superalloys, a comprehensive study on surface roughness, dimensional integrity and residual stress was conducted. The estimated results derived from a probabilistic filter were used for finding the proper correlations between wear, surface roughness and dimensional integrity, along with a finite element simulation for predicting the residual stress profile for sharp and worn cutting tool conditions. The output of this research provides the essential information on condition monitoring of the tool and its effects on product quality. The low-cost Hall effect sensor used in this work to capture spindle power in the context of the stochastic filter can effectively estimate tool wear in both milling and turning operations, while the estimated wear can be used to generate knowledge of the state of workpiece surface integrity. Therefore the true functionality and efficiency of the tool in superalloy machining can be evaluated without additional high-cost sensing.

  10. Use of satellite telemetry for study of a gyrfalcon in Greenland

    USGS Publications Warehouse

    Klugman, S.S.; Fuller, M.R.; Howey, P.W.; Yates, M.A.; Oar, J.J.; Seegar, J.M.; Seegar, W.S.; Mattox, G.M.; Maechtle, T.L.

    1993-01-01

    Long-term research in Greenland has yielded 1 8 years of incidental sightings and 2 years of surveys and observations of gyrfalcons(Falco rusticolus) around Sondrestromfjord, Greenland. Gyrfalcons nest on cliffs along fjords and near rivers and lakes throughout our 2590 sq. km study area. Nestlings are present mid-June to July. In 1990, we marked one adult female gyrfalcon with a 65 g radio-transmitter to obtain location estimates via the ARGOS polar orbiting satellite system. The unit transmitted 8 hours/day every two days. We obtained 145 locations during 5 weeks of the nestling and fledgling stage of breeding. We collected 1-9 locations/day, with a mean of 4/day. We calculated home range estimates based on the Minimum Convex Polygon( MCP) and Harmonic Mean (HM methods and tested subsets of the data based on location quality and number of transmission hours per day. Home range estimated by MCP using higher quality locations was approximately 589 sq. km. Home range estimates were larger when lower-quality locations were included in the estimates. Estimates based on data collected for 4 hours/day were similar to those for 8 hours/day. In the future, it might be possible to extend battery life of the transmitters by reducing the number of transmission hours/day. A longer-lived transmitter could provide information on movements and home ranges throughout the year.

  11. Assessment and modeling of the groundwater hydrogeochemical quality parameters via geostatistical approaches

    NASA Astrophysics Data System (ADS)

    Karami, Shawgar; Madani, Hassan; Katibeh, Homayoon; Fatehi Marj, Ahmad

    2018-03-01

    Geostatistical methods are one of the advanced techniques used for interpolation of groundwater quality data. The results obtained from geostatistics will be useful for decision makers to adopt suitable remedial measures to protect the quality of groundwater sources. Data used in this study were collected from 78 wells in Varamin plain aquifer located in southeast of Tehran, Iran, in 2013. Ordinary kriging method was used in this study to evaluate groundwater quality parameters. According to what has been mentioned in this paper, seven main quality parameters (i.e. total dissolved solids (TDS), sodium adsorption ratio (SAR), electrical conductivity (EC), sodium (Na+), total hardness (TH), chloride (Cl-) and sulfate (SO4 2-)), have been analyzed and interpreted by statistical and geostatistical methods. After data normalization by Nscore method in WinGslib software, variography as a geostatistical tool to define spatial regression was compiled and experimental variograms were plotted by GS+ software. Then, the best theoretical model was fitted to each variogram based on the minimum RSS. Cross validation method was used to determine the accuracy of the estimated data. Eventually, estimation maps of groundwater quality were prepared in WinGslib software and estimation variance map and estimation error map were presented to evaluate the quality of estimation in each estimated point. Results showed that kriging method is more accurate than the traditional interpolation methods.

  12. Full-reference quality assessment of stereoscopic images by learning binocular receptive field properties.

    PubMed

    Shao, Feng; Li, Kemeng; Lin, Weisi; Jiang, Gangyi; Yu, Mei; Dai, Qionghai

    2015-10-01

    Quality assessment of 3D images encounters more challenges than its 2D counterparts. Directly applying 2D image quality metrics is not the solution. In this paper, we propose a new full-reference quality assessment for stereoscopic images by learning binocular receptive field properties to be more in line with human visual perception. To be more specific, in the training phase, we learn a multiscale dictionary from the training database, so that the latent structure of images can be represented as a set of basis vectors. In the quality estimation phase, we compute sparse feature similarity index based on the estimated sparse coefficient vectors by considering their phase difference and amplitude difference, and compute global luminance similarity index by considering luminance changes. The final quality score is obtained by incorporating binocular combination based on sparse energy and sparse complexity. Experimental results on five public 3D image quality assessment databases demonstrate that in comparison with the most related existing methods, the devised algorithm achieves high consistency with subjective assessment.

  13. Estimation of the global burden of mesothelioma deaths from incomplete national mortality data.

    PubMed

    Odgerel, Chimed-Ochir; Takahashi, Ken; Sorahan, Tom; Driscoll, Tim; Fitzmaurice, Christina; Yoko-O, Makoto; Sawanyawisuth, Kittisak; Furuya, Sugio; Tanaka, Fumihiro; Horie, Seichi; Zandwijk, Nico van; Takala, Jukka

    2017-12-01

    Mesothelioma is increasingly recognised as a global health issue and the assessment of its global burden is warranted. To descriptively analyse national mortality data and to use reported and estimated data to calculate the global burden of mesothelioma deaths. For the study period of 1994 to 2014, we grouped 230 countries into 59 countries with quality mesothelioma mortality data suitable to be used for reference rates, 45 countries with poor quality data and 126 countries with no data, based on the availability of data in the WHO Mortality Database. To estimate global deaths, we extrapolated the gender-specific and age-specific mortality rates of the countries with quality data to all other countries. The global numbers and rates of mesothelioma deaths have increased over time. The 59 countries with quality data recorded 15 011 mesothelioma deaths per year over the 3 most recent years with available data (equivalent to 9.9 deaths per million per year). From these reference data, we extrapolated the global mesothelioma deaths to be 38 400 per year, based on extrapolations for asbestos use. Although the validity of our extrapolation method depends on the adequate identification of quality mesothelioma data and appropriate adjustment for other variables, our estimates can be updated, refined and verified because they are based on commonly accessible data and are derived using a straightforward algorithm. Our estimates are within the range of previously reported values but higher than the most recently reported values. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Estimates of utility weights in hemophilia: implications for cost-utility analysis of clotting factor prophylaxis

    PubMed Central

    Grosse, Scott D; Chaugule, Shraddha S; Hay, Joel W

    2015-01-01

    Estimates of preference-weighted health outcomes or health state utilities are needed to assess improvements in health in terms of quality-adjusted life-years. Gains in quality-adjusted life-years are used to assess the cost–effectiveness of prophylactic use of clotting factor compared with on-demand treatment among people with hemophilia, a congenital bleeding disorder. Published estimates of health utilities for people with hemophilia vary, contributing to uncertainty in the estimates of cost–effectiveness of prophylaxis. Challenges in estimating utility weights for the purpose of evaluating hemophilia treatment include selection bias in observational data, difficulty in adjusting for predictors of health-related quality of life and lack of preference-based data comparing adults with lifetime or primary prophylaxis versus no prophylaxis living within the same country and healthcare system. PMID:25585817

  15. Assessing the likely value of gravity and drawdown measurements to constrain estimates of hydraulic conductivity and specific yield during unconfined aquifer testing

    USGS Publications Warehouse

    Blainey, Joan B.; Ferré, Ty P.A.; Cordova, Jeffrey T.

    2007-01-01

    Pumping of an unconfined aquifer can cause local desaturation detectable with high‐resolution gravimetry. A previous study showed that signal‐to‐noise ratios could be predicted for gravity measurements based on a hydrologic model. We show that although changes should be detectable with gravimeters, estimations of hydraulic conductivity and specific yield based on gravity data alone are likely to be unacceptably inaccurate and imprecise. In contrast, a transect of low‐quality drawdown data alone resulted in accurate estimates of hydraulic conductivity and inaccurate and imprecise estimates of specific yield. Combined use of drawdown and gravity data, or use of high‐quality drawdown data alone, resulted in unbiased and precise estimates of both parameters. This study is an example of the value of a staged assessment regarding the likely significance of a new measurement method or monitoring scenario before collecting field data.

  16. QMEANclust: estimation of protein model quality by combining a composite scoring function with structural density information.

    PubMed

    Benkert, Pascal; Schwede, Torsten; Tosatto, Silvio Ce

    2009-05-20

    The selection of the most accurate protein model from a set of alternatives is a crucial step in protein structure prediction both in template-based and ab initio approaches. Scoring functions have been developed which can either return a quality estimate for a single model or derive a score from the information contained in the ensemble of models for a given sequence. Local structural features occurring more frequently in the ensemble have a greater probability of being correct. Within the context of the CASP experiment, these so called consensus methods have been shown to perform considerably better in selecting good candidate models, but tend to fail if the best models are far from the dominant structural cluster. In this paper we show that model selection can be improved if both approaches are combined by pre-filtering the models used during the calculation of the structural consensus. Our recently published QMEAN composite scoring function has been improved by including an all-atom interaction potential term. The preliminary model ranking based on the new QMEAN score is used to select a subset of reliable models against which the structural consensus score is calculated. This scoring function called QMEANclust achieves a correlation coefficient of predicted quality score and GDT_TS of 0.9 averaged over the 98 CASP7 targets and perform significantly better in selecting good models from the ensemble of server models than any other groups participating in the quality estimation category of CASP7. Both scoring functions are also benchmarked on the MOULDER test set consisting of 20 target proteins each with 300 alternatives models generated by MODELLER. QMEAN outperforms all other tested scoring functions operating on individual models, while the consensus method QMEANclust only works properly on decoy sets containing a certain fraction of near-native conformations. We also present a local version of QMEAN for the per-residue estimation of model quality (QMEANlocal) and compare it to a new local consensus-based approach. Improved model selection is obtained by using a composite scoring function operating on single models in order to enrich higher quality models which are subsequently used to calculate the structural consensus. The performance of consensus-based methods such as QMEANclust highly depends on the composition and quality of the model ensemble to be analysed. Therefore, performance estimates for consensus methods based on large meta-datasets (e.g. CASP) might overrate their applicability in more realistic modelling situations with smaller sets of models based on individual methods.

  17. An approach to bioassessment of water quality using diversity measures based on species accumulative curves.

    PubMed

    Xu, Guangjian; Zhang, Wei; Xu, Henglong

    2015-02-15

    Traditional community-based bioassessment is time-consuming because they rely on full species-abundance data of a community. To improve bioassessment efficiency, the feasibility of the diversity measures based on species accumulative curves for bioassessment of water quality status was studied based on a dataset of microperiphyton fauna. The results showed that: (1) the species accumulative curves well fitted the Michaelis-Menten equation; (2) the β- and γ-diversity, as well as the number of samples to 50% of the maximum species number (Michaelis-Menten constant K), can be statistically estimated based on the formulation; (3) the rarefied α-diversity represented a significant negative correlation with the changes in the nutrient NH4-N; and (4) the estimated β-diversity and the K constant were significantly positively related to the concentration of NH4-N. The results suggest that the diversity measures based on species accumulative curves might be used as a potential bioindicator of water quality in marine ecosystems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Bayesian Revision of Residual Detection Power

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2013-01-01

    This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.

  19. Estimation of 3D reconstruction errors in a stereo-vision system

    NASA Astrophysics Data System (ADS)

    Belhaoua, A.; Kohler, S.; Hirsch, E.

    2009-06-01

    The paper presents an approach for error estimation for the various steps of an automated 3D vision-based reconstruction procedure of manufactured workpieces. The process is based on a priori planning of the task and built around a cognitive intelligent sensory system using so-called Situation Graph Trees (SGT) as a planning tool. Such an automated quality control system requires the coordination of a set of complex processes performing sequentially data acquisition, its quantitative evaluation and the comparison with a reference model (e.g., CAD object model) in order to evaluate quantitatively the object. To ensure efficient quality control, the aim is to be able to state if reconstruction results fulfill tolerance rules or not. Thus, the goal is to evaluate independently the error for each step of the stereo-vision based 3D reconstruction (e.g., for calibration, contour segmentation, matching and reconstruction) and then to estimate the error for the whole system. In this contribution, we analyze particularly the segmentation error due to localization errors for extracted edge points supposed to belong to lines and curves composing the outline of the workpiece under evaluation. The fitting parameters describing these geometric features are used as quality measure to determine confidence intervals and finally to estimate the segmentation errors. These errors are then propagated through the whole reconstruction procedure, enabling to evaluate their effect on the final 3D reconstruction result, specifically on position uncertainties. Lastly, analysis of these error estimates enables to evaluate the quality of the 3D reconstruction, as illustrated by the shown experimental results.

  20. Classification accuracy of claims-based methods for identifying providers failing to meet performance targets.

    PubMed

    Hubbard, Rebecca A; Benjamin-Johnson, Rhondee; Onega, Tracy; Smith-Bindman, Rebecca; Zhu, Weiwei; Fenton, Joshua J

    2015-01-15

    Quality assessment is critical for healthcare reform, but data sources are lacking for measurement of many important healthcare outcomes. With over 49 million people covered by Medicare as of 2010, Medicare claims data offer a potentially valuable source that could be used in targeted health care quality improvement efforts. However, little is known about the operating characteristics of provider profiling methods using claims-based outcome measures that may estimate provider performance with error. Motivated by the example of screening mammography performance, we compared approaches to identifying providers failing to meet guideline targets using Medicare claims data. We used data from the Breast Cancer Surveillance Consortium and linked Medicare claims to compare claims-based and clinical estimates of cancer detection rate. We then demonstrated the performance of claim-based estimates across a broad range of operating characteristics using simulation studies. We found that identification of poor performing providers was extremely sensitive to algorithm specificity, with no approach identifying more than 65% of poor performing providers when claims-based measures had specificity of 0.995 or less. We conclude that claims have the potential to contribute important information on healthcare outcomes to quality improvement efforts. However, to achieve this potential, development of highly accurate claims-based outcome measures should remain a priority. Copyright © 2014 John Wiley & Sons, Ltd.

  1. A Comparative Study of Three Spatial Interpolation Methodologies for the Analysis of Air Pollution Concentrations in Athens, Greece

    NASA Astrophysics Data System (ADS)

    Deligiorgi, Despina; Philippopoulos, Kostas; Thanou, Lelouda; Karvounis, Georgios

    2010-01-01

    Spatial interpolation in air pollution modeling is the procedure for estimating ambient air pollution concentrations at unmonitored locations based on available observations. The selection of the appropriate methodology is based on the nature and the quality of the interpolated data. In this paper, an assessment of three widely used interpolation methodologies is undertaken in order to estimate the errors involved. For this purpose, air quality data from January 2001 to December 2005, from a network of seventeen monitoring stations, operating at the greater area of Athens in Greece, are used. The Nearest Neighbor and the Liner schemes were applied to the mean hourly observations, while the Inverse Distance Weighted (IDW) method to the mean monthly concentrations. The discrepancies of the estimated and measured values are assessed for every station and pollutant, using the correlation coefficient, the scatter diagrams and the statistical residuals. The capability of the methods to estimate air quality data in an area with multiple land-use types and pollution sources, such as Athens, is discussed.

  2. Quality of surgical randomized controlled trials for acute cholecystitis: assessment based on CONSORT and additional check items.

    PubMed

    Shikata, Satoru; Nakayama, Takeo; Yamagishi, Hisakazu

    2008-01-01

    In this study, we conducted a limited survey of reports of surgical randomized controlled trials, using the consolidated standards of reporting trials (CONSORT) statement and additional check items to clarify problems in the evaluation of surgical reports. A total of 13 randomized trials were selected from two latest review articles on biliary surgery. Each randomized trial was evaluated according to 28 quality measures that comprised items from the CONSORT statement plus additional items. Analysis focused on relationships between the quality of each study and the estimated effect gap ("pooled estimate in meta-analysis" -- "estimated effect of each study"). No definite relationships were found between individual study quality and the estimated effect gap. The following items could have been described but were not provided in almost all the surgical RCT reports: "clearly defined outcomes"; "details of randomization"; "participant flow charts"; "intention-to-treat analysis"; "ancillary analyses"; and "financial conflicts of interest". The item, "participation of a trial methodologist in the study" was not found in any of the reports. Although the quality of reporting trials is not always related to a biased estimation of treatment effect, the items used for quality measures must be described to enable readers to evaluate the quality and applicability of the reporting. Further development of an assessment tool is needed for items specific to surgical randomized controlled trials.

  3. An evaluation of data-driven motion estimation in comparison to the usage of external-surrogates in cardiac SPECT imaging

    PubMed Central

    Mukherjee, Joyeeta Mitra; Hutton, Brian F; Johnson, Karen L; Pretorius, P Hendrik; King, Michael A

    2014-01-01

    Motion estimation methods in single photon emission computed tomography (SPECT) can be classified into methods which depend on just the emission data (data-driven), or those that use some other source of information such as an external surrogate. The surrogate-based methods estimate the motion exhibited externally which may not correlate exactly with the movement of organs inside the body. The accuracy of data-driven strategies on the other hand is affected by the type and timing of motion occurrence during acquisition, the source distribution, and various degrading factors such as attenuation, scatter, and system spatial resolution. The goal of this paper is to investigate the performance of two data-driven motion estimation schemes based on the rigid-body registration of projections of motion-transformed source distributions to the acquired projection data for cardiac SPECT studies. Comparison is also made of six intensity based registration metrics to an external surrogate-based method. In the data-driven schemes, a partially reconstructed heart is used as the initial source distribution. The partially-reconstructed heart has inaccuracies due to limited angle artifacts resulting from using only a part of the SPECT projections acquired while the patient maintained the same pose. The performance of different cost functions in quantifying consistency with the SPECT projection data in the data-driven schemes was compared for clinically realistic patient motion occurring as discrete pose changes, one or two times during acquisition. The six intensity-based metrics studied were mean-squared difference (MSD), mutual information (MI), normalized mutual information (NMI), pattern intensity (PI), normalized cross-correlation (NCC) and entropy of the difference (EDI). Quantitative and qualitative analysis of the performance is reported using Monte-Carlo simulations of a realistic heart phantom including degradation factors such as attenuation, scatter and system spatial resolution. Further the visual appearance of motion-corrected images using data-driven motion estimates was compared to that obtained using the external motion-tracking system in patient studies. Pattern intensity and normalized mutual information cost functions were observed to have the best performance in terms of lowest average position error and stability with degradation of image quality of the partial reconstruction in simulations. In all patients, the visual quality of PI-based estimation was either significantly better or comparable to NMI-based estimation. Best visual quality was obtained with PI-based estimation in 1 of the 5 patient studies, and with external-surrogate based correction in 3 out of 5 patients. In the remaining patient study there was little motion and all methods yielded similar visual image quality. PMID:24107647

  4. Underwater image enhancement through depth estimation based on random forest

    NASA Astrophysics Data System (ADS)

    Tai, Shen-Chuan; Tsai, Ting-Chou; Huang, Jyun-Han

    2017-11-01

    Light absorption and scattering in underwater environments can result in low-contrast images with a distinct color cast. This paper proposes a systematic framework for the enhancement of underwater images. Light transmission is estimated using the random forest algorithm. RGB values, luminance, color difference, blurriness, and the dark channel are treated as features in training and estimation. Transmission is calculated using an ensemble machine learning algorithm to deal with a variety of conditions encountered in underwater environments. A color compensation and contrast enhancement algorithm based on depth information was also developed with the aim of improving the visual quality of underwater images. Experimental results demonstrate that the proposed scheme outperforms existing methods with regard to subjective visual quality as well as objective measurements.

  5. Fractional kalman filter to estimate the concentration of air pollution

    NASA Astrophysics Data System (ADS)

    Vita Oktaviana, Yessy; Apriliani, Erna; Khusnul Arif, Didik

    2018-04-01

    Air pollution problem gives important effect in quality environment and quality of human’s life. Air pollution can be caused by nature sources or human activities. Pollutant for example Ozone, a harmful gas formed by NOx and volatile organic compounds (VOCs) emitted from various sources. The air pollution problem can be modeled by TAPM-CTM (The Air Pollution Model with Chemical Transport Model). The model shows concentration of pollutant in the air. Therefore, it is important to estimate concentration of air pollutant. Estimation method can be used for forecast pollutant concentration in future and keep stability of air quality. In this research, an algorithm is developed, based on Fractional Kalman Filter to solve the model of air pollution’s problem. The model will be discretized first and then it will be estimated by the method. The result shows that estimation of Fractional Kalman Filter has better accuracy than estimation of Kalman Filter. The accuracy was tested by applying RMSE (Root Mean Square Error).

  6. Adaptive compressed sensing of remote-sensing imaging based on the sparsity prediction

    NASA Astrophysics Data System (ADS)

    Yang, Senlin; Li, Xilong; Chong, Xin

    2017-10-01

    The conventional compressive sensing works based on the non-adaptive linear projections, and the parameter of its measurement times is usually set empirically. As a result, the quality of image reconstruction is always affected. Firstly, the block-based compressed sensing (BCS) with conventional selection for compressive measurements was given. Then an estimation method for the sparsity of image was proposed based on the two dimensional discrete cosine transform (2D DCT). With an energy threshold given beforehand, the DCT coefficients were processed with both energy normalization and sorting in descending order, and the sparsity of the image can be achieved by the proportion of dominant coefficients. And finally, the simulation result shows that, the method can estimate the sparsity of image effectively, and provides an active basis for the selection of compressive observation times. The result also shows that, since the selection of observation times is based on the sparse degree estimated with the energy threshold provided, the proposed method can ensure the quality of image reconstruction.

  7. Online tools for uncovering data quality issues in satellite-based global precipitation products

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Heo, G.

    2015-12-01

    Accurate and timely available global precipitation products are important to many applications such as flood forecasting, hydrological modeling, vector-borne disease research, crop yield estimates, etc. However, data quality issues such as biases and uncertainties are common in satellite-based precipitation products and it is important to understand these issues in applications. In recent years, algorithms using multi-satellites and multi-sensors for satellite-based precipitation estimates have become popular, such as the TRMM (Tropical Rainfall Measuring Mission) Multi-satellite Precipitation Analysis (TMPA) and the latest Integrated Multi-satellitE Retrievals for GPM (IMERG). Studies show that data quality issues for multi-satellite and multi-sensor products can vary with space and time and can be difficult to summarize. Online tools can provide customized results for a given area of interest, allowing customized investigation or comparison on several precipitation products. Because downloading data and software is not required, online tools can facilitate precipitation product evaluation and comparison. In this presentation, we will present online tools to uncover data quality issues in satellite-based global precipitation products. Examples will be presented as well.

  8. Regional Demand Models for Water-Based Recreation: Combining Aggregate and Individual-Level Choice Data

    EPA Science Inventory

    Estimating the effect of changes in water quality on non-market values for recreation involves estimating a change in aggregate consumer surplus. This aggregate value typically involves estimating both a per-person, per-trip change in willingness to pay, as well as defining the m...

  9. Parallel mutual information estimation for inferring gene regulatory networks on GPUs

    PubMed Central

    2011-01-01

    Background Mutual information is a measure of similarity between two variables. It has been widely used in various application domains including computational biology, machine learning, statistics, image processing, and financial computing. Previously used simple histogram based mutual information estimators lack the precision in quality compared to kernel based methods. The recently introduced B-spline function based mutual information estimation method is competitive to the kernel based methods in terms of quality but at a lower computational complexity. Results We present a new approach to accelerate the B-spline function based mutual information estimation algorithm with commodity graphics hardware. To derive an efficient mapping onto this type of architecture, we have used the Compute Unified Device Architecture (CUDA) programming model to design and implement a new parallel algorithm. Our implementation, called CUDA-MI, can achieve speedups of up to 82 using double precision on a single GPU compared to a multi-threaded implementation on a quad-core CPU for large microarray datasets. We have used the results obtained by CUDA-MI to infer gene regulatory networks (GRNs) from microarray data. The comparisons to existing methods including ARACNE and TINGe show that CUDA-MI produces GRNs of higher quality in less time. Conclusions CUDA-MI is publicly available open-source software, written in CUDA and C++ programming languages. It obtains significant speedup over sequential multi-threaded implementation by fully exploiting the compute capability of commonly used CUDA-enabled low-cost GPUs. PMID:21672264

  10. Spatial regression test for ensuring temperature data quality in southern Spain

    NASA Astrophysics Data System (ADS)

    Estévez, J.; Gavilán, P.; García-Marín, A. P.

    2018-01-01

    Quality assurance of meteorological data is crucial for ensuring the reliability of applications and models that use such data as input variables, especially in the field of environmental sciences. Spatial validation of meteorological data is based on the application of quality control procedures using data from neighbouring stations to assess the validity of data from a candidate station (the station of interest). These kinds of tests, which are referred to in the literature as spatial consistency tests, take data from neighbouring stations in order to estimate the corresponding measurement at the candidate station. These estimations can be made by weighting values according to the distance between the stations or to the coefficient of correlation, among other methods. The test applied in this study relies on statistical decision-making and uses a weighting based on the standard error of the estimate. This paper summarizes the results of the application of this test to maximum, minimum and mean temperature data from the Agroclimatic Information Network of Andalusia (southern Spain). This quality control procedure includes a decision based on a factor f, the fraction of potential outliers for each station across the region. Using GIS techniques, the geographic distribution of the errors detected has been also analysed. Finally, the performance of the test was assessed by evaluating its effectiveness in detecting known errors.

  11. Shrinkage Estimators for a Composite Measure of Quality Conceptualized as a Formative Construct

    PubMed Central

    Shwartz, Michael; Peköz, Erol A; Christiansen, Cindy L; Burgess, James F; Berlowitz, Dan

    2013-01-01

    Objective To demonstrate the value of shrinkage estimators when calculating a composite quality measure as the weighted average of a set of individual quality indicators. Data Sources Rates of 28 quality indicators (QIs) calculated from the minimum dataset from residents of 112 Veterans Health Administration nursing homes in fiscal years 2005–2008. Study Design We compared composite scores calculated from the 28 QIs using both observed rates and shrunken rates derived from a Bayesian multivariate normal-binomial model. Principal Findings Shrunken-rate composite scores, because they take into account unreliability of estimates from small samples and the correlation among QIs, have more intuitive appeal than observed-rate composite scores. Facilities can be profiled based on more policy-relevant measures than point estimates of composite scores, and interval estimates can be calculated without assuming the QIs are independent. Usually, shrunken-rate composite scores in 1 year are better able to predict the observed total number of QI events or the observed-rate composite scores in the following year than the initial year observed-rate composite scores. Conclusion Shrinkage estimators can be useful when a composite measure is conceptualized as a formative construct. PMID:22716650

  12. Lucid dreaming incidence: A quality effects meta-analysis of 50years of research.

    PubMed

    Saunders, David T; Roe, Chris A; Smith, Graham; Clegg, Helen

    2016-07-01

    We report a quality effects meta-analysis on studies from the period 1966-2016 measuring either (a) lucid dreaming prevalence (one or more lucid dreams in a lifetime); (b) frequent lucid dreaming (one or more lucid dreams in a month) or both. A quality effects meta-analysis allows for the minimisation of the influence of study methodological quality on overall model estimates. Following sensitivity analysis, a heterogeneous lucid dreaming prevalence data set of 34 studies yielded a mean estimate of 55%, 95% C. I. [49%, 62%] for which moderator analysis showed no systematic bias for suspected sources of variability. A heterogeneous lucid dreaming frequency data set of 25 studies yielded a mean estimate of 23%, 95% C. I. [20%, 25%], moderator analysis revealed no suspected sources of variability. These findings are consistent with earlier estimates of lucid dreaming prevalence and frequent lucid dreaming in the population but are based on more robust evidence. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Association between costs and quality of acute myocardial infarction care hospitals under the Korea National Health Insurance program.

    PubMed

    Kang, Hee-Chung; Hong, Jae-Seok

    2017-08-01

    If cost reductions produce a cost-quality trade-off, healthcare policy makers need to be more circumspect about the use of cost-effective initiatives. Additional empirical evidence about the relationship between cost and quality is needed to design a value-based payment system. We examined the association between cost and quality performances for acute myocardial infarction (AMI) care at the hospital level.In 2008, this cross-sectional study examined 69 hospitals with 6599 patients hospitalized under the Korea National Health Insurance (KNHI) program. We separately estimated hospital-specific effects on cost and quality using the fixed effect models adjusting for average patient risk. The analysis examined the association between the estimated hospital effects against the treatment cost and quality. All hospitals were distributed over the 4 cost × quality quadrants rather than concentrated in only the trade-off quadrants (i.e., above-average cost and above-average quality, below-average cost and below-average quality). We found no significant trade-off between cost and quality among hospitals providing AMI care in Korea.Our results further contribute to formulating a rationale for value-based hospital-level incentive programs by supporting the necessity of different approaches depending on the quality location of a hospital in these 4 quadrants.

  14. Testing survey-based methods for rapid monitoring of child mortality, with implications for summary birth history data.

    PubMed

    Brady, Eoghan; Hill, Kenneth

    2017-01-01

    Under-five mortality estimates are increasingly used in low and middle income countries to target interventions and measure performance against global development goals. Two new methods to rapidly estimate under-5 mortality based on Summary Birth Histories (SBH) were described in a previous paper and tested with data available. This analysis tests the methods using data appropriate to each method from 5 countries that lack vital registration systems. SBH data are collected across many countries through censuses and surveys, and indirect methods often rely upon their quality to estimate mortality rates. The Birth History Imputation method imputes data from a recent Full Birth History (FBH) onto the birth, death and age distribution of the SBH to produce estimates based on the resulting distribution of child mortality. DHS FBHs and MICS SBHs are used for all five countries. In the implementation, 43 of 70 estimates are within 20% of validation estimates (61%). Mean Absolute Relative Error is 17.7.%. 1 of 7 countries produces acceptable estimates. The Cohort Change method considers the differences in births and deaths between repeated Summary Birth Histories at 1 or 2-year intervals to estimate the mortality rate in that period. SBHs are taken from Brazil's PNAD Surveys 2004-2011 and validated against IGME estimates. 2 of 10 estimates are within 10% of validation estimates. Mean absolute relative error is greater than 100%. Appropriate testing of these new methods demonstrates that they do not produce sufficiently good estimates based on the data available. We conclude this is due to the poor quality of most SBH data included in the study. This has wider implications for the next round of censuses and future household surveys across many low- and middle- income countries.

  15. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    NASA Astrophysics Data System (ADS)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  16. Modeling individualized coefficient alpha to measure quality of test score data.

    PubMed

    Liu, Molei; Hu, Ming; Zhou, Xiao-Hua

    2018-05-23

    Individualized coefficient alpha is defined. It is item and subject specific and is used to measure the quality of test score data with heterogenicity among the subjects and items. A regression model is developed based on 3 sets of generalized estimating equations. The first set of generalized estimating equation models the expectation of the responses, the second set models the response's variance, and the third set is proposed to estimate the individualized coefficient alpha, defined and used to measure individualized internal consistency of the responses. We also use different techniques to extend our method to handle missing data. Asymptotic property of the estimators is discussed, based on which inference on the coefficient alpha is derived. Performance of our method is evaluated through simulation study and real data analysis. The real data application is from a health literacy study in Hunan province of China. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. The Novel Nonlinear Adaptive Doppler Shift Estimation Technique and the Coherent Doppler Lidar System Validation Lidar

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.

    2006-01-01

    The signal processing aspect of a 2-m wavelength coherent Doppler lidar system under development at NASA Langley Research Center in Virginia is investigated in this paper. The lidar system is named VALIDAR (validation lidar) and its signal processing program estimates and displays various wind parameters in real-time as data acquisition occurs. The goal is to improve the quality of the current estimates such as power, Doppler shift, wind speed, and wind direction, especially in low signal-to-noise-ratio (SNR) regime. A novel Nonlinear Adaptive Doppler Shift Estimation Technique (NADSET) is developed on such behalf and its performance is analyzed using the wind data acquired over a long period of time by VALIDAR. The quality of Doppler shift and power estimations by conventional Fourier-transform-based spectrum estimation methods deteriorates rapidly as SNR decreases. NADSET compensates such deterioration in the quality of wind parameter estimates by adaptively utilizing the statistics of Doppler shift estimate in a strong SNR range and identifying sporadic range bins where good Doppler shift estimates are found. The authenticity of NADSET is established by comparing the trend of wind parameters with and without NADSET applied to the long-period lidar return data.

  19. Cost-effectiveness of rehabilitation after an acute coronary event: a randomised controlled trial.

    PubMed

    Briffa, Tom G; Eckermann, Simon D; Griffiths, Alison D; Harris, Phillip J; Heath, M Rose; Freedman, Saul B; Donaldson, Lana T; Briffa, N Kathryn; Keech, Anthony C

    2005-11-07

    To estimate the incremental effects on cost and quality of life of cardiac rehabilitation after an acute coronary syndrome. Open randomised controlled trial with 1 year's follow-up. Analysis was on an intention-to-treat basis. Two tertiary hospitals in Sydney. 18 sessions of comprehensive exercise-based outpatient cardiac rehabilitation or conventional care as provided by the treating doctor. 113 patients aged 41-75 years who were self-caring and literate in English. Patients with uncompensated heart failure, uncontrolled arrhythmias, severe and symptomatic aortic stenosis or physical impairment were excluded. Costs (hospitalisations, medication use, outpatient visits, investigations, and personal expenses); and measures of quality of life. Incremental cost per quality-adjusted life year (QALY) saved at 1 year (this estimate combines within-study utility effects with reported 1-year risk of survival and treatment effects of rehabilitation on mortality). Sensitivity analyses around a base case estimate included alternative assumptions of no treatment effect on survival, 3 years of treatment effect on survival and variations in utility. The estimated incremental cost per QALY saved for rehabilitation relative to standard care was 42,535 US dollars when modelling included the reported treatment effect on survival. This increased to 70,580 US dollars per QALY saved if treatment effect on survival was not included. The results were sensitive to variations in utility and ranged from 19,685 US dollars per QALY saved to rehabilitation not being cost-effective. The effects on quality of life tend to reinforce treatment advantages on survival for patients having postdischarge rehabilitation after an acute coronary syndrome. The estimated base case incremental cost per QALY saved is consistent with those historically accepted by decision making authorities such as the Pharmaceutical Benefits Advisory Committee.

  20. Spatially-controlled illumination with rescan confocal microscopy enhances image quality, resolution and reduces photodamage

    NASA Astrophysics Data System (ADS)

    Krishnaswami, Venkataraman; De Luca, Giulia M. R.; Breedijk, Ronald M. P.; Van Noorden, Cornelis J. F.; Manders, Erik M. M.; Hoebe, Ron A.

    2017-02-01

    Fluorescence microscopy is an important tool in biomedical imaging. An inherent trade-off lies between image quality and photodamage. Recently, we have introduced rescan confocal microscopy (RCM) that improves the lateral resolution of a confocal microscope down to 170 nm. Previously, we have demonstrated that with controlled-light exposure microscopy, spatial control of illumination reduces photodamage without compromising image quality. Here, we show that the combination of these two techniques leads to high resolution imaging with reduced photodamage without compromising image quality. Implementation of spatially-controlled illumination was carried out in RCM using a line scanning-based approach. Illumination is spatially-controlled for every line during imaging with the help of a prediction algorithm that estimates the spatial profile of the fluorescent specimen. The estimation is based on the information available from previously acquired line images. As a proof-of-principle, we show images of N1E-115 neuroblastoma cells, obtained by this new setup with reduced illumination dose, improved resolution and without compromising image quality.

  1. Comparison of eating quality and physicochemical properties between Japanese and Chinese rice cultivars.

    PubMed

    Nakamura, Sumiko; Cui, Jing; Zhang, Xin; Yang, Fan; Xu, Ximing; Sheng, Hua; Ohtsubo, Ken'ichi

    2016-12-01

    In this study, we evaluated 16 Japanese and Chinese rice cultivars in terms of their main chemical components, iodine absorption curve, apparent amylose content (AAC), pasting property, resistant starch content, physical properties, sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis, and enzyme activity. Based on these quality evaluations, we concluded that Chinese rice varieties are characterized by a high protein and the grain texture after cooking has high hardness and low stickiness. In a previous study, we developed a novel formula for estimating AAC based on the iodine absorption curve. The validation test showed a determination coefficient of 0.996 for estimating AAC of Chinese rice cultivars as unknown samples. In the present study, we developed a novel formulae for estimating the balance degree of the surface layer of cooked rice (A3/A1: a ratio of workload of stickiness and hardness) based on the iodine absorption curve obtained using milled rice.

  2. Application of spectral decomposition algorithm for mapping water quality in a turbid lake (Lake Kasumigaura, Japan) from Landsat TM data

    NASA Astrophysics Data System (ADS)

    Oyama, Youichi; Matsushita, Bunkei; Fukushima, Takehiko; Matsushige, Kazuo; Imai, Akio

    The remote sensing of Case 2 water has been far less successful than that of Case 1 water, due mainly to the complex interactions among optically active substances (e.g., phytoplankton, suspended sediments, colored dissolved organic matter, and water) in the former. To address this problem, we developed a spectral decomposition algorithm (SDA), based on a spectral linear mixture modeling approach. Through a tank experiment, we found that the SDA-based models were superior to conventional empirical models (e.g. using single band, band ratio, or arithmetic calculation of band) for accurate estimates of water quality parameters. In this paper, we develop a method for applying the SDA to Landsat-5 TM data on Lake Kasumigaura, a eutrophic lake in Japan characterized by high concentrations of suspended sediment, for mapping chlorophyll-a (Chl-a) and non-phytoplankton suspended sediment (NPSS) distributions. The results show that the SDA-based estimation model can be obtained by a tank experiment. Moreover, by combining this estimation model with satellite-SRSs (standard reflectance spectra: i.e., spectral end-members) derived from bio-optical modeling, we can directly apply the model to a satellite image. The same SDA-based estimation model for Chl-a concentration was applied to two Landsat-5 TM images, one acquired in April 1994 and the other in February 2006. The average Chl-a estimation error between the two was 9.9%, a result that indicates the potential robustness of the SDA-based estimation model. The average estimation error of NPSS concentration from the 2006 Landsat-5 TM image was 15.9%. The key point for successfully applying the SDA-based estimation model to satellite data is the method used to obtain a suitable satellite-SRS for each end-member.

  3. The Effects of Baseline Estimation on the Reliability, Validity, and Precision of CBM-R Growth Estimates

    ERIC Educational Resources Information Center

    Van Norman, Ethan R.; Christ, Theodore J.; Zopluoglu, Cengiz

    2013-01-01

    This study examined the effect of baseline estimation on the quality of trend estimates derived from Curriculum Based Measurement of Oral Reading (CBM-R) progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for schedules ranging from 6-20 weeks for datasets with high and low…

  4. Curriculum-Based Measurement of Oral Reading: Quality of Progress Monitoring Outcomes

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Zopluoglu, Cengiz; Long, Jeffery D.; Monaghen, Barbara D.

    2012-01-01

    Curriculum-based measurement of oral reading (CBM-R) is frequently used to set student goals and monitor student progress. This study examined the quality of growth estimates derived from CBM-R progress monitoring data. The authors used a linear mixed effects regression (LMER) model to simulate progress monitoring data for multiple levels of…

  5. WATGIS: A GIS-Based Lumped Parameter Water Quality Model

    Treesearch

    Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya

    2002-01-01

    A Geographic Information System (GIS)­based, lumped parameter water quality model was developed to estimate the spatial and temporal nitrogen­loading patterns for lower coastal plain watersheds in eastern North Carolina. The model uses a spatially distributed delivery ratio (DR) parameter to account for nitrogen retention or loss along a drainage network. Delivery...

  6. Dynamic Fuzzy Logic-Based Quality of Interaction within Blended-Learning: The Rare and Contemporary Dance Cases

    ERIC Educational Resources Information Center

    Dias, Sofia B.; Diniz, José A.; Hadjileontiadis, Leontios J.

    2014-01-01

    The combination of the process of pedagogical planning within the Blended (b-) learning environment with the users' quality of interaction ("QoI") with the Learning Management System (LMS) is explored here. The required "QoI" (both for professors and students) is estimated by adopting a fuzzy logic-based modeling approach,…

  7. Polarization impacts on the water-leaving radiance retrieval from above-water radiometric measurements.

    PubMed

    Harmel, Tristan; Gilerson, Alexander; Tonizzo, Alberto; Chowdhary, Jacek; Weidemann, Alan; Arnone, Robert; Ahmed, Sam

    2012-12-10

    Above-water measurements of water-leaving radiance are widely used for water-quality monitoring and ocean-color satellite data validation. Reflected skylight in above-water radiometry needs to be accurately estimated prior to derivation of water-leaving radiance. Up-to-date methods to estimate reflection of diffuse skylight on rough sea surfaces are based on radiative transfer simulations and sky radiance measurements. But these methods neglect the polarization state of the incident skylight, which is generally highly polarized. In this paper, the effects of polarization on the sea surface reflectance and the subsequent water-leaving radiance estimation are investigated. We show that knowledge of the polarization field of the diffuse skylight significantly improves above-water radiometry estimates, in particular in the blue part of the spectrum where the reflected skylight is dominant. A newly developed algorithm based on radiative transfer simulations including polarization is described. Its application to the standard Aerosol Robotic Network-Ocean Color and hyperspectral radiometric measurements of the 1.5-year dataset acquired at the Long Island Sound site demonstrates the noticeable importance of considering polarization for water-leaving radiance estimation. In particular it is shown, based on time series of collocated data acquired in coastal waters, that the azimuth range of measurements leading to good-quality data is significantly increased, and that these estimates are improved by more than 12% at 413 nm. Full consideration of polarization effects is expected to significantly improve the quality of the field data utilized for satellite data validation or potential vicarious calibration purposes.

  8. Using Ensemble-Based Methods for Directly Estimating Causal Effects: An Investigation of Tree-Based G-Computation

    ERIC Educational Resources Information Center

    Austin, Peter C.

    2012-01-01

    Researchers are increasingly using observational or nonrandomized data to estimate causal treatment effects. Essential to the production of high-quality evidence is the ability to reduce or minimize the confounding that frequently occurs in observational studies. When using the potential outcome framework to define causal treatment effects, one…

  9. Child mortality estimation 2013: an overview of updates in estimation methods by the United Nations Inter-agency Group for Child Mortality Estimation.

    PubMed

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues.

  10. Child Mortality Estimation 2013: An Overview of Updates in Estimation Methods by the United Nations Inter-Agency Group for Child Mortality Estimation

    PubMed Central

    Alkema, Leontine; New, Jin Rou; Pedersen, Jon; You, Danzhen

    2014-01-01

    Background In September 2013, the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME) published an update of the estimates of the under-five mortality rate (U5MR) and under-five deaths for all countries. Compared to the UN IGME estimates published in 2012, updated data inputs and a new method for estimating the U5MR were used. Methods We summarize the new U5MR estimation method, which is a Bayesian B-spline Bias-reduction model, and highlight differences with the previously used method. Differences in UN IGME U5MR estimates as published in 2012 and those published in 2013 are presented and decomposed into differences due to the updated database and differences due to the new estimation method to explain and motivate changes in estimates. Findings Compared to the previously used method, the new UN IGME estimation method is based on a different trend fitting method that can track (recent) changes in U5MR more closely. The new method provides U5MR estimates that account for data quality issues. Resulting differences in U5MR point estimates between the UN IGME 2012 and 2013 publications are small for the majority of countries but greater than 10 deaths per 1,000 live births for 33 countries in 2011 and 19 countries in 1990. These differences can be explained by the updated database used, the curve fitting method as well as accounting for data quality issues. Changes in the number of deaths were less than 10% on the global level and for the majority of MDG regions. Conclusions The 2013 UN IGME estimates provide the most recent assessment of levels and trends in U5MR based on all available data and an improved estimation method that allows for closer-to-real-time monitoring of changes in the U5MR and takes account of data quality issues. PMID:25013954

  11. An Auditory-Masking-Threshold-Based Noise Suppression Algorithm GMMSE-AMT[ERB] for Listeners with Sensorineural Hearing Loss

    NASA Astrophysics Data System (ADS)

    Natarajan, Ajay; Hansen, John H. L.; Arehart, Kathryn Hoberg; Rossi-Katz, Jessica

    2005-12-01

    This study describes a new noise suppression scheme for hearing aid applications based on the auditory masking threshold (AMT) in conjunction with a modified generalized minimum mean square error estimator (GMMSE) for individual subjects with hearing loss. The representation of cochlear frequency resolution is achieved in terms of auditory filter equivalent rectangular bandwidths (ERBs). Estimation of AMT and spreading functions for masking are implemented in two ways: with normal auditory thresholds and normal auditory filter bandwidths (GMMSE-AMT[ERB]-NH) and with elevated thresholds and broader auditory filters characteristic of cochlear hearing loss (GMMSE-AMT[ERB]-HI). Evaluation is performed using speech corpora with objective quality measures (segmental SNR, Itakura-Saito), along with formal listener evaluations of speech quality rating and intelligibility. While no measurable changes in intelligibility occurred, evaluations showed quality improvement with both algorithm implementations. However, the customized formulation based on individual hearing losses was similar in performance to the formulation based on the normal auditory system.

  12. Average glandular dose in paired digital mammography and digital breast tomosynthesis acquisitions in a population based screening program: effects of measuring breast density, air kerma and beam quality

    NASA Astrophysics Data System (ADS)

    Helge Østerås, Bjørn; Skaane, Per; Gullien, Randi; Catrine Trægde Martinsen, Anne

    2018-02-01

    The main purpose was to compare average glandular dose (AGD) for same-compression digital mammography (DM) and digital breast tomosynthesis (DBT) acquisitions in a population based screening program, with and without breast density stratification, as determined by automatically calculated breast density (Quantra™). Secondary, to compare AGD estimates based on measured breast density, air kerma and half value layer (HVL) to DICOM metadata based estimates. AGD was estimated for 3819 women participating in the screening trial. All received craniocaudal and mediolateral oblique views of each breasts with paired DM and DBT acquisitions. Exposure parameters were extracted from DICOM metadata. Air kerma and HVL were measured for all beam qualities used to acquire the mammograms. Volumetric breast density was estimated using Quantra™. AGD was estimated using the Dance model. AGD reported directly from the DICOM metadata was also assessed. Mean AGD was 1.74 and 2.10 mGy for DM and DBT, respectively. Mean DBT/DM AGD ratio was 1.24. For fatty breasts: mean AGD was 1.74 and 2.27 mGy for DM and DBT, respectively. For dense breasts: mean AGD was 1.73 and 1.79 mGy, for DM and DBT, respectively. For breasts of similar thickness, dense breasts had higher AGD for DM and similar AGD for DBT. The DBT/DM dose ratio was substantially lower for dense compared to fatty breasts (1.08 versus 1.33). The average c-factor was 1.16. Using previously published polynomials to estimate glandularity from thickness underestimated the c-factor by 5.9% on average. Mean AGD error between estimates based on measurements (air kerma and HVL) versus DICOM header data was 3.8%, but for one mammography unit as high as 7.9%. Mean error of using the AGD value reported in the DICOM header was 10.7 and 13.3%, respectively. Thus, measurement of breast density, radiation dose and beam quality can substantially affect AGD estimates.

  13. Average glandular dose in paired digital mammography and digital breast tomosynthesis acquisitions in a population based screening program: effects of measuring breast density, air kerma and beam quality.

    PubMed

    Østerås, Bjørn Helge; Skaane, Per; Gullien, Randi; Martinsen, Anne Catrine Trægde

    2018-01-25

    The main purpose was to compare average glandular dose (AGD) for same-compression digital mammography (DM) and digital breast tomosynthesis (DBT) acquisitions in a population based screening program, with and without breast density stratification, as determined by automatically calculated breast density (Quantra ™ ). Secondary, to compare AGD estimates based on measured breast density, air kerma and half value layer (HVL) to DICOM metadata based estimates. AGD was estimated for 3819 women participating in the screening trial. All received craniocaudal and mediolateral oblique views of each breasts with paired DM and DBT acquisitions. Exposure parameters were extracted from DICOM metadata. Air kerma and HVL were measured for all beam qualities used to acquire the mammograms. Volumetric breast density was estimated using Quantra ™ . AGD was estimated using the Dance model. AGD reported directly from the DICOM metadata was also assessed. Mean AGD was 1.74 and 2.10 mGy for DM and DBT, respectively. Mean DBT/DM AGD ratio was 1.24. For fatty breasts: mean AGD was 1.74 and 2.27 mGy for DM and DBT, respectively. For dense breasts: mean AGD was 1.73 and 1.79 mGy, for DM and DBT, respectively. For breasts of similar thickness, dense breasts had higher AGD for DM and similar AGD for DBT. The DBT/DM dose ratio was substantially lower for dense compared to fatty breasts (1.08 versus 1.33). The average c-factor was 1.16. Using previously published polynomials to estimate glandularity from thickness underestimated the c-factor by 5.9% on average. Mean AGD error between estimates based on measurements (air kerma and HVL) versus DICOM header data was 3.8%, but for one mammography unit as high as 7.9%. Mean error of using the AGD value reported in the DICOM header was 10.7 and 13.3%, respectively. Thus, measurement of breast density, radiation dose and beam quality can substantially affect AGD estimates.

  14. Population-based cancer survival in the United States: Data, quality control, and statistical methods.

    PubMed

    Allemani, Claudia; Harewood, Rhea; Johnson, Christopher J; Carreira, Helena; Spika, Devon; Bonaventure, Audrey; Ward, Kevin; Weir, Hannah K; Coleman, Michel P

    2017-12-15

    Robust comparisons of population-based cancer survival estimates require tight adherence to the study protocol, standardized quality control, appropriate life tables of background mortality, and centralized analysis. The CONCORD program established worldwide surveillance of population-based cancer survival in 2015, analyzing individual data on 26 million patients (including 10 million US patients) diagnosed between 1995 and 2009 with 1 of 10 common malignancies. In this Cancer supplement, we analyzed data from 37 state cancer registries that participated in the second cycle of the CONCORD program (CONCORD-2), covering approximately 80% of the US population. Data quality checks were performed in 3 consecutive phases: protocol adherence, exclusions, and editorial checks. One-, 3-, and 5-year age-standardized net survival was estimated using the Pohar Perme estimator and state- and race-specific life tables of all-cause mortality for each year. The cohort approach was adopted for patients diagnosed between 2001 and 2003, and the complete approach for patients diagnosed between 2004 and 2009. Articles in this supplement report population coverage, data quality indicators, and age-standardized 5-year net survival by state, race, and stage at diagnosis. Examples of tables, bar charts, and funnel plots are provided in this article. Population-based cancer survival is a key measure of the overall effectiveness of services in providing equitable health care. The high quality of US cancer registry data, 80% population coverage, and use of an unbiased net survival estimator ensure that the survival trends reported in this supplement are robustly comparable by race and state. The results can be used by policymakers to identify and address inequities in cancer survival in each state and for the United States nationally. Cancer 2017;123:4982-93. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  15. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different management decisions. Our research results indicate that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our research also indicates that the probability of violating current water quality guidelines at specified true fecal coliform concentrations depends on the laboratory procedure used. As a result, quality-based management decisions, such as opening or closing a shellfishing area, may also depend on the laboratory procedure used.

  16. Remote sensing analysis of water quality and the entrapment zone in the San Francisco Bay and delta

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.; Knight, A. W. (Principal Investigator); Khorram, S.

    1979-01-01

    The author has identified the following significant results. Data from an ocean color scanner flown on a U-2 aircraft at 65,000 ft and from LANDSAT MSS were enhanced to identify biologically active areas of San Francisco Bay, and to determine the salinity, turbidity, chlorophylls, and suspended solids. The best fit regression models for mapping water quality parameters were based on bands 3, 5, 7, and 10. Comparision of the results indicate that chlorophyll concentrations can best be estimated by the OCS or MSS data. Salinity can best be estimated by OCS data; turbidity can best be estimated by LANDSAT data. Neither OCS nor MSS data provide reliable basis for estimating suspended solids. Aerial photography of the highest quality taken with either conventional color or infrared-sensitive color films was unable to locate the biologically active areas.

  17. Soldier Quality of Life Assessment

    DTIC Science & Technology

    2016-09-01

    ABSTRACT This report documents survey research and modeling of Soldier quality of life (QoL) on contingency base camps by the U.S. Army Natick...Science and Technology Objective Demonstration, was to develop a way to quantify QoL for camps housing fewer than 1000 personnel. A discrete choice survey ... Survey results were analyzed using hierarchical Bayesian logistic regression to develop a quantitative model for estimating QoL based on base camp

  18. Application of the quantum spin glass theory to image restoration.

    PubMed

    Inoue, J I

    2001-04-01

    Quantum fluctuation is introduced into the Markov random-field model for image restoration in the context of a Bayesian approach. We investigate the dependence of the quantum fluctuation on the quality of a black and white image restoration by making use of statistical mechanics. We find that the maximum posterior marginal (MPM) estimate based on the quantum fluctuation gives a fine restoration in comparison with the maximum a posteriori estimate or the thermal fluctuation based MPM estimate.

  19. Regression estimators for generic health-related quality of life and quality-adjusted life years.

    PubMed

    Basu, Anirban; Manca, Andrea

    2012-01-01

    To develop regression models for outcomes with truncated supports, such as health-related quality of life (HRQoL) data, and account for features typical of such data such as a skewed distribution, spikes at 1 or 0, and heteroskedasticity. Regression estimators based on features of the Beta distribution. First, both a single equation and a 2-part model are presented, along with estimation algorithms based on maximum-likelihood, quasi-likelihood, and Bayesian Markov-chain Monte Carlo methods. A novel Bayesian quasi-likelihood estimator is proposed. Second, a simulation exercise is presented to assess the performance of the proposed estimators against ordinary least squares (OLS) regression for a variety of HRQoL distributions that are encountered in practice. Finally, the performance of the proposed estimators is assessed by using them to quantify the treatment effect on QALYs in the EVALUATE hysterectomy trial. Overall model fit is studied using several goodness-of-fit tests such as Pearson's correlation test, link and reset tests, and a modified Hosmer-Lemeshow test. The simulation results indicate that the proposed methods are more robust in estimating covariate effects than OLS, especially when the effects are large or the HRQoL distribution has a large spike at 1. Quasi-likelihood techniques are more robust than maximum likelihood estimators. When applied to the EVALUATE trial, all but the maximum likelihood estimators produce unbiased estimates of the treatment effect. One and 2-part Beta regression models provide flexible approaches to regress the outcomes with truncated supports, such as HRQoL, on covariates, after accounting for many idiosyncratic features of the outcomes distribution. This work will provide applied researchers with a practical set of tools to model outcomes in cost-effectiveness analysis.

  20. Application of independent component analysis for speech-music separation using an efficient score function estimation

    NASA Astrophysics Data System (ADS)

    Pishravian, Arash; Aghabozorgi Sahaf, Masoud Reza

    2012-12-01

    In this paper speech-music separation using Blind Source Separation is discussed. The separating algorithm is based on the mutual information minimization where the natural gradient algorithm is used for minimization. In order to do that, score function estimation from observation signals (combination of speech and music) samples is needed. The accuracy and the speed of the mentioned estimation will affect on the quality of the separated signals and the processing time of the algorithm. The score function estimation in the presented algorithm is based on Gaussian mixture based kernel density estimation method. The experimental results of the presented algorithm on the speech-music separation and comparing to the separating algorithm which is based on the Minimum Mean Square Error estimator, indicate that it can cause better performance and less processing time

  1. Air quality mapping using GIS and economic evaluation of health impact for Mumbai City, India.

    PubMed

    Kumar, Awkash; Gupta, Indrani; Brandt, Jørgen; Kumar, Rakesh; Dikshit, Anil Kumar; Patil, Rashmi S

    2016-05-01

    Mumbai, a highly populated city in India, has been selected for air quality mapping and assessment of health impact using monitored air quality data. Air quality monitoring networks in Mumbai are operated by National Environment Engineering Research Institute (NEERI), Maharashtra Pollution Control Board (MPCB), and Brihanmumbai Municipal Corporation (BMC). A monitoring station represents air quality at a particular location, while we need spatial variation for air quality management. Here, air quality monitored data of NEERI and BMC were spatially interpolated using various inbuilt interpolation techniques of ArcGIS. Inverse distance weighting (IDW), Kriging (spherical and Gaussian), and spline techniques have been applied for spatial interpolation for this study. The interpolated results of air pollutants sulfur dioxide (SO2), nitrogen dioxide (NO2) and suspended particulate matter (SPM) were compared with air quality data of MPCB in the same region. Comparison of results showed good agreement for predicted values using IDW and Kriging with observed data. Subsequently, health impact assessment of a ward was carried out based on total population of the ward and air quality monitored data within the ward. Finally, health cost within a ward was estimated on the basis of exposed population. This study helps to estimate the valuation of health damage due to air pollution. Operating more air quality monitoring stations for measurement of air quality is highly resource intensive in terms of time and cost. The appropriate spatial interpolation techniques can be used to estimate concentration where air quality monitoring stations are not available. Further, health impact assessment for the population of the city and estimation of economic cost of health damage due to ambient air quality can help to make rational control strategies for environmental management. The total health cost for Mumbai city for the year 2012, with a population of 12.4 million, was estimated as USD8000 million.

  2. A sound quality model for objective synthesis evaluation of vehicle interior noise based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Wang, Y. S.; Shen, G. Q.; Xing, Y. F.

    2014-03-01

    Based on the artificial neural network (ANN) technique, an objective sound quality evaluation (SQE) model for synthesis annoyance of vehicle interior noises is presented in this paper. According to the standard named GB/T18697, firstly, the interior noises under different working conditions of a sample vehicle are measured and saved in a noise database. Some mathematical models for loudness, sharpness and roughness of the measured vehicle noises are established and performed by Matlab programming. Sound qualities of the vehicle interior noises are also estimated by jury tests following the anchored semantic differential (ASD) procedure. Using the objective and subjective evaluation results, furthermore, an ANN-based model for synthetical annoyance evaluation of vehicle noises, so-called ANN-SAE, is developed. Finally, the ANN-SAE model is proved by some verification tests with the leave-one-out algorithm. The results suggest that the proposed ANN-SAE model is accurate and effective and can be directly used to estimate sound quality of the vehicle interior noises, which is very helpful for vehicle acoustical designs and improvements. The ANN-SAE approach may be extended to deal with other sound-related fields for product quality evaluations in SQE engineering.

  3. A no-reference image and video visual quality metric based on machine learning

    NASA Astrophysics Data System (ADS)

    Frantc, Vladimir; Voronin, Viacheslav; Semenishchev, Evgenii; Minkin, Maxim; Delov, Aliy

    2018-04-01

    The paper presents a novel visual quality metric for lossy compressed video quality assessment. High degree of correlation with subjective estimations of quality is due to using of a convolutional neural network trained on a large amount of pairs video sequence-subjective quality score. We demonstrate how our predicted no-reference quality metric correlates with qualitative opinion in a human observer study. Results are shown on the EVVQ dataset with comparison existing approaches.

  4. Onboard TDI stage estimation and calibration using SNR analysis

    NASA Astrophysics Data System (ADS)

    Haghshenas, Javad

    2017-09-01

    Electro-Optical design of a push-broom space camera for a Low Earth Orbit (LEO) remote sensing satellite is performed based on the noise analysis of TDI sensors for very high GSDs and low light level missions. It is well demonstrated that the CCD TDI mode of operation provides increased photosensitivity relative to a linear CCD array, without the sacrifice of spatial resolution. However, for satellite imaging, in order to utilize the advantages which the TDI mode of operation offers, attention should be given to the parameters which affect the image quality of TDI sensors such as jitters, vibrations, noises and etc. A predefined TDI stages may not properly satisfy image quality requirement of the satellite camera. Furthermore, in order to use the whole dynamic range of the sensor, imager must be capable to set the TDI stages in every shots based on the affecting parameters. This paper deals with the optimal estimation and setting the stages based on tradeoffs among MTF, noises and SNR. On-board SNR estimation is simulated using the atmosphere analysis based on the MODTRAN algorithm in PcModWin software. According to the noises models, we have proposed a formulation to estimate TDI stages in such a way to satisfy the system SNR requirement. On the other hand, MTF requirement must be satisfy in the same manner. A proper combination of both parameters will guaranty the full dynamic range usage along with the high SNR and image quality.

  5. [Cost-effectiveness analysis and diet quality index applied to the WHO Global Strategy].

    PubMed

    Machado, Flávia Mori Sarti; Simões, Arlete Naresse

    2008-02-01

    To test the use of cost-effectiveness analysis as a decision making tool in the production of meals for the inclusion of the recommendations published in the World Health Organization's Global Strategy. Five alternative options for breakfast menu were assessed previously to their adoption in a food service at a university in the state of Sao Paulo, Southeastern Brazil, in 2006. Costs of the different options were based on market prices of food items (direct cost). Health benefits were estimated based on adaptation of the Diet Quality Index (DQI). Cost-effectiveness ratios were estimated by dividing benefits by costs and incremental cost-effectiveness ratios were estimated as cost differential per unit of additional benefit. The meal choice was based on health benefit units associated to direct production cost as well as incremental effectiveness per unit of differential cost. The analysis showed the most simple option with the addition of a fruit (DQI = 64 / cost = R$ 1.58) as the best alternative. Higher effectiveness was seen in the options with a fruit portion (DQI1=64 / DQI3=58 / DQI5=72) compared to the others (DQI2=48 / DQI4=58). The estimate of cost-effectiveness ratio allowed to identifying the best breakfast option based on cost-effectiveness analysis and Diet Quality Index. These instruments allow easy application easiness and objective evaluation which are key to the process of inclusion of public or private institutions under the Global Strategy directives.

  6. The Use of Satellite Remote Sensing in Epidemiological Studies

    PubMed Central

    Sorek-Hamer, Meytar; Just, Allan C.; Kloog, Itai

    2016-01-01

    Purpose of review Particulate matter (PM) air pollution is a ubiquitous exposure linked with multiple adverse health outcomes for children and across the life course. The recent development of satellite based remote sensing models for air pollution enables the quantification of these risks and addresses many limitations of previous air pollution research strategies. We review the recent literature on the applications of satellite remote sensing in air quality research, with a focus on their use in epidemiological studies. Recent findings Aerosol optical depth (AOD) is a focus of this review and a significant number of studies show that ground-level PM can be estimated from columnar AOD. Satellite measurements have been found to be an important source of data for PM model-based exposure estimates, and recently have been used in health studies to increase the spatial breadth and temporal resolution of these estimates. Summary It is suggested that satellite-based models improve our understanding of the spatial characteristics of air quality. Although the adoption of satellite-based measures of air quality in health studies is in its infancy, it is rapidly growing. Nevertheless, further investigation is still needed in order to have a better understanding of the AOD contribution to these prediction models in order to use them with higher accuracy in epidemiological studies. PMID:26859287

  7. Satellite remote sensing in epidemiological studies.

    PubMed

    Sorek-Hamer, Meytar; Just, Allan C; Kloog, Itai

    2016-04-01

    Particulate matter air pollution is a ubiquitous exposure linked with multiple adverse health outcomes for children and across the life course. The recent development of satellite-based remote-sensing models for air pollution enables the quantification of these risks and addresses many limitations of previous air pollution research strategies. We review the recent literature on the applications of satellite remote sensing in air quality research, with a focus on their use in epidemiological studies. Aerosol optical depth (AOD) is a focus of this review and a significant number of studies show that ground-level particulate matter can be estimated from columnar AOD. Satellite measurements have been found to be an important source of data for particulate matter model-based exposure estimates, and recently have been used in health studies to increase the spatial breadth and temporal resolution of these estimates. It is suggested that satellite-based models improve our understanding of the spatial characteristics of air quality. Although the adoption of satellite-based measures of air quality in health studies is in its infancy, it is rapidly growing. Nevertheless, further investigation is still needed in order to have a better understanding of the AOD contribution to these prediction models in order to use them with higher accuracy in epidemiological studies.

  8. Quality control and gap-filling of PM10 daily mean concentrations with the best linear unbiased estimator.

    PubMed

    Sozzi, R; Bolignano, A; Ceradini, S; Morelli, M; Petenko, I; Argentini, S

    2017-10-15

    According to the European Directive 2008/50/CE, the air quality assessment consists in the measurement of the concentration fields, and the evaluation of the mean, number of exceedances, etc. of some chemical species dangerous to human health. The measurements provided by an air quality ground-based monitoring network are the main information source but the availability of these data is often limited by several technical and operational problems. In this paper, the best linear unbiased estimator (BLUE) is proposed to validate the pollutant concentration values and to fill the gaps in the measurement of time series collected by a monitoring network. The BLUE algorithm is tested using the daily mean concentrations of particulate matter having aerodynamic diameter less than 10 μ (PM 10 concentrations) measured by the air quality monitoring sensors operating in the Lazio Region in Italy. The comparison between the estimated and measured data evidences an error comparable with the measurement uncertainty. Due to its simplicity and reliability, the BLUE will be used in the routine quality test procedures of the Lazio air quality monitoring network measurements.

  9. Adjustment of regional regression models of urban-runoff quality using data for Chattanooga, Knoxville, and Nashville, Tennessee

    USGS Publications Warehouse

    Hoos, Anne B.; Patel, Anant R.

    1996-01-01

    Model-adjustment procedures were applied to the combined data bases of storm-runoff quality for Chattanooga, Knoxville, and Nashville, Tennessee, to improve predictive accuracy for storm-runoff quality for urban watersheds in these three cities and throughout Middle and East Tennessee. Data for 45 storms at 15 different sites (five sites in each city) constitute the data base. Comparison of observed values of storm-runoff load and event-mean concentration to the predicted values from the regional regression models for 10 constituents shows prediction errors, as large as 806,000 percent. Model-adjustment procedures, which combine the regional model predictions with local data, are applied to improve predictive accuracy. Standard error of estimate after model adjustment ranges from 67 to 322 percent. Calibration results may be biased due to sampling error in the Tennessee data base. The relatively large values of standard error of estimate for some of the constituent models, although representing significant reduction (at least 50 percent) in prediction error compared to estimation with unadjusted regional models, may be unacceptable for some applications. The user may wish to collect additional local data for these constituents and repeat the analysis, or calibrate an independent local regression model.

  10. Using regression methods to estimate stream phosphorus loads at the Illinois River, Arkansas

    USGS Publications Warehouse

    Haggard, B.E.; Soerens, T.S.; Green, W.R.; Richards, R.P.

    2003-01-01

    The development of total maximum daily loads (TMDLs) requires evaluating existing constituent loads in streams. Accurate estimates of constituent loads are needed to calibrate watershed and reservoir models for TMDL development. The best approach to estimate constituent loads is high frequency sampling, particularly during storm events, and mass integration of constituents passing a point in a stream. Most often, resources are limited and discrete water quality samples are collected on fixed intervals and sometimes supplemented with directed sampling during storm events. When resources are limited, mass integration is not an accurate means to determine constituent loads and other load estimation techniques such as regression models are used. The objective of this work was to determine a minimum number of water-quality samples needed to provide constituent concentration data adequate to estimate constituent loads at a large stream. Twenty sets of water quality samples with and without supplemental storm samples were randomly selected at various fixed intervals from a database at the Illinois River, northwest Arkansas. The random sets were used to estimate total phosphorus (TP) loads using regression models. The regression-based annual TP loads were compared to the integrated annual TP load estimated using all the data. At a minimum, monthly sampling plus supplemental storm samples (six samples per year) was needed to produce a root mean square error of less than 15%. Water quality samples should be collected at least semi-monthly (every 15 days) in studies less than two years if seasonal time factors are to be used in the regression models. Annual TP loads estimated from independently collected discrete water quality samples further demonstrated the utility of using regression models to estimate annual TP loads in this stream system.

  11. Modeling water quality in an urban river using hydrological factors--data driven approaches.

    PubMed

    Chang, Fi-John; Tsai, Yu-Hsuan; Chen, Pin-An; Coynel, Alexandra; Vachaud, Georges

    2015-03-15

    Contrasting seasonal variations occur in river flow and water quality as a result of short duration, severe intensity storms and typhoons in Taiwan. Sudden changes in river flow caused by impending extreme events may impose serious degradation on river water quality and fateful impacts on ecosystems. Water quality is measured in a monthly/quarterly scale, and therefore an estimation of water quality in a daily scale would be of good help for timely river pollution management. This study proposes a systematic analysis scheme (SAS) to assess the spatio-temporal interrelation of water quality in an urban river and construct water quality estimation models using two static and one dynamic artificial neural networks (ANNs) coupled with the Gamma test (GT) based on water quality, hydrological and economic data. The Dahan River basin in Taiwan is the study area. Ammonia nitrogen (NH3-N) is considered as the representative parameter, a correlative indicator in judging the contamination level over the study. Key factors the most closely related to the representative parameter (NH3-N) are extracted by the Gamma test for modeling NH3-N concentration, and as a result, four hydrological factors (discharge, days w/o discharge, water temperature and rainfall) are identified as model inputs. The modeling results demonstrate that the nonlinear autoregressive with exogenous input (NARX) network furnished with recurrent connections can accurately estimate NH3-N concentration with a very high coefficient of efficiency value (0.926) and a low RMSE value (0.386 mg/l). Besides, the NARX network can suitably catch peak values that mainly occur in dry periods (September-April in the study area), which is particularly important to water pollution treatment. The proposed SAS suggests a promising approach to reliably modeling the spatio-temporal NH3-N concentration based solely on hydrological data, without using water quality sampling data. It is worth noticing that such estimation can be made in a much shorter time interval of interest (span from a monthly scale to a daily scale) because hydrological data are long-term collected in a daily scale. The proposed SAS favorably makes NH3-N concentration estimation much easier (with only hydrological field sampling) and more efficient (in shorter time intervals), which can substantially help river managers interpret and estimate water quality responses to natural and/or manmade pollution in a more effective and timely way for river pollution management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Is a Quality Course a Worthy Course? Designing for Value and Worth in Online Courses

    ERIC Educational Resources Information Center

    Youger, Robin E.; Ahern, Terence C.

    2015-01-01

    There are many strategies for estimating the effectiveness of instruction. Typically, most methods are based on the student evaluation. Recently a more standardized approach, Quality Matters (QM), has been developed that uses an objectives-based strategy. QM, however, does not account for the learning process, nor for the value and worth of the…

  13. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations

    PubMed Central

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J. Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types. PMID:27472383

  14. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    PubMed

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  15. Total suspended solids concentrations and yields for water-quality monitoring stations in Gwinnett County, Georgia, 1996-2009

    USGS Publications Warehouse

    Landers, Mark N.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Gwinnett County Department of Water Resources, established a water-quality monitoring program during late 1996 to collect comprehensive, consistent, high-quality data for use by watershed managers. As of 2009, continuous streamflow and water-quality data as well as discrete water-quality samples were being collected for 14 watershed monitoring stations in Gwinnett County. This report provides statistical summaries of total suspended solids (TSS) concentrations for 730 stormflow and 710 base-flow water-quality samples collected between 1996 and 2009 for 14 watershed monitoring stations in Gwinnett County. Annual yields of TSS were estimated for each of the 14 watersheds using methods described in previous studies. TSS yield was estimated using linear, ordinary least-squares regression of TSS and explanatory variables of discharge, turbidity, season, date, and flow condition. The error of prediction for estimated yields ranged from 1 to 42 percent for the stations in this report; however, the actual overall uncertainty of the estimated yields cannot be less than that of the observed yields (± 15 to 20 percent). These watershed yields provide a basis for evaluation of how watershed characteristics, climate, and watershed management practices affect suspended sediment yield.

  16. A probabilistic model of gastroenteritis risks associated with consumption of street food salads in Kumasi, Ghana: evaluation of methods to estimate pathogen dose from water, produce or food quality.

    PubMed

    Barker, S Fiona; Amoah, Philip; Drechsel, Pay

    2014-07-15

    With a rapidly growing urban population in Kumasi, Ghana, the consumption of street food is increasing. Raw salads, which often accompany street food dishes, are typically composed of perishable vegetables that are grown in close proximity to the city using poor quality water for irrigation. This study assessed the risk of gastroenteritis illness (caused by rotavirus, norovirus and Ascaris lumbricoides) associated with the consumption of street food salads using Quantitative Microbial Risk Assessment (QMRA). Three different risk assessment models were constructed, based on availability of microbial concentrations: 1) Water - starting from irrigation water quality, 2) Produce - starting from the quality of produce at market, and 3) Street - using microbial quality of street food salad. In the absence of viral concentrations, published ratios between faecal coliforms and viruses were used to estimate the quality of water, produce and salad, and annual disease burdens were determined. Rotavirus dominated the estimates of annual disease burden (~10(-3)Disability Adjusted Life Years per person per year (DALYs pppy)), although norovirus also exceeded the 10(-4)DALY threshold for both Produce and Street models. The Water model ignored other on-farm and post-harvest sources of contamination and consistently produced lower estimates of risk; it likely underestimates disease burden and therefore is not recommended. Required log reductions of up to 5.3 (95th percentile) for rotavirus were estimated for the Street model, demonstrating that significant interventions are required to protect the health and safety of street food consumers in Kumasi. Estimates of virus concentrations were a significant source of model uncertainty and more data on pathogen concentrations is needed to refine QMRA estimates of disease burden. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Iterated unscented Kalman filter for phase unwrapping of interferometric fringes.

    PubMed

    Xie, Xianming

    2016-08-22

    A fresh phase unwrapping algorithm based on iterated unscented Kalman filter is proposed to estimate unambiguous unwrapped phase of interferometric fringes. This method is the result of combining an iterated unscented Kalman filter with a robust phase gradient estimator based on amended matrix pencil model, and an efficient quality-guided strategy based on heap sort. The iterated unscented Kalman filter that is one of the most robust methods under the Bayesian theorem frame in non-linear signal processing so far, is applied to perform simultaneously noise suppression and phase unwrapping of interferometric fringes for the first time, which can simplify the complexity and the difficulty of pre-filtering procedure followed by phase unwrapping procedure, and even can remove the pre-filtering procedure. The robust phase gradient estimator is used to efficiently and accurately obtain phase gradient information from interferometric fringes, which is needed for the iterated unscented Kalman filtering phase unwrapping model. The efficient quality-guided strategy is able to ensure that the proposed method fast unwraps wrapped pixels along the path from the high-quality area to the low-quality area of wrapped phase images, which can greatly improve the efficiency of phase unwrapping. Results obtained from synthetic data and real data show that the proposed method can obtain better solutions with an acceptable time consumption, with respect to some of the most used algorithms.

  18. A Web-Based Decision Support System for Assessing Regional Water-Quality Conditions and Management Actions

    USGS Publications Warehouse

    Booth, N.L.; Everman, E.J.; Kuo, I.-L.; Sprague, L.; Murphy, L.

    2011-01-01

    The U.S. Geological Survey National Water Quality Assessment Program has completed a number of water-quality prediction models for nitrogen and phosphorus for the conterminous United States as well as for regional areas of the nation. In addition to estimating water-quality conditions at unmonitored streams, the calibrated SPAtially Referenced Regressions On Watershed attributes (SPARROW) models can be used to produce estimates of yield, flow-weighted concentration, or load of constituents in water under various land-use condition, change, or resource management scenarios. A web-based decision support infrastructure has been developed to provide access to SPARROW simulation results on stream water-quality conditions and to offer sophisticated scenario testing capabilities for research and water-quality planning via a graphical user interface with familiar controls. The SPARROW decision support system (DSS) is delivered through a web browser over an Internet connection, making it widely accessible to the public in a format that allows users to easily display water-quality conditions and to describe, test, and share modeled scenarios of future conditions. SPARROW models currently supported by the DSS are based on the modified digital versions of the 1:500,000-scale River Reach File (RF1) and 1:100,000-scale National Hydrography Dataset (medium-resolution, NHDPlus) stream networks. ?? 2011 American Water Resources Association. This article is a U.S. Government work and is in the public domain in the USA.

  19. Explore the Impacts of River Flow and Water Quality on Fish Communities

    NASA Astrophysics Data System (ADS)

    Tsai, W. P.; Chang, F. J.; Lin, C. Y.; Hu, J. H.; Yu, C. J.; Chu, T. J.

    2015-12-01

    Owing to the limitation of geographical environment in Taiwan, the uneven temporal and spatial distribution of rainfall would cause significant impacts on river ecosystems. To pursue sustainable water resources development, integrity and rationality is important to water management planning. The water quality and the flow regimes of rivers are closely related to each other and affect river ecosystems simultaneously. Therefore, this study collects long-term observational heterogeneity data, which includes water quality parameters, stream flow and fish species in the Danshui River of norther Taiwan, and aims to explore the complex impacts of water quality and flow regime on fish communities in order to comprehend the situations of the eco-hydrological system in this river basin. First, this study improves the understanding of the relationship between water quality parameters, flow regime and fish species by using artificial neural networks (ANNs). The Self-organizing feature map (SOM) is an unsupervised learning process used to cluster, analyze and visualize a large number of data. The results of SOM show that nine clusters (3x3) forms the optimum map size based on the local minimum values of both quantization error (QE) and topographic error (TE). Second, the fish diversity indexes are estimated by using the Adapted network-based fuzzy inference system (ANFIS) based on key input factors determined by the Gamma Test (GT), which is a useful tool for reducing model dimension and the structure complexity of ANNs. The result reveals that the constructed models can effectively estimate fish diversity indexes and produce good estimation performance based on the 9 clusters identified by the SOM, in which RMSE is 0.18 and CE is 0.84 for the training data set while RMSE is 0.20 and CE is 0.80 for the testing data set.

  20. Medical costs and quality-adjusted life years associated with smoking: a systematic review.

    PubMed

    Feirman, Shari P; Glasser, Allison M; Teplitskaya, Lyubov; Holtgrave, David R; Abrams, David B; Niaura, Raymond S; Villanti, Andrea C

    2016-07-27

    Estimated medical costs ("T") and QALYs ("Q") associated with smoking are frequently used in cost-utility analyses of tobacco control interventions. The goal of this study was to understand how researchers have addressed the methodological challenges involved in estimating these parameters. Data were collected as part of a systematic review of tobacco modeling studies. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Studies were eligible for the current analysis if they were U.S.-based, provided an estimate for Q, and used a societal perspective and lifetime analytic horizon to estimate T. We identified common methods and frequently cited sources used to obtain these estimates. Across all 18 studies included in this review, 50 % cited a 1992 source to estimate the medical costs associated with smoking and 56 % cited a 1996 study to derive the estimate for QALYs saved by quitting or preventing smoking. Approaches for estimating T varied dramatically among the studies included in this review. T was valued as a positive number, negative number and $0; five studies did not include estimates for T in their analyses. The most commonly cited source for Q based its estimate on the Health Utilities Index (HUI). Several papers also cited sources that based their estimates for Q on the Quality of Well-Being Scale and the EuroQol five dimensions questionnaire (EQ-5D). Current estimates of the lifetime medical care costs and the QALYs associated with smoking are dated and do not reflect the latest evidence on the health effects of smoking, nor the current costs and benefits of smoking cessation and prevention. Given these limitations, we recommend that researchers conducting economic evaluations of tobacco control interventions perform extensive sensitivity analyses around these parameter estimates.

  1. A method to estimate spatiotemporal air quality in an urban traffic corridor.

    PubMed

    Singh, Nongthombam Premananda; Gokhale, Sharad

    2015-12-15

    Air quality exposure assessment using personal exposure sampling or direct measurement of spatiotemporal air pollutant concentrations has difficulty and limitations. Most statistical methods used for estimating spatiotemporal air quality do not account for the source characteristics (e.g. emissions). In this study, a prediction method, based on the lognormal probability distribution of hourly-average-spatial concentrations of carbon monoxide (CO) obtained by a CALINE4 model, has been developed and validated in an urban traffic corridor. The data on CO concentrations were collected at three locations and traffic and meteorology within the urban traffic corridor.(1) The method has been developed with the data of one location and validated at other two locations. The method estimated the CO concentrations reasonably well (correlation coefficient, r≥0.96). Later, the method has been applied to estimate the probability of occurrence [P(C≥Cstd] of the spatial CO concentrations in the corridor. The results have been promising and, therefore, may be useful to quantifying spatiotemporal air quality within an urban area. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Standard and reduced radiation dose liver CT images: adaptive statistical iterative reconstruction versus model-based iterative reconstruction-comparison of findings and image quality.

    PubMed

    Shuman, William P; Chan, Keith T; Busey, Janet M; Mitsumori, Lee M; Choi, Eunice; Koprowicz, Kent M; Kanal, Kalpana M

    2014-12-01

    To investigate whether reduced radiation dose liver computed tomography (CT) images reconstructed with model-based iterative reconstruction ( MBIR model-based iterative reconstruction ) might compromise depiction of clinically relevant findings or might have decreased image quality when compared with clinical standard radiation dose CT images reconstructed with adaptive statistical iterative reconstruction ( ASIR adaptive statistical iterative reconstruction ). With institutional review board approval, informed consent, and HIPAA compliance, 50 patients (39 men, 11 women) were prospectively included who underwent liver CT. After a portal venous pass with ASIR adaptive statistical iterative reconstruction images, a 60% reduced radiation dose pass was added with MBIR model-based iterative reconstruction images. One reviewer scored ASIR adaptive statistical iterative reconstruction image quality and marked findings. Two additional independent reviewers noted whether marked findings were present on MBIR model-based iterative reconstruction images and assigned scores for relative conspicuity, spatial resolution, image noise, and image quality. Liver and aorta Hounsfield units and image noise were measured. Volume CT dose index and size-specific dose estimate ( SSDE size-specific dose estimate ) were recorded. Qualitative reviewer scores were summarized. Formal statistical inference for signal-to-noise ratio ( SNR signal-to-noise ratio ), contrast-to-noise ratio ( CNR contrast-to-noise ratio ), volume CT dose index, and SSDE size-specific dose estimate was made (paired t tests), with Bonferroni adjustment. Two independent reviewers identified all 136 ASIR adaptive statistical iterative reconstruction image findings (n = 272) on MBIR model-based iterative reconstruction images, scoring them as equal or better for conspicuity, spatial resolution, and image noise in 94.1% (256 of 272), 96.7% (263 of 272), and 99.3% (270 of 272), respectively. In 50 image sets, two reviewers (n = 100) scored overall image quality as sufficient or good with MBIR model-based iterative reconstruction in 99% (99 of 100). Liver SNR signal-to-noise ratio was significantly greater for MBIR model-based iterative reconstruction (10.8 ± 2.5 [standard deviation] vs 7.7 ± 1.4, P < .001); there was no difference for CNR contrast-to-noise ratio (2.5 ± 1.4 vs 2.4 ± 1.4, P = .45). For ASIR adaptive statistical iterative reconstruction and MBIR model-based iterative reconstruction , respectively, volume CT dose index was 15.2 mGy ± 7.6 versus 6.2 mGy ± 3.6; SSDE size-specific dose estimate was 16.4 mGy ± 6.6 versus 6.7 mGy ± 3.1 (P < .001). Liver CT images reconstructed with MBIR model-based iterative reconstruction may allow up to 59% radiation dose reduction compared with the dose with ASIR adaptive statistical iterative reconstruction , without compromising depiction of findings or image quality. © RSNA, 2014.

  3. Neutrons in proton pencil beam scanning: parameterization of energy, quality factors and RBE

    NASA Astrophysics Data System (ADS)

    Schneider, Uwe; Hälg, Roger A.; Baiocco, Giorgio; Lomax, Tony

    2016-08-01

    The biological effectiveness of neutrons produced during proton therapy in inducing cancer is unknown, but potentially large. In particular, since neutron biological effectiveness is energy dependent, it is necessary to estimate, besides the dose, also the energy spectra, in order to obtain quantities which could be a measure of the biological effectiveness and test current models and new approaches against epidemiological studies on cancer induction after proton therapy. For patients treated with proton pencil beam scanning, this work aims to predict the spatially localized neutron energies, the effective quality factor, the weighting factor according to ICRP, and two RBE values, the first obtained from the saturation corrected dose mean lineal energy and the second from DSB cluster induction. A proton pencil beam was Monte Carlo simulated using GEANT. Based on the simulated neutron spectra for three different proton beam energies a parameterization of energy, quality factors and RBE was calculated. The pencil beam algorithm used for treatment planning at PSI has been extended using the developed parameterizations in order to calculate the spatially localized neutron energy, quality factors and RBE for each treated patient. The parameterization represents the simple quantification of neutron energy in two energy bins and the quality factors and RBE with a satisfying precision up to 85 cm away from the proton pencil beam when compared to the results based on 3D Monte Carlo simulations. The root mean square error of the energy estimate between Monte Carlo simulation based results and the parameterization is 3.9%. For the quality factors and RBE estimates it is smaller than 0.9%. The model was successfully integrated into the PSI treatment planning system. It was found that the parameterizations for neutron energy, quality factors and RBE were independent of proton energy in the investigated energy range of interest for proton therapy. The pencil beam algorithm has been extended using the developed parameterizations in order to calculate the neutron energy, quality factor and RBE.

  4. Neutrons in proton pencil beam scanning: parameterization of energy, quality factors and RBE.

    PubMed

    Schneider, Uwe; Hälg, Roger A; Baiocco, Giorgio; Lomax, Tony

    2016-08-21

    The biological effectiveness of neutrons produced during proton therapy in inducing cancer is unknown, but potentially large. In particular, since neutron biological effectiveness is energy dependent, it is necessary to estimate, besides the dose, also the energy spectra, in order to obtain quantities which could be a measure of the biological effectiveness and test current models and new approaches against epidemiological studies on cancer induction after proton therapy. For patients treated with proton pencil beam scanning, this work aims to predict the spatially localized neutron energies, the effective quality factor, the weighting factor according to ICRP, and two RBE values, the first obtained from the saturation corrected dose mean lineal energy and the second from DSB cluster induction. A proton pencil beam was Monte Carlo simulated using GEANT. Based on the simulated neutron spectra for three different proton beam energies a parameterization of energy, quality factors and RBE was calculated. The pencil beam algorithm used for treatment planning at PSI has been extended using the developed parameterizations in order to calculate the spatially localized neutron energy, quality factors and RBE for each treated patient. The parameterization represents the simple quantification of neutron energy in two energy bins and the quality factors and RBE with a satisfying precision up to 85 cm away from the proton pencil beam when compared to the results based on 3D Monte Carlo simulations. The root mean square error of the energy estimate between Monte Carlo simulation based results and the parameterization is 3.9%. For the quality factors and RBE estimates it is smaller than 0.9%. The model was successfully integrated into the PSI treatment planning system. It was found that the parameterizations for neutron energy, quality factors and RBE were independent of proton energy in the investigated energy range of interest for proton therapy. The pencil beam algorithm has been extended using the developed parameterizations in order to calculate the neutron energy, quality factor and RBE.

  5. Voluntary Compliance, Pollution Levels, and Infant Mortality in Mexico.

    PubMed

    Foster, Andrew; Gutierrez, Emilio; Kumar, Naresh

    2009-05-01

    The increasing body of evidence from high income countries linking pollution to health outcomes (Ken Chay and Michael Greenstone 2003; Janet Currie and Matthew Neidell 2004), has raised concerns about the health impact of adverse air quality in developing countries, where, in general, environmental regulation is less stringent and health monitoring and treatment are less accessible. These concerns have, in turn, encouraged consideration of the effectiveness of alternative mechanisms for improving air quality while limiting the adverse impact on economic growth. However, the analysis of both the effects of pollution on health and the effectiveness of pollution abatement policies faces particular empirical challenges in low- and middle-income contexts, given the scarcity of reliable measures of pollution concentrations. The primary source of good quality data on air quality, ground monitoring, tends to be limited to larger metropolitan areas with monitors placed at sentinel sites that may or may not yield a representative picture of population exposure. This paper calls attention to, and makes use of, newly available procedures for extracting measures of air quality from satellite imagery. In particular, satellite-based measures of aerosol optical depth (AOD) are used to obtain estimates of air quality for the whole Mexican territory at a detailed geographic scale, and these estimates are related to measures of participation in a voluntary certification program at the level of the county. The resulting estimates are then combined with estimates of the relationship between participation in the certification program and infant mortality due to respiratory causes to obtain a rough estimate of the relationship between air quality and infant health in Mexico.

  6. fRMSDPred: Predicting Local RMSD Between Structural Fragments Using Sequence Information

    DTIC Science & Technology

    2007-04-04

    machine learning approaches for estimating the RMSD value of a pair of protein fragments. These estimated fragment-level RMSD values can be used to construct the alignment, assess the quality of an alignment, and identify high-quality alignment segments. We present algorithms to solve this fragment-level RMSD prediction problem using a supervised learning framework based on support vector regression and classification that incorporates protein profiles, predicted secondary structure, effective information encoding schemes, and novel second-order pairwise exponential kernel

  7. Reduced-Reference Quality Assessment Based on the Entropy of DWT Coefficients of Locally Weighted Gradient Magnitudes.

    PubMed

    Golestaneh, S Alireza; Karam, Lina

    2016-08-24

    Perceptual image quality assessment (IQA) attempts to use computational models to estimate the image quality in accordance with subjective evaluations. Reduced-reference (RR) image quality assessment (IQA) methods make use of partial information or features extracted from the reference image for estimating the quality of distorted images. Finding a balance between the number of RR features and accuracy of the estimated image quality is essential and important in IQA. In this paper we propose a training-free low-cost RRIQA method that requires a very small number of RR features (6 RR features). The proposed RRIQA algorithm is based on the discrete wavelet transform (DWT) of locally weighted gradient magnitudes.We apply human visual system's contrast sensitivity and neighborhood gradient information to weight the gradient magnitudes in a locally adaptive manner. The RR features are computed by measuring the entropy of each DWT subband, for each scale, and pooling the subband entropies along all orientations, resulting in L RR features (one average entropy per scale) for an L-level DWT. Extensive experiments performed on seven large-scale benchmark databases demonstrate that the proposed RRIQA method delivers highly competitive performance as compared to the state-of-the-art RRIQA models as well as full reference ones for both natural and texture images. The MATLAB source code of REDLOG and the evaluation results are publicly available online at https://http://lab.engineering.asu.edu/ivulab/software/redlog/.

  8. The climate and air-quality benefits of wind and solar power in the United States

    NASA Astrophysics Data System (ADS)

    Millstein, Dev; Wiser, Ryan; Bolinger, Mark; Barbose, Galen

    2017-09-01

    Wind and solar energy reduce combustion-based electricity generation and provide air-quality and greenhouse gas emission benefits. These benefits vary dramatically by region and over time. From 2007 to 2015, solar and wind power deployment increased rapidly while regulatory changes and fossil fuel price changes led to steep cuts in overall power-sector emissions. Here we evaluate how wind and solar climate and air-quality benefits evolved during this time period. We find cumulative wind and solar air-quality benefits of 2015 US$29.7-112.8 billion mostly from 3,000 to 12,700 avoided premature mortalities, and cumulative climate benefits of 2015 US$5.3-106.8 billion. The ranges span results across a suite of air-quality and health impact models and social cost of carbon estimates. We find that binding cap-and-trade pollutant markets may reduce these cumulative benefits by up to 16%. In 2015, based on central estimates, combined marginal benefits equal 7.3 ¢ kWh-1 (wind) and 4.0 ¢ kWh-1 (solar).

  9. Augmenting Species Diversity in Water Quality Criteria Derivation using Interspecies Correlation Models

    EPA Science Inventory

    The specific requirements for taxa diversity of the 1985 guidelines have limited the number of ambient water quality criteria (AWQC) developed for aquatic life protection. The EPA developed the Web-based Interspecies Correlation Estimation (Web-ICE) tool to allow extrapolation of...

  10. User's guide: estimation of key PCC, base, subbase, and pavement engineering properties from routine tests and physical characteristics

    DOT National Transportation Integrated Search

    2012-08-01

    Material characterization is a critical component of modern day pavement analysis, design, construction, quality : control/quality assurance, management, and rehabilitation. At each stage during the life of a project, the influence of : several funda...

  11. Ways to estimate speeds for the purposes of air quality conformity analyses.

    DOT National Transportation Integrated Search

    2002-01-01

    A speed post-processor refers to equations or lookup tables that can determine vehicle speeds on a particular roadway link using only the limited information available in a long-range planning model. An estimated link speed is usually based on volume...

  12. VoroMQA: Assessment of protein structure quality using interatomic contact areas.

    PubMed

    Olechnovič, Kliment; Venclovas, Česlovas

    2017-06-01

    In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Cuff-less blood pressure measurement using pulse arrival time and a Kalman filter

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Chen, Xianxiang; Fang, Zhen; Xue, Yongjiao; Zhan, Qingyuan; Yang, Ting; Xia, Shanhong

    2017-02-01

    The present study designs an algorithm to increase the accuracy of continuous blood pressure (BP) estimation. Pulse arrival time (PAT) has been widely used for continuous BP estimation. However, because of motion artifact and physiological activities, PAT-based methods are often troubled with low BP estimation accuracy. This paper used a signal quality modified Kalman filter to track blood pressure changes. A Kalman filter guarantees that BP estimation value is optimal in the sense of minimizing the mean square error. We propose a joint signal quality indice to adjust the measurement noise covariance, pushing the Kalman filter to weigh more heavily on measurements from cleaner data. Twenty 2 h physiological data segments selected from the MIMIC II database were used to evaluate the performance. Compared with straightforward use of the PAT-based linear regression model, the proposed model achieved higher measurement accuracy. Due to low computation complexity, the proposed algorithm can be easily transplanted into wearable sensor devices.

  14. The Effect of the MassHealth Hospital Pay-for-Performance Program on Quality

    PubMed Central

    Ryan, Andrew M; Blustein, Jan

    2011-01-01

    Objective To test the effect of Massachusetts Medicaid's (MassHealth) hospital-based pay-for-performance (P4P) program, implemented in 2008, on quality of care for pneumonia and surgical infection prevention (SIP). Data Hospital Compare process of care quality data from 2004 to 2009 for acute care hospitals in Massachusetts (N = 62) and other states (N = 3,676) and American Hospital Association data on hospital characteristics from 2005. Study Design Panel data models with hospital fixed effects and hospital-specific trends are estimated to test the effect of P4P on composite quality for pneumonia and SIP. This base model is extended to control for the completeness of measure reporting. Further sensitivity checks include estimation with propensity-score matched control hospitals, excluding hospitals in other P4P programs, varying the time period during which the program was assumed to have an effect, and testing the program effect across hospital characteristics. Principal Findings Estimates from our preferred specification, including hospital fixed effects, trends, and the control for measure completeness, indicate small and nonsignificant program effects for pneumonia (−0.67 percentage points, p>.10) and SIP (−0.12 percentage points, p>.10). Sensitivity checks indicate a similar pattern of findings across specifications. Conclusions Despite offering substantial financial incentives, the MassHealth P4P program did not improve quality in the first years of implementation. PMID:21210796

  15. Regression model development and computational procedures to support estimation of real-time concentrations and loads of selected constituents in two tributaries to Lake Houston near Houston, Texas, 2005-9

    USGS Publications Warehouse

    Lee, Michael T.; Asquith, William H.; Oden, Timothy D.

    2012-01-01

    In December 2005, the U.S. Geological Survey (USGS), in cooperation with the City of Houston, Texas, began collecting discrete water-quality samples for nutrients, total organic carbon, bacteria (Escherichia coli and total coliform), atrazine, and suspended sediment at two USGS streamflow-gaging stations that represent watersheds contributing to Lake Houston (08068500 Spring Creek near Spring, Tex., and 08070200 East Fork San Jacinto River near New Caney, Tex.). Data from the discrete water-quality samples collected during 2005–9, in conjunction with continuously monitored real-time data that included streamflow and other physical water-quality properties (specific conductance, pH, water temperature, turbidity, and dissolved oxygen), were used to develop regression models for the estimation of concentrations of water-quality constituents of substantial source watersheds to Lake Houston. The potential explanatory variables included discharge (streamflow), specific conductance, pH, water temperature, turbidity, dissolved oxygen, and time (to account for seasonal variations inherent in some water-quality data). The response variables (the selected constituents) at each site were nitrite plus nitrate nitrogen, total phosphorus, total organic carbon, E. coli, atrazine, and suspended sediment. The explanatory variables provide easily measured quantities to serve as potential surrogate variables to estimate concentrations of the selected constituents through statistical regression. Statistical regression also facilitates accompanying estimates of uncertainty in the form of prediction intervals. Each regression model potentially can be used to estimate concentrations of a given constituent in real time. Among other regression diagnostics, the diagnostics used as indicators of general model reliability and reported herein include the adjusted R-squared, the residual standard error, residual plots, and p-values. Adjusted R-squared values for the Spring Creek models ranged from .582–.922 (dimensionless). The residual standard errors ranged from .073–.447 (base-10 logarithm). Adjusted R-squared values for the East Fork San Jacinto River models ranged from .253–.853 (dimensionless). The residual standard errors ranged from .076–.388 (base-10 logarithm). In conjunction with estimated concentrations, constituent loads can be estimated by multiplying the estimated concentration by the corresponding streamflow and by applying the appropriate conversion factor. The regression models presented in this report are site specific, that is, they are specific to the Spring Creek and East Fork San Jacinto River streamflow-gaging stations; however, the general methods that were developed and documented could be applied to most perennial streams for the purpose of estimating real-time water quality data.

  16. Curriculum-Based Measurement of Oral Reading: Evaluation of Growth Estimates Derived with Pre-Post Assessment Methods

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Monaghen, Barbara D.; Zopluoglu, Cengiz; Van Norman, Ethan R.

    2013-01-01

    Curriculum-based measurement of oral reading (CBM-R) is used to index the level and rate of student growth across the academic year. The method is frequently used to set student goals and monitor student progress. This study examined the diagnostic accuracy and quality of growth estimates derived from pre-post measurement using CBM-R data. A…

  17. [Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].

    PubMed

    Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D

    2016-01-01

    The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.

  18. Methods for estimating monthly mean concentrations of selected water-quality constituents for stream sites in the Red River of the North basin, North Dakota and Minnesota

    USGS Publications Warehouse

    Guenthner, R.S.

    1991-01-01

    Future development of the Garrison Diversion Unit may divert water from the Missouri River into the Sheyenne River and the Red River of the North for municipal and industrial use. The U.S. Bureau of Reclamation's Canals, Rivers, and Reservoirs Salinity Accounting Procedures model can be used to predict the effect various operating plans could have on water quality in the Sheyenne River and the Red River of the North. The model uses, as Input, monthly means of streamflow and selected water-quality constituents for a 54-year period at 28 nodes on the Sheyenne River and the Red River of the North. This report provides methods for estimating monthly mean concentrations of selected water-quality constituents that can be used for input to and calibration of the salinity model.Mater-quality data for 32 gaging stations can be used to define selected water-quality characteristics at the 28 model nodes. Materquality data were retrieved from the U.S. Geological Survey's National Mater Data Storage and Retrieval System data base and statistical summaries were prepared. The frequency of water-quality data collection at the gaging stations is inadequate to define monthly mean concentrations of the individual water-quality constituents for all months for the 54-year period; therefore, methods for estimating monthly mean concentrations were developed. Relations between selected water-quality constituents [dissolved solids, hardness (as CaCO3), sodium, sulfate, and chloride] and streamflow were developed as the primary method to estimate monthly mean concentrations. Relations between specific conductance and streamflow and relations between selected water-quality constituents [dissolved solids, hardness (as CaCO3), sodium, sulfate, and chloride] and specific conductance were developed so that a cascaded-regression relation could be developed as a second method of estimating monthly mean concentrations and, thus, utilize a large specific-conductance data base. Information about the quantity and the quality of ground water discharging to the Sheyenne River is needed for model input for reaches of the river where ground water accounts for a substantial part of streamflow during periods of low flow. Ground-water discharge was identified for two reaches of the Sheyenne River. Ground-water discharge to the Sheyenne River in the vicinity of Warwick, N.Dak., was about 14.8 cubic feet per second and the estimated dissolved-solids concentration was about 441 milligrams per liter during October 15 and 16, 1986. Ground-water discharge to the Sheyenne River in a reach between Lisbon and Kindred, N.Dak., ranged from an average of 25.3 cubic feet per second during September 13 to November 19, 1963, to about 45.0 cubic feet per second during October 21 and 22, 1986. Dissolved-solids concentration was estimated at about 442 milligrams per liter during October 21 and 22, 1986.

  19. Ventilation potential during the emissions survey in Toluca Valley, Mexico

    NASA Astrophysics Data System (ADS)

    Ruiz Angulo, A.; Peralta, O.; Jurado, O. E.; Ortinez, A.; Grutter de la Mora, M.; Rivera, C.; Gutierrez, W.; Gonzalez, E.

    2017-12-01

    During the late-spring early-summer measurements of emissions and pollutants were carried out during a survey campaign at four different locations within the Toluca Valley. The current emissions inventory typically estimates the generation of pollutants based on pre-estimated values representing an entire sector function of their activities. However, those factors are not always based direct measurements. The emissions from the Toluca Valley are rather large and they could affect the air quality of Mexico City Valley. The air masses interchange between those two valleys is not very well understood; however, based on the measurements obtained during the 3 months campaign we looked carefully at the daily variability of the wind finding a clear signal for mountain-valley breeze. The ventilation coefficient is estimated and the correlations with the concentrations at the 4 locations and in a far away station in Mexico City are addressed in this work. Finally, we discuss the implication of the ventilation capacity in air quality for the system of Valleys that include Mexico City.

  20. Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.

    PubMed

    Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse

    2018-05-01

    Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.

  1. Mapping CHU9D Utility Scores from the PedsQLTM 4.0 SF-15.

    PubMed

    Mpundu-Kaambwa, Christine; Chen, Gang; Russo, Remo; Stevens, Katherine; Petersen, Karin Dam; Ratcliffe, Julie

    2017-04-01

    The Pediatric Quality of Life Inventory™ 4.0 Short Form 15 Generic Core Scales (hereafter the PedsQL) and the Child Health Utility-9 Dimensions (CHU9D) are two generic instruments designed to measure health-related quality of life in children and adolescents in the general population and paediatric patient groups living with specific health conditions. Although the PedsQL is widely used among paediatric patient populations, presently it is not possible to directly use the scores from the instrument to calculate quality-adjusted life-years (QALYs) for application in economic evaluation because it produces summary scores which are not preference-based. This paper examines different econometric mapping techniques for estimating CHU9D utility scores from the PedsQL for the purpose of calculating QALYs for cost-utility analysis. The PedsQL and the CHU9D were completed by a community sample of 755 Australian adolescents aged 15-17 years. Seven regression models were estimated: ordinary least squares estimator, generalised linear model, robust MM estimator, multivariate factorial polynomial estimator, beta-binomial estimator, finite mixture model and multinomial logistic model. The mean absolute error (MAE) and the mean squared error (MSE) were used to assess predictive ability of the models. The MM estimator with stepwise-selected PedsQL dimension scores as explanatory variables had the best predictive accuracy using MAE and the equivalent beta-binomial model had the best predictive accuracy using MSE. Our mapping algorithm facilitates the estimation of health-state utilities for use within economic evaluations where only PedsQL data is available and is suitable for use in community-based adolescents aged 15-17 years. Applicability of the algorithm in younger populations should be assessed in further research.

  2. Assessing the Impact of Fires on Air Quality in the Southeastern U.S. with a Unified Prescribed Burning Database

    NASA Astrophysics Data System (ADS)

    Garcia Menendez, F.; Afrin, S.

    2017-12-01

    Prescribed fires are used extensively across the Southeastern United States and are a major source of air pollutant emissions in the region. These land management projects can adversely impact local and regional air quality. However, the emissions and air pollution impacts of prescribed fires remain largely uncertain. Satellite data, commonly used to estimate fire emissions, is often unable to detect the low-intensity, short-lived prescribed fires characteristic of the region. Additionally, existing ground-based prescribed burn records are incomplete, inconsistent and scattered. Here we present a new unified database of prescribed fire occurrence and characteristics developed from systemized digital burn permit records collected from public and private land management organizations in the Southeast. This bottom-up fire database is used to analyze the correlation between high PM2.5 concentrations measured by monitoring networks in southern states and prescribed fire occurrence at varying spatial and temporal scales. We show significant associations between ground-based records of prescribed fire activity and the observational air quality record at numerous sites by applying regression analysis and controlling confounding effects of meteorology. Furthermore, we demonstrate that the response of measured PM2.5 concentrations to prescribed fire estimates based on burning permits is significantly stronger than their response to satellite fire observations from MODIS (moderate-resolution imaging spectroradiometer) and geostationary satellites or prescribed fire emissions data in the National Emissions Inventory. These results show the importance of bottom-up smoke emissions estimates and reflect the need for improved ground-based fire data to advance air quality impacts assessments focused on prescribed burning.

  3. Generalized watermarking attack based on watermark estimation and perceptual remodulation

    NASA Astrophysics Data System (ADS)

    Voloshynovskiy, Sviatoslav V.; Pereira, Shelby; Herrigel, Alexander; Baumgartner, Nazanin; Pun, Thierry

    2000-05-01

    Digital image watermarking has become a popular technique for authentication and copyright protection. For verifying the security and robustness of watermarking algorithms, specific attacks have to be applied to test them. In contrast to the known Stirmark attack, which degrades the quality of the image while destroying the watermark, this paper presents a new approach which is based on the estimation of a watermark and the exploitation of the properties of Human Visual System (HVS). The new attack satisfies two important requirements. First, image quality after the attack as perceived by the HVS is not worse than the quality of the stego image. Secondly, the attack uses all available prior information about the watermark and cover image statistics to perform the best watermark removal or damage. The proposed attack is based on a stochastic formulation of the watermark removal problem, considering the embedded watermark as additive noise with some probability distribution. The attack scheme consists of two main stages: (1) watermark estimation and partial removal by a filtering based on a Maximum a Posteriori (MAP) approach; (2) watermark alteration and hiding through addition of noise to the filtered image, taking into account the statistics of the embedded watermark and exploiting HVS characteristics. Experiments on a number of real world and computer generated images show the high efficiency of the proposed attack against known academic and commercial methods: the watermark is completely destroyed in all tested images without altering the image quality. The approach can be used against watermark embedding schemes that operate either in coordinate domain, or transform domains like Fourier, DCT or wavelet.

  4. Assessing the radar rainfall estimates in watershed-scale water quality model

    USDA-ARS?s Scientific Manuscript database

    Watershed-scale water quality models are effective science-based tools for interpreting change in complex environmental systems that affect hydrology cycle, soil erosion and nutrient fate and transport in watershed. Precipitation is one of the primary input data to achieve a precise rainfall-runoff ...

  5. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES (PRESENTATION)

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  6. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  7. Uncertainties in selected river water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2007-02-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise from natural or anthropogenic causes. Empirical quality of river water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected river water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2005). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties, measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerably to the overall uncertainty of river water quality data. Temporal autocorrelation of river water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments (500-3000 km2) reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  8. Uncertainties in selected surface water quality data

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.

    2006-09-01

    Monitoring of surface waters is primarily done to detect the status and trends in water quality and to identify whether observed trends arise form natural or anthropogenic causes. Empirical quality of surface water quality data is rarely certain and knowledge of their uncertainties is essential to assess the reliability of water quality models and their predictions. The objective of this paper is to assess the uncertainties in selected surface water quality data, i.e. suspended sediment, nitrogen fraction, phosphorus fraction, heavy metals and biological compounds. The methodology used to structure the uncertainty is based on the empirical quality of data and the sources of uncertainty in data (van Loon et al., 2006). A literature review was carried out including additional experimental data of the Elbe river. All data of compounds associated with suspended particulate matter have considerable higher sampling uncertainties than soluble concentrations. This is due to high variability's within the cross section of a given river. This variability is positively correlated with total suspended particulate matter concentrations. Sampling location has also considerable effect on the representativeness of a water sample. These sampling uncertainties are highly site specific. The estimation of uncertainty in sampling can only be achieved by taking at least a proportion of samples in duplicates. Compared to sampling uncertainties measurement and analytical uncertainties are much lower. Instrument quality can be stated well suited for field and laboratory situations for all considered constituents. Analytical errors can contribute considerable to the overall uncertainty of surface water quality data. Temporal autocorrelation of surface water quality data is present but literature on general behaviour of water quality compounds is rare. For meso scale river catchments reasonable yearly dissolved load calculations can be achieved using biweekly sample frequencies. For suspended sediments none of the methods investigated produced very reliable load estimates when weekly concentrations data were used. Uncertainties associated with loads estimates based on infrequent samples will decrease with increasing size of rivers.

  9. Community-LINE Source Model (C-LINE) to estimate roadway emissions

    EPA Pesticide Factsheets

    C-LINE is a web-based model that estimates emissions and dispersion of toxic air pollutants for roadways in the U.S. This reduced-form air quality model examines what-if scenarios for changes in emissions such as traffic volume fleet mix and vehicle speed.

  10. A novel SURE-based criterion for parametric PSF estimation.

    PubMed

    Xue, Feng; Blu, Thierry

    2015-02-01

    We propose an unbiased estimate of a filtered version of the mean squared error--the blur-SURE (Stein's unbiased risk estimate)--as a novel criterion for estimating an unknown point spread function (PSF) from the degraded image only. The PSF is obtained by minimizing this new objective functional over a family of Wiener processings. Based on this estimated blur kernel, we then perform nonblind deconvolution using our recently developed algorithm. The SURE-based framework is exemplified with a number of parametric PSF, involving a scaling factor that controls the blur size. A typical example of such parametrization is the Gaussian kernel. The experimental results demonstrate that minimizing the blur-SURE yields highly accurate estimates of the PSF parameters, which also result in a restoration quality that is very similar to the one obtained with the exact PSF, when plugged into our recent multi-Wiener SURE-LET deconvolution algorithm. The highly competitive results obtained outline the great potential of developing more powerful blind deconvolution algorithms based on SURE-like estimates.

  11. Relationship and variation of qPCR and culturable enterococci estimates in ambient surface waters are predictable

    USGS Publications Warehouse

    Whitman, Richard L.; Ge, Zhongfu; Nevers, Meredith B.; Boehm, Alexandria B.; Chern, Eunice C.; Haugland, Richard A.; Lukasik, Ashley M.; Molina, Marirosa; Przybyla-Kelly, Kasia; Shively, Dawn A.; White, Emily M.; Zepp, Richard G.; Byappanahalli, Muruleedhara N.

    2010-01-01

    The quantitative polymerase chain reaction (qPCR) method provides rapid estimates of fecal indicator bacteria densities that have been indicated to be useful in the assessment of water quality. Primarily because this method provides faster results than standard culture-based methods, the U.S. Environmental Protection Agency is currently considering its use as a basis for revised ambient water quality criteria. In anticipation of this possibility, we sought to examine the relationship between qPCR-based and culture-based estimates of enterococci in surface waters. Using data from several research groups, we compared enterococci estimates by the two methods in water samples collected from 37 sites across the United States. A consistent linear pattern in the relationship between cell equivalents (CCE), based on the qPCR method, and colony-forming units (CFU), based on the traditional culturable method, was significant (P 10CFU > 2.0/100 mL) while uncertainty increases at lower CFU values. It was further noted that the relative error in replicated qPCR estimates was generally higher than that in replicated culture counts even at relatively high target levels, suggesting a greater need for replicated analyses in the qPCR method to reduce relative error. Further studies evaluating the relationship between culture and qPCR should take into account analytical uncertainty as well as potential differences in results of these methods that may arise from sample variability, different sources of pollution, and environmental factors.

  12. Quantifying Quality of Life and Disability of Patients with Advanced Schistosomiasis Japonica

    PubMed Central

    Jia, Tie-Wu; Utzinger, Jürg; Deng, Yao; Yang, Kun; Li, Yi-Yi; Zhu, Jin-Huan; King, Charles H.; Zhou, Xiao-Nong

    2011-01-01

    Background The Chinese government lists advanced schistosomiasis as a leading healthcare priority due to its serious health and economic impacts, yet it has not been included in the estimates of schistosomiasis burden in the Global Burden of Disease (GBD) study. Therefore, the quality of life and disability weight (DW) for the advanced cases of schistosomiasis japonica have to be taken into account in the re-estimation of burden of disease due to schistosomiasis. Methodology/Principal Findings A patient-based quality-of-life evaluation was performed for advanced schistosomiasis japonica. Suspected or officially registered advanced cases in a Schistosoma japonicum-hyperendemic county of the People's Republic of China (P.R. China) were screened using a short questionnaire and physical examination. Disability and morbidity were assessed in confirmed cases, using the European quality of life questionnaire with an additional cognitive dimension (known as the “EQ-5D plus”), ultrasonography, and laboratory testing. The age-specific DW of advanced schistosomiasis japonica was estimated based on patients' self-rated health scores on the visual analogue scale of the questionnaire. The relationships between health status, morbidity and DW were explored using multivariate regression models. Of 506 candidates, 215 cases were confirmed as advanced schistosomiasis japonica and evaluated. Most of the patients reported impairments in at least one health dimension, such as pain or discomfort (90.7%), usual activities (87.9%), and anxiety or depression (80.9%). The overall DW was 0.447, and age-specific DWs ranged from 0.378 among individuals aged 30–44 years to 0.510 among the elderly aged ≥60 years. DWs are positively associated with loss of work capacity, psychological abnormality, ascites, and active hepatitis B virus, while splenectomy and high albumin were protective factors for quality of life. Conclusions/Significance These patient-preference disability estimates could provide updated data for a revision of the GBD, as well as for evidence-based decision-making in P.R. China's national schistosomiasis control program. PMID:21358814

  13. The economics of health care quality and medical errors.

    PubMed

    Andel, Charles; Davidow, Stephen L; Hollander, Mark; Moreno, David A

    2012-01-01

    Hospitals have been looking for ways to improve quality and operational efficiency and cut costs for nearly three decades, using a variety of quality improvement strategies. However, based on recent reports, approximately 200,000 Americans die from preventable medical errors including facility-acquired conditions and millions may experience errors. In 2008, medical errors cost the United States $19.5 billion. About 87 percent or $17 billion were directly associated with additional medical cost, including: ancillary services, prescription drug services, and inpatient and outpatient care, according to a study sponsored by the Society for Actuaries and conducted by Milliman in 2010. Additional costs of $1.4 billion were attributed to increased mortality rates with $1.1 billion or 10 million days of lost productivity from missed work based on short-term disability claims. The authors estimate that the economic impact is much higher, perhaps nearly $1 trillion annually when quality-adjusted life years (QALYs) are applied to those that die. Using the Institute of Medicine's (IOM) estimate of 98,000 deaths due to preventable medical errors annually in its 1998 report, To Err Is Human, and an average of ten lost years of life at $75,000 to $100,000 per year, there is a loss of $73.5 billion to $98 billion in QALYs for those deaths--conservatively. These numbers are much greater than those we cite from studies that explore the direct costs of medical errors. And if the estimate of a recent Health Affairs article is correct-preventable death being ten times the IOM estimate-the cost is $735 billion to $980 billion. Quality care is less expensive care. It is better, more efficient, and by definition, less wasteful. It is the right care, at the right time, every time. It should mean that far fewer patients are harmed or injured. Obviously, quality care is not being delivered consistently throughout U.S. hospitals. Whatever the measure, poor quality is costing payers and society a great deal. However, health care leaders and professionals are focusing on quality and patient safety in ways they never have before because the economics of quality have changed substantially.

  14. Impact of landscape disturbance on the quality of terrestrial sediment carbon in temperate streams

    NASA Astrophysics Data System (ADS)

    Fox, James F.; Ford, William I.

    2016-09-01

    Recent studies have shown the super saturation of fluvial networks with respect to carbon dioxide, and the concept that the high carbon dioxide is at least partially the result of turnover of sediment organic carbon that ranges in age from years to millennia. Currently, there is a need for more highly resolved studies at stream and river scales that enable estimates of terrestrial carbon turnover within fluvial networks. Our objective was to develop a new isotope-based metric to estimate the quality of sediment organic carbon delivered to temperate streams and to use the new metric to estimate carbon quality across landscape disturbance gradients. Carbon quality is defined to be consistent with in-stream turnover and our metric is used to measure the labile or recalcitrant nature of the terrestrial-derived carbon within streams. Our hypothesis was that intensively-disturbed landscapes would tend to produce low quality carbon because deep, recalcitrant soil carbon would be eroded and transported to the fluvial system while moderately disturbed or undisturbed landscapes would tend to produce higher quality carbon from well-developed surface soils and litter. The hypothesis was tested by applying the new carbon quality metric to 15 temperate streams with a wide range of landscape disturbance levels. We find that our hypothesis premised on an indirect relationship between the extent of landscape disturbance and the quality of sediment carbon in streams holds true for moderate and high disturbances but not for un-disturbed forests. We explain the results based on the connectivity, or dis-connectivity, between terrestrial carbon sources and pathways for sediment transport. While pathways are typically un-limited for disturbed landscapes, the un-disturbed forests have dis-connectivity between labile carbon of the forest floor and the stream corridor. Only in the case when trees fell into the stream corridor due to severe ice storms did the quality of sediment carbon increase in the streams. We argue that as scientists continue to estimate the in-stream turnover of terrestrially-derived carbon in fluvial carbon budgets, the assumption of pathway connectivity between carbon sources to the stream should be justified.

  15. Parameter estimation methods for gene circuit modeling from time-series mRNA data: a comparative study.

    PubMed

    Fan, Ming; Kuwahara, Hiroyuki; Wang, Xiaolei; Wang, Suojin; Gao, Xin

    2015-11-01

    Parameter estimation is a challenging computational problem in the reverse engineering of biological systems. Because advances in biotechnology have facilitated wide availability of time-series gene expression data, systematic parameter estimation of gene circuit models from such time-series mRNA data has become an important method for quantitatively dissecting the regulation of gene expression. By focusing on the modeling of gene circuits, we examine here the performance of three types of state-of-the-art parameter estimation methods: population-based methods, online methods and model-decomposition-based methods. Our results show that certain population-based methods are able to generate high-quality parameter solutions. The performance of these methods, however, is heavily dependent on the size of the parameter search space, and their computational requirements substantially increase as the size of the search space increases. In comparison, online methods and model decomposition-based methods are computationally faster alternatives and are less dependent on the size of the search space. Among other things, our results show that a hybrid approach that augments computationally fast methods with local search as a subsequent refinement procedure can substantially increase the quality of their parameter estimates to the level on par with the best solution obtained from the population-based methods while maintaining high computational speed. These suggest that such hybrid methods can be a promising alternative to the more commonly used population-based methods for parameter estimation of gene circuit models when limited prior knowledge about the underlying regulatory mechanisms makes the size of the parameter search space vastly large. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  16. Automatic vision-based grain optimization and analysis of multi-crystalline solar wafers using hierarchical region growing

    NASA Astrophysics Data System (ADS)

    Fan, Shu-Kai S.; Tsai, Du-Ming; Chuang, Wei-Che

    2017-04-01

    Solar power has become an attractive alternative source of energy. The multi-crystalline solar cell has been widely accepted in the market because it has a relatively low manufacturing cost. Multi-crystalline solar wafers with larger grain sizes and fewer grain boundaries are higher quality and convert energy more efficiently than mono-crystalline solar cells. In this article, a new image processing method is proposed for assessing the wafer quality. An adaptive segmentation algorithm based on region growing is developed to separate the closed regions of individual grains. Using the proposed method, the shape and size of each grain in the wafer image can be precisely evaluated. Two measures of average grain size are taken from the literature and modified to estimate the average grain size. The resulting average grain size estimate dictates the quality of the crystalline solar wafers and can be considered a viable quantitative indicator of conversion efficiency.

  17. Neural Network based Control of SG based Standalone Generating System with Energy Storage for Power Quality Enhancement

    NASA Astrophysics Data System (ADS)

    Nayar, Priya; Singh, Bhim; Mishra, Sukumar

    2017-08-01

    An artificial intelligence based control algorithm is used in solving power quality problems of a diesel engine driven synchronous generator with automatic voltage regulator and governor based standalone system. A voltage source converter integrated with a battery energy storage system is employed to mitigate the power quality problems. An adaptive neural network based signed regressor control algorithm is used for the estimation of the fundamental component of load currents for control of a standalone system with load leveling as an integral feature. The developed model of the system performs accurately under varying load conditions and provides good dynamic response to the step changes in loads. The real time performance is achieved using MATLAB along with simulink/simpower system toolboxes and results adhere to an IEEE-519 standard for power quality enhancement.

  18. Genetic parameters of egg defects and egg quality in layer chickens.

    PubMed

    Wolc, A; Arango, J; Settar, P; O'Sullivan, N P; Olori, V E; White, I M S; Hill, W G; Dekkers, J C M

    2012-06-01

    Genetic parameters were estimated for egg defects, egg production, and egg quality traits. Eggs from 11,738 purebred brown-egg laying hens were classified as salable or as having one of the following defects: bloody, broken, calcium deposit, dirty, double yolk, misshapen, pee-wee, shell-less, and soft shelled. Egg quality included albumen height, egg weight, yolk weight, and puncture score. Body weight, age at sexual maturity, and egg production were also recorded. Heritability estimates of liability to defects using a threshold animal model were less than 0.1 for bloody and dirty; between 0.1 and 0.2 for pee-wee, broken, misshapen, soft shelled, and shell-less; and above 0.2 for calcium deposit and double yolk. Quality and production traits were more heritable, with estimates ranging from 0.29 (puncture score) to 0.74 (egg weight). High-producing hens had a lower frequency of egg defects. High egg weight and BW were associated with an increased frequency of double yolks, and to a lesser extent, with more shell quality defects. Estimates of genetic correlations among defect traits that were related to shell quality were positive and moderate to strong (0.24-0.73), suggesting that these could be grouped into one category or selection could be based on the trait with the highest heritability or that is easiest to measure. Selection against defective eggs would be more efficient by including egg defect traits in the selection criterion, along with egg production rate of salable eggs and egg quality traits.

  19. Optimal combining of ground-based sensors for the purpose of validating satellite-based rainfall estimates

    NASA Technical Reports Server (NTRS)

    Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie

    1991-01-01

    Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.

  20. A Clinical Support System Based on Quality of Life Estimation.

    PubMed

    Faria, Brígida Mónica; Gonçalves, Joaquim; Reis, Luis Paulo; Rocha, Álvaro

    2015-10-01

    Quality of life is a concept influenced by social, economic, psychological, spiritual or medical state factors. More specifically, the perceived quality of an individual's daily life is an assessment of their well-being or lack of it. In this context, information technologies may help on the management of services for healthcare of chronic patients such as estimating the patient quality of life and helping the medical staff to take appropriate measures to increase each patient quality of life. This paper describes a Quality of Life estimation system developed using information technologies and the application of data mining algorithms to access the information of clinical data of patients with cancer from Otorhinolaryngology and Head and Neck services of an oncology institution. The system was evaluated with a sample composed of 3013 patients. The results achieved show that there are variables that may be significant predictors for the Quality of Life of the patient: years of smoking (p value 0.049) and size of the tumor (p value < 0.001). In order to assign the variables to the classification of the quality of life the best accuracy was obtained by applying the John Platt's sequential minimal optimization algorithm for training a support vector classifier. In conclusion data mining techniques allow having access to patients additional information helping the physicians to be able to know the quality of life and produce a well-informed clinical decision.

  1. County-level estimates of nitrogen and phosphorus from commercial fertilizer for the Conterminous United States, 1987–2006

    USGS Publications Warehouse

    Gronberg, Jo Ann M.; Spahr, Norman E.

    2012-01-01

    The U.S. Geological Survey’s National Water-Quality Assessment program requires nutrient input for analysis of the national and regional assessment of water quality. Detailed information on nutrient inputs to the environment are needed to understand and address the many serious problems that arise from excess nutrients in the streams and groundwater of the Nation. This report updates estimated county-level farm and nonfarm nitrogen and phosphorus input from commercial fertilizer sales for the conterminous United States for 1987 through 2006. Estimates were calculated from the Association of American Plant Food Control Officials fertilizer sales data, Census of Agriculture fertilizer expenditures, and U.S. Census Bureau county population. A previous national approach for deriving farm and nonfarm fertilizer nutrient estimates was evaluated, and a revised method for selecting representative states to calculate national farm and nonfarm proportions was developed. A national approach was used to estimate farm and nonfarm fertilizer inputs because not all states distinguish between farm and nonfarm use, and the quality of fertilizer reporting varies from year to year. For states that distinguish between farm and nonfarm use, the spatial distribution of the ratios of nonfarm-to-total fertilizer estimates for nitrogen and phosphorus calculated using the national-based farm and nonfarm proportions were similar to the spatial distribution of the ratios generated using state-based farm and nonfarm proportions. In addition, the relative highs and lows in the temporal distribution of farm and nonfarm nitrogen and phosphorus input at the state level were maintained—the periods of high and low usage coincide between national- and state-based values. With a few exceptions, nonfarm nitrogen estimates were found to be reasonable when compared to the amounts that would result if the lawn application rates recommended by state and university agricultural agencies were used. Also, states with higher nonfarm-to-total fertilizer ratios for nitrogen and phosphorus tended to have higher urban land-use percentages.

  2. Estimation of contribution ratios of pollutant sources to a specific section based on an enhanced water quality model.

    PubMed

    Cao, Bibo; Li, Chuan; Liu, Yan; Zhao, Yue; Sha, Jian; Wang, Yuqiu

    2015-05-01

    Because water quality monitoring sections or sites could reflect the water quality status of rivers, surface water quality management based on water quality monitoring sections or sites would be effective. For the purpose of improving water quality of rivers, quantifying the contribution ratios of pollutant resources to a specific section is necessary. Because physical and chemical processes of nutrient pollutants are complex in water bodies, it is difficult to quantitatively compute the contribution ratios. However, water quality models have proved to be effective tools to estimate surface water quality. In this project, an enhanced QUAL2Kw model with an added module was applied to the Xin'anjiang Watershed, to obtain water quality information along the river and to assess the contribution ratios of each pollutant source to a certain section (the Jiekou state-controlled section). Model validation indicated that the results were reliable. Then, contribution ratios were analyzed through the added module. Results show that among the pollutant sources, the Lianjiang tributary contributes the largest part of total nitrogen (50.43%), total phosphorus (45.60%), ammonia nitrogen (32.90%), nitrate (nitrite + nitrate) nitrogen (47.73%), and organic nitrogen (37.87%). Furthermore, contribution ratios in different reaches varied along the river. Compared with pollutant loads ratios of different sources in the watershed, an analysis of contribution ratios of pollutant sources for each specific section, which takes the localized chemical and physical processes into consideration, was more suitable for local-regional water quality management. In summary, this method of analyzing the contribution ratios of pollutant sources to a specific section based on the QUAL2Kw model was found to support the improvement of the local environment.

  3. Hybrid Air Quality Modeling Approach for use in the Hear-road Exposures to Urban air pollutant Study(NEXUS)

    EPA Science Inventory

    The paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associa...

  4. Review of NASA approach to space radiation risk assessments for Mars exploration.

    PubMed

    Cucinotta, Francis A

    2015-02-01

    Long duration space missions present unique radiation protection challenges due to the complexity of the space radiation environment, which includes high charge and energy particles and other highly ionizing radiation such as neutrons. Based on a recommendation by the National Council on Radiation Protection and Measurements, a 3% lifetime risk of exposure-induced death for cancer has been used as a basis for risk limitation by the National Aeronautics and Space Administration (NASA) for low-Earth orbit missions. NASA has developed a risk-based approach to radiation exposure limits that accounts for individual factors (age, gender, and smoking history) and assesses the uncertainties in risk estimates. New radiation quality factors with associated probability distribution functions to represent the quality factor's uncertainty have been developed based on track structure models and recent radiobiology data for high charge and energy particles. The current radiation dose limits are reviewed for spaceflight and the various qualitative and quantitative uncertainties that impact the risk of exposure-induced death estimates using the NASA Space Cancer Risk (NSCR) model. NSCR estimates of the number of "safe days" in deep space to be within exposure limits and risk estimates for a Mars exploration mission are described.

  5. Relationship and Variation of qPCR and Culturable Enterococci Estimates in Ambient Surface Waters Are Predictable

    EPA Science Inventory

    The quantitative polymerase chain reaction (qPCR) method provides rapid estimates of fecal indicator bacteria densities that have been indicated to be useful in the assessment of water quality. Primarily because this method provides faster results than standard culture-based meth...

  6. Estimating benthic secondary production from aquatic insect emergence in streams affected by mountaintop removal coal mining, West Virginia USA

    EPA Science Inventory

    Mountaintop removal and valley fill (MTR/VF) coal mining recountours the Appalachian landscape, buries headwater stream channels, and degrades downstream water quality. The goal of this study was to compare benthic community production estimates, based on seasonal insect emergen...

  7. Influence of fossil-fuel power plant emissions on the surface fine particulate matter in the Seoul Capital Area, South Korea.

    PubMed

    Kim, Byeong-Uk; Kim, Okgil; Kim, Hyun Cheol; Kim, Soontae

    2016-09-01

    The South Korean government plans to reduce region-wide annual PM2.5 (particulate matter with an aerodynamic diameter ≤2.5 μm) concentrations in the Seoul Capital Area (SCA) from 2010 levels of 27 µg/m(3) to 20 µg/m(3) by 2024. At the same time, it is inevitable that emissions from fossil-fuel power plants will continue to increase if electricity generation expands and the generation portfolio remains the same in the future. To estimate incremental PM2.5 contributions due to projected electricity generation growth in South Korea, we utilized an ensemble forecasting member of the Integrated Multidimensional Air Quality System for Korea based on the Community Multi-scale Air Quality model. We performed sensitivity runs with across-the-board emission reductions for all fossil-fuel power plants in South Korea to estimate the contribution of PM2.5 from domestic fossil-fuel power plants. We estimated that fossil-fuel power plants are responsible for 2.4% of the annual PM2.5 national ambient air quality standard in the SCA as of 2010. Based on the electricity generation and the annual contribution of fossil-fuel power plants in 2010, we estimated that annual PM2.5 concentrations may increase by 0.2 µg/m(3) per 100 TWhr due to additional electricity generation. With currently available information on future electricity demands, we estimated that the total future contribution of fossil-fuel power plants would be 0.87 µg/m(3), which is 12.4% of the target reduction amount of the annual PM2.5 concentration by 2024. We also approximated that the number of premature deaths caused by existing fossil-fuel power plants would be 736 in 2024. Since the proximity of power plants to the SCA and the types of fuel used significantly impact this estimation, further studies are warranted on the impact of physical parameters of plants, such as location and stack height, on PM2.5 concentrations in the SCA due to each precursor. Improving air quality by reducing fine particle pollution is challenging when fossil-fuel-based electricity production is increasing. We show that an air quality forecasting system based on a photochemical model can be utilized to efficiently estimate PM2.5 contributions from and health impacts of domestic power plants. We derived PM2.5 concentrations per unit amount of electricity production from existing fossil-fuel power plants in South Korea. We assessed the health impacts of existing fossil-fuel power plants and the PM2.5 concentrations per unit electricity production to quantify the significance of existing and future fossil-fuel power plants with respect to the planned PM2.5 reduction target.

  8. A near-optimal low complexity sensor fusion technique for accurate indoor localization based on ultrasound time of arrival measurements from low-quality sensors

    NASA Astrophysics Data System (ADS)

    Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.

    2009-05-01

    A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.

  9. HOS network-based classification of power quality events via regression algorithms

    NASA Astrophysics Data System (ADS)

    Palomares Salas, José Carlos; González de la Rosa, Juan José; Sierra Fernández, José María; Pérez, Agustín Agüera

    2015-12-01

    This work compares seven regression algorithms implemented in artificial neural networks (ANNs) supported by 14 power-quality features, which are based in higher-order statistics. Combining time and frequency domain estimators to deal with non-stationary measurement sequences, the final goal of the system is the implementation in the future smart grid to guarantee compatibility between all equipment connected. The principal results are based in spectral kurtosis measurements, which easily adapt to the impulsive nature of the power quality events. These results verify that the proposed technique is capable of offering interesting results for power quality (PQ) disturbance classification. The best results are obtained using radial basis networks, generalized regression, and multilayer perceptron, mainly due to the non-linear nature of data.

  10. Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.

  11. Combining Nordtest method and bootstrap resampling for measurement uncertainty estimation of hematology analytes in a medical laboratory.

    PubMed

    Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong

    2017-12-01

    Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  12. Inter-comparison of interpolated background nitrogen dioxide concentrations across Greater Manchester, UK

    NASA Astrophysics Data System (ADS)

    Lindley, S. J.; Walsh, T.

    There are many modelling methods dedicated to the estimation of spatial patterns in pollutant concentrations, each with their distinctive advantages and disadvantages. The derivation of a surface of air quality values from monitoring data alone requires the conversion of point-based data from a limited number of monitoring stations to a continuous surface using interpolation. Since interpolation techniques involve the estimation of data at un-sampled points based on calculated relationships between data measured at a number of known sample points, they are subject to some uncertainty, both in terms of the values estimated and their spatial distribution. These uncertainties, which are incorporated into many empirical and semi-empirical mapping methodologies, could be recognised in any further usage of the data and also in the assessment of the extent of an exceedence of an air quality standard and the degree of exposure this may represent. There is a wide range of available interpolation techniques and the differences in the characteristics of these result in variations in the output surfaces estimated from the same set of input points. The work presented in this paper provides an examination of uncertainties through the application of a number of interpolation techniques available in standard GIS packages to a case study nitrogen dioxide data set for the Greater Manchester conurbation in northern England. The implications of the use of different techniques are discussed through application to hourly concentrations during an air quality episode and annual average concentrations in 2001. Patterns of concentrations demonstrate considerable differences in the estimated spatial pattern of maxima as the combined effects of chemical processes, topography and meteorology. In the case of air quality episodes, the considerable spatial variability of concentrations results in large uncertainties in the surfaces produced but these uncertainties vary widely from area to area. In view of the uncertainties with classical techniques research is ongoing to develop alternative methods which should in time help improve the suite of tools available to air quality managers.

  13. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Development of fuzzy air quality index using soft computing approach.

    PubMed

    Mandal, T; Gorai, A K; Pathak, G

    2012-10-01

    Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.

  15. Assessing the influence of land use land cover pattern, socio economic factors and air quality status to predict morbidity on the basis of logistic based regression model

    NASA Astrophysics Data System (ADS)

    Dixit, A.; Singh, V. K.

    2017-12-01

    Recent studies conducted by World Health Organisation (WHO) estimated that 92 % of the total world population are living in places where the air quality level has exceeded the WHO standard limit for air quality. This is due to the change in Land Use Land Cover (LULC) pattern, socio economic drivers and anthropogenic heat emission caused by manmade activity. Thereby, many prevalent human respiratory diseases such as lung cancer, chronic obstructive pulmonary disease and emphysema have increased in recent times. In this study, a quantitative relationship is developed between land use (built-up land, water bodies, and vegetation), socio economic drivers and air quality parameters using logistic based regression model over 7 different cities of India for the winter season of 2012 to 2016. Different LULC, socio economic, industrial emission sources, meteorological condition and air quality level from the monitoring stations are taken to estimate the influence on morbidity of each city. Results of correlation are analyzed between land use variables and monthly concentration of pollutants. These values range from 0.63 to 0.76. Similarly, the correlation value between land use variable with socio economic and morbidity ranges from 0.57 to 0.73. The performance of model is improved from 67 % to 79 % in estimating morbidity for the year 2015 and 2016 due to the better availability of observed data.The study highlights the growing importance of incorporating socio-economic drivers with air quality data for evaluating morbidity rate for each city in comparison to just change in quantitative analysis of air quality.

  16. An Optical Flow-Based Full Reference Video Quality Assessment Algorithm.

    PubMed

    K, Manasa; Channappayya, Sumohana S

    2016-06-01

    We present a simple yet effective optical flow-based full-reference video quality assessment (FR-VQA) algorithm for assessing the perceptual quality of natural videos. Our algorithm is based on the premise that local optical flow statistics are affected by distortions and the deviation from pristine flow statistics is proportional to the amount of distortion. We characterize the local flow statistics using the mean, the standard deviation, the coefficient of variation (CV), and the minimum eigenvalue ( λ min ) of the local flow patches. Temporal distortion is estimated as the change in the CV of the distorted flow with respect to the reference flow, and the correlation between λ min of the reference and of the distorted patches. We rely on the robust multi-scale structural similarity index for spatial quality estimation. The computed temporal and spatial distortions, thus, are then pooled using a perceptually motivated heuristic to generate a spatio-temporal quality score. The proposed method is shown to be competitive with the state-of-the-art when evaluated on the LIVE SD database, the EPFL Polimi SD database, and the LIVE Mobile HD database. The distortions considered in these databases include those due to compression, packet-loss, wireless channel errors, and rate-adaptation. Our algorithm is flexible enough to allow for any robust FR spatial distortion metric for spatial distortion estimation. In addition, the proposed method is not only parameter-free but also independent of the choice of the optical flow algorithm. Finally, we show that the replacement of the optical flow vectors in our proposed method with the much coarser block motion vectors also results in an acceptable FR-VQA algorithm. Our algorithm is called the flow similarity index.

  17. Estimating the incubation period of acute Q fever, a systematic review.

    PubMed

    Todkill, D; Fowler, T; Hawker, J I

    2018-04-01

    Estimates of the incubation period for Q fever vary substantially between different reviews and expert advice documents. We systematically reviewed and quality appraised the literature to provide an evidence-based estimate of the incubation period of the Q fever by the aerosolised infection route. Medline (OVIDSP) and EMBASE were searched with the search limited to human studies and English language. Eligible studies included persons with symptomatic, acute Q fever, and defined exposure to Coxiella burnetti. After review of 7115 titles and abstracts, 320 records were screened at full-text level. Of these, 23 studies contained potentially useful data and were quality assessed, with eight studies (with 403 individual cases where the derivation of incubation period was possible) being of sufficient quality and providing individual-level data to produce a pooled summary. We found a median incubation period of 18 days, with 95% of cases expected to occur between 7 and 32 days after exposure.

  18. Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging

    PubMed Central

    Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895

  19. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database

    PubMed Central

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-01

    Objective To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. Design A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Setting Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Main outcome measure Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding ‘poor’ quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Results Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled ‘poor’ quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. Conclusions In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. PMID:28122831

  20. Simultaneous analysis and quality assurance for diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A

    2013-01-01

    Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.

  1. A Social Network Approach to the Estimation of Perceived Quality of Health Care

    PubMed Central

    Carletti, Giulia; Soriani, Nicola; Mattiazzi, Martina; Gregori, Dario

    2017-01-01

    Background: Measuring service quality aids health care providers to recognize specific and unmet needs of patients. Nevertheless, perceived quality of health care services (PQC) is often investigated with inadequate techniques which may lead to biased results. Objective: The aim of the present study is to develop a proof-of-concept for estimating the PQC using the scale-up estimator, with reference to a concrete assessment in patients of a major Oncology Hospital in Veneto (IOV). Results have then been compared with those collected by the Customer Relations Office (CRO) after the annual survey conducted with traditional questionnaire based techniques. Material and Methods: Seven hundred and eighty-three sets consisting of two questionnaires were handed out to IOV patients between 26 and 28 November 2012. The first questionnaire was the CRO annual one composed by 15 direct questions about the perception of quality satisfaction rate using a Likert scale. The second questionnaire was the scale-up (NSUM) one, composed by 20 indirect questions, 5 of which were reproducing the main target of CRO for estimating PQC. Results: The comparisons made over 299 sets of questionnaires showed differences between the two techniques. Network Scale-Up Method (NSUM) questionnaire seems to be able to produce lower estimates of PQC with respect to the CRO annual questionnaire. In some cases, the NSUM showed dissatisfaction rates which are 20-fold higher respect to CRO. Conclusion: NSUM could be a promising method for assessing the perceived quality of care. PMID:29238425

  2. Comparative study on fractal analysis of interferometry images with application to tear film surface quality assessment.

    PubMed

    Szyperski, Piotr D

    2018-06-01

    The purpose of this research was to evaluate the applicability of the fractal dimension (FD) estimators to assess lateral shearing interferometric (LSI) measurements of tear film surface quality. Retrospective recordings of tear film measured with LSI were used: 69 from healthy subjects and 41 from patients diagnosed with dry eye syndrome. Five surface quality descriptors were considered, four based on FD and a previously reported descriptor operating in a spatial frequency domain (M 2 ), presenting temporal kinetics of post-blink tear film. A set of 12 regression parameters has been extracted and analyzed for classification purposes. The classifiers are assessed in terms of receiver operating characteristics and areas under their curves (AUC). Also, the computational loads are estimated. The maximum AUC of 82.4% was achieved for M 2 , closely followed by the binary box-counting (BBC) FD estimator with AUC=78.6%. For all descriptors, statistically significant differences between the subject groups were found (p<0.05). The BBC FD estimator was characterized with the highest empirical computational efficiency that was about 30% faster than that of M 2 , while that based on the differential box-counting exhibited the lowest efficiency (4.5 times slower than the best one). Concluding, FD estimators can be utilized for quantitative assessment of tear film kinetics. They provide a viable alternative to previously used spectral counter parameters, and at the same time allow higher computational efficiency.

  3. Phosphorus and nitrogen concentrations and loads at Illinois River south of Siloam Springs, Arkansas, 1997-1999

    USGS Publications Warehouse

    Green, W. Reed; Haggard, Brian E.

    2001-01-01

    Water-quality sampling consisting of every other month (bimonthly) routine sampling and storm event sampling (six storms annually) is used to estimate annual phosphorus and nitrogen loads at Illinois River south of Siloam Springs, Arkansas. Hydrograph separation allowed assessment of base-flow and surfacerunoff nutrient relations and yield. Discharge and nutrient relations indicate that water quality at Illinois River south of Siloam Springs, Arkansas, is affected by both point and nonpoint sources of contamination. Base-flow phosphorus concentrations decreased with increasing base-flow discharge indicating the dilution of phosphorus in water from point sources. Nitrogen concentrations increased with increasing base-flow discharge, indicating a predominant ground-water source. Nitrogen concentrations at higher base-flow discharges often were greater than median concentrations reported for ground water (from wells and springs) in the Springfield Plateau aquifer. Total estimated phosphorus and nitrogen annual loads for calendar year 1997-1999 using the regression techniques presented in this paper (35 samples) were similar to estimated loads derived from integration techniques (1,033 samples). Flow-weighted nutrient concentrations and nutrient yields at the Illinois River site were about 10 to 100 times greater than national averages for undeveloped basins and at North Sylamore Creek and Cossatot River (considered to be undeveloped basins in Arkansas). Total phosphorus and soluble reactive phosphorus were greater than 10 times and total nitrogen and dissolved nitrite plus nitrate were greater than 10 to 100 times the national and regional averages for undeveloped basins. These results demonstrate the utility of a strategy whereby samples are collected every other month and during selected storm events annually, with use of regression models to estimate nutrient loads. Annual loads of phosphorus and nitrogen estimated using regression techniques could provide similar results to estimates using integration techniques, with much less investment.

  4. Rapid assessment of antimicrobial resistance prevalence using a Lot Quality Assurance sampling approach.

    PubMed

    van Leth, Frank; den Heijer, Casper; Beerepoot, Mariëlle; Stobberingh, Ellen; Geerlings, Suzanne; Schultsz, Constance

    2017-04-01

    Increasing antimicrobial resistance (AMR) requires rapid surveillance tools, such as Lot Quality Assurance Sampling (LQAS). LQAS classifies AMR as high or low based on set parameters. We compared classifications with the underlying true AMR prevalence using data on 1335 Escherichia coli isolates from surveys of community-acquired urinary tract infection in women, by assessing operating curves, sensitivity and specificity. Sensitivity and specificity of any set of LQAS parameters was above 99% and between 79 and 90%, respectively. Operating curves showed high concordance of the LQAS classification with true AMR prevalence estimates. LQAS-based AMR surveillance is a feasible approach that provides timely and locally relevant estimates, and the necessary information to formulate and evaluate guidelines for empirical treatment.

  5. Evaluation of Observation-Fused Regional Air Quality Model Results for Population Air Pollution Exposure Estimation

    PubMed Central

    Chen, Gang; Li, Jingyi; Ying, Qi; Sherman, Seth; Perkins, Neil; Rajeshwari, Sundaram; Mendola, Pauline

    2014-01-01

    In this study, Community Multiscale Air Quality (CMAQ) model was applied to predict ambient gaseous and particulate concentrations during 2001 to 2010 in 15 hospital referral regions (HRRs) using a 36-km horizontal resolution domain. An inverse distance weighting based method was applied to produce exposure estimates based on observation-fused regional pollutant concentration fields using the differences between observations and predictions at grid cells where air quality monitors were located. Although the raw CMAQ model is capable of producing satisfying results for O3 and PM2.5 based on EPA guidelines, using the observation data fusing technique to correct CMAQ predictions leads to significant improvement of model performance for all gaseous and particulate pollutants. Regional average concentrations were calculated using five different methods: 1) inverse distance weighting of observation data alone, 2) raw CMAQ results, 3) observation-fused CMAQ results, 4) population-averaged raw CMAQ results and 5) population-averaged fused CMAQ results. It shows that while O3 (as well as NOx) monitoring networks in the HRR regions are dense enough to provide consistent regional average exposure estimation based on monitoring data alone, PM2.5 observation sites (as well as monitors for CO, SO2, PM10 and PM2.5 components) are usually sparse and the difference between the average concentrations estimated by the inverse distance interpolated observations, raw CMAQ and fused CMAQ results can be significantly different. Population-weighted average should be used to account spatial variation in pollutant concentration and population density. Using raw CMAQ results or observations alone might lead to significant biases in health outcome analyses. PMID:24747248

  6. Urban-rural variations in air quality and health impacts in northern India

    NASA Astrophysics Data System (ADS)

    Karambelas, A. N.; Holloway, T.; Fiore, A. M.; Kinney, P.; DeFries, R. S.; Kiesewetter, G.; Heyes, C.

    2017-12-01

    Ambient air pollution in India is a severe problem, contributing to negative health impacts and early death. Ground-based monitors often used to quantify health impacts are often located in urban regions, however approximately 70% of India's population resides in rural areas. We use high-resolution concentrations from the regional Community Multi-scale Air Quality (CMAQ) model over densely-populated northern India to estimate air quality and health impacts due to anthropogenic emission sectors separately for urban and rural regions. Modeled concentrations inform relative risk calculations and exposure estimates as performed in the Global Burden of Disease. Anthropogenic emissions from the International Institute for Applied Systems Analysis (IIASA) Greenhouse Gas-Air Pollution Interactions and Synergies (GAINS) model following version 5a of the Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants project gridding structure are updated to reflect urban- and rural-specific activity information for transportation and residential combustion, and industrial and electrical generating unit location and magnitude information. We estimate 314,000 (95% Confidence Interval: 304,000—323,000) and 58,000 (CI: 39,000—70,000) adults (25 years or older) die prematurely each year from PM2.5 and O3 respectively in northern India, with the greatest impacts along the Indo-Gangetic Plain. Using urban and rural population distributions, we estimate that the majority of premature deaths resulting from PM2.5 and O3 are in rural (292,000) as opposed to urban (79,000) regions. These findings indicate the need for designing monitoring networks and ground-based health studies in rural areas of India to more accurately quantify the true health implications of ambient air pollution, in addition to supporting model evaluation. Using this urban-versus-rural emissions framework, we are assessing anthropogenic contributions to regional air quality and health impacts, and examining mitigation strategies to reduce anthropogenic emissions, improve air quality, and reduce PM2.5 and O3 attributable premature death in the near-term.

  7. Cost-effective water quality assessment through the integration of monitoring data and modeling results

    NASA Astrophysics Data System (ADS)

    Lobuglio, Joseph N.; Characklis, Gregory W.; Serre, Marc L.

    2007-03-01

    Sparse monitoring data and error inherent in water quality models make the identification of waters not meeting regulatory standards uncertain. Additional monitoring can be implemented to reduce this uncertainty, but it is often expensive. These costs are currently a major concern, since developing total maximum daily loads, as mandated by the Clean Water Act, will require assessing tens of thousands of water bodies across the United States. This work uses the Bayesian maximum entropy (BME) method of modern geostatistics to integrate water quality monitoring data together with model predictions to provide improved estimates of water quality in a cost-effective manner. This information includes estimates of uncertainty and can be used to aid probabilistic-based decisions concerning the status of a water (i.e., impaired or not impaired) and the level of monitoring needed to characterize the water for regulatory purposes. This approach is applied to the Catawba River reservoir system in western North Carolina as a means of estimating seasonal chlorophyll a concentration. Mean concentration and confidence intervals for chlorophyll a are estimated for 66 reservoir segments over an 11-year period (726 values) based on 219 measured seasonal averages and 54 model predictions. Although the model predictions had a high degree of uncertainty, integration of modeling results via BME methods reduced the uncertainty associated with chlorophyll estimates compared with estimates made solely with information from monitoring efforts. Probabilistic predictions of future chlorophyll levels on one reservoir are used to illustrate the cost savings that can be achieved by less extensive and rigorous monitoring methods within the BME framework. While BME methods have been applied in several environmental contexts, employing these methods as a means of integrating monitoring and modeling results, as well as application of this approach to the assessment of surface water monitoring networks, represent unexplored areas of research.

  8. DYNAMIC EVALUATION OF REGIONAL AIR QUALITY MODELS: ASSESSING CHANGES TO O 3 STEMMING FROM CHANGES IN EMISSIONS AND METEOROLOGY

    EPA Science Inventory

    Regional-scale air quality models are used to estimate the response of air pollutants to potential emission control strategies as part of the decision-making process. Traditionally, the model predicted pollutant concentrations are evaluated for the “base case” to assess a model’s...

  9. Guide to Selected Algorithms, Distributions, and Databases Used in Exposure Models Developed By the Office of Air Quality Planning and Standards

    EPA Pesticide Factsheets

    In the evaluation of emissions standards, OAQPS frequently uses one or more computer-based models to estimate the number of people who will be exposed to the air pollution levels that are expected to occur under various air quality scenarios.

  10. A Reduced Form Model for Ozone Based on Two Decades of CMAQ Simulations for the Continental United States

    EPA Science Inventory

    A Reduced Form Model (RFM) is a mathematical relationship between the inputs and outputs of an air quality model, permitting estimation of additional modeling without costly new regional-scale simulations. A 21-year Community Multiscale Air Quality (CMAQ) simulation for the con...

  11. 40 CFR 130.7 - Total maximum daily loads (TMDL) and individual water quality-based effluent limitations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of a balanced indigenous population of shellfish, fish and wildlife. (3) For the purposes of listing... propagation of a balanced, indigenous population of shellfish, fish and wildlife. Such estimates shall take... water quality criteria for protection and propagation of a balanced, indigenous population of shellfish...

  12. 40 CFR 130.7 - Total maximum daily loads (TMDL) and individual water quality-based effluent limitations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of a balanced indigenous population of shellfish, fish and wildlife. (3) For the purposes of listing... propagation of a balanced, indigenous population of shellfish, fish and wildlife. Such estimates shall take... water quality criteria for protection and propagation of a balanced, indigenous population of shellfish...

  13. 40 CFR 130.7 - Total maximum daily loads (TMDL) and individual water quality-based effluent limitations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of a balanced indigenous population of shellfish, fish and wildlife. (3) For the purposes of listing... propagation of a balanced, indigenous population of shellfish, fish and wildlife. Such estimates shall take... water quality criteria for protection and propagation of a balanced, indigenous population of shellfish...

  14. Effect of water quality sampling time and frequency on storm load predictions of a prominent regression-based model

    USDA-ARS?s Scientific Manuscript database

    High frequency in situ measurements of nitrate can greatly reduce the uncertainty in nitrate flux estimates. Water quality databases maintained by various federal and state agencies often consist of pollutant concentration data obtained from periodic grab samples collected from gauged reaches of a s...

  15. SEASONAL NH3 EMISSION ESTIMATES FOR THE EASTERN UNITED STATES BASED ON AMMONIUM WET CONCENTRATIONS AND AN INVERSE MODELING METHOD

    EPA Science Inventory

    Significant uncertainty exists in the magnitude and variability of ammonia (NH3) emissions. NH3 emissions are needed as input for air quality modeling of aerosols and deposition of nitrogen compounds. Approximately 85% of NH3 emissions are estimated to come from agricultural ...

  16. Region-specific S-wave attenuation for earthquakes in northwestern Iran

    NASA Astrophysics Data System (ADS)

    Heidari, Reza; Mirzaei, Noorbakhsh

    2017-11-01

    In this study, continuous wavelet transform is applied to estimate the frequency-dependent quality factor of shear waves, Q S , in northwestern Iran. The dataset used in this study includes velocigrams of more than 50 events with magnitudes between 4.0 and 6.5, which have occurred in the study area. The CWT-based method shows a high-resolution technique for the estimation of S-wave frequency-dependent attenuation. The quality factor values are determined in the form of a power law as Q S ( f) = (147 ± 16) f 0.71 ± 0.02 and (126 ± 12) f 0.73 ± 0.02 for vertical and horizontal components, respectively, where f is between 0.9 and 12 Hz. Furthermore, in order to verify the reliability of the suggested Q S estimator method, an additional test is performed by using accelerograms of Ahar-Varzaghan dual earthquakes on August 11, 2012, of moment magnitudes 6.4 and 6.3 and their aftershocks. Results indicate that the estimated Q S values from CWT-based method are not very sensitive to the numbers and types of waveforms used (velocity or acceleration).

  17. WE-G-204-07: Automated Characterization of Perceptual Quality of Clinical Chest Radiographs: Improvements in Lung, Spine, and Hardware Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, J; Zhang, L; Samei, E

    Purpose: To develop and validate more robust methods for automated lung, spine, and hardware detection in AP/PA chest images. This work is part of a continuing effort to automatically characterize the perceptual image quality of clinical radiographs. [Y. Lin et al. Med. Phys. 39, 7019–7031 (2012)] Methods: Our previous implementation of lung/spine identification was applicable to only one vendor. A more generalized routine was devised based on three primary components: lung boundary detection, fuzzy c-means (FCM) clustering, and a clinically-derived lung pixel probability map. Boundary detection was used to constrain the lung segmentations. FCM clustering produced grayscale- and neighborhood-based pixelmore » classification probabilities which are weighted by the clinically-derived probability maps to generate a final lung segmentation. Lung centerlines were set along the left-right lung midpoints. Spine centerlines were estimated as a weighted average of body contour, lateral lung contour, and intensity-based centerline estimates. Centerline estimation was tested on 900 clinical AP/PA chest radiographs which included inpatient/outpatient, upright/bedside, men/women, and adult/pediatric images from multiple imaging systems. Our previous implementation further did not account for the presence of medical hardware (pacemakers, wires, implants, staples, stents, etc.) potentially biasing image quality analysis. A hardware detection algorithm was developed using a gradient-based thresholding method. The training and testing paradigm used a set of 48 images from which 1920 51×51 pixel{sup 2} ROIs with and 1920 ROIs without hardware were manually selected. Results: Acceptable lung centerlines were generated in 98.7% of radiographs while spine centerlines were acceptable in 99.1% of radiographs. Following threshold optimization, the hardware detection software yielded average true positive and true negative rates of 92.7% and 96.9%, respectively. Conclusion: Updated segmentation and centerline estimation methods in addition to new gradient-based hardware detection software provide improved data integrity control and error-checking for automated clinical chest image quality characterization across multiple radiography systems.« less

  18. The reliability of the Glasgow Coma Scale: a systematic review.

    PubMed

    Reith, Florence C M; Van den Brande, Ruben; Synnot, Anneliese; Gruen, Russell; Maas, Andrew I R

    2016-01-01

    The Glasgow Coma Scale (GCS) provides a structured method for assessment of the level of consciousness. Its derived sum score is applied in research and adopted in intensive care unit scoring systems. Controversy exists on the reliability of the GCS. The aim of this systematic review was to summarize evidence on the reliability of the GCS. A literature search was undertaken in MEDLINE, EMBASE and CINAHL. Observational studies that assessed the reliability of the GCS, expressed by a statistical measure, were included. Methodological quality was evaluated with the consensus-based standards for the selection of health measurement instruments checklist and its influence on results considered. Reliability estimates were synthesized narratively. We identified 52 relevant studies that showed significant heterogeneity in the type of reliability estimates used, patients studied, setting and characteristics of observers. Methodological quality was good (n = 7), fair (n = 18) or poor (n = 27). In good quality studies, kappa values were ≥0.6 in 85%, and all intraclass correlation coefficients indicated excellent reliability. Poor quality studies showed lower reliability estimates. Reliability for the GCS components was higher than for the sum score. Factors that may influence reliability include education and training, the level of consciousness and type of stimuli used. Only 13% of studies were of good quality and inconsistency in reported reliability estimates was found. Although the reliability was adequate in good quality studies, further improvement is desirable. From a methodological perspective, the quality of reliability studies needs to be improved. From a clinical perspective, a renewed focus on training/education and standardization of assessment is required.

  19. Impact of voice- and knowledge-enabled clinical reporting--US example.

    PubMed

    Bushko, Renata G; Havlicek, Penny L; Deppert, Edward; Epner, Stephen

    2002-01-01

    This study shows qualitative and quantitative estimates of the national and the clinic level impact of utilizing voice and knowledge enabled clinical reporting systems. Using common sense estimation methodology, we show that the delivery of health care can experience a dramatic improvement in four areas as a result of the broad use of voice and knowledge enabled clinical reporting: (1) Process Quality as measured by cost savings, (2) Organizational Quality as measured by compliance, (3) Clinical Quality as measured by clinical outcomes and (4) Service Quality as measured by patient satisfaction. If only 15 percent of US physicians replaced transcription with modem clinical reporting voice-based methodology, about one half billion dollars could be saved. $6.7 Billion could be saved annually if all medical reporting currently transcribed was handled with voice-and knowledge-enabled dictation and reporting systems.

  20. Using ZIP Code Business Patterns Data to Measure Alcohol Outlet Density

    PubMed Central

    Matthews, Stephen A.; McCarthy, John D.; Rafail, Patrick S.

    2014-01-01

    Some states maintain high-quality alcohol outlet databases but quality varies by state, making comprehensive comparative analysis across US communities difficult. This study assesses the adequacy of using ZIP Code Business Patterns (ZIP-BP) data on establishments as estimates of the number of alcohol outlets by ZIP code. Specifically we compare ZIP-BP alcohol outlet counts with high-quality data from state and local records surrounding 44 college campus communities across 10 states plus the District of Columbia. Results show that a composite measure is strongly correlated (R=0.89) with counts of alcohol outlets generated from official state records. Analyses based on Generalized Estimation Equation models show that community and contextual factors have little impact on the concordance between the two data sources. There are also minimal inter-state differences in the level of agreement. To validate the use of a convenient secondary data set (ZIP-BP) it is important to have a high correlation with the more complex, high quality and more costly data product (i.e., datasets based on the acquisition and geocoding of state and local records) and then to clearly demonstrate that the discrepancy between the two to be unrelated to relevant explanatory variables. Thus our overall findings support the adequacy of using a conveniently available data set (ZIP-BP data) to estimate alcohol outlet densities in ZIP code areas in future research. PMID:21411233

  1. Using ZIP code business patterns data to measure alcohol outlet density.

    PubMed

    Matthews, Stephen A; McCarthy, John D; Rafail, Patrick S

    2011-07-01

    Some states maintain high-quality alcohol outlet databases but quality varies by state, making comprehensive comparative analysis across US communities difficult. This study assesses the adequacy of using ZIP Code Business Patterns (ZIP-BP) data on establishments as estimates of the number of alcohol outlets by ZIP code. Specifically we compare ZIP-BP alcohol outlet counts with high-quality data from state and local records surrounding 44 college campus communities across 10 states plus the District of Columbia. Results show that a composite measure is strongly correlated (R=0.89) with counts of alcohol outlets generated from official state records. Analyses based on Generalized Estimation Equation models show that community and contextual factors have little impact on the concordance between the two data sources. There are also minimal inter-state differences in the level of agreement. To validate the use of a convenient secondary data set (ZIP-BP) it is important to have a high correlation with the more complex, high quality and more costly data product (i.e., datasets based on the acquisition and geocoding of state and local records) and then to clearly demonstrate that the discrepancy between the two to be unrelated to relevant explanatory variables. Thus our overall findings support the adequacy of using a conveniently available data set (ZIP-BP data) to estimate alcohol outlet densities in ZIP code areas in future research. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Perceptual quality estimation of H.264/AVC videos using reduced-reference and no-reference models

    NASA Astrophysics Data System (ADS)

    Shahid, Muhammad; Pandremmenou, Katerina; Kondi, Lisimachos P.; Rossholm, Andreas; Lövström, Benny

    2016-09-01

    Reduced-reference (RR) and no-reference (NR) models for video quality estimation, using features that account for the impact of coding artifacts, spatio-temporal complexity, and packet losses, are proposed. The purpose of this study is to analyze a number of potentially quality-relevant features in order to select the most suitable set of features for building the desired models. The proposed sets of features have not been used in the literature and some of the features are used for the first time in this study. The features are employed by the least absolute shrinkage and selection operator (LASSO), which selects only the most influential of them toward perceptual quality. For comparison, we apply feature selection in the complete feature sets and ridge regression on the reduced sets. The models are validated using a database of H.264/AVC encoded videos that were subjectively assessed for quality in an ITU-T compliant laboratory. We infer that just two features selected by RR LASSO and two bitstream-based features selected by NR LASSO are able to estimate perceptual quality with high accuracy, higher than that of ridge, which uses more features. The comparisons with competing works and two full-reference metrics also verify the superiority of our models.

  3. Recreational Stream Crossing Effects on Sediment Delivery and Macroinvertebrates in Southwestern Virginia, USA

    NASA Astrophysics Data System (ADS)

    Kidd, Kathryn R.; Aust, W. Michael; Copenheaver, Carolyn A.

    2014-09-01

    Trail-based recreation has increased over recent decades, raising the environmental management issue of soil erosion that originates from unsurfaced, recreational trail systems. Trail-based soil erosion that occurs near stream crossings represents a non-point source of pollution to streams. We modeled soil erosion rates along multiple-use (hiking, mountain biking, and horseback riding) recreational trails that approach culvert and ford stream crossings as potential sources of sediment input and evaluated whether recreational stream crossings were impacting water quality based on downstream changes in macroinvertebrate-based indices within the Poverty Creek Trail System of the George Washington and Jefferson National Forest in southwestern Virginia, USA. We found modeled soil erosion rates for non-motorized recreational approaches that were lower than published estimates for an off-road vehicle approach, bare horse trails, and bare forest operational skid trail and road approaches, but were 13 times greater than estimated rates for undisturbed forests and 2.4 times greater than a 2-year old clearcut in this region. Estimated soil erosion rates were similar to rates for skid trails and horse trails where best management practices (BMPs) had been implemented. Downstream changes in macroinvertebrate-based indices indicated water quality was lower downstream from crossings than in upstream reference reaches. Our modeled soil erosion rates illustrate recreational stream crossing approaches have the potential to deliver sediment into adjacent streams, particularly where BMPs are not being implemented or where approaches are not properly managed, and as a result can negatively impact water quality below stream crossings.

  4. Winter wheat quality monitoring and forecasting system based on remote sensing and environmental factors

    NASA Astrophysics Data System (ADS)

    Haiyang, Yu; Yanmei, Liu; Guijun, Yang; Xiaodong, Yang; Dong, Ren; Chenwei, Nie

    2014-03-01

    To achieve dynamic winter wheat quality monitoring and forecasting in larger scale regions, the objective of this study was to design and develop a winter wheat quality monitoring and forecasting system by using a remote sensing index and environmental factors. The winter wheat quality trend was forecasted before the harvest and quality was monitored after the harvest, respectively. The traditional quality-vegetation index from remote sensing monitoring and forecasting models were improved. Combining with latitude information, the vegetation index was used to estimate agronomy parameters which were related with winter wheat quality in the early stages for forecasting the quality trend. A combination of rainfall in May, temperature in May, illumination at later May, the soil available nitrogen content and other environmental factors established the quality monitoring model. Compared with a simple quality-vegetation index, the remote sensing monitoring and forecasting model used in this system get greatly improved accuracy. Winter wheat quality was monitored and forecasted based on the above models, and this system was completed based on WebGIS technology. Finally, in 2010 the operation process of winter wheat quality monitoring system was presented in Beijing, the monitoring and forecasting results was outputted as thematic maps.

  5. Partial Deconvolution with Inaccurate Blur Kernel.

    PubMed

    Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei

    2017-10-17

    Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.

  6. A user-oriented and computerized model for estimating vehicle ride quality

    NASA Technical Reports Server (NTRS)

    Leatherwood, J. D.; Barker, L. M.

    1984-01-01

    A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.

  7. Spatially explicit measures of production of young alewives in Lake Michigan: Linkage between essential fish habitat and recruitment

    USGS Publications Warehouse

    Hook, Tomas O.; Rutherford, Edward S.; Brines, Shannon J.; Mason, Doran M.; Schwab, David J.; McCormick, Michael; Desorcie, Timothy J.

    2003-01-01

    The identification and protection of essential habitats for early life stages of fishes are necessary to sustain fish stocks. Essential fish habitat for early life stages may be defined as areas where fish densities, growth, survival, or production rates are relatively high. To identify critical habitats for young-of-year (YOY) alewives (Alosa pseud oharengus) in Lake Michigan, we integrated bioenergetics models with GIS (Geographic Information Systems) to generate spatially explicit estimates of potential population production (an index of habitat quality). These estimates were based upon YOY alewife bioenergetic growth rate potential and their salmonine predators’ consumptive demand. We compared estimates of potential population production to YOY alewife yield (an index of habitat importance). Our analysis suggested that during 1994–1995, YOY alewife habitat quality and yield varied widely throughout Lake Michigan. Spatial patterns of alewife yield were not significantly correlated to habitat quality. Various mechanisms (e.g., predator migrations, lake circulation patterns, alternative strategies) may preclude YOY alewives from concentrating in areas of high habitat quality in Lake Michigan.

  8. Artificial arterial blood pressure artifact models and an evaluation of a robust blood pressure and heart rate estimator

    PubMed Central

    Li, Qiao; Mark, Roger G; Clifford, Gari D

    2009-01-01

    Background Within the intensive care unit (ICU), arterial blood pressure (ABP) is typically recorded at different (and sometimes uneven) sampling frequencies, and from different sensors, and is often corrupted by different artifacts and noise which are often non-Gaussian, nonlinear and nonstationary. Extracting robust parameters from such signals, and providing confidences in the estimates is therefore difficult and requires an adaptive filtering approach which accounts for artifact types. Methods Using a large ICU database, and over 6000 hours of simultaneously acquired electrocardiogram (ECG) and ABP waveforms sampled at 125 Hz from a 437 patient subset, we documented six general types of ABP artifact. We describe a new ABP signal quality index (SQI), based upon the combination of two previously reported signal quality measures weighted together. One index measures morphological normality, and the other degradation due to noise. After extracting a 6084-hour subset of clean data using our SQI, we evaluated a new robust tracking algorithm for estimating blood pressure and heart rate (HR) based upon a Kalman Filter (KF) with an update sequence modified by the KF innovation sequence and the value of the SQI. In order to do this, we have created six novel models of different categories of artifacts that we have identified in our ABP waveform data. These artifact models were then injected into clean ABP waveforms in a controlled manner. Clinical blood pressure (systolic, mean and diastolic) estimates were then made from the ABP waveforms for both clean and corrupted data. The mean absolute error for systolic, mean and diastolic blood pressure was then calculated for different levels of artifact pollution to provide estimates of expected errors given a single value of the SQI. Results Our artifact models demonstrate that artifact types have differing effects on systolic, diastolic and mean ABP estimates. We show that, for most artifact types, diastolic ABP estimates are less noise-sensitive than mean ABP estimates, which in turn are more robust than systolic ABP estimates. We also show that our SQI can provide error bounds for both HR and ABP estimates. Conclusion The KF/SQI-fusion method described in this article was shown to provide an accurate estimate of blood pressure and HR derived from the ABP waveform even in the presence of high levels of persistent noise and artifact, and during extreme bradycardia and tachycardia. Differences in error between artifact types, measurement sensors and the quality of the source signal can be factored into physiological estimation using an unbiased adaptive filter, signal innovation and signal quality measures. PMID:19586547

  9. Grip Strength as an Indicator of Health-Related Quality of Life in Old Age-A Pilot Study.

    PubMed

    Musalek, Christina; Kirchengast, Sylvia

    2017-11-24

    Over the last century life expectancy has increased dramatically nearly all over the world. This dramatic absolute and relative increase of the old aged people component of the population has influenced not only population structure but also has dramatic implications for the individuals and public health services. The aim of the present pilot study was to examine the impact of physical well-being assessed by hand grip strength and social factors estimated by social contact frequency on health-related quality of life among 22 men and 41 women ranging in age between 60 and 94 years. Physical well-being was estimated by hand grip strength, data concerning subjective wellbeing and health related quality of life were collected by personal interviews based on the WHOQOL-BREF questionnaires. Number of offspring and intergenerational contacts were not related significantly to health-related quality of life, while social contacts with non-relatives and hand grip strength in contrast had a significant positive impact on health related quality of life among old aged men and women. Physical well-being and in particular muscle strength-estimated by grip strength-may increase health-related quality of life and is therefore an important source for well-being during old age. Grip strength may be used as an indicator of health-related quality of life.

  10. A no-reference bitstream-based perceptual model for video quality estimation of videos affected by coding artifacts and packet losses

    NASA Astrophysics Data System (ADS)

    Pandremmenou, K.; Shahid, M.; Kondi, L. P.; Lövström, B.

    2015-03-01

    In this work, we propose a No-Reference (NR) bitstream-based model for predicting the quality of H.264/AVC video sequences, affected by both compression artifacts and transmission impairments. The proposed model is based on a feature extraction procedure, where a large number of features are calculated from the packet-loss impaired bitstream. Many of the features are firstly proposed in this work, and the specific set of the features as a whole is applied for the first time for making NR video quality predictions. All feature observations are taken as input to the Least Absolute Shrinkage and Selection Operator (LASSO) regression method. LASSO indicates the most important features, and using only them, it is possible to estimate the Mean Opinion Score (MOS) with high accuracy. Indicatively, we point out that only 13 features are able to produce a Pearson Correlation Coefficient of 0.92 with the MOS. Interestingly, the performance statistics we computed in order to assess our method for predicting the Structural Similarity Index and the Video Quality Metric are equally good. Thus, the obtained experimental results verified the suitability of the features selected by LASSO as well as the ability of LASSO in making accurate predictions through sparse modeling.

  11. Noise Estimation and Quality Assessment of Gaussian Noise Corrupted Images

    NASA Astrophysics Data System (ADS)

    Kamble, V. M.; Bhurchandi, K.

    2018-03-01

    Evaluating the exact quantity of noise present in an image and quality of an image in the absence of reference image is a challenging task. We propose a near perfect noise estimation method and a no reference image quality assessment method for images corrupted by Gaussian noise. The proposed methods obtain initial estimate of noise standard deviation present in an image using the median of wavelet transform coefficients and then obtains a near to exact estimate using curve fitting. The proposed noise estimation method provides the estimate of noise within average error of +/-4%. For quality assessment, this noise estimate is mapped to fit the Differential Mean Opinion Score (DMOS) using a nonlinear function. The proposed methods require minimum training and yields the noise estimate and image quality score. Images from Laboratory for image and Video Processing (LIVE) database and Computational Perception and Image Quality (CSIQ) database are used for validation of the proposed quality assessment method. Experimental results show that the performance of proposed quality assessment method is at par with the existing no reference image quality assessment metric for Gaussian noise corrupted images.

  12. Global assessment of exposure to faecal contamination through drinking water based on a systematic review.

    PubMed

    Bain, Robert; Cronk, Ryan; Hossain, Rifat; Bonjour, Sophie; Onda, Kyle; Wright, Jim; Yang, Hong; Slaymaker, Tom; Hunter, Paul; Prüss-Ustün, Annette; Bartram, Jamie

    2014-08-01

    To estimate exposure to faecal contamination through drinking water as indicated by levels of Escherichia coli (E. coli) or thermotolerant coliform (TTC) in water sources. We estimated coverage of different types of drinking water source based on household surveys and censuses using multilevel modelling. Coverage data were combined with water quality studies that assessed E. coli or TTC including those identified by a systematic review (n = 345). Predictive models for the presence and level of contamination of drinking water sources were developed using random effects logistic regression and selected covariates. We assessed sensitivity of estimated exposure to study quality, indicator bacteria and separately considered nationally randomised surveys. We estimate that 1.8 billion people globally use a source of drinking water which suffers from faecal contamination, of these 1.1 billion drink water that is of at least 'moderate' risk (>10 E. coli or TTC per 100 ml). Data from nationally randomised studies suggest that 10% of improved sources may be 'high' risk, containing at least 100 E. coli or TTC per 100 ml. Drinking water is found to be more often contaminated in rural areas (41%, CI: 31%-51%) than in urban areas (12%, CI: 8-18%), and contamination is most prevalent in Africa (53%, CI: 42%-63%) and South-East Asia (35%, CI: 24%-45%). Estimates were not sensitive to the exclusion of low quality studies or restriction to studies reporting E. coli. Microbial contamination is widespread and affects all water source types, including piped supplies. Global burden of disease estimates may have substantially understated the disease burden associated with inadequate water services. © 2014 The Authors. Tropical Medicine and International Health published by John Wiley & Sons Ltd.

  13. Global assessment of exposure to faecal contamination through drinking water based on a systematic review

    PubMed Central

    Bain, Robert; Cronk, Ryan; Hossain, Rifat; Bonjour, Sophie; Onda, Kyle; Wright, Jim; Yang, Hong; Slaymaker, Tom; Hunter, Paul; Prüss-Ustün, Annette; Bartram, Jamie

    2014-01-01

    Objectives To estimate exposure to faecal contamination through drinking water as indicated by levels of Escherichia coli (E. coli) or thermotolerant coliform (TTC) in water sources. Methods We estimated coverage of different types of drinking water source based on household surveys and censuses using multilevel modelling. Coverage data were combined with water quality studies that assessed E. coli or TTC including those identified by a systematic review (n = 345). Predictive models for the presence and level of contamination of drinking water sources were developed using random effects logistic regression and selected covariates. We assessed sensitivity of estimated exposure to study quality, indicator bacteria and separately considered nationally randomised surveys. Results We estimate that 1.8 billion people globally use a source of drinking water which suffers from faecal contamination, of these 1.1 billion drink water that is of at least ‘moderate’ risk (>10 E. coli or TTC per 100 ml). Data from nationally randomised studies suggest that 10% of improved sources may be ‘high’ risk, containing at least 100 E. coli or TTC per 100 ml. Drinking water is found to be more often contaminated in rural areas (41%, CI: 31%–51%) than in urban areas (12%, CI: 8–18%), and contamination is most prevalent in Africa (53%, CI: 42%–63%) and South-East Asia (35%, CI: 24%–45%). Estimates were not sensitive to the exclusion of low quality studies or restriction to studies reporting E. coli. Conclusions Microbial contamination is widespread and affects all water source types, including piped supplies. Global burden of disease estimates may have substantially understated the disease burden associated with inadequate water services. PMID:24811893

  14. An improved global wind resource estimate for integrated assessment models

    DOE PAGES

    Eurek, Kelly; Sullivan, Patrick; Gleason, Michael; ...

    2017-11-25

    This study summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquelymore » detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km 2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.« less

  15. An improved global wind resource estimate for integrated assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eurek, Kelly; Sullivan, Patrick; Gleason, Michael

    This study summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquelymore » detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km 2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.« less

  16. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    NASA Astrophysics Data System (ADS)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and prior image penalized-likelihood estimation with rigid registration of a prior image (PIRPLE) over a wide range of sampling sparsity and exposure levels.

  17. A preliminary test of estimating forest site quality using species composition in a southern Appalachian watershed

    Treesearch

    W. Henry McNab; David L. Loftis

    2013-01-01

    Characteristic arborescent communities of mesophytic or xerophytic species have long been recognized as indicative of forest site quality in the Southern Appalachians, where soil moisture availability is the primary environmental variable affecting productivity. But, a workable quantitative system of site classification based on species composition is not available. We...

  18. INTEGRATION OF SATELLITE, MODELED, AND GROUND BASED AEROSOL DATA FOR USE IN AIR QUALITY AND PUBLIC HEALTH APPLICATIONS ( AGU-BALTIMORE )

    EPA Science Inventory

    Within the next several years NOAA and EPA will begin to issue PM2.5 air quality forecasts over the entire domain of the eastern United States, eventually extending to national coverage. These forecasts will provide continuous estimated values of particulate matter on ...

  19. Improving coarse woody debris measurements: a taper-based technique

    Treesearch

    Christopher W. Woodall; James A. Westfall

    2007-01-01

    Coarse woody debris (CWD) are dead and down trees of a certain minimum size that are an important forest ecosystem component (e.g., wildlife habitat, carbon stocks, and fuels). Accurately measuring the dimensions of CWD is important for ensuring the quality of CWD estimates and hence for accurately assessing forest ecosystem attributes. To improve the quality of CWD...

  20. Hybrid Air Quality Modeling Approach For Use in the Near ...

    EPA Pesticide Factsheets

    The Near-road EXposures to Urban air pollutant Study (NEXUS) investigated whether children with asthma living in close proximity to major roadways in Detroit, MI, (particularly near roadways with high diesel traffic) have greater health impacts associated with exposure to air pollutants than those living farther away. A major challenge in such health and exposure studies is the lack of information regarding pollutant exposure characterization. Air quality modeling can provide spatially and temporally varying exposure estimates for examining relationships between traffic-related air pollutants and adverse health outcomes. This paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associated with local variations of emissions and meteorology, were estimated using a combination of the AERMOD and R-LINE dispersion models, local emission source information from the National Emissions Inventory, detailed road network locations and traffic activity, and meteorological data from the Detroit City Airport. The regional background contribution was estimated using a combination of the Community Multiscale Air Quality (CMAQ) model and the Space/Time Ordinary Kriging (STOK) model. To capture the near-road pollutant gradients, refined “mini-grids” of model recep

  1. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    USGS Publications Warehouse

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  2. Water Quality Conditions in the Missouri River Mainstem System. 2010 Report

    DTIC Science & Technology

    2011-11-01

    year period 2006 through 2010. .......................................... 191 Plate 25. Estimated biomass, number of species , and percent...site GARLK1512DW during 2010. ............................................................... 264 Plate 98. Estimated biomass, number of species ...of species , and percent composition (based on biomass) by taxonomic grouping for zooplankton tow samples collected in Lake Oahe at Sites OAHLK1073A

  3. Determining the accuracy of maximum likelihood parameter estimates with colored residuals

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Klein, Vladislav

    1994-01-01

    An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.

  4. Input-output model for MACCS nuclear accident impacts estimation¹

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less

  5. Quantifying rainfall-derived inflow and infiltration in sanitary sewer systems based on conductivity monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Mingkai; Liu, Yanchen; Cheng, Xun; Zhu, David Z.; Shi, Hanchang; Yuan, Zhiguo

    2018-03-01

    Quantifying rainfall-derived inflow and infiltration (RDII) in a sanitary sewer is difficult when RDII and overflow occur simultaneously. This study proposes a novel conductivity-based method for estimating RDII. The method separately decomposes rainfall-derived inflow (RDI) and rainfall-induced infiltration (RII) on the basis of conductivity data. Fast Fourier transform was adopted to analyze variations in the flow and water quality during dry weather. Nonlinear curve fitting based on the least squares algorithm was used to optimize parameters in the proposed RDII model. The method was successfully applied to real-life case studies, in which inflow and infiltration were successfully estimated for three typical rainfall events with total rainfall volumes of 6.25 mm (light), 28.15 mm (medium), and 178 mm (heavy). Uncertainties of model parameters were estimated using the generalized likelihood uncertainty estimation (GLUE) method and were found to be acceptable. Compared with traditional flow-based methods, the proposed approach exhibits distinct advantages in estimating RDII and overflow, particularly when the two processes happen simultaneously.

  6. Estimating Escherichia coli loads in streams based on various physical, chemical, and biological factors

    PubMed Central

    Dwivedi, Dipankar; Mohanty, Binayak P.; Lesikar, Bruce J.

    2013-01-01

    Microbes have been identified as a major contaminant of water resources. Escherichia coli (E. coli) is a commonly used indicator organism. It is well recognized that the fate of E. coli in surface water systems is governed by multiple physical, chemical, and biological factors. The aim of this work is to provide insight into the physical, chemical, and biological factors along with their interactions that are critical in the estimation of E. coli loads in surface streams. There are various models to predict E. coli loads in streams, but they tend to be system or site specific or overly complex without enhancing our understanding of these factors. Hence, based on available data, a Bayesian Neural Network (BNN) is presented for estimating E. coli loads based on physical, chemical, and biological factors in streams. The BNN has the dual advantage of overcoming the absence of quality data (with regards to consistency in data) and determination of mechanistic model parameters by employing a probabilistic framework. This study evaluates whether the BNN model can be an effective alternative tool to mechanistic models for E. coli loads estimation in streams. For this purpose, a comparison with a traditional model (LOADEST, USGS) is conducted. The models are compared for estimated E. coli loads based on available water quality data in Plum Creek, Texas. All the model efficiency measures suggest that overall E. coli loads estimations by the BNN model are better than the E. coli loads estimations by the LOADEST model on all the three occasions (three-fold cross validation). Thirteen factors were used for estimating E. coli loads with the exhaustive feature selection technique, which indicated that six of thirteen factors are important for estimating E. coli loads. Physical factors included temperature and dissolved oxygen; chemical factors include phosphate and ammonia; biological factors include suspended solids and chlorophyll. The results highlight that the LOADEST model estimates E. coli loads better in the smaller ranges, whereas the BNN model estimates E. coli loads better in the higher ranges. Hence, the BNN model can be used to design targeted monitoring programs and implement regulatory standards through TMDL programs. PMID:24511166

  7. Concept for estimating mitochondrial DNA haplogroups using a maximum likelihood approach (EMMA)☆

    PubMed Central

    Röck, Alexander W.; Dür, Arne; van Oven, Mannis; Parson, Walther

    2013-01-01

    The assignment of haplogroups to mitochondrial DNA haplotypes contributes substantial value for quality control, not only in forensic genetics but also in population and medical genetics. The availability of Phylotree, a widely accepted phylogenetic tree of human mitochondrial DNA lineages, led to the development of several (semi-)automated software solutions for haplogrouping. However, currently existing haplogrouping tools only make use of haplogroup-defining mutations, whereas private mutations (beyond the haplogroup level) can be additionally informative allowing for enhanced haplogroup assignment. This is especially relevant in the case of (partial) control region sequences, which are mainly used in forensics. The present study makes three major contributions toward a more reliable, semi-automated estimation of mitochondrial haplogroups. First, a quality-controlled database consisting of 14,990 full mtGenomes downloaded from GenBank was compiled. Together with Phylotree, these mtGenomes serve as a reference database for haplogroup estimates. Second, the concept of fluctuation rates, i.e. a maximum likelihood estimation of the stability of mutations based on 19,171 full control region haplotypes for which raw lane data is available, is presented. Finally, an algorithm for estimating the haplogroup of an mtDNA sequence based on the combined database of full mtGenomes and Phylotree, which also incorporates the empirically determined fluctuation rates, is brought forward. On the basis of examples from the literature and EMPOP, the algorithm is not only validated, but both the strength of this approach and its utility for quality control of mitochondrial haplotypes is also demonstrated. PMID:23948335

  8. Assessing quality of life in a clinical study on heart rehabilitation patients: how well do value sets based on given or experienced health states reflect patients' valuations?

    PubMed

    Leidl, Reiner; Schweikert, Bernd; Hahmann, Harry; Steinacker, Juergen M; Reitmeir, Peter

    2016-03-22

    Quality of life as an endpoint in a clinical study may be sensitive to the value set used to derive a single score. Focusing on patients' actual valuations in a clinical study, we compare different value sets for the EQ-5D-3L and assess how well they reproduce patients' reported results. A clinical study comparing inpatient (n = 98) and outpatient (n = 47) rehabilitation of patients after an acute coronary event is re-analyzed. Value sets include: 1. Given health states and time-trade-off valuation (GHS-TTO) rendering economic utilities; 2. Experienced health states and valuation by visual analog scale (EHS-VAS). Valuations are compared with patient-reported VAS rating. Accuracy is assessed by mean absolute error (MAE) and by Pearson's correlation ρ. External validity is tested by correlation with established MacNew global scores. Drivers of differences between value sets and VAS are analyzed using repeated measures regression. EHS-VAS had smaller MAEs and higher ρ in all patients and in the inpatient group, and correlated best with MacNew global score. Quality-adjusted survival was more accurately reflected by EHS-VAS. Younger, better educated patients reported lower VAS at admission than the EHS-based value set. EHS-based estimates were mostly able to reproduce patient-reported valuation. Economic utility measurement is conceptually different, produced results less strongly related to patients' reports, and resulted in about 20 % longer quality-adjusted survival. Decision makers should take into account the impact of choosing value sets on effectiveness results. For transferring the results of heart rehabilitation patients from another country or from another valuation method, the EHS-based value set offers a promising estimation option for those decision makers who prioritize patient-reported valuation. Yet, EHS-based estimates may not fully reflect patient-reported VAS in all situations.

  9. Comparison of Kasai Autocorrelation and Maximum Likelihood Estimators for Doppler Optical Coherence Tomography

    PubMed Central

    Chan, Aaron C.; Srinivasan, Vivek J.

    2013-01-01

    In optical coherence tomography (OCT) and ultrasound, unbiased Doppler frequency estimators with low variance are desirable for blood velocity estimation. Hardware improvements in OCT mean that ever higher acquisition rates are possible, which should also, in principle, improve estimation performance. Paradoxically, however, the widely used Kasai autocorrelation estimator’s performance worsens with increasing acquisition rate. We propose that parametric estimators based on accurate models of noise statistics can offer better performance. We derive a maximum likelihood estimator (MLE) based on a simple additive white Gaussian noise model, and show that it can outperform the Kasai autocorrelation estimator. In addition, we also derive the Cramer Rao lower bound (CRLB), and show that the variance of the MLE approaches the CRLB for moderate data lengths and noise levels. We note that the MLE performance improves with longer acquisition time, and remains constant or improves with higher acquisition rates. These qualities may make it a preferred technique as OCT imaging speed continues to improve. Finally, our work motivates the development of more general parametric estimators based on statistical models of decorrelation noise. PMID:23446044

  10. Mechanistic Sediment Quality Guidelines Based on Contaminant Bioavailability: Equilibrium Partitioning Sediment Benchmarks

    EPA Science Inventory

    Globally, billions of metric tons of contaminated sediments are present in aquatic systems representing a potentially significant ecological risk. Estimated costs to manage (i.e., remediate and monitor) these sediments are in the billions of U.S. dollars. Biologically-based app...

  11. Three-dimensional holoscopic image coding scheme using high-efficiency video coding with kernel-based minimum mean-square-error estimation

    NASA Astrophysics Data System (ADS)

    Liu, Deyang; An, Ping; Ma, Ran; Yang, Chao; Shen, Liquan; Li, Kai

    2016-07-01

    Three-dimensional (3-D) holoscopic imaging, also known as integral imaging, light field imaging, or plenoptic imaging, can provide natural and fatigue-free 3-D visualization. However, a large amount of data is required to represent the 3-D holoscopic content. Therefore, efficient coding schemes for this particular type of image are needed. A 3-D holoscopic image coding scheme with kernel-based minimum mean square error (MMSE) estimation is proposed. In the proposed scheme, the coding block is predicted by an MMSE estimator under statistical modeling. In order to obtain the signal statistical behavior, kernel density estimation (KDE) is utilized to estimate the probability density function of the statistical modeling. As bandwidth estimation (BE) is a key issue in the KDE problem, we also propose a BE method based on kernel trick. The experimental results demonstrate that the proposed scheme can achieve a better rate-distortion performance and a better visual rendering quality.

  12. Why is quality estimation judgment fast? Comparison of gaze control strategies in quality and difference estimation tasks

    NASA Astrophysics Data System (ADS)

    Radun, Jenni; Leisti, Tuomas; Virtanen, Toni; Nyman, Göte; Häkkinen, Jukka

    2014-11-01

    To understand the viewing strategies employed in a quality estimation task, we compared two visual tasks-quality estimation and difference estimation. The estimation was done for a pair of natural images having small global changes in quality. Two groups of observers estimated the same set of images, but with different instructions. One group estimated the difference in quality and the other the difference between image pairs. The results demonstrated the use of different visual strategies in the tasks. The quality estimation was found to include more visual planning during the first fixation than the difference estimation, but afterward needed only a few long fixations on the semantically important areas of the image. The difference estimation used many short fixations. Salient image areas were mainly attended to when these areas were also semantically important. The results support the hypothesis that these tasks' general characteristics (evaluation time, number of fixations, area fixated on) show differences in processing, but also suggest that examining only single fixations when comparing tasks is too narrow a view. When planning a subjective experiment, one must remember that a small change in the instructions might lead to a noticeable change in viewing strategy.

  13. Evaluation of Simulation Models that Estimate the Effect of Dietary Strategies on Nutritional Intake: A Systematic Review.

    PubMed

    Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K

    2017-05-01

    Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes. Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines. Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported. Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to high. However, given the lack of dietary simulation reporting guidelines, future work could refine the quality tool to harmonize consistency in the reporting of subsequent dietary modeling studies. © 2017 American Society for Nutrition.

  14. Calibration and Limitations of the Mg II Line-based Black Hole Masses

    NASA Astrophysics Data System (ADS)

    Woo, Jong-Hak; Le, Huynh Anh N.; Karouzos, Marios; Park, Dawoo; Park, Daeseong; Malkan, Matthew A.; Treu, Tommaso; Bennert, Vardha N.

    2018-06-01

    We present single-epoch black hole mass ({M}BH}) calibrations based on the rest-frame ultraviolet (UV) and optical measurements of Mg II 2798 Å and Hβ 4861 Å lines and the active galactic nucleus (AGN) continuum, using a sample of 52 moderate-luminosity AGNs at z ∼ 0.4 and z ∼ 0.6 with high-quality Keck spectra. We combine this sample with a large number of luminous AGNs from the Sloan Digital Sky Survey to increase the dynamic range for a better comparison of UV and optical velocity and luminosity measurements. With respect to the reference {M}BH} based on the line dispersion of Hβ and continuum luminosity at 5100 Å, we calibrate the UV and optical mass estimators by determining the best-fit values of the coefficients in the mass equation. By investigating whether the UV estimators show a systematic trend with Eddington ratio, FWHM of Hβ, Fe II strength, or UV/optical slope, we find no significant bias except for the slope. By fitting the systematic difference of Mg II-based and Hβ-based masses with the L 3000/L 5100 ratio, we provide a correction term as a function of the spectral index as ΔC = 0.24 (1 + α λ ) + 0.17, which can be added to the Mg II-based mass estimators if the spectral slope can be well determined. The derived UV mass estimators typically show >∼0.2 dex intrinsic scatter with respect to the Hβ-based {M}BH}, suggesting that the UV-based mass has an additional uncertainty of ∼0.2 dex, even if high-quality rest-frame UV spectra are available.

  15. Real-time video quality monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Narvekar, Niranjan; Wang, Beibei; Ding, Ran; Zou, Dekun; Cash, Glenn; Bhagavathy, Sitaram; Bloom, Jeffrey

    2011-12-01

    The ITU-T Recommendation G.1070 is a standardized opinion model for video telephony applications that uses video bitrate, frame rate, and packet-loss rate to measure the video quality. However, this model was original designed as an offline quality planning tool. It cannot be directly used for quality monitoring since the above three input parameters are not readily available within a network or at the decoder. And there is a great room for the performance improvement of this quality metric. In this article, we present a real-time video quality monitoring solution based on this Recommendation. We first propose a scheme to efficiently estimate the three parameters from video bitstreams, so that it can be used as a real-time video quality monitoring tool. Furthermore, an enhanced algorithm based on the G.1070 model that provides more accurate quality prediction is proposed. Finally, to use this metric in real-world applications, we present an example emerging application of real-time quality measurement to the management of transmitted videos, especially those delivered to mobile devices.

  16. Impact of work-related cancers in Taiwan-Estimation with QALY (quality-adjusted life year) and healthcare costs.

    PubMed

    Lee, Lukas Jyuhn-Hsiarn; Lin, Cheng-Kuan; Hung, Mei-Chuan; Wang, Jung-Der

    2016-12-01

    This study estimates the annual numbers of eight work-related cancers, total losses of quality-adjusted life years (QALYs), and lifetime healthcare expenditures that possibly could be saved by improving occupational health in Taiwan. Three databases were interlinked: the Taiwan Cancer Registry, the National Mortality Registry, and the National Health Insurance Research Database. Annual numbers of work-related cancers were estimated based on attributable fractions (AFs) abstracted from a literature review. The survival functions for eight cancers were estimated and extrapolated to lifetime using a semi-parametric method. A convenience sample of 8846 measurements of patients' quality of life with EQ-5D was collected for utility values and multiplied by survival functions to estimate quality-adjusted life expectancies (QALEs). The loss-of-QALE was obtained by subtracting the QALE of cancer from age- and sex-matched referents simulated from national vital statistics. The lifetime healthcare expenditures were estimated by multiplying the survival probability with mean monthly costs paid by the National Health Insurance for cancer diagnosis and treatment and summing this for the expected lifetime. A total of 3010 males and 726 females with eight work-related cancers were estimated in 2010. Among them, lung cancer ranked first in terms of QALY loss, with an annual total loss-of-QALE of 28,463 QALYs and total lifetime healthcare expenditures of US$36.6 million. Successful prevention of eight work-related cancers would not only avoid the occurrence of 3736 cases of cancer, but would also save more than US$70 million in healthcare costs and 46,750 QALYs for the Taiwan society in 2010.

  17. ViVaMBC: estimating viral sequence variation in complex populations from illumina deep-sequencing data using model-based clustering.

    PubMed

    Verbist, Bie; Clement, Lieven; Reumers, Joke; Thys, Kim; Vapirev, Alexander; Talloen, Willem; Wetzels, Yves; Meys, Joris; Aerssens, Jeroen; Bijnens, Luc; Thas, Olivier

    2015-02-22

    Deep-sequencing allows for an in-depth characterization of sequence variation in complex populations. However, technology associated errors may impede a powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores which are derived from a quadruplet of intensities, one channel for each nucleotide type for Illumina sequencing. The highest intensity of the four channels determines the base that is called. Mismatch bases can often be corrected by the second best base, i.e. the base with the second highest intensity in the quadruplet. A virus variant model-based clustering method, ViVaMBC, is presented that explores quality scores and second best base calls for identifying and quantifying viral variants. ViVaMBC is optimized to call variants at the codon level (nucleotide triplets) which enables immediate biological interpretation of the variants with respect to their antiviral drug responses. Using mixtures of HCV plasmids we show that our method accurately estimates frequencies down to 0.5%. The estimates are unbiased when average coverages of 25,000 are reached. A comparison with the SNP-callers V-Phaser2, ShoRAH, and LoFreq shows that ViVaMBC has a superb sensitivity and specificity for variants with frequencies above 0.4%. Unlike the competitors, ViVaMBC reports a higher number of false-positive findings with frequencies below 0.4% which might partially originate from picking up artificial variants introduced by errors in the sample and library preparation step. ViVaMBC is the first method to call viral variants directly at the codon level. The strength of the approach lies in modeling the error probabilities based on the quality scores. Although the use of second best base calls appeared very promising in our data exploration phase, their utility was limited. They provided a slight increase in sensitivity, which however does not warrant the additional computational cost of running the offline base caller. Apparently a lot of information is already contained in the quality scores enabling the model based clustering procedure to adjust the majority of the sequencing errors. Overall the sensitivity of ViVaMBC is such that technical constraints like PCR errors start to form the bottleneck for low frequency variant detection.

  18. UAV remote sensing atmospheric degradation image restoration based on multiple scattering APSF estimation

    NASA Astrophysics Data System (ADS)

    Qiu, Xiang; Dai, Ming; Yin, Chuan-li

    2017-09-01

    Unmanned aerial vehicle (UAV) remote imaging is affected by the bad weather, and the obtained images have the disadvantages of low contrast, complex texture and blurring. In this paper, we propose a blind deconvolution model based on multiple scattering atmosphere point spread function (APSF) estimation to recovery the remote sensing image. According to Narasimhan analytical theory, a new multiple scattering restoration model is established based on the improved dichromatic model. Then using the L0 norm sparse priors of gradient and dark channel to estimate APSF blur kernel, the fast Fourier transform is used to recover the original clear image by Wiener filtering. By comparing with other state-of-the-art methods, the proposed method can correctly estimate blur kernel, effectively remove the atmospheric degradation phenomena, preserve image detail information and increase the quality evaluation indexes.

  19. Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    PubMed

    Brown, Adrian P; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Boyd, James H

    2017-07-10

    Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets.

  20. Validity test and its consistency in the construction of patient loyalty model

    NASA Astrophysics Data System (ADS)

    Yanuar, Ferra

    2016-04-01

    The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.

  1. Quality of life and patient preferences: identification of subgroups of multiple sclerosis patients.

    PubMed

    Rosato, Rosalba; Testa, Silvia; Oggero, Alessandra; Molinengo, Giorgia; Bertolotto, Antonio

    2015-09-01

    The aim of this study was to estimate preferences related to quality of life attributes in people with multiple sclerosis, by keeping heterogeneity of patient preference in mind, using the latent class approach. A discrete choice experiment survey was developed using the following attributes: activities of daily living, instrumental activities of daily living, pain/fatigue, anxiety/depression and attention/concentration. Choice sets were presented as pairs of hypothetical health status, based upon a fractional factorial design. The latent class logit model estimated on 152 patients identified three subpopulations, which, respectively, attached more importance to: (1) the physical dimension; (2) pain/fatigue and anxiety/depression; and (3) instrumental activities of daily living impairments, anxiety/depression and attention/concentration. A posterior analysis suggests that the latent class membership may be related to an individual's age to some extent, or to diagnosis and treatment, while apart from energy dimension, no significant difference exists between latent groups, with regard to Multiple Sclerosis Quality of Life-54 scales. A quality of life preference-based utility measure for people with multiple sclerosis was developed. These utility values allow identification of a hierarchic priority among different aspects of quality of life and may allow physicians to develop a care programme tailored to patient needs.

  2. Optimal threshold estimator of a prognostic marker by maximizing a time-dependent expected utility function for a patient-centered stratified medicine.

    PubMed

    Dantan, Etienne; Foucher, Yohann; Lorent, Marine; Giral, Magali; Tessier, Philippe

    2018-06-01

    Defining thresholds of prognostic markers is essential for stratified medicine. Such thresholds are mostly estimated from purely statistical measures regardless of patient preferences potentially leading to unacceptable medical decisions. Quality-Adjusted Life-Years are a widely used preferences-based measure of health outcomes. We develop a time-dependent Quality-Adjusted Life-Years-based expected utility function for censored data that should be maximized to estimate an optimal threshold. We performed a simulation study to compare estimated thresholds when using the proposed expected utility approach and purely statistical estimators. Two applications illustrate the usefulness of the proposed methodology which was implemented in the R package ROCt ( www.divat.fr ). First, by reanalysing data of a randomized clinical trial comparing the efficacy of prednisone vs. placebo in patients with chronic liver cirrhosis, we demonstrate the utility of treating patients with a prothrombin level higher than 89%. Second, we reanalyze the data of an observational cohort of kidney transplant recipients: we conclude to the uselessness of the Kidney Transplant Failure Score to adapt the frequency of clinical visits. Applying such a patient-centered methodology may improve future transfer of novel prognostic scoring systems or markers in clinical practice.

  3. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    PubMed

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  4. Using Deep Learning for Tropical Cyclone Intensity Estimation

    NASA Astrophysics Data System (ADS)

    Miller, J.; Maskey, M.; Berendes, T.

    2017-12-01

    Satellite-based techniques are the primary approach to estimating tropical cyclone (TC) intensity. Tropical cyclone warning centers worldwide still apply variants of the Dvorak technique for such estimations that include visual inspection of the satellite images. The National Hurricane Center (NHC) estimates about 10-20% uncertainty in its post analyses when only satellite-based estimates are available. The success of the Dvorak technique proves that spatial patterns in infrared (IR) imagery strongly relate to TC intensity. With the ever-increasing quality and quantity of satellite observations of TCs, deep learning techniques designed to excel at pattern recognition have become more relevant in this area of study. In our current study, we aim to provide a fully objective approach to TC intensity estimation by utilizing deep learning in the form of a convolutional neural network trained to predict TC intensity (maximum sustained wind speed) using IR satellite imagery. Large amounts of training data are needed to train a convolutional neural network, so we use GOES IR images from historical tropical storms from the Atlantic and Pacific basins spanning years 2000 to 2015. Images are labeled using a special subset of the HURDAT2 dataset restricted to time periods with airborne reconnaissance data available in order to improve the quality of the HURDAT2 data. Results and the advantages of this technique are to be discussed.

  5. Telemedicine-based system for quality management and peer review in radiology.

    PubMed

    Morozov, Sergey; Guseva, Ekaterina; Ledikhova, Natalya; Vladzymyrskyy, Anton; Safronov, Dmitry

    2018-06-01

    Quality assurance is the key component of modern radiology. A telemedicine-based quality assurance system helps to overcome the "scoring" approach and makes the quality control more accessible and objective. A concept for quality assurance in radiology is developed. Its realization is a set of strategies, actions, and tools. The latter is based on telemedicine-based peer review of 23,199 computed tomography (CT) and magnetic resonance imaging (MRI) images. The conception of the system for quality management in radiology represents a chain of actions: "discrepancies evaluation - routine support - quality improvement activity - discrepancies evaluation". It is realized by an audit methodology, telemedicine, elearning, and other technologies. After a year of systemic telemedicine-based peer reviews, the authors have estimated that clinically significant discrepancies were detected in 6% of all cases, while clinically insignificant ones were found in 19% of cases. Most often, problems appear in musculoskeletal records; 80% of the examinations have diagnostic or technical imperfections. The presence of routine telemedicine support and personalized elearning allowed improving the diagnostics quality. The level of discrepancies has decreased significantly (p < 0.05). The telemedicine-based peer review system allows improving radiology departments' network effectiveness. • "Scoring" approach to radiologists' performance assessment must be changed. • Telemedicine peer review and personalized elearning significantly decrease the number of discrepancies. • Teleradiology allows linking all primary-level hospitals to a common peer review network.

  6. Comparison of Soil Quality Index Using Three Methods

    PubMed Central

    Mukherjee, Atanu; Lal, Rattan

    2014-01-01

    Assessment of management-induced changes in soil quality is important to sustaining high crop yield. A large diversity of cultivated soils necessitate identification development of an appropriate soil quality index (SQI) based on relative soil properties and crop yield. Whereas numerous attempts have been made to estimate SQI for major soils across the World, there is no standard method established and thus, a strong need exists for developing a user-friendly and credible SQI through comparison of various available methods. Therefore, the objective of this article is to compare three widely used methods to estimate SQI using the data collected from 72 soil samples from three on-farm study sites in Ohio. Additionally, challenge lies in establishing a correlation between crop yield versus SQI calculated either depth wise or in combination of soil layers as standard methodology is not yet available and was not given much attention to date. Predominant soils of the study included one organic (Mc), and two mineral (CrB, Ko) soils. Three methods used to estimate SQI were: (i) simple additive SQI (SQI-1), (ii) weighted additive SQI (SQI-2), and (iii) statistically modeled SQI (SQI-3) based on principal component analysis (PCA). The SQI varied between treatments and soil types and ranged between 0–0.9 (1 being the maximum SQI). In general, SQIs did not significantly differ at depths under any method suggesting that soil quality did not significantly differ for different depths at the studied sites. Additionally, data indicate that SQI-3 was most strongly correlated with crop yield, the correlation coefficient ranged between 0.74–0.78. All three SQIs were significantly correlated (r = 0.92–0.97) to each other and with crop yield (r = 0.65–0.79). Separate analyses by crop variety revealed that correlation was low indicating that some key aspects of soil quality related to crop response are important requirements for estimating SQI. PMID:25148036

  7. Estimation and control of droplet size and frequency in projected spray mode of a gas metal arc welding (GMAW) process.

    PubMed

    Anzehaee, Mohammad Mousavi; Haeri, Mohammad

    2011-07-01

    New estimators are designed based on the modified force balance model to estimate the detaching droplet size, detached droplet size, and mean value of droplet detachment frequency in a gas metal arc welding process. The proper droplet size for the process to be in the projected spray transfer mode is determined based on the modified force balance model and the designed estimators. Finally, the droplet size and the melting rate are controlled using two proportional-integral (PI) controllers to achieve high weld quality by retaining the transfer mode and generating appropriate signals as inputs of the weld geometry control loop. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Effective deep learning training for single-image super-resolution in endomicroscopy exploiting video-registration-based reconstruction.

    PubMed

    Ravì, Daniele; Szczotka, Agnieszka Barbara; Shakir, Dzhoshkun Ismail; Pereira, Stephen P; Vercauteren, Tom

    2018-06-01

    Probe-based confocal laser endomicroscopy (pCLE) is a recent imaging modality that allows performing in vivo optical biopsies. The design of pCLE hardware, and its reliance on an optical fibre bundle, fundamentally limits the image quality with a few tens of thousands fibres, each acting as the equivalent of a single-pixel detector, assembled into a single fibre bundle. Video registration techniques can be used to estimate high-resolution (HR) images by exploiting the temporal information contained in a sequence of low-resolution (LR) images. However, the alignment of LR frames, required for the fusion, is computationally demanding and prone to artefacts. In this work, we propose a novel synthetic data generation approach to train exemplar-based Deep Neural Networks (DNNs). HR pCLE images with enhanced quality are recovered by the models trained on pairs of estimated HR images (generated by the video registration algorithm) and realistic synthetic LR images. Performance of three different state-of-the-art DNNs techniques were analysed on a Smart Atlas database of 8806 images from 238 pCLE video sequences. The results were validated through an extensive image quality assessment that takes into account different quality scores, including a Mean Opinion Score (MOS). Results indicate that the proposed solution produces an effective improvement in the quality of the obtained reconstructed image. The proposed training strategy and associated DNNs allows us to perform convincing super-resolution of pCLE images.

  9. Identifying Optimal Temporal Scale for the Correlation of AOD and Ground Measurements of PM2.5 to Improve the Modeling Performance in a Real-Time Air Quality Estimation System

    NASA Technical Reports Server (NTRS)

    Li,Hui; Faruque, Fazlay; Williams, Worth; Al-Hamdan, Mohammad; Luvall, Jeffrey; Crosson, William; Rickman, Douglas; Limaye, Ashutosh

    2008-01-01

    Aerosol optical depth (AOD), derived from satellite measurements using Moderate Resolution Imaging Spectrometer (MODIS), offers indirect estimates of particle matter. Research shows a significant positive correlation between satellite-based measurements of AOD and ground-based measurements of particulate matter with aerodynamic diameter less than or equal to 2.5 micrometers (PM2.5). In addition, satellite observations have also shown great promise in improving estimates of PM2.5 air quality surface. Research shows that correlations between AOD and ground PM2.5 are affected by a combination of many factors such as inherent characteristics of satellite observations, terrain, cloud cover, height of the mixing layer, and weather conditions, and thus might vary widely in different regions, different seasons, and even different days in a same location. Analysis of correlating AOD with ground measured PM2.5 on a day-to-day basis suggests the temporal scale, a number of immediate latest days for a given run's day, for their correlations needs to be considered to improve air quality surface estimates, especially when satellite observations are used in a real-time pollution system. The second reason is that correlation coefficients between AOD and ground PM2.5 cannot be predetermined and needs to be calculated for each day's run for a real-time system because the coefficients can vary over space and time. Few studies have been conducted to explore the optimal way to apply AOD data to improve model accuracies of PM2.5 surface estimation in a real-time air quality system. This paper discusses the best temporal scale to calculate the correlation of AOD and ground particle matter data to improve the results of pollution models in real-time system.

  10. Pragmatic estimation of a spatio-temporal air quality model with irregular monitoring data

    NASA Astrophysics Data System (ADS)

    Sampson, Paul D.; Szpiro, Adam A.; Sheppard, Lianne; Lindström, Johan; Kaufman, Joel D.

    2011-11-01

    Statistical analyses of health effects of air pollution have increasingly used GIS-based covariates for prediction of ambient air quality in "land use" regression models. More recently these spatial regression models have accounted for spatial correlation structure in combining monitoring data with land use covariates. We present a flexible spatio-temporal modeling framework and pragmatic, multi-step estimation procedure that accommodates essentially arbitrary patterns of missing data with respect to an ideally complete space by time matrix of observations on a network of monitoring sites. The methodology incorporates a model for smooth temporal trends with coefficients varying in space according to Partial Least Squares regressions on a large set of geographic covariates and nonstationary modeling of spatio-temporal residuals from these regressions. This work was developed to provide spatial point predictions of PM 2.5 concentrations for the Multi-Ethnic Study of Atherosclerosis and Air Pollution (MESA Air) using irregular monitoring data derived from the AQS regulatory monitoring network and supplemental short-time scale monitoring campaigns conducted to better predict intra-urban variation in air quality. We demonstrate the interpretation and accuracy of this methodology in modeling data from 2000 through 2006 in six U.S. metropolitan areas and establish a basis for likelihood-based estimation.

  11. Estimation of pollutant loads considering dam operation in Han River Basin by BASINS/Hydrological Simulation Program-FORTRAN.

    PubMed

    Jung, Kwang-Wook; Yoon, Choon-G; Jang, Jae-Ho; Kong, Dong-Soo

    2008-01-01

    Effective watershed management often demands qualitative and quantitative predictions of the effect of future management activities as arguments for policy makers and administration. The BASINS geographic information system was developed to compute total maximum daily loads, which are helpful to establish hydrological process and water quality modeling system. In this paper the BASINS toolkit HSPF model is applied in 20,271 km(2) large watershed of the Han River Basin is used for applicability of HSPF and BMPs scenarios. For proper evaluation of watershed and stream water quality, comprehensive estimation methods are necessary to assess large amounts of point source and nonpoint-source (NPS) pollution based on the total watershed area. In this study, The Hydrological Simulation Program-FORTRAN (HSPF) was estimated to simulate watershed pollutant loads containing dam operation and applied BMPs scenarios for control NPS pollution. The 8-day monitoring data (about three years) were used in the calibration and verification processes. Model performance was in the range of "very good" and "good" based on percent difference. The water-quality simulation results were encouraging for this large sizable watershed with dam operation practice and mixed land uses; HSPF proved adequate, and its application is recommended to simulate watershed processes and BMPs evaluation. IWA Publishing 2008.

  12. Quality of recording of diabetes in the UK: how does the GP's method of coding clinical data affect incidence estimates? Cross-sectional study using the CPRD database.

    PubMed

    Tate, A Rosemary; Dungey, Sheena; Glew, Simon; Beloff, Natalia; Williams, Rachael; Williams, Tim

    2017-01-25

    To assess the effect of coding quality on estimates of the incidence of diabetes in the UK between 1995 and 2014. A cross-sectional analysis examining diabetes coding from 1995 to 2014 and how the choice of codes (diagnosis codes vs codes which suggest diagnosis) and quality of coding affect estimated incidence. Routine primary care data from 684 practices contributing to the UK Clinical Practice Research Datalink (data contributed from Vision (INPS) practices). Incidence rates of diabetes and how they are affected by (1) GP coding and (2) excluding 'poor' quality practices with at least 10% incident patients inaccurately coded between 2004 and 2014. Incidence rates and accuracy of coding varied widely between practices and the trends differed according to selected category of code. If diagnosis codes were used, the incidence of type 2 increased sharply until 2004 (when the UK Quality Outcomes Framework was introduced), and then flattened off, until 2009, after which they decreased. If non-diagnosis codes were included, the numbers continued to increase until 2012. Although coding quality improved over time, 15% of the 666 practices that contributed data between 2004 and 2014 were labelled 'poor' quality. When these practices were dropped from the analyses, the downward trend in the incidence of type 2 after 2009 became less marked and incidence rates were higher. In contrast to some previous reports, diabetes incidence (based on diagnostic codes) appears not to have increased since 2004 in the UK. Choice of codes can make a significant difference to incidence estimates, as can quality of recording. Codes and data quality should be checked when assessing incidence rates using GP data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Quality Assessment of TPB-Based Questionnaires: A Systematic Review

    PubMed Central

    Oluka, Obiageli Crystal; Nie, Shaofa; Sun, Yi

    2014-01-01

    Objective This review is aimed at assessing the quality of questionnaires and their development process based on the theory of planned behavior (TPB) change model. Methods A systematic literature search for studies with the primary aim of TPB-based questionnaire development was conducted in relevant databases between 2002 and 2012 using selected search terms. Ten of 1,034 screened abstracts met the inclusion criteria and were assessed for methodological quality using two different appraisal tools: one for the overall methodological quality of each study and the other developed for the appraisal of the questionnaire content and development process. Both appraisal tools consisted of items regarding the likelihood of bias in each study and were eventually combined to give the overall quality score for each included study. Results 8 of the 10 included studies showed low risk of bias in the overall quality assessment of each study, while 9 of the studies were of high quality based on the quality appraisal of questionnaire content and development process. Conclusion Quality appraisal of the questionnaires in the 10 reviewed studies was successfully conducted, highlighting the top problem areas (including: sample size estimation; inclusion of direct and indirect measures; and inclusion of questions on demographics) in the development of TPB-based questionnaires and the need for researchers to provide a more detailed account of their development process. PMID:24722323

  14. Nondestructive methods of evaluating quality of wood in preservative-treated piles

    Treesearch

    Xiping Wang; Robert J. Ross; John R. Erickson; John W. Forsman; Gary D. McGinnis; Rodney C. De Groot

    2000-01-01

    Stress-wave-based nondestructive evaluation methods were used to evaluate the potential quality and modulus of elasticity (MOE) of wood in used preservative-treated Douglas-fir and southern pine piles. Stress wave measurements were conducted on each pile section. Stress wave propagation speeds in the piles were then obtained to estimate their MOE. This was followed by...

  15. Nondestructive evaluation of potential quality of creosote-treated piles removed from service

    Treesearch

    Xiping Wang; Robert J. Ross; John R. Erickson; John W. Forsman; Gary D. McGinnis; Rodney C. De Groot

    2001-01-01

    Stress-wave-based nondestructive evaluation methods were used to evaluate the potential quality and modulus of elasticity (MOE) of wood from creosote-treated Douglas-fir and southern pine piles removed from service. Stress-wave measurements were conducted on each pile section. Stress-wave propagation speeds were obtained to estimate the MOE of the wood. Tests were then...

  16. Integrated modeling approach using SELECT and SWAT models to simulate source loading and in-stream conditions of fecal indicator bacteria.

    NASA Astrophysics Data System (ADS)

    Ranatunga, T.

    2016-12-01

    Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.

  17. Application of SELECT and SWAT models to simulate source load, fate, and transport of fecal bacteria in watersheds.

    NASA Astrophysics Data System (ADS)

    Ranatunga, T.

    2017-12-01

    Modeling of fate and transport of fecal bacteria in a watershed is a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads from SELECT model were input to the SWAT model, and simulate the bacteria transport through the land and in-stream. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on regional land use, population and household forecast (up to 2040). Based on the reductions required to meet the water quality standards in-stream, the corresponding required source load reductions were estimated.

  18. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Design of a practical model-observer-based image quality assessment method for x-ray computed tomography imaging systems

    PubMed Central

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.

    2016-01-01

    Abstract. The use of a channelization mechanism on model observers not only makes mimicking human visual behavior possible, but also reduces the amount of image data needed to estimate the model observer parameters. The channelized Hotelling observer (CHO) and channelized scanning linear observer (CSLO) have recently been used to assess CT image quality for detection tasks and combined detection/estimation tasks, respectively. Although the use of channels substantially reduces the amount of data required to compute image quality, the number of scans required for CT imaging is still not practical for routine use. It is our desire to further reduce the number of scans required to make CHO or CSLO an image quality tool for routine and frequent system validations and evaluations. This work explores different data-reduction schemes and designs an approach that requires only a few CT scans. Three different kinds of approaches are included in this study: a conventional CHO/CSLO technique with a large sample size, a conventional CHO/CSLO technique with fewer samples, and an approach that we will show requires fewer samples to mimic conventional performance with a large sample size. The mean value and standard deviation of areas under ROC/EROC curve were estimated using the well-validated shuffle approach. The results indicate that an 80% data reduction can be achieved without loss of accuracy. This substantial data reduction is a step toward a practical tool for routine-task-based QA/QC CT system assessment. PMID:27493982

  20. Near-infrared reflectance spectroscopy calibrations for assessment of oil, phenols, glucosinolates and fatty acid content in the intact seeds of oilseed Brassica species.

    PubMed

    Sen, Rahul; Sharma, Sanjula; Kaur, Gurpreet; Banga, Surinder S

    2018-01-31

    Very few near-infrared reflectance spectroscopy (NIRS) calibration models are available for non-destructive estimation of seed quality traits in Brassica juncea. Those that are available also fail to adequately discern variation for oleic acid (C 18:1 ) , linolenic (C 18:3 ) fatty acids, meal glucosinolates and phenols. We report the development of a new NIRS calibration equation that is expected to fill the gaps in the existing NIRS equations. Calibrations were based on the reference values of important quality traits estimated from a purposely selected germplasm set comprising 240 genotypes of B. juncea and 193 of B. napus. We were able to develop optimal NIRS-based calibration models for oil, phenols, glucosinolates, oleic acid, linoleic acid and erucic acid for B. juncea and B. napus. Correlation coefficients (RSQ) of the external validations appeared greater than 0.7 for the majority of traits, such as oil (0.766, 0.865), phenols (0.821, 0.915), glucosinolates (0.951, 0.986), oleic acid (0.814. 0.810), linoleic acid (0.974, 0.781) and erucic acid (0.963, 0.943) for B. juncea and B. napus, respectively. The results demonstrate the robust predictive power of the developed calibration models for rapid estimation of many quality traits in intact rapeseed-mustard seeds which will assist plant breeders in effective screening and selection of lines in quality improvement breeding programmes. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  1. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop

    USDA-ARS?s Scientific Manuscript database

    A radio-controlled unmanned helicopter-based LARS (Low-Altitude Remote Sensing) platform was used to acquire quality images of high spatial and temporal resolution, in order to estimate yield and total biomass of a rice crop (Oriza Sativa, L.). Fifteen rice field plots with five N-treatments (0, 33,...

  2. Reconciling Different Estimates of Teacher Quality Gaps Based on Value Added. CEDR Policy Brief. PB #2016-9

    ERIC Educational Resources Information Center

    Goldhaber, Dan; Quince, Vanessa; Theobald, Roddy

    2016-01-01

    This policy brief reviews evidence about the extent to which disadvantaged students are taught by teachers with lower value-added estimates of performance, and seeks to reconcile differences in findings from different studies. We demonstrate that much of the inequity in teacher value added in Washington state is due to differences across different…

  3. Predicting volumes and numbers of logs by grade from hardwood cruise data

    Treesearch

    Daniel A. Yaussy; Robert L. Brisbin; Mary J. Humphreys; Mary J. Humphreys

    1988-01-01

    The equations presented allow the estimation of quality and quantity of logs produced from a hardwood stand based on cruise data. When packaged in appropriate computer software, the information will provide the mill manager with the means to estimate the value of logs that would be added to a mill yard inventory from a timber sale.

  4. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  5. Estimation of Delta Wave by Mutual Information of Heartbeat During Sleep

    NASA Astrophysics Data System (ADS)

    Kurihara, Yosuke; Watanabe, Kajiro; Kobayashi, Kazuyuki; Tanaka, Hiroshi

    The quality of sleep is evaluated based on the sleep stages judged by R-K method or the manual of American Academy of Sleep Medicine. The brainwaves, eye movements, and chin EMG of sleeping subjects are used for the judgment. These methods above, however, require some electrodes to be attached to the head and the face to obtain the brainwaves, eye movements, and chin EMG, thus making the measurements troublesome to be held on a daily basis. If non-invasive measurements of brainwaves, eye movements, and chin EMG are feasible, or their equivalent data can be estimated through other bio-signals, the monitoring of the quality of daily sleeps, which influences the health condition, will be easy. In this paper, we discuss the appearance rate of delta wave occurrences, which is deeply related with the depth of sleep, can be estimated based on the average amount of mutual information calculated by pulse wave signals and body movements measured non-invasively by the pneumatic method. As a result, the root mean square error between the appearance rate of delta wave occurrences measured with a polysomnography and the estimated delta pulse was 14.93%.

  6. Parametric Analysis of Surveillance Quality and Level and Quality of Intent Information and Their Impact on Conflict Detection Performance

    NASA Technical Reports Server (NTRS)

    Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.

    2016-01-01

    A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.

  7. Health-related quality of life among adults 65 years and older in the United States, 2011-2012: a multilevel small area estimation approach.

    PubMed

    Lin, Yu-Hsiu; McLain, Alexander C; Probst, Janice C; Bennett, Kevin J; Qureshi, Zaina P; Eberth, Jan M

    2017-01-01

    The purpose of this study was to develop county-level estimates of poor health-related quality of life (HRQOL) among aged 65 years and older U.S. adults and to identify spatial clusters of poor HRQOL using a multilevel, poststratification approach. Multilevel, random-intercept models were fit to HRQOL data (two domains: physical health and mental health) from the 2011-2012 Behavioral Risk Factor Surveillance System. Using a poststratification, small area estimation approach, we generated county-level probabilities of having poor HRQOL for each domain in U.S. adults aged 65 and older, and validated our model-based estimates against state and county direct estimates. County-level estimates of poor HRQOL in the United States ranged from 18.07% to 44.81% for physical health and 14.77% to 37.86% for mental health. Correlations between model-based and direct estimates were higher for physical than mental HRQOL. Counties located in the Arkansas, Kentucky, and Mississippi exhibited the worst physical HRQOL scores, but this pattern did not hold for mental HRQOL, which had the highest probability of mentally unhealthy days in Illinois, Indiana, and Vermont. Substantial geographic variation in physical and mental HRQOL scores exists among older U.S. adults. State and local policy makers should consider these local conditions in targeting interventions and policies to counties with high levels of poor HRQOL scores. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Relations between continuous real-time turbidity data and discrete suspended-sediment concentration samples in the Neosho and Cottonwood Rivers, east-central Kansas, 2009-2012

    USGS Publications Warehouse

    Foster, Guy M.

    2014-01-01

    The Neosho River and its primary tributary, the Cottonwood River, are the primary sources of inflow to the John Redmond Reservoir in east-central Kansas. Sedimentation rate in the John Redmond Reservoir was estimated as 743 acre-feet per year for 1964–2006. This estimated sedimentation rate is more than 80 percent larger than the projected design sedimentation rate of 404 acre-feet per year, and resulted in a loss of 40 percent of the conservation pool since its construction in 1964. To reduce sediment input into the reservoir, the Kansas Water Office implemented stream bank stabilization techniques along an 8.3 mile reach of the Neosho River during 2010 through 2011. The U.S. Geological Survey, in cooperation with the Kansas Water Office and funded in part through the Kansas State Water Plan Fund, operated continuous real-time water-quality monitors upstream and downstream from stream bank stabilization efforts before, during, and after construction. Continuously measured water-quality properties include streamflow, specific conductance, water temperature, and turbidity. Discrete sediment samples were collected from June 2009 through September 2012 and analyzed for suspended-sediment concentration (SSC), percentage of sediments less than 63 micrometers (sand-fine break), and loss of material on ignition (analogous to amount of organic matter). Regression models were developed to establish relations between discretely measured SSC samples, and turbidity or streamflow to estimate continuously SSC. Continuous water-quality monitors represented between 96 and 99 percent of the cross-sectional variability for turbidity, and had slopes between 0.91 and 0.98. Because consistent bias was not observed, values from continuous water-quality monitors were considered representative of stream conditions. On average, turbidity-based SSC models explained 96 percent of the variance in SSC. Streamflow-based regressions explained 53 to 60 percent of the variance. Mean squared prediction error for turbidity-based regression relations ranged from -32 to 48 percent, whereas mean square prediction error for streamflow-based regressions ranged from -69 to 218 percent. These models are useful for evaluating the variability of SSC during rapidly changing conditions, computing loads and yields to assess SSC transport through the watershed, and for providing more accurate load estimates compared to streamflow-only based estimation methods used in the past. These models can be used to evaluate the efficacy of streambank stabilization efforts.

  9. Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin

    PubMed Central

    Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R

    2017-01-01

    Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency’s model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program—Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes. PMID:29162976

  10. Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin.

    PubMed

    Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R

    2017-01-01

    Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency's model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program-Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes.

  11. From phenotyping towards breeding strategies: using in vivo indicator traits and genetic markers to improve meat quality in an endangered pig breed.

    PubMed

    Biermann, A D M; Yin, T; König von Borstel, U U; Rübesam, K; Kuhn, B; König, S

    2015-06-01

    In endangered and local pig breeds of small population sizes, production has to focus on alternative niche markets with an emphasis on specific product and meat quality traits to achieve economic competiveness. For designing breeding strategies on meat quality, an adequate performance testing scheme focussing on phenotyped selection candidates is required. For the endangered German pig breed 'Bunte Bentheimer' (BB), no breeding program has been designed until now, and no performance testing scheme has been implemented. For local breeds, mainly reared in small-scale production systems, a performance test based on in vivo indicator traits might be a promising alternative in order to increase genetic gain for meat quality traits. Hence, the main objective of this study was to design and evaluate breeding strategies for the improvement of meat quality within the BB breed using in vivo indicator traits and genetic markers. The in vivo indicator trait was backfat thickness measured by ultrasound (BFiv), and genetic markers were allele variants at the ryanodine receptor 1 (RYR1) locus. In total, 1116 records of production and meat quality traits were collected, including 613 in vivo ultrasound measurements and 713 carcass and meat quality records. Additionally, 700 pigs were genotyped at the RYR1 locus. Data were used (1) to estimate genetic (co)variance components for production and meat quality traits, (2) to estimate allele substitution effects at the RYR1 locus using a selective genotyping approach and (3) to evaluate breeding strategies on meat quality by combining results from quantitative-genetic and molecular-genetic approaches. Heritability for the production trait BFiv was 0.27, and 0.48 for backfat thickness measured on carcass. Estimated heritabilities for meat quality traits ranged from 0.14 for meat brightness to 0.78 for the intramuscular fat content (IMF). Genetic correlations between BFiv and IMF were higher than estimates based on carcass backfat measurements (0.39 v. 0.25). The presence of the unfavorable n allele was associated with increased electric conductivity, paler meat and higher drip loss. The allele substitution effect on IMF was unfavorable, indicating lower IMF when the n allele is present. A breeding strategy including the phenotype (BFiv) combined with genetic marker information at the RYR1 locus from the selection candidate, resulted in a 20% increase in accuracy and selection response when compared with a breeding strategy without genetic marker information.

  12. Robust fundamental frequency estimation in sustained vowels: Detailed algorithmic comparisons and information fusion with adaptive Kalman filtering

    PubMed Central

    Tsanas, Athanasios; Zañartu, Matías; Little, Max A.; Fox, Cynthia; Ramig, Lorraine O.; Clifford, Gari D.

    2014-01-01

    There has been consistent interest among speech signal processing researchers in the accurate estimation of the fundamental frequency (F0) of speech signals. This study examines ten F0 estimation algorithms (some well-established and some proposed more recently) to determine which of these algorithms is, on average, better able to estimate F0 in the sustained vowel /a/. Moreover, a robust method for adaptively weighting the estimates of individual F0 estimation algorithms based on quality and performance measures is proposed, using an adaptive Kalman filter (KF) framework. The accuracy of the algorithms is validated using (a) a database of 117 synthetic realistic phonations obtained using a sophisticated physiological model of speech production and (b) a database of 65 recordings of human phonations where the glottal cycles are calculated from electroglottograph signals. On average, the sawtooth waveform inspired pitch estimator and the nearly defect-free algorithms provided the best individual F0 estimates, and the proposed KF approach resulted in a ∼16% improvement in accuracy over the best single F0 estimation algorithm. These findings may be useful in speech signal processing applications where sustained vowels are used to assess vocal quality, when very accurate F0 estimation is required. PMID:24815269

  13. Index to Estimate the Efficiency of an Ophthalmic Practice.

    PubMed

    Chen, Andrew; Kim, Eun Ah; Aigner, Dennis J; Afifi, Abdelmonem; Caprioli, Joseph

    2015-08-01

    A metric of efficiency, a function of the ratio of quality to cost per patient, will allow the health care system to better measure the impact of specific reforms and compare the effectiveness of each. To develop and evaluate an efficiency index that estimates the performance of an ophthalmologist's practice as a function of cost, number of patients receiving care, and quality of care. Retrospective review of 36 ophthalmology subspecialty practices from October 2011 to September 2012 at a university-based eye institute. The efficiency index (E) was defined as a function of adjusted number of patients (N(a)), total practice adjusted costs (C(a)), and a preliminary measure of quality (Q). Constant b limits E between 0 and 1. Constant y modifies the influence of Q on E. Relative value units and geographic cost indices determined by the Centers for Medicare and Medicaid for 2012 were used to calculate adjusted costs. The efficiency index is expressed as the following: E = b(N(a)/C(a))Q(y). Independent, masked auditors reviewed 20 random patient medical records for each practice and filled out 3 questionnaires to obtain a process-based quality measure. The adjusted number of patients, adjusted costs, quality, and efficiency index were calculated for 36 ophthalmology subspecialties. The median adjusted number of patients was 5516 (interquartile range, 3450-11,863), the median adjusted cost was 1.34 (interquartile range, 0.99-1.96), the median quality was 0.89 (interquartile range, 0.79-0.91), and the median value of the efficiency index was 0.26 (interquartile range, 0.08-0.42). The described efficiency index is a metric that provides a broad overview of performance for a variety of ophthalmology specialties as estimated by resources used and a preliminary measure of quality of care provided. The results of the efficiency index could be used in future investigations to determine its sensitivity to detect the impact of interventions on a practice such as training modules or practice restructuring.

  14. Mapping of the DLQI scores to EQ-5D utility values using ordinal logistic regression.

    PubMed

    Ali, Faraz Mahmood; Kay, Richard; Finlay, Andrew Y; Piguet, Vincent; Kupfer, Joerg; Dalgard, Florence; Salek, M Sam

    2017-11-01

    The Dermatology Life Quality Index (DLQI) and the European Quality of Life-5 Dimension (EQ-5D) are separate measures that may be used to gather health-related quality of life (HRQoL) information from patients. The EQ-5D is a generic measure from which health utility estimates can be derived, whereas the DLQI is a specialty-specific measure to assess HRQoL. To reduce the burden of multiple measures being administered and to enable a more disease-specific calculation of health utility estimates, we explored an established mathematical technique known as ordinal logistic regression (OLR) to develop an appropriate model to map DLQI data to EQ-5D-based health utility estimates. Retrospective data from 4010 patients were randomly divided five times into two groups for the derivation and testing of the mapping model. Split-half cross-validation was utilized resulting in a total of ten ordinal logistic regression models for each of the five EQ-5D dimensions against age, sex, and all ten items of the DLQI. Using Monte Carlo simulation, predicted health utility estimates were derived and compared against those observed. This method was repeated for both OLR and a previously tested mapping methodology based on linear regression. The model was shown to be highly predictive and its repeated fitting demonstrated a stable model using OLR as well as linear regression. The mean differences between OLR-predicted health utility estimates and observed health utility estimates ranged from 0.0024 to 0.0239 across the ten modeling exercises, with an average overall difference of 0.0120 (a 1.6% underestimate, not of clinical importance). This modeling framework developed in this study will enable researchers to calculate EQ-5D health utility estimates from a specialty-specific study population, reducing patient and economic burden.

  15. Comparison of Methods for Estimating Prevalence of Chronic Diseases and Health Behaviors for Small Geographic Areas: Boston Validation Study, 2013

    PubMed Central

    Holt, James B.; Zhang, Xingyou; Lu, Hua; Shah, Snehal N.; Dooley, Daniel P.; Matthews, Kevin A.; Croft, Janet B.

    2017-01-01

    Introduction Local health authorities need small-area estimates for prevalence of chronic diseases and health behaviors for multiple purposes. We generated city-level and census-tract–level prevalence estimates of 27 measures for the 500 largest US cities. Methods To validate the methodology, we constructed multilevel logistic regressions to predict 10 selected health indicators among adults aged 18 years or older by using 2013 Behavioral Risk Factor Surveillance System (BRFSS) data; we applied their predicted probabilities to census population data to generate city-level, neighborhood-level, and zip-code–level estimates for the city of Boston, Massachusetts. Results By comparing the predicted estimates with their corresponding direct estimates from a locally administered survey (Boston BRFSS 2010 and 2013), we found that our model-based estimates for most of the selected health indicators at the city level were close to the direct estimates from the local survey. We also found strong correlation between the model-based estimates and direct survey estimates at neighborhood and zip code levels for most indicators. Conclusion Findings suggest that our model-based estimates are reliable and valid at the city level for certain health outcomes. Local health authorities can use the neighborhood-level estimates if high quality local health survey data are not otherwise available. PMID:29049020

  16. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    PubMed

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  17. An Improved Global Wind Resource Estimate for Integrated Assessment Models: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eurek, Kelly; Sullivan, Patrick; Gleason, Michael

    This paper summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquelymore » detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.« less

  18. An evaluation of methods for estimating decadal stream loads

    NASA Astrophysics Data System (ADS)

    Lee, Casey J.; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.

    2016-11-01

    Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen - lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale's ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.

  19. An evaluation of methods for estimating decadal stream loads

    USGS Publications Warehouse

    Lee, Casey; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.

    2016-01-01

    Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen – lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale’s ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.

  20. Extraction of respiratory signals from the electrocardiogram and photoplethysmogram: technical and physiological determinants.

    PubMed

    Charlton, Peter H; Bonnici, Timothy; Tarassenko, Lionel; Alastruey, Jordi; Clifton, David A; Beale, Richard; Watkinson, Peter J

    2017-05-01

    Breathing rate (BR) can be estimated by extracting respiratory signals from the electrocardiogram (ECG) or photoplethysmogram (PPG). The extracted respiratory signals may be influenced by several technical and physiological factors. In this study, our aim was to determine how technical and physiological factors influence the quality of respiratory signals. Using a variety of techniques 15 respiratory signals were extracted from the ECG, and 11 from PPG signals collected from 57 healthy subjects. The quality of each respiratory signal was assessed by calculating its correlation with a reference oral-nasal pressure respiratory signal using Pearson's correlation coefficient. Relevant results informing device design and clinical application were obtained. The results informing device design were: (i) seven out of 11 respiratory signals were of higher quality when extracted from finger PPG compared to ear PPG; (ii) laboratory equipment did not provide higher quality of respiratory signals than a clinical monitor; (iii) the ECG provided higher quality respiratory signals than the PPG; (iv) during downsampling of the ECG and PPG significant reductions in quality were first observed at sampling frequencies of  <250 Hz and  <16 Hz respectively. The results informing clinical application were: (i) frequency modulation-based respiratory signals were generally of lower quality in elderly subjects compared to young subjects; (ii) the qualities of 23 out of 26 respiratory signals were reduced at elevated BRs; (iii) there were no differences associated with gender. Recommendations based on the results are provided regarding device designs for BR estimation, and clinical applications. The dataset and code used in this study are publicly available.

  1. Ballast Water Self Monitoring

    DTIC Science & Technology

    2011-11-01

    Analytical Methods .........................................................22  7 Estimated Capital Cost for Vessels Needing Additional Ballast Water...streams; narrative water-quality based effluent limits; inspection, monitoring, recordkeeping, and reporting requirements; and additional requirements...decline of several pelagic fish species in the Sacramento-San Joaquin River Delta by reducing the plankton food base of the ecosystem (California State

  2. The Improvement of Spatial-Temporal PM2.5 Resolution in Taiwan by Using Data Assimilation Method

    NASA Astrophysics Data System (ADS)

    Lin, Yong-Qing; Lin, Yuan-Chien

    2017-04-01

    Forecasting air pollution concentration, e.g., the concentration of PM2.5, is of great significance to protect human health and the environment. Accurate prediction of PM2.5 concentrations is limited in number and the data quality of air quality monitoring stations. The spatial and temporal variations of PM2.5 concentrations are measured by 76 National Air Quality Monitoring Stations (built by the TW-EPA) in Taiwan. The National Air Quality Monitoring Stations are costly and scarce because of the highly precise instrument and their size. Therefore, many places still out of the range of National Air Quality Monitoring Stations. Recently, there are an enormous number of portable air quality sensors called "AirBox" developed jointly by the Taiwan government and a private company. By virtue of its price and portative, the AirBox can provide higher resolution of space-time PM2.5 measurement. However, the spatiotemporal distribution and data quality are different between AirBox and National Air Quality Monitoring Stations. To integrate the heterogeneous PM2.5 data, the data assimilation method should be performed before further analysis. In this study, we propose a data assimilation method based on Ensemble Kalman Filter (EnKF), which is a variant of classic Kalman Filter, can be used to combine additional heterogeneous data from different source while modeling to improve the estimation of spatial-temporal PM2.5 concentration. The assimilation procedure uses the advantages of the two kinds of heterogeneous data and merges them to produce the final estimation. The results have shown that by combining AirBox PM2.5 data as additional information in our model based EnKF can bring the better estimation of spatial-temporal PM2.5 concentration and improve the it's space-time resolution. Under the approach proposed in this study, higher spatial-temporal resoultion could provide a very useful information for a better spatial-temporal data analysis and further environmental management, such as air pollution source localization and micro-scale air pollution analysis. Keywords: PM2.5, Data Assimilation, Ensemble Kalman Filter, Air Quality

  3. A simulation of air pollution model parameter estimation using data from a ground-based LIDAR remote sensor

    NASA Technical Reports Server (NTRS)

    Kibler, J. F.; Suttles, J. T.

    1977-01-01

    One way to obtain estimates of the unknown parameters in a pollution dispersion model is to compare the model predictions with remotely sensed air quality data. A ground-based LIDAR sensor provides relative pollution concentration measurements as a function of space and time. The measured sensor data are compared with the dispersion model output through a numerical estimation procedure to yield parameter estimates which best fit the data. This overall process is tested in a computer simulation to study the effects of various measurement strategies. Such a simulation is useful prior to a field measurement exercise to maximize the information content in the collected data. Parametric studies of simulated data matched to a Gaussian plume dispersion model indicate the trade offs available between estimation accuracy and data acquisition strategy.

  4. Development of National Program of Cancer Registries SAS Tool for Population-Based Cancer Relative Survival Analysis.

    PubMed

    Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth

    2016-01-01

    Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.

  5. Correction of stream quality trends for the effects of laboratory measurement bias

    USGS Publications Warehouse

    Alexander, Richard B.; Smith, Richard A.; Schwarz, Gregory E.

    1993-01-01

    We present a statistical model relating measurements of water quality to associated errors in laboratory methods. Estimation of the model allows us to correct trends in water quality for long-term and short-term variations in laboratory measurement errors. An illustration of the bias correction method for a large national set of stream water quality and quality assurance data shows that reductions in the bias of estimates of water quality trend slopes are achieved at the expense of increases in the variance of these estimates. Slight improvements occur in the precision of estimates of trend in bias by using correlative information on bias and water quality to estimate random variations in measurement bias. The results of this investigation stress the need for reliable, long-term quality assurance data and efficient statistical methods to assess the effects of measurement errors on the detection of water quality trends.

  6. 3D conditional generative adversarial networks for high-quality PET image estimation at low dose.

    PubMed

    Wang, Yan; Yu, Biting; Wang, Lei; Zu, Chen; Lalush, David S; Lin, Weili; Wu, Xi; Zhou, Jiliu; Shen, Dinggang; Zhou, Luping

    2018-07-01

    Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Production and use of estimates for monitoring progress in the health sector: the case of Bangladesh

    PubMed Central

    Ahsan, Karar Zunaid; Tahsina, Tazeen; Iqbal, Afrin; Ali, Nazia Binte; Chowdhury, Suman Kanti; Huda, Tanvir M.; Arifeen, Shams El

    2017-01-01

    ABSTRACT Background: In order to support the progress towards the post-2015 development agenda for the health sector, the importance of high-quality and timely estimates has become evident both globally and at the country level. Objective and Methods: Based on desk review, key informant interviews and expert panel discussions, the paper critically reviews health estimates from both the local (i.e. nationally generated information by the government and other agencies) and the global sources (which are mostly modeled or interpolated estimates developed by international organizations based on different sources of information), and assesses the country capacity and monitoring strategies to meet the increasing data demand in the coming years. Primarily, this paper provides a situation analysis of Bangladesh in terms of production and use of health estimates for monitoring progress towards the post-2015 development goals for the health sector. Results: The analysis reveals that Bangladesh is data rich, particularly from household surveys and health facility assessments. Practices of data utilization also exist, with wide acceptability of survey results for informing policy, programme review and course corrections. Despite high data availability from multiple sources, the country capacity for providing regular updates of major global health estimates/indicators remains low. Major challenges also include limited human resources, capacity to generate quality data and multiplicity of data sources, where discrepancy and lack of linkages among different data sources (local sources and between local and global estimates) present emerging challenges for interpretation of the resulting estimates. Conclusion: To fulfill the increased data requirement for the post-2015 era, Bangladesh needs to invest more in electronic data capture and routine health information systems. Streamlining of data sources, integration of parallel information systems into a common platform, and capacity building for data generation and analysis are recommended as priority actions for Bangladesh in the coming years. In addition to automation of routine health information systems, establishing an Indicator Reference Group for Bangladesh to analyze data; building country capacity in data quality assessment and triangulation; and feeding into global, inter-agency estimates for better reporting would address a number of mentioned challenges in the short- and long-run. PMID:28532305

  8. Influence of various water quality sampling strategies on load estimates for small streams

    USGS Publications Warehouse

    Robertson, Dale M.; Roerish, Eric D.

    1999-01-01

    Extensive streamflow and water quality data from eight small streams were systematically subsampled to represent various water‐quality sampling strategies. The subsampled data were then used to determine the accuracy and precision of annual load estimates generated by means of a regression approach (typically used for big rivers) and to determine the most effective sampling strategy for small streams. Estimation of annual loads by regression was imprecise regardless of the sampling strategy used; for the most effective strategy, median absolute errors were ∼30% based on the load estimated with an integration method and all available data, if a regression approach is used with daily average streamflow. The most effective sampling strategy depends on the length of the study. For 1‐year studies, fixed‐period monthly sampling supplemented by storm chasing was the most effective strategy. For studies of 2 or more years, fixed‐period semimonthly sampling resulted in not only the least biased but also the most precise loads. Additional high‐flow samples, typically collected to help define the relation between high streamflow and high loads, result in imprecise, overestimated annual loads if these samples are consistently collected early in high‐flow events.

  9. Quality assessment of color images based on the measure of just noticeable color difference

    NASA Astrophysics Data System (ADS)

    Chou, Chun-Hsien; Hsu, Yun-Hsiang

    2014-01-01

    Accurate assessment on the quality of color images is an important step to many image processing systems that convey visual information of the reproduced images. An accurate objective image quality assessment (IQA) method is expected to give the assessment result highly agreeing with the subjective assessment. To assess the quality of color images, many approaches simply apply the metric for assessing the quality of gray scale images to each of three color channels of the color image, neglecting the correlation among three color channels. In this paper, a metric for assessing color images' quality is proposed, in which the model of variable just-noticeable color difference (VJNCD) is employed to estimate the visibility thresholds of distortion inherent in each color pixel. With the estimated visibility thresholds of distortion, the proposed metric measures the average perceptible distortion in terms of the quantized distortion according to the perceptual error map similar to that defined by National Bureau of Standards (NBS) for converting the color difference enumerated by CIEDE2000 to the objective score of perceptual quality assessment. The perceptual error map in this case is designed for each pixel according to the visibility threshold estimated by the VJNCD model. The performance of the proposed metric is verified by assessing the test images in the LIVE database, and is compared with those of many well-know IQA metrics. Experimental results indicate that the proposed metric is an effective IQA method that can accurately predict the image quality of color images in terms of the correlation between objective scores and subjective evaluation.

  10. Instruments evaluating the quality of the clinical learning environment in nursing education: A systematic review of psychometric properties.

    PubMed

    Mansutti, Irene; Saiani, Luisa; Grassetti, Luca; Palese, Alvisa

    2017-03-01

    The clinical learning environment is fundamental to nursing education paths, capable of affecting learning processes and outcomes. Several instruments have been developed in nursing education, aimed at evaluating the quality of the clinical learning environments; however, no systematic review of the psychometric properties and methodological quality of these studies has been performed to date. The aims of the study were: 1) to identify validated instruments evaluating the clinical learning environments in nursing education; 2) to evaluate critically the methodological quality of the psychometric property estimation used; and 3) to compare psychometric properties across the instruments available. A systematic review of the literature (using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines) and an evaluation of the methodological quality of psychometric properties (using the COnsensus-based Standards for the selection of health Measurement INstruments guidelines). The Medline and CINAHL databases were searched. Eligible studies were those that satisfied the following criteria: a) validation studies of instruments evaluating the quality of clinical learning environments; b) in nursing education; c) published in English or Italian; d) before April 2016. The included studies were evaluated for the methodological quality of the psychometric properties measured and then compared in terms of both the psychometric properties and the methodological quality of the processes used. The search strategy yielded a total of 26 studies and eight clinical learning environment evaluation instruments. A variety of psychometric properties have been estimated for each instrument, with differing qualities in the methodology used. Concept and construct validity were poorly assessed in terms of their significance and rarely judged by the target population (nursing students). Some properties were rarely considered (e.g., reliability, measurement error, criterion validity), whereas others were frequently estimated, but using different coefficients and statistical analyses (e.g., internal consistency, structural validity), thus rendering comparison across instruments difficult. Moreover, the methodological quality adopted in the property assessments was poor or fair in most studies, compromising the goodness of the psychometric values estimated. Clinical learning placements represent the key strategies in educating the future nursing workforce: instruments evaluating the quality of the settings, as well as their capacity to promote significant learning, are strongly recommended. Studies estimating psychometric properties, using an increased quality of research methodologies are needed in order to support nursing educators in the process of clinical placements accreditation and quality improvement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Grid mapping: a novel method of signal quality evaluation on a single lead electrocardiogram.

    PubMed

    Li, Yanjun; Tang, Xiaoying

    2017-12-01

    Diagnosis of long-term electrocardiogram (ECG) calls for automatic and accurate methods of ECG signal quality estimation, not only to lighten the burden of the doctors but also to avoid misdiagnoses. In this paper, a novel waveform-based method of phase-space reconstruction for signal quality estimation on a single lead ECG was proposed by projecting the amplitude of the ECG and its first order difference into grid cells. The waveform of a single lead ECG was divided into non-overlapping episodes (T s  = 10, 20, 30 s), and the number of grids in both the width and the height of each map are in the range [20, 100] (N X  = N Y  = 20, 30, 40, … 90, 100). The blank pane ratio (BPR) and the entropy were calculated from the distribution of ECG sampling points which were projected into the grid cells. Signal Quality Indices (SQI) bSQI and eSQI were calculated according to the BPR and the entropy, respectively. The MIT-BIH Noise Stress Test Database was used to test the performance of bSQI and eSQI on ECG signal quality estimation. The signal-to-noise ratio (SNR) during the noisy segments of the ECG records in the database is 24, 18, 12, 6, 0 and - 6 dB, respectively. For the SQI quantitative analysis, the records were divided into three groups: good quality group (24, 18 dB), moderate group (12, 6 dB) and bad quality group (0, - 6 dB). The classification among good quality group, moderate quality group and bad quality group were made by linear support-vector machine with the combination of the BPR, the entropy, the bSQI and the eSQI. The classification accuracy was 82.4% and the Cohen's Kappa coefficient was 0.74 on a scale of N X  = 40 and T s  = 20 s. In conclusion, the novel grid mapping offers an intuitive and simple approach to achieving signal quality estimation on a single lead ECG.

  12. Application of online measures to monitor and evaluate multiplatform fusion performance

    NASA Astrophysics Data System (ADS)

    Stubberud, Stephen C.; Kowalski, Charlene; Klamer, Dale M.

    1999-07-01

    A primary concern of multiplatform data fusion is assessing the quality and utility of data shared among platforms. Constraints such as platform and sensor capability and task load necessitate development of an on-line system that computes a metric to determine which other platform can provide the best data for processing. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. Entropy measures quality of processed information such as localization, classification, and ambiguity in measurement-to-track association. Lower entropy scores imply less uncertainty about a particular target. When new information is provided, we compuete the level of improvement a particular track obtains from one measurement to another. The measure permits us to evaluate the utility of the new information. We couple entropy with intelligent agents that provide two main data gathering functions: estimation of another platform's performance and evaluation of the new measurement data's quality. Both functions result from the entropy metric. The intelligent agent on a platform makes an estimate of another platform's measurement and provides it to its own fusion system, which can then incorporate it, for a particular target. A resulting entropy measure is then calculated and returned to its own agent. From this metric, the agent determines a perceived value of the offboard platform's measurement. If the value is satisfactory, the agent requests the measurement from the other platform, usually by interacting with the other platform's agent. Once the actual measurement is received, again entropy is computed and the agent assesses its estimation process and refines it accordingly.

  13. Data Quality Control Tools Applied to Seismo-Acoustic Arrays in Korea

    NASA Astrophysics Data System (ADS)

    Park, J.; Hayward, C.; Stump, B. W.

    2017-12-01

    We assess data quality (data gap, seismometer orientation, timing error, noise level and coherence between co-located sensors) for seismic and infrasound data in South Korea using six seismo-acoustic arrays, BRDAR, CHNAR, KSGAR, KMPAR, TJIAR, and YPDAR, cooperatively operated by Southern Methodist University and Korea Institute for Geosciences and Mineral Resources. Timing errors associated with seismometers can be found based on estimated changes in instrument orientation calculated from RMS errors between the reference array and each array seismometer using waveforms filtered from 0.1 to 0.35 Hz. Noise levels of seismic and infrasound data are analyzed to investigate local environmental effects and seasonal noise variation. In order to examine the spectral properties of the noise, the waveform are analyzed using Welch's method (Welch, 1967) that produces a single power spectral estimate from an average of spectra taken at regular intervals over a specific time period. This analysis quantifies the range of noise conditions found at each of the arrays over the given time period. We take an advantage of the fact that infrasound sensors are co-located or closely located to one another, which allows for a direct comparison of sensors, following the method by Ringler et al. (2010). The power level differences between two sensors at the same array in the frequency band of interest are used to monitor temporal changes in data quality and instrument conditions. A data quality factor is assigned to stations based on the average values of temporal changes estimated in the frequency and time domains. These monitoring tools enable us to automatically assess technical issue related to the instruments and data quality at each seismo-acoustic array as well as to investigate local environmental effects and seasonal variations in both seismic and infrasound data.

  14. Optimizing fish sampling for fish - mercury bioaccumulation factors

    USGS Publications Warehouse

    Scudder Eikenberry, Barbara C.; Riva-Murray, Karen; Knightes, Christopher D.; Journey, Celeste A.; Chasar, Lia C.; Brigham, Mark E.; Bradley, Paul M.

    2015-01-01

    Fish Bioaccumulation Factors (BAFs; ratios of mercury (Hg) in fish (Hgfish) and water (Hgwater)) are used to develop Total Maximum Daily Load and water quality criteria for Hg-impaired waters. Both applications require representative Hgfish estimates and, thus, are sensitive to sampling and data-treatment methods. Data collected by fixed protocol from 11 streams in 5 states distributed across the US were used to assess the effects of Hgfish normalization/standardization methods and fish sample numbers on BAF estimates. Fish length, followed by weight, was most correlated to adult top-predator Hgfish. Site-specific BAFs based on length-normalized and standardized Hgfish estimates demonstrated up to 50% less variability than those based on non-normalized Hgfish. Permutation analysis indicated that length-normalized and standardized Hgfish estimates based on at least 8 trout or 5 bass resulted in mean Hgfish coefficients of variation less than 20%. These results are intended to support regulatory mercury monitoring and load-reduction program improvements.

  15. Risk-based decision making to manage water quality failures caused by combined sewer overflows

    NASA Astrophysics Data System (ADS)

    Sriwastava, A. K.; Torres-Matallana, J. A.; Tait, S.; Schellart, A.

    2017-12-01

    Regulatory authorities set certain environmental permit for water utilities such that the combined sewer overflows (CSO) managed by these companies conform to the regulations. These utility companies face the risk of paying penalty or negative publicity in case they breach the environmental permit. These risks can be addressed by designing appropriate solutions such as investing in additional infrastructure which improve the system capacity and reduce the impact of CSO spills. The performance of these solutions is often estimated using urban drainage models. Hence, any uncertainty in these models can have a significant effect on the decision making process. This study outlines a risk-based decision making approach to address water quality failure caused by CSO spills. A calibrated lumped urban drainage model is used to simulate CSO spill quality in Haute-Sûre catchment in Luxembourg. Uncertainty in rainfall and model parameters is propagated through Monte Carlo simulations to quantify uncertainty in the concentration of ammonia in the CSO spill. A combination of decision alternatives such as the construction of a storage tank at the CSO and the reduction in the flow contribution of catchment surfaces are selected as planning measures to avoid the water quality failure. Failure is defined as exceedance of a concentration-duration based threshold based on Austrian emission standards for ammonia (De Toffol, 2006) with a certain frequency. For each decision alternative, uncertainty quantification results into a probability distribution of the number of annual CSO spill events which exceed the threshold. For each alternative, a buffered failure probability as defined in Rockafellar & Royset (2010), is estimated. Buffered failure probability (pbf) is a conservative estimate of failure probability (pf), however, unlike failure probability, it includes information about the upper tail of the distribution. A pareto-optimal set of solutions is obtained by performing mean- pbf optimization. The effectiveness of using buffered failure probability compared to the failure probability is tested by comparing the solutions obtained by using mean-pbf and mean-pf optimizations.

  16. Runoff load estimation of particulate and dissolved nitrogen in Lake Inba watershed using continuous monitoring data on turbidity and electric conductivity.

    PubMed

    Kim, J; Nagano, Y; Furumai, H

    2012-01-01

    Easy-to-measure surrogate parameters for water quality indicators are needed for real time monitoring as well as for generating data for model calibration and validation. In this study, a novel linear regression model for estimating total nitrogen (TN) based on two surrogate parameters is proposed based on evaluation of pollutant loads flowing into a eutrophic lake. Based on their runoff characteristics during wet weather, electric conductivity (EC) and turbidity were selected as surrogates for particulate nitrogen (PN) and dissolved nitrogen (DN), respectively. Strong linear relationships were established between PN and turbidity and DN and EC, and both models subsequently combined for estimation of TN. This model was evaluated by comparison of estimated and observed TN runoff loads during rainfall events. This analysis showed that turbidity and EC are viable surrogates for PN and DN, respectively, and that the linear regression model for TN concentration was successful in estimating TN runoff loads during rainfall events and also under dry weather conditions.

  17. Risk-based indicators of Canadians' exposures to environmental carcinogens.

    PubMed

    Setton, Eleanor; Hystad, Perry; Poplawski, Karla; Cheasley, Roslyn; Cervantes-Larios, Alejandro; Keller, C Peter; Demers, Paul A

    2013-02-12

    Tools for estimating population exposures to environmental carcinogens are required to support evidence-based policies to reduce chronic exposures and associated cancers. Our objective was to develop indicators of population exposure to selected environmental carcinogens that can be easily updated over time, and allow comparisons and prioritization between different carcinogens and exposure pathways. We employed a risk assessment-based approach to produce screening-level estimates of lifetime excess cancer risk for selected substances listed as known carcinogens by the International Agency for Research on Cancer. Estimates of lifetime average daily intake were calculated using population characteristics combined with concentrations (circa 2006) in outdoor air, indoor air, dust, drinking water, and food and beverages from existing monitoring databases or comprehensive literature reviews. Intake estimates were then multiplied by cancer potency factors from Health Canada, the United States Environmental Protection Agency, and the California Office of Environmental Health Hazard Assessment to estimate lifetime excess cancer risks associated with each substance and exposure pathway. Lifetime excess cancer risks in excess of 1 per million people are identified as potential priorities for further attention. Based on data representing average conditions circa 2006, a total of 18 carcinogen-exposure pathways had potential lifetime excess cancer risks greater than 1 per million, based on varying data quality. Carcinogens with moderate to high data quality and lifetime excess cancer risk greater than 1 per million included benzene, 1,3-butadiene and radon in outdoor air; benzene and radon in indoor air; and arsenic and hexavalent chromium in drinking water. Important data gaps were identified for asbestos, hexavalent chromium and diesel exhaust in outdoor and indoor air, while little data were available to assess risk for substances in dust, food and beverages. The ability to track changes in potential population exposures to environmental carcinogens over time, as well as to compare between different substances and exposure pathways, is necessary to support comprehensive, evidence-based prevention policy. We used estimates of lifetime excess cancer risk as indicators that, although based on a number of simplifying assumptions, help to identify important data gaps and prioritize more detailed data collection and exposure assessment needs.

  18. Michigan`s forests 1993: An analysis. Forest Service resource bulletin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, T.L.; Spencer, J.S.; Bertsch, R.

    1997-02-04

    Michigan`s forests are abundant, diverse, healthy, productive, and expanding. These forests make important contributions to the quality of life by providing a wide array of benefits, including wildlife habitat, biological diversity, outdoor recreation, improved air and water quality, and economic resources such as the estimated $12 billion of value added and 200,000 jobs annually supported by forest-based industries/tourism/recreation.

  19. Qualifying variability: patterns in water quality and biota from a long-term, multi-stream dataset

    Treesearch

    Camille Flinders; Douglas McLaughlin

    2016-01-01

    Effective water resources assessment and management requires quantitative information on the variability of ambient and biological conditions in aquatic communities. Although it is understood that natural systems are variable, robust estimates of variation in water quality and biotic endpoints (e.g. community-based structure and function metrics) are rare in US waters...

  20. Comparing image quality of print-on-demand books and photobooks from web-based vendors

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan; Bajorski, Peter; Burns, Peter; Fredericks, Erin; Rosen, Mitchell

    2010-01-01

    Because of the emergence of e-commerce and developments in print engines designed for economical output of very short runs, there are increased business opportunities and consumer options for print-on-demand books and photobooks. The current state of these printing modes allows for direct uploading of book files via the web, printing on nonoffset printers, and distributing by standard parcel or mail delivery services. The goal of this research is to assess the image quality of print-on-demand books and photobooks produced by various Web-based vendors and to identify correlations between psychophysical results and objective metrics. Six vendors were identified for one-off (single-copy) print-on-demand books, and seven vendors were identified for photobooks. Participants rank ordered overall quality of a subset of individual pages from each book, where the pages included text, photographs, or a combination of the two. Observers also reported overall quality ratings and price estimates for the bound books. Objective metrics of color gamut, color accuracy, accuracy of International Color Consortium profile usage, eye-weighted root mean square L*, and cascaded modulation transfer acutance were obtained and compared to the observer responses. We introduce some new methods for normalizing data as well as for strengthening the statistical significance of the results. Our approach includes the use of latent mixed-effect models. We found statistically significant correlation with overall image quality and some of the spatial metrics, but correlations between psychophysical results and other objective metrics were weak or nonexistent. Strong correlation was found between psychophysical results of overall quality assessment and estimated price associated with quality. The photobook set of vendors reached higher image-quality ratings than the set of print-on-demand vendors. However, the photobook set had higher image-quality variability.

  1. Derivation of freshwater quality criteria for zinc using interspecies correlation estimation models to protect aquatic life in China.

    PubMed

    Feng, C L; Wu, F C; Dyer, S D; Chang, H; Zhao, X L

    2013-01-01

    Species sensitivity distributions (SSDs) are usually used in the development of water quality criteria and require a large number of toxicity values to define a hazard level to protect the majority of species. However, some toxicity data for certain chemicals are limited, especially for endangered and threatened species. Thus, it is important to predict the unknown species toxicity data using available toxicity data. To address this need, interspecies correlation estimation (ICE) models were developed by US EPA to predict acute toxicity of chemicals to diverse species based on a more limited data set of surrogate species toxicity data. Use of SSDs generated from ICE models allows for the prediction of protective water quality criteria, such as the HC5 (hazard concentration, 5th percentile). In the present study, we tested this concept using toxicity data collected for zinc. ICE-based-SSDs were generated using three surrogate species (common carp (Cyprinus carpio), rainbow trout (Oncorhynchus mykiss), and Daphnia magna) and compared with the measured-based SSD and corresponding HC5. The results showed that no significant differences were observed between the ICE- and the measured-based SSDs and HC5s. Furthermore, the examination of species placements within the SSDs indicated that the most sensitive species to zinc were invertebrates, especially crustaceans. Given the similarity of SSD and HC5s for zinc, the use of ICE to derive potential water quality criteria for diverse chemicals in China is proposed. Further, a combination of measured and ICE-derived data will prove useful for assessing water quality and chemical risks in the near future. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Average effect estimates remain similar as evidence evolves from single trials to high-quality bodies of evidence: a meta-epidemiologic study.

    PubMed

    Gartlehner, Gerald; Dobrescu, Andreea; Evans, Tammeka Swinson; Thaler, Kylie; Nussbaumer, Barbara; Sommer, Isolde; Lohr, Kathleen N

    2016-01-01

    The objective of our study was to use a diverse sample of medical interventions to assess empirically whether first trials rendered substantially different treatment effect estimates than reliable, high-quality bodies of evidence. We used a meta-epidemiologic study design using 100 randomly selected bodies of evidence from Cochrane reports that had been graded as high quality of evidence. To determine the concordance of effect estimates between first and subsequent trials, we applied both quantitative and qualitative approaches. For quantitative assessment, we used Lin's concordance correlation and calculated z-scores; to determine the magnitude of differences of treatment effects, we calculated standardized mean differences (SMDs) and ratios of relative risks. We determined qualitative concordance based on a two-tiered approach incorporating changes in statistical significance and magnitude of effect. First trials both overestimated and underestimated the true treatment effects in no discernible pattern. Nevertheless, depending on the definition of concordance, effect estimates of first trials were concordant with pooled subsequent studies in at least 33% but up to 50% of comparisons. The pooled magnitude of change as bodies of evidence advanced from single trials to high-quality bodies of evidence was 0.16 SMD [95% confidence interval (CI): 0.12, 0.21]. In 80% of comparisons, the difference in effect estimates was smaller than 0.5 SMDs. In first trials with large treatment effects (>0.5 SMD), however, estimates of effect substantially changed as new evidence accrued (mean change 0.68 SMD; 95% CI: 0.50, 0.86). Results of first trials often change, but the magnitude of change, on average, is small. Exceptions are first trials that present large treatment effects, which often dissipate as new evidence accrues. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Improving the quality of parameter estimates obtained from slug tests

    USGS Publications Warehouse

    Butler, J.J.; McElwee, C.D.; Liu, W.

    1996-01-01

    The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.

  4. Economics of Team-based Care in Controlling Blood Pressure: A Community Guide Systematic Review

    PubMed Central

    Jacob, Verughese; Chattopadhyay, Sajal K.; Thota, Anilkrishna B.; Proia, Krista K.; Njie, Gibril; Hopkins, David P.; Finnie, Ramona K.C.; Pronk, Nicolaas P.; Kottke, Thomas E.

    2015-01-01

    Context High blood pressure is an important risk factor for cardiovascular disease (CVD) and stroke, the leading cause of death in the U.S. and a substantial national burden through lost productivity and medical care. A recent Community Guide systematic review found strong evidence of effectiveness of team-based care in improving blood pressure control. The objective of the present review was to determine from the economic literature whether team-based care for blood pressure control is cost-beneficial and/or cost-effective. Evidence acquisition Electronic databases of papers published January 1980 – May 2012 were searched to find economic evaluations of team-based care interventions to improve blood pressure outcomes, yielding 31 studies for inclusion. Evidence synthesis In analyses conducted in 2012, intervention cost, healthcare cost averted, benefit-to-cost ratios, and cost-effectiveness were abstracted from the studies. The quality of estimates for intervention and healthcare cost from each study were assessed using three elements: intervention focus on blood pressure control; incremental estimates in the intervention group relative to a control group; and inclusion of major cost-driving elements in estimates. Intervention cost per unit reduction in systolic blood pressure was converted to lifetime intervention cost per quality-adjusted life-year (QALY) saved using algorithms from published trials. Conclusion Team-based care to improve blood pressure control is cost-effective based on evidence that 26 of 28 estimates of $/QALY gained from 10 studies were below a conservative threshold of $50,000. This finding is salient to recent health care reforms in the U.S. and coordinated patient-centered care through formation of Accountable Care Organizations (ACOs). PMID:26477804

  5. Performance and effects of land cover type on synthetic surface reflectance data and NDVI estimates for assessment and monitoring of semi-arid rangeland

    USGS Publications Warehouse

    Olexa, Edward M.; Lawrence, Rick L

    2014-01-01

    Federal land management agencies provide stewardship over much of the rangelands in the arid andsemi-arid western United States, but they often lack data of the proper spatiotemporal resolution andextent needed to assess range conditions and monitor trends. Recent advances in the blending of com-plementary, remotely sensed data could provide public lands managers with the needed information.We applied the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) to five Landsat TMand concurrent Terra MODIS scenes, and used pixel-based regression and difference image analyses toevaluate the quality of synthetic reflectance and NDVI products associated with semi-arid rangeland. Pre-dicted red reflectance data consistently demonstrated higher accuracy, less bias, and stronger correlationwith observed data than did analogous near-infrared (NIR) data. The accuracy of both bands tended todecline as the lag between base and prediction dates increased; however, mean absolute errors (MAE)were typically ≤10%. The quality of area-wide NDVI estimates was less consistent than either spectra lband, although the MAE of estimates predicted using early season base pairs were ≤10% throughout the growing season. Correlation between known and predicted NDVI values and agreement with the 1:1regression line tended to decline as the prediction lag increased. Further analyses of NDVI predictions,based on a 22 June base pair and stratified by land cover/land use (LCLU), revealed accurate estimates through the growing season; however, inter-class performance varied. This work demonstrates the successful application of the STARFM algorithm to semi-arid rangeland; however, we encourage evaluation of STARFM’s performance on a per product basis, stratified by LCLU, with attention given to the influence of base pair selection and the impact of the time lag.

  6. Impact of infectious diseases on population health using incidence-based disability-adjusted life years (DALYs): results from the Burden of Communicable Diseases in Europe study, European Union and European Economic Area countries, 2009 to 2013

    PubMed Central

    Cassini, Alessandro; Colzani, Edoardo; Pini, Alessandro; Mangen, Marie-Josee J; Plass, Dietrich; McDonald, Scott A; Maringhini, Guido; van Lier, Alies; Haagsma, Juanita A; Havelaar, Arie H; Kramarz, Piotr; Kretzschmar, Mirjam E

    2018-01-01

    Background and aims The Burden of Communicable Diseases in Europe (BCoDE) study aimed to calculate disability-adjusted life years (DALYs) for 31 selected diseases in the European Union (EU) and European Economic Area (EEA). Methods: DALYs were estimated using an incidence-based and pathogen-based approach. Incidence was estimated through assessment of data availability and quality, and a correction was applied for under-estimation. Calculation of DALYs was performed with the BCoDE software toolkit without applying time discounting and age-weighting. Results: We estimated that one in 14 inhabitants experienced an infectious disease episode for a total burden of 1.38 million DALYs (95% uncertainty interval (UI): 1.25–1.5) between 2009 and 2013; 76% of which was related to the acute phase of the infection and its short-term complications. Influenza had the highest burden (30% of the total burden), followed by tuberculosis, human immunodeficiency virus (HIV) infection/AIDS and invasive pneumococcal disease (IPD). Men had the highest burden measured in DALYs (60% of the total), adults 65 years of age and over had 24% and children less than 5 years of age had 11%. Age group-specific burden showed that infants (less than 1 year of age) and elderly people (80 years of age and over) experienced the highest burden. Conclusions: These results provide baseline estimates for evaluating infectious disease prevention and control strategies. The study promotes an evidence-based approach to describing population health and assessing surveillance data availability and quality, and provides information for the planning and prioritisation of limited resources in infectious disease prevention and control. PMID:29692315

  7. Impact of infectious diseases on population health using incidence-based disability-adjusted life years (DALYs): results from the Burden of Communicable Diseases in Europe study, European Union and European Economic Area countries, 2009 to 2013.

    PubMed

    Cassini, Alessandro; Colzani, Edoardo; Pini, Alessandro; Mangen, Marie-Josee J; Plass, Dietrich; McDonald, Scott A; Maringhini, Guido; van Lier, Alies; Haagsma, Juanita A; Havelaar, Arie H; Kramarz, Piotr; Kretzschmar, Mirjam E; On Behalf Of The BCoDE Consortium

    2018-04-01

    Background and aimsThe Burden of Communicable Diseases in Europe (BCoDE) study aimed to calculate disability-adjusted life years (DALYs) for 31 selected diseases in the European Union (EU) and European Economic Area (EEA). Methods: DALYs were estimated using an incidence-based and pathogen-based approach. Incidence was estimated through assessment of data availability and quality, and a correction was applied for under-estimation. Calculation of DALYs was performed with the BCoDE software toolkit without applying time discounting and age-weighting. Results: We estimated that one in 14 inhabitants experienced an infectious disease episode for a total burden of 1.38 million DALYs (95% uncertainty interval (UI): 1.25-1.5) between 2009 and 2013; 76% of which was related to the acute phase of the infection and its short-term complications. Influenza had the highest burden (30% of the total burden), followed by tuberculosis, human immunodeficiency virus (HIV) infection/AIDS and invasive pneumococcal disease (IPD). Men had the highest burden measured in DALYs (60% of the total), adults 65 years of age and over had 24% and children less than 5 years of age had 11%. Age group-specific burden showed that infants (less than 1 year of age) and elderly people (80 years of age and over) experienced the highest burden. Conclusions: These results provide baseline estimates for evaluating infectious disease prevention and control strategies. The study promotes an evidence-based approach to describing population health and assessing surveillance data availability and quality, and provides information for the planning and prioritisation of limited resources in infectious disease prevention and control.

  8. [Effective coverage to manage domestic violence against women in Mexican municipalities: limits of metrics].

    PubMed

    Viviescas-Vargas, Diana P; Idrovo, Alvaro Javier; López-López, Erika; Uicab-Pool, Gloria; Herrera-Trujillo, Mónica; Balam-Gómez, Maricela; Hidalgo-Solórzano, Elisa

    2013-08-01

    The study estimated the effective coverage of health services in primary care for the management of domestic violence against women in three municipalities in Mexico. We estimated the prevalence and severity of violence using a validated scale, and the effective coverage proposed by Shengelia and partners with any modifications. Quality care was considered when there was a suggestion to report it to authorities. The use and quality of care was low in the three municipalities analyzed, used most frequently when there was sexual or physical violence. Effective coverage was 29.41%, 16.67% and zero in Guachochi, Jojutla and Tizimín, respectively. The effective coverage indicator had difficulties in measuring events and responses that were not based on biomedical models. Findings suggest that the indicator can be improved by incorporating other dimensions of quality.

  9. Streamflow and Nutrient Fluxes of the Mississippi-Atchafalaya River Basin and Subbasins for the Period of Record Through 2005

    USGS Publications Warehouse

    Aulenbach, Brent T.; Buxton, Herbert T.; Battaglin, William A.; Coupe, Richard H.

    2007-01-01

    U.S. Geological Survey has monitored streamflow and water quality systematically in the Mississippi-Atchafalaya River Basin (MARB) for more than five decades. This report provides streamflow and estimates of nutrient delivery (flux) to the Gulf of Mexico from both the Atchafalaya River and the main stem of the Mississippi River. This report provides streamflow and nutrient flux estimates for nine major subbasins of the Mississippi River. This report also provides streamflow and flux estimates for 21 selected subbasins of various sizes, hydrology, land use, and geographic location within the Basin. The information is provided at each station for the period for which sufficient water-quality data are available to make statistically based flux estimates (starting as early as water year1 1960 and going through water year 2005). Nutrient fluxes are estimated using the adjusted maximum likelihood estimate, a type of regression-model method; nutrient fluxes to the Gulf of Mexico also are estimated using the composite method. Regression models were calibrated using a 5-year moving calibration period; the model was used to estimate the last year of the calibration period. Nutrient flux estimates are provided for six water-quality constituents: dissolved nitrite plus nitrate, total organic nitrogen plus ammonia nitrogen (total Kjeldahl nitrogen), dissolved ammonia, total phosphorous, dissolved orthophosphate, and dissolved silica. Additionally, the contribution of streamflow and net nutrient flux for five large subbasins comprising the MARB were determined from streamflow and nutrient fluxes from seven of the aforementioned major subbasins. These five large subbasins are: 1. Lower Mississippi, 2. Upper Mississippi, 3. Ohio/Tennessee, 4. Missouri, and 5. Arkansas/Red.

  10. A software sensor model based on hybrid fuzzy neural network for rapid estimation water quality in Guangzhou section of Pearl River, China.

    PubMed

    Zhou, Chunshan; Zhang, Chao; Tian, Di; Wang, Ke; Huang, Mingzhi; Liu, Yanbiao

    2018-01-02

    In order to manage water resources, a software sensor model was designed to estimate water quality using a hybrid fuzzy neural network (FNN) in Guangzhou section of Pearl River, China. The software sensor system was composed of data storage module, fuzzy decision-making module, neural network module and fuzzy reasoning generator module. Fuzzy subtractive clustering was employed to capture the character of model, and optimize network architecture for enhancing network performance. The results indicate that, on basis of available on-line measured variables, the software sensor model can accurately predict water quality according to the relationship between chemical oxygen demand (COD) and dissolved oxygen (DO), pH and NH 4 + -N. Owing to its ability in recognizing time series patterns and non-linear characteristics, the software sensor-based FNN is obviously superior to the traditional neural network model, and its R (correlation coefficient), MAPE (mean absolute percentage error) and RMSE (root mean square error) are 0.8931, 10.9051 and 0.4634, respectively.

  11. Pedestrian visual recommendation in Kertanegara - Semeru corridor in Malang City

    NASA Astrophysics Data System (ADS)

    Cosalia, V. B.

    2017-06-01

    Streetscape could be the first impression to see an urban area. One of the streerscape that should be attended to it is corridor of Jl. Kertanegara - Semeru since at that corridor is the road corridor having the strong caracter also as the one of the main axes in Malang city. This research is aim knowing the visual quality also the exact structuring rcommendation for Jl. Kertanegara - Semeru based on pedestrian’s visual. The methode used to this research is Scenic Beauty Estimation (SBE) and used historic study. There is several variables used, they are scale space, visual flexibility, beauty, emphasis, balance and dominant. Based on those variable the pedestrians as a respondent doing the assessment. Based on the result of SBE have been done, it is showed that the visual quality in Corridor Kertanegara Semeru is well enough since the result showed that there are 10 photos in low visual quality in Jl. Semeru and 14 photos in high visual quality in Jl. Kertanegara, Jl. Tugu dan Jl. Kahuripan. By the historic study and based on high visual quality reference doing the structuring recommendation in part of landscape having the low visual quality.

  12. Laser welding of polymers: phenomenological model for a quick and reliable process quality estimation considering beam shape influences

    NASA Astrophysics Data System (ADS)

    Timpe, Nathalie F.; Stuch, Julia; Scholl, Marcus; Russek, Ulrich A.

    2016-03-01

    This contribution presents a phenomenological, analytical model for laser welding of polymers which is suited for a quick process quality estimation for the practitioner. Besides material properties of the polymer and processing parameters like welding pressure, feed rate and laser power the model is based on a simple few parameter description of the size and shape of the laser power density distribution (PDD) in the processing zone. The model allows an estimation of the weld seam tensile strength. It is based on energy balance considerations within a thin sheet with the thickness of the optical penetration depth on the surface of the absorbing welding partner. The joining process itself is modelled by a phenomenological approach. The model reproduces the experimentally known process windows for the main process parameters correctly. Using the parameters describing the shape of the laser PDD the critical dependence of the process windows on the PDD shape will be predicted and compared with experiments. The adaption of the model to other laser manufacturing processes where the PDD influence can be modelled comparably will be discussed.

  13. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework

    PubMed Central

    Zheng, Qi; Grice, Elizabeth A.

    2016-01-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost’s algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost. PMID:27706155

  14. High-efficiency cell concepts on low-cost silicon sheets

    NASA Technical Reports Server (NTRS)

    Bell, R. O.; Ravi, K. V.

    1985-01-01

    The limitations on sheet growth material in terms of the defect structure and minority carrier lifetime are discussed. The effect of various defects on performance are estimated. Given these limitations designs for a sheet growth cell that will make the best of the material characteristics are proposed. Achievement of optimum synergy between base material quality and device processing variables is proposed. A strong coupling exists between material quality and the variables during crystal growth, and device processing variables. Two objectives are outlined: (1) optimization of the coupling for maximum performance at minimal cost; and (2) decoupling of materials from processing by improvement in base material quality to make it less sensitive to processing variables.

  15. Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD)

    PubMed Central

    Sikich, N

    2012-01-01

    Executive Summary In July 2010, the Medical Advisory Secretariat (MAS) began work on a Chronic Obstructive Pulmonary Disease (COPD) evidentiary framework, an evidence-based review of the literature surrounding treatment strategies for patients with COPD. This project emerged from a request by the Health System Strategy Division of the Ministry of Health and Long-Term Care that MAS provide them with an evidentiary platform on the effectiveness and cost-effectiveness of COPD interventions. After an initial review of health technology assessments and systematic reviews of COPD literature, and consultation with experts, MAS identified the following topics for analysis: vaccinations (influenza and pneumococcal), smoking cessation, multidisciplinary care, pulmonary rehabilitation, long-term oxygen therapy, noninvasive positive pressure ventilation for acute and chronic respiratory failure, hospital-at-home for acute exacerbations of COPD, and telehealth (including telemonitoring and telephone support). Evidence-based analyses were prepared for each of these topics. For each technology, an economic analysis was also completed where appropriate. In addition, a review of the qualitative literature on patient, caregiver, and provider perspectives on living and dying with COPD was conducted, as were reviews of the qualitative literature on each of the technologies included in these analyses. The Chronic Obstructive Pulmonary Disease Mega-Analysis series is made up of the following reports, which can be publicly accessed at the MAS website at: http://www.hqontario.ca/en/mas/mas_ohtas_mn.html. Chronic Obstructive Pulmonary Disease (COPD) Evidentiary Framework Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Smoking Cessation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Long-term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Home Telehealth for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty_member_giacomini.htm. For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx. The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact. Objective The objective of this evidence-based analysis was to determine the effectiveness and cost-effectiveness of multidisciplinary care (MDC) compared with usual care (UC, single health care provider) for the treatment of stable chronic obstructive pulmonary disease (COPD). Clinical Need: Condition and Target Population Chronic obstructive pulmonary disease is a progressive disorder with episodes of acute exacerbations associated with significant morbidity and mortality. Cigarette smoking is linked causally to COPD in more than 80% of cases. Chronic obstructive pulmonary disease is among the most common chronic diseases worldwide and has an enormous impact on individuals, families, and societies through reduced quality of life and increased health resource utilization and mortality. The estimated prevalence of COPD in Ontario in 2007 was 708,743 persons. Technology Multidisciplinary care involves professionals from a range of disciplines, working together to deliver comprehensive care that addresses as many of the patient’s health care and psychosocial needs as possible. Two variables are inherent in the concept of a multidisciplinary team: i) the multidisciplinary components such as an enriched knowledge base and a range of clinical skills and experiences, and ii) the team components, which include but are not limited to, communication and support measures. However, the most effective number of team members and which disciplines should comprise the team for optimal effect is not yet known. Research Question What is the effectiveness and cost-effectiveness of MDC compared with UC (single health care provider) for the treatment of stable COPD? Research Methods Literature Search Search Strategy A literature search was performed on July 19, 2010 using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published from January 1, 1995 until July 2010. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. Inclusion Criteria health technology assessments, systematic reviews, or randomized controlled trials studies published between January 1995 and July 2010; COPD study population studies comparing MDC (2 or more health care disciplines participating in care) compared with UC (single health care provider) Exclusion Criteria grey literature duplicate publications non-English language publications study population less than 18 years of age Outcomes of Interest hospital admissions emergency department (ED) visits mortality health-related quality of life lung function Quality of Evidence The quality of each included study was assessed, taking into consideration allocation concealment, randomization, blinding, power/sample size, withdrawals/dropouts, and intention-to-treat analyses. The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. The following definitions of quality were used in grading the quality of the evidence: High Further research is very unlikely to change confidence in the estimate of effect. Moderate Further research is likely to have an important impact on confidence in the estimate of effect and may change the estimate. Low Further research is very likely to have an important impact on confidence in the estimate of effect and is likely to change the estimate. Very Low Any estimate of effect is very uncertain. Summary of Findings Six randomized controlled trials were obtained from the literature search. Four of the 6 studies were completed in the United States. The sample size of the 6 studies ranged from 40 to 743 participants, with a mean study sample between 66 and 71 years of age. Only 2 studies characterized the study sample in terms of the Global Initiative for Chronic Obstructive Lung Disease (GOLD) COPD stage criteria, and in general the description of the study population in the other 4 studies was limited. The mean percent predicted forced expiratory volume in 1 second (% predicted FEV1) among study populations was between 32% and 59%. Using this criterion, 3 studies included persons with severe COPD and 2 with moderate COPD. Information was not available to classify the population in the sixth study. Four studies had MDC treatment groups which included a physician. All studies except 1 reported a respiratory specialist (i.e., respiratory therapist, specialist nurse, or physician) as part of the multidisciplinary team. The UC group was comprised of a single health care practitioner who may or may not have been a respiratory specialist. A meta-analysis was completed for 5 of the 7 outcome measures of interest including: health-related quality of life, lung function, all-cause hospitalization, COPD-specific hospitalization, and mortality. There was only 1 study contributing to the outcome of all-cause and COPD-specific ED visits which precluded pooling data for these outcomes. Subgroup analyses were not completed either because heterogeneity was not significant or there were a small number of studies that were meta-analysed for the outcome. Quality of Life Three studies reported results of quality of life assessment based on the St. George’s Respiratory Questionnaire (SGRQ). A mean decrease in the SGRQ indicates an improvement in quality of life while a mean increase indicates deterioration in quality of life. In all studies the mean change score from baseline to the end time point in the MDC treatment group showed either an improvement compared with the control group or less deterioration compared with the control group. The mean difference in change scores between MDC and UC groups was statistically significant in all 3 studies. The pooled weighted mean difference in total SGRQ score was −4.05 (95% confidence interval [CI], −6.47 to 1.63; P = 0.001). The GRADE quality of evidence was assessed as low for this outcome. Lung Function Two studies reported results of the FEV1 % predicted as a measure of lung function. A negative change from baseline infers deterioration in lung function and a positive change from baseline infers an improvement in lung function. The MDC group showed a statistically significant improvement in lung function up to 12 months compared with the UC group (P = 0.01). However this effect is not maintained at 2-year follow-up (P = 0.24). The pooled weighted mean difference in FEV1 percent predicted was 2.78 (95% CI, −1.82 to −7.37). The GRADE quality of evidence was assessed as very low for this outcome indicating that an estimate of effect is uncertain. Hospital Admissions All-Cause Four studies reported results of all-cause hospital admissions in terms of number of persons with at least 1 admission during the follow-up period. Estimates from these 4 studies were pooled to determine a summary estimate. There is a statistically significant 25% relative risk (RR) reduction in all-cause hospitalizations in the MDC group compared with the UC group (P < 0.001). The index of heterogeneity (I2) value is 0%, indicating no statistical heterogeneity between studies. The GRADE quality of evidence was assessed as moderate for this outcome, indicating that further research may change the estimate of effect. COPD-Specific Hospitalization Three studies reported results of COPD-specific hospital admissions in terms of number of persons with at least 1 admission during the follow-up period. Estimates from these 3 studies were pooled to determine a summary estimate. There is a statistically significant 33% RR reduction in all-cause hospitalizations in the MDC group compared with the UC group (P = 0.002). The I2 value is 0%, indicating no statistical heterogeneity between studies. The GRADE quality of evidence was assessed as moderate for this outcome, indicating that further research may change the estimate of effect. Emergency Department Visits All-Cause Two studies reported results of all-cause ED visits in terms of number of persons with at least 1 visit during the follow-up period. There is a statistically nonsignificant reduction in all-cause ED visits when data from these 2 studies are pooled (RR, 0.64; 95% CI, 0.31 to −1.33; P = 0.24). The GRADE quality of evidence was assessed as very low for this outcome indicating that an estimate of effect is uncertain. COPD-Specific One study reported results of COPD-specific ED visits in terms of number of persons with at least 1 visit during the follow-up period. There is a statistically significant 41% reduction in COPD-specific ED visits when the data from these 2 studies are pooled (RR, 0.59; 95% CI, 0.43−0.81; P < 0.001). The GRADE quality of evidence was assessed as moderate for this outcome. Mortality Three studies reported the mortality during the study follow-up period. Estimates from these 3 studies were pooled to determine a summary estimate. There is a statistically nonsignificant reduction in mortality between treatment groups (RR, 0.81; 95% CI, 0.52−1.27; P = 0.36). The I2 value is 19%, indicating low statistical heterogeneity between studies. All studies had a 12-month follow-up period. The GRADE quality of evidence was assessed as low for this outcome. Conclusions Significant effect estimates with moderate quality of evidence were found for all-cause hospitalization, COPD-specific hospitalization, and COPD-specific ED visits (Table ES1). A significant estimate with low quality evidence was found for the outcome of quality of life (Table ES2). All other outcome measures were nonsignificant and supported by low or very low quality of evidence. Table ES1: Summary of Dichotomous Data Outcome Number ofStudies(n) Relative Risk(95% CI) GRADE Hospitalizations             All-cause (number of persons) 4(1121) 0.75 (0.64−0.87) Moderate       COPD-specific (number of persons) 3(916) 0.67 (0.52−0.87) Moderate Emergency Department Visits             All-cause (number of persons) 2(223) 0.64 (0.31−1.33) Very Low       COPD-specific (number of persons) 2(783) 0.59 (0.43−0.81) Moderate Mortality         3(1033) 0.81 (0.52−1.27) Low * Abbreviations: CI, confidence intervals; COPD, chronic obstructive pulmonary disease; n, number. Table ES2: Summary of Continuous Data Outcome Number ofStudies(n) Weighted Mean Difference (95% CI) GRADE Quality of Life (SGRQ) 2(942) −4.05 (−6.47 to −1.63) Low Lung Function (FEV1% predicted) 2(316) 2.78 (−1.82−7.37) Very Low * Abbreviations: CI, confidence intervals; FEV1, forced expiratory volume in 1 second; n, number; SGRQ, St. George’s Respiratory Questionnaire. PMID:23074433

  16. Optimization-Based Sensor Fusion of GNSS and IMU Using a Moving Horizon Approach

    PubMed Central

    Girrbach, Fabian; Hol, Jeroen D.; Bellusci, Giovanni; Diehl, Moritz

    2017-01-01

    The rise of autonomous systems operating close to humans imposes new challenges in terms of robustness and precision on the estimation and control algorithms. Approaches based on nonlinear optimization, such as moving horizon estimation, have been shown to improve the accuracy of the estimated solution compared to traditional filter techniques. This paper introduces an optimization-based framework for multi-sensor fusion following a moving horizon scheme. The framework is applied to the often occurring estimation problem of motion tracking by fusing measurements of a global navigation satellite system receiver and an inertial measurement unit. The resulting algorithm is used to estimate position, velocity, and orientation of a maneuvering airplane and is evaluated against an accurate reference trajectory. A detailed study of the influence of the horizon length on the quality of the solution is presented and evaluated against filter-like and batch solutions of the problem. The versatile configuration possibilities of the framework are finally used to analyze the estimated solutions at different evaluation times exposing a nearly linear behavior of the sensor fusion problem. PMID:28534857

  17. Optimization-Based Sensor Fusion of GNSS and IMU Using a Moving Horizon Approach.

    PubMed

    Girrbach, Fabian; Hol, Jeroen D; Bellusci, Giovanni; Diehl, Moritz

    2017-05-19

    The rise of autonomous systems operating close to humans imposes new challenges in terms of robustness and precision on the estimation and control algorithms. Approaches based on nonlinear optimization, such as moving horizon estimation, have been shown to improve the accuracy of the estimated solution compared to traditional filter techniques. This paper introduces an optimization-based framework for multi-sensor fusion following a moving horizon scheme. The framework is applied to the often occurring estimation problem of motion tracking by fusing measurements of a global navigation satellite system receiver and an inertial measurement unit. The resulting algorithm is used to estimate position, velocity, and orientation of a maneuvering airplane and is evaluated against an accurate reference trajectory. A detailed study of the influence of the horizon length on the quality of the solution is presented and evaluated against filter-like and batch solutions of the problem. The versatile configuration possibilities of the framework are finally used to analyze the estimated solutions at different evaluation times exposing a nearly linear behavior of the sensor fusion problem.

  18. The predictive validity of quality of evidence grades for the stability of effect estimates was low: a meta-epidemiological study.

    PubMed

    Gartlehner, Gerald; Dobrescu, Andreea; Evans, Tammeka Swinson; Bann, Carla; Robinson, Karen A; Reston, James; Thaler, Kylie; Skelly, Andrea; Glechner, Anna; Peterson, Kimberly; Kien, Christina; Lohr, Kathleen N

    2016-02-01

    To determine the predictive validity of the U.S. Evidence-based Practice Center (EPC) approach to GRADE (Grading of Recommendations Assessment, Development and Evaluation). Based on Cochrane reports with outcomes graded as high quality of evidence (QOE), we prepared 160 documents which represented different levels of QOE. Professional systematic reviewers dually graded the QOE. For each document, we determined whether estimates were concordant with high QOE estimates of the Cochrane reports. We compared the observed proportion of concordant estimates with the expected proportion from an international survey. To determine the predictive validity, we used the Hosmer-Lemeshow test to assess calibration and the C (concordance) index to assess discrimination. The predictive validity of the EPC approach to GRADE was limited. Estimates graded as high QOE were less likely, estimates graded as low or insufficient QOE more likely to remain stable than expected. The EPC approach to GRADE could not reliably predict the likelihood that individual bodies of evidence remain stable as new evidence becomes available. C-indices ranged between 0.56 (95% CI, 0.47 to 0.66) and 0.58 (95% CI, 0.50 to 0.67) indicating a low discriminatory ability. The limited predictive validity of the EPC approach to GRADE seems to reflect a mismatch between expected and observed changes in treatment effects as bodies of evidence advance from insufficient to high QOE. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Estimates of nitrate loads and yields from groundwater to streams in the Chesapeake Bay watershed based on land use and geology

    USGS Publications Warehouse

    Terziotti, Silvia; Capel, Paul D.; Tesoriero, Anthony J.; Hopple, Jessica A.; Kronholm, Scott C.

    2018-03-07

    The water quality of the Chesapeake Bay may be adversely affected by dissolved nitrate carried in groundwater discharge to streams. To estimate the concentrations, loads, and yields of nitrate from groundwater to streams for the Chesapeake Bay watershed, a regression model was developed based on measured nitrate concentrations from 156 small streams with watersheds less than 500 square miles (mi2 ) at baseflow. The regression model has three predictive variables: geologic unit, percent developed land, and percent agricultural land. Comparisons of estimated and actual values within geologic units were closely matched. The coefficient of determination (R2 ) for the model was 0.6906. The model was used to calculate baseflow nitrate concentrations at over 83,000 National Hydrography Dataset Plus Version 2 catchments and aggregated to 1,966 total 12-digit hydrologic units in the Chesapeake Bay watershed. The modeled output geospatial data layers provided estimated annual loads and yields of nitrate from groundwater into streams. The spatial distribution of annual nitrate yields from groundwater estimated by this method was compared to the total watershed yields of all sources estimated from a Chesapeake Bay SPAtially Referenced Regressions On Watershed attributes (SPARROW) water-quality model. The comparison showed similar spatial patterns. The regression model for groundwater contribution had similar but lower yields, suggesting that groundwater is an important source of nitrogen for streams in the Chesapeake Bay watershed.

  20. Iraq War mortality estimates: a systematic review.

    PubMed

    Tapp, Christine; Burkle, Frederick M; Wilson, Kumanan; Takaro, Tim; Guyatt, Gordon H; Amad, Hani; Mills, Edward J

    2008-03-07

    In March 2003, the United States invaded Iraq. The subsequent number, rates, and causes of mortality in Iraq resulting from the war remain unclear, despite intense international attention. Understanding mortality estimates from modern warfare, where the majority of casualties are civilian, is of critical importance for public health and protection afforded under international humanitarian law. We aimed to review the studies, reports and counts on Iraqi deaths since the start of the war and assessed their methodological quality and results. We performed a systematic search of 15 electronic databases from inception to January 2008. In addition, we conducted a non-structured search of 3 other databases, reviewed study reference lists and contacted subject matter experts. We included studies that provided estimates of Iraqi deaths based on primary research over a reported period of time since the invasion. We excluded studies that summarized mortality estimates and combined non-fatal injuries and also studies of specific sub-populations, e.g. under-5 mortality. We calculated crude and cause-specific mortality rates attributable to violence and average deaths per day for each study, where not already provided. Thirteen studies met the eligibility criteria. The studies used a wide range of methodologies, varying from sentinel-data collection to population-based surveys. Studies assessed as the highest quality, those using population-based methods, yielded the highest estimates. Average deaths per day ranged from 48 to 759. The cause-specific mortality rates attributable to violence ranged from 0.64 to 10.25 per 1,000 per year. Our review indicates that, despite varying estimates, the mortality burden of the war and its sequelae on Iraq is large. The use of established epidemiological methods is rare. This review illustrates the pressing need to promote sound epidemiologic approaches to determining mortality estimates and to establish guidelines for policy-makers, the media and the public on how to interpret these estimates.

  1. Estimating the confidence bounds for projected ozone design values under different emissions control options

    EPA Science Inventory

    In current regulatory applications, regional air quality model is applied for a base year and a future year with reduced emissions using the same meteorological conditions. The base year design value is multiplied by the ratio of the average of the top 10 ozone concentrations fo...

  2. 77 FR 51930 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Attainment Plan for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... for the Philadelphia Area. While the monitoring data that show the Philadelphia Area attained the 1997... sources (such as cargo handling equipment) at ports. Activity data for land-based sources collected from... emission estimates. EPA also verified that land-based sources for cargo handling equipment, such as...

  3. Impact of particulate air pollution on quality-adjusted life expectancy in Canada.

    PubMed

    Coyle, Douglas; Stieb, Dave; Burnett, Richard T; DeCivita, Paul; Krewski, Daniel; Chen, Yue; Thun, Michael J

    Air pollution and premature death are important public health concerns. Analyses have repeatedly demonstrated that airborne particles are associated with increased mortality and estimates have been used to forecast the impact on life expectancy. In this analysis, we draw upon data from the American Cancer Society (ACS) cohort and literature on utility-based measures of quality of life in relation to health status to more fully quantify the effects of air pollution on mortality in terms of quality-adjusted life expectancy. The analysis was conducted within a decision analytic model using Monte Carlo simulation techniques. Outcomes were estimated based on projections of the Canadian population. A one-unit reduction in sulfate air pollution would yield a mean annual increase in Quality-Adjusted Life Years (QALYs) of 20,960, with gains being greater for individuals with lower educational status and for males compared to females. This suggests that the impact of reductions in sulfate air pollution on quality-adjusted life expectancy is substantial. Interpretation of the results is unclear. However, the potential gains in QALYs from reduced air pollutants can be contrasted to the costs of policies to bring about such reductions. Based on a tentative threshold for the value of health benefits, analysis suggests that an investment in Canada of over 1 billion dollars per annum would be an efficient use of resources if it could be demonstrated that this would reduce sulfate concentrations in ambient air by 1 microg/m(3). Further analysis can assess the efficiency of targeting such initiatives to communities that are most likely to benefit.

  4. Quantifying public health benefits of environmental strategy of PM2.5 air quality management in Beijing-Tianjin-Hebei region, China.

    PubMed

    Chen, Li; Shi, Mengshuang; Li, Suhuan; Gao, Shuang; Zhang, Hui; Sun, Yanling; Mao, Jian; Bai, Zhipeng; Wang, Zhongliang; Zhou, Jiang

    2017-07-01

    In 2013, China issued "Air Pollution Prevention and Control Action Plan (Action Plan)" to improve air quality. To assess the benefits of this program in Beijing-Tianjin-Hebei (BTH) region, where the density of population and emissions vary greatly, we simulated the air quality benefit based on BenMAP to satisfy the Action Plan. In this study, we estimate PM 2.5 concentration using Voronoi spatial interpolation method on a grid with a spatial resolution of 1×1km 2 . Combined with the exposure-response function between PM 2.5 concentration and health endpoints, health effects of PM 2.5 exposure are analyzed. The economic loss is assessed by using the willingness to pay (WTP) method and human capital (HC) method. When the PM 2.5 concentration falls by 25% in BTH and reached 60μg/m 3 in Beijing, the avoiding deaths will be in the range of 3175 to 14051 based on different functions each year. Of the estimated mortality attributable to all causes, 3117 annual deaths were due to lung cancer, 1924 - 6318 annual deaths were due to cardiovascular, and 343 - 1697 annual deaths were due to respiratory. Based on WTP, the estimated monetary values for the avoided cases of all cause mortality, cardiovascular mortality, respiratory mortality and lung cancer ranged from 1110 to 29632, 673 to 13325, 120 to 3579, 1091 to 6574 million yuan, respectively. Based on HC, the corresponding values for the avoided cases of these four mortalities were 267 to 1178, 161 to 529, 29 to 143 and 261 million yuan, respectively. Copyright © 2016. Published by Elsevier B.V.

  5. Evapotranspiration-based irrigation scheduling of lettuce and broccoli

    USDA-ARS?s Scientific Manuscript database

    Estimation of crop evapotranspiration supports efficient irrigation water management, which in turn supports water conservation, mitigation of groundwater depletion/degradation, energy savings, and crop quality maintenance. Past research in California has revealed strong relationships between fract...

  6. 4D ERT-based calibration and prediction of biostimulant induced changes in fluid conductivity

    NASA Astrophysics Data System (ADS)

    Johnson, T. C.; Versteeg, R. J.; Day-Lewis, F. D.; Major, W. R.; Wright, K. E.

    2008-12-01

    In-situ bioremediation is an emerging and cost-effective method of removing organic contaminants from groundwater. The performance of bioremedial systems depends on the adequate delivery and distribution of biostimulants to contaminated zones. Monitoring the distribution of biostimulants using monitoring wells is expensive, time consuming, and provides inadequate information between sampling wells. We discuss a Hydrogeophysical Performance Monitoring System (HPMS) deployed to monitor bioremediation efforts at a TCE-contaminated Superfund site in Brandywine MD. The HPMS enables autonomous electrical geophysical data acquisition, processing, quality-assurance/quality-control, and inversion. Our objective is to demonstrate the feasibility and cost effectiveness of the HPMS to provide near real-time information on the spatiotemporal behavior of injected biostimulants. As a first step, we use time-lapse electrical resistivity tomography (ERT) to estimate changes in bulk conductivity caused by the injectate. We demonstrate how ERT-based bulk conductivity estimates can be calibrated with a small number of fluid conductivity measurements to produce ERT-based estimates of fluid conductivity. The calibration procedure addresses the spatially variable resolution of the ERT tomograms. To test the validity of these estimates, we used the ERT results to predict the fluid conductivity at tens of points prior to field sampling of fluid conductivity at the same points. The comparison of ERT-predicted vs. observed fluid conductivity displays a high degree of correlation (correlation coefficient over 0.8), and demonstrates the ability of the HPMS to estimate the four-dimensional (4D) distribution of fluid conductivity caused by the biostimulant injection.

  7. The potential health and economic benefits of preventing recurrent respiratory papillomatosis through quadrivalent human papillomavirus vaccination.

    PubMed

    Chesson, Harrell W; Forhan, Sara E; Gottlieb, Sami L; Markowitz, Lauri E

    2008-08-18

    We estimated the health and economic benefits of preventing recurrent respiratory papillomatosis (RRP) through quadrivalent human papillomavirus (HPV) vaccination. We applied a simple mathematical model to estimate the averted costs and quality-adjusted life years (QALYs) saved by preventing RRP in children whose mothers had been vaccinated at age 12 years. Under base case assumptions, the prevention of RRP would avert an estimated USD 31 (range: USD 2-178) in medical costs (2006 US dollars) and save 0.00016 QALYs (range: 0.00001-0.00152) per 12-year-old girl vaccinated. Including the benefits of RRP reduced the estimated cost per QALY gained by HPV vaccination by roughly 14-21% in the base case and by <2% to >100% in the sensitivity analyses. More precise estimates of the incidence of RRP are needed, however, to quantify this impact more reliably.

  8. The Quality of Educational Services from Students' Viewpoint in Iran: A Systematic Review and Meta-analysis.

    PubMed

    Moosavi, Ahmad; Mohseni, Mohammad; Ziaiifar, Hajar; Azami-Aghdash, Saber; Gharasi Manshadi, Mahdi; Rezapour, Aziz

    2017-04-01

    Students' view is an important factor in assessing the quality of universities. Servqual pattern is regarded as the most prominent for services quality measurement. This study aimed to review systematically studies that investigated the quality of educational services. A systematic review and meta-analysis of studies evaluating students' viewpoint about quality of educational services were conducted. Required data were collected from PubMed, Embase, Scopus, Science Direct, Google Scholar, SID, Magiran, and Iranmedex, without time restriction. Computer software CMA, ver. 2 was applied to estimate the total mean score of students' perception and expectation of services quality and the gap between them. The 18 eligible studies were entered into study. The studies were conducted between 2004 and 2014. Based on the random effect model, the total mean score of students' perception, students' expectation and the gap between them were estimated 2.92 (95% CI, 2.75 - 3.09), 4.18 (95% CI, 3.98 - 4.38), respectively and -1.30 (95% CI= -1.56, -1.04). The studied students' expectation level is higher than the current quality of educational services. There is a tangible difference between their expectations and the current quality, which requires officials' efforts to improve quality in all dimensions and effective steps can be taken towards improving the quality of educational services through appropriate training planning and training for empowering employees in colleges and universities.

  9. Targeting Environmental Quality to Improve Population Health ...

    EPA Pesticide Factsheets

    Key goals of health care reform are to stimulate innovative approaches to improve healthcare quality and clinical outcomes while holding down costs. To achieve these goals value-based payment places the needs of the patient first and encourages multi-stakeholder cooperation. Yet, the stakeholders are typically all within the healthcare system, e.g. the Accountable Care Organization or Patient-Centered Medical Home, leaving important contributors to the health of the population such as the public health and environmental health systems absent. And rarely is the quality of the environment regarded as a modifiable factor capable of imparting a health benefit. Underscoring this point, a PubMed search of the search terms “environmental quality” with “value-based payment”, “value-based healthcare” or “value-based reimbursement” returned no relevant articles, providing further evidence that the healthcare industry largely disregards the quality of the environment as a significant determinant of wellbeing and an actionable risk factor for clinical disease management and population health intervention. Yet, the quality of the environment is unequivocally related to indicators of population health including all-cause mortality. The EPA’s Environmental Quality Index (EQI) composed of five different domains (air, land use, water, built environment and social) has provided new estimates of the associations between environmental quality and health stat

  10. Constant Switching Frequency DTC for Matrix Converter Fed Speed Sensorless Induction Motor Drive

    NASA Astrophysics Data System (ADS)

    Mir, Tabish Nazir; Singh, Bhim; Bhat, Abdul Hamid

    2018-05-01

    The paper presents a constant switching frequency scheme for speed sensorless Direct Torque Control (DTC) of Matrix Converter fed Induction Motor Drive. The use of matrix converter facilitates improved power quality on input as well as motor side, along with Input Power Factor control, besides eliminating the need for heavy passive elements. Moreover, DTC through Space Vector Modulation helps in achieving a fast control over the torque and flux of the motor, with added benefit of constant switching frequency. A constant switching frequency aids in maintaining desired power quality of AC mains current even at low motor speeds, and simplifies input filter design of the matrix converter, as compared to conventional hysteresis based DTC. Further, stator voltage estimation from sensed input voltage, and subsequent stator (and rotor) flux estimation is done. For speed sensorless operation, a Model Reference Adaptive System is used, which emulates the speed dependent rotor flux equations of the induction motor. The error between conventionally estimated rotor flux (reference model) and the rotor flux estimated through the adaptive observer is processed through PI controller to generate the rotor speed estimate.

  11. Diagnostic performance of contrast-enhanced spectral mammography: Systematic review and meta-analysis.

    PubMed

    Tagliafico, Alberto Stefano; Bignotti, Bianca; Rossi, Federica; Signori, Alessio; Sormani, Maria Pia; Valdora, Francesca; Calabrese, Massimo; Houssami, Nehmat

    2016-08-01

    To estimate sensitivity and specificity of CESM for breast cancer diagnosis. Systematic review and meta-analysis of the accuracy of CESM in finding breast cancer in highly selected women. We estimated summary receiver operating characteristic curves, sensitivity and specificity according to quality criteria with QUADAS-2. Six hundred four studies were retrieved, 8 of these reporting on 920 patients with 994 lesions, were eligible for inclusion. Estimated sensitivity from all studies was: 0.98 (95% CI: 0.96-1.00). Specificity was estimated from six studies reporting raw data: 0.58 (95% CI: 0.38-0.77). The majority of studies were scored as at high risk of bias due to the very selected populations. CESM has a high sensitivity but very low specificity. The source studies were based on highly selected case series and prone to selection bias. High-quality studies are required to assess the accuracy of CESM in unselected cases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Monitoring of coalbed water retention ponds in the Powder River Basin using Google Earth images and an Unmanned Aircraft System

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Zhou, Z.; Apple, M. E.; Spangler, L.

    2016-12-01

    To extract methane from unminable seams of coal in the Powder River Basin of Montana and Wyoming, coalbed methane (CBM) water has to be pumped and kept in retention ponds rather than discharged to the vadose zone to mix with the ground water. The water areal coverage of these ponds changes due to evaporation and repetitive refilling. The water quality also changes due to growing of microalgae (unicellular or filamentous including green algae and diatoms), evaporation, and refilling. To estimate the water coverage changes and monitor water quality becomes important for monitoring the CBM water retention ponds to provide timely management plan for the newly pumped CBM water. Conventional methods such as various water indices based on multi-spectral satellite data such as Landsat because of the small pond size ( 100mx100m scale) and low spatial resolution ( 30m scale) of the satellite data. In this study we will present new methods to estimate water coverage and water quality changes using Google Earth images and images collected from an unmanned aircraft system (UAS) (Phantom 2 plus). Because these images have only visible bands (red, green, and blue bands), the conventional water index methods that involve near-infrared bands do not work. We design a new method just based on the visible bands to automatically extract water pixels and the intensity of the water pixel as a proxy for water quality after a series of image processing such as georeferencing, resampling, filtering, etc. Differential GPS positions along the water edges were collected the same day as the images collected from the UAS. Area of the water area was calculated from the GPS positions and used for the validation of the method. Because of the very high resolution ( 10-30 cm scale), the water areal coverage and water quality distribution can be accurately estimated. Since the UAS can be flied any time, water area and quality information can be collected timely.

  13. Using satellite observations in performance evaluation for regulatory air quality modeling: Comparison with ground-level measurements

    NASA Astrophysics Data System (ADS)

    Odman, M. T.; Hu, Y.; Russell, A.; Chai, T.; Lee, P.; Shankar, U.; Boylan, J.

    2012-12-01

    Regulatory air quality modeling, such as State Implementation Plan (SIP) modeling, requires that model performance meets recommended criteria in the base-year simulations using period-specific, estimated emissions. The goal of the performance evaluation is to assure that the base-year modeling accurately captures the observed chemical reality of the lower troposphere. Any significant deficiencies found in the performance evaluation must be corrected before any base-case (with typical emissions) and future-year modeling is conducted. Corrections are usually made to model inputs such as emission-rate estimates or meteorology and/or to the air quality model itself, in modules that describe specific processes. Use of ground-level measurements that follow approved protocols is recommended for evaluating model performance. However, ground-level monitoring networks are spatially sparse, especially for particulate matter. Satellite retrievals of atmospheric chemical properties such as aerosol optical depth (AOD) provide spatial coverage that can compensate for the sparseness of ground-level measurements. Satellite retrievals can also help diagnose potential model or data problems in the upper troposphere. It is possible to achieve good model performance near the ground, but have, for example, erroneous sources or sinks in the upper troposphere that may result in misleading and unrealistic responses to emission reductions. Despite these advantages, satellite retrievals are rarely used in model performance evaluation, especially for regulatory modeling purposes, due to the high uncertainty in retrievals associated with various contaminations, for example by clouds. In this study, 2007 was selected as the base year for SIP modeling in the southeastern U.S. Performance of the Community Multiscale Air Quality (CMAQ) model, at a 12-km horizontal resolution, for this annual simulation is evaluated using both recommended ground-level measurements and non-traditional satellite retrievals. Evaluation results are assessed against recommended criteria and peer studies in the literature. Further analysis is conducted, based upon these assessments, to discover likely errors in model inputs and potential deficiencies in the model itself. Correlations as well as differences in input errors and model deficiencies revealed by ground-level measurements versus satellite observations are discussed. Additionally, sensitivity analyses are employed to investigate errors in emission-rate estimates using either ground-level measurements or satellite retrievals, and the results are compared against each other considering observational uncertainties. Recommendations are made for how to effectively utilize satellite retrievals in regulatory air quality modeling.

  14. The economic burden of lung cancer and mesothelioma due to occupational and para-occupational asbestos exposure

    PubMed Central

    Tompa, Emile; Kalcevich, Christina; McLeod, Chris; Lebeau, Martin; Song, Chaojie; McLeod, Kim; Kim, Joanne; Demers, Paul A

    2017-01-01

    Objectives To estimate the economic burden of lung cancer and mesothelioma due to occupational and para-occupational asbestos exposure in Canada. Methods We estimate the lifetime cost of newly diagnosed lung cancer and mesothelioma cases associated with occupational and para-occupational asbestos exposure for calendar year 2011 based on the societal perspective. The key cost components considered are healthcare costs, productivity and output costs, and quality of life costs. Results There were 427 cases of newly diagnosed mesothelioma cases and 1904 lung cancer cases attributable to asbestos exposure in 2011 for a total of 2331 cases. Our estimate of the economic burden is $C831 million in direct and indirect costs for newly identified cases of mesothelioma and lung cancer and $C1.5 billion in quality of life costs based on a value of $C100 000 per quality-adjusted life year. This amounts to $C356 429 and $C652 369 per case, respectively. Conclusions The economic burden of lung cancer and mesothelioma associated with occupational and para-occupational asbestos exposure is substantial. The estimate identified is for 2331 newly diagnosed, occupational and para-occupational exposure cases in 2011, so it is only a portion of the burden of existing cases in that year. Our findings provide important information for policy decision makers for priority setting, in particular the merits of banning the mining of asbestos and use of products containing asbestos in countries where they are still allowed and also the merits of asbestos removal in older buildings with asbestos insulation. PMID:28756416

  15. Effects of seasonal and pandemic influenza on health-related quality of life, work and school absence in England: Results from the Flu Watch cohort study.

    PubMed

    Fragaszy, Ellen B; Warren-Gash, Charlotte; White, Peter J; Zambon, Maria; Edmunds, William J; Nguyen-Van-Tam, Jonathan S; Hayward, Andrew C

    2018-01-01

    Estimates of health-related quality of life (HRQoL) and work/school absences for influenza are typically based on medically attended cases or those meeting influenza-like-illness (ILI) case definitions and thus biased towards severe disease. Although community influenza cases are more common, estimates of their effects on HRQoL and absences are limited. To measure quality-adjusted life days and years (QALDs and QALYs) lost and work/school absences among community cases of acute respiratory infections (ARI), ILI and influenza A and B and to estimate community burden of QALY loss and absences from influenza. Flu Watch was a community cohort in England from 2006 to 2011. Participants were followed up weekly. During respiratory illness, they prospectively recorded daily symptoms, work/school absences and EQ-5D-3L data and submitted nasal swabs for RT-PCR influenza testing. Average QALD lost was 0.26, 0.93, 1.61 and 1.84 for ARI, ILI, H1N1pdm09 and influenza B cases, respectively. 40% of influenza A cases and 24% of influenza B cases took time off work/school with an average duration of 3.6 and 2.4 days, respectively. In England, community influenza cases lost 24 300 QALYs in 2010/11 and had an estimated 2.9 million absences per season based on data from 2006/07 to 2009/10. Our QALDs and QALYs lost and work and school absence estimates are lower than previous estimates because we focus on community cases, most of which are mild, may not meet ILI definitions and do not result in healthcare consultations. Nevertheless, they contribute a substantial loss of HRQoL on a population level. © 2017 The Authors. Influenza and Other Respiratory Viruses. Published by John Wiley & Sons Ltd.

  16. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, J; Fan, J; Hu, W

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less

  17. dropEst: pipeline for accurate estimation of molecular counts in droplet-based single-cell RNA-seq experiments.

    PubMed

    Petukhov, Viktor; Guo, Jimin; Baryawno, Ninib; Severe, Nicolas; Scadden, David T; Samsonova, Maria G; Kharchenko, Peter V

    2018-06-19

    Recent single-cell RNA-seq protocols based on droplet microfluidics use massively multiplexed barcoding to enable simultaneous measurements of transcriptomes for thousands of individual cells. The increasing complexity of such data creates challenges for subsequent computational processing and troubleshooting of these experiments, with few software options currently available. Here, we describe a flexible pipeline for processing droplet-based transcriptome data that implements barcode corrections, classification of cell quality, and diagnostic information about the droplet libraries. We introduce advanced methods for correcting composition bias and sequencing errors affecting cellular and molecular barcodes to provide more accurate estimates of molecular counts in individual cells.

  18. JPL IGS Analysis Center Report, 2001-2003

    NASA Technical Reports Server (NTRS)

    Heflin, M. B.; Bar-Sever, Y. E.; Jefferson, D. C.; Meyer, R. F.; Newport, B. J.; Vigue-Rodi, Y.; Webb, F. H.; Zumberge, J. F.

    2004-01-01

    Three GPS orbit and clock products are currently provided by JPL for consideration by the IGS. Each differs in its latency and quality, with later results being more accurate. Results are typically available in both IGS and GIPSY formats via anonymous ftp. Current performance based on comparisons with the IGS final products is summarized. Orbit performance was determined by computing the 3D RMS difference between each JPL product and the IGS final orbits based on 15 minute estimates from the sp3 files. Clock performance was computed as the RMS difference after subtracting a linear trend based on 15 minute estimates from the sp3 files.

  19. Estimation of Thermal Sensation Based on Wrist Skin Temperatures.

    PubMed

    Sim, Soo Young; Koh, Myung Jun; Joo, Kwang Min; Noh, Seungwoo; Park, Sangyun; Kim, Youn Ho; Park, Kwang Suk

    2016-03-23

    Thermal comfort is an essential environmental factor related to quality of life and work effectiveness. We assessed the feasibility of wrist skin temperature monitoring for estimating subjective thermal sensation. We invented a wrist band that simultaneously monitors skin temperatures from the wrist (i.e., the radial artery and ulnar artery regions, and upper wrist) and the fingertip. Skin temperatures from eight healthy subjects were acquired while thermal sensation varied. To develop a thermal sensation estimation model, the mean skin temperature, temperature gradient, time differential of the temperatures, and average power of frequency band were calculated. A thermal sensation estimation model using temperatures of the fingertip and wrist showed the highest accuracy (mean root mean square error [RMSE]: 1.26 ± 0.31). An estimation model based on the three wrist skin temperatures showed a slightly better result to the model that used a single fingertip skin temperature (mean RMSE: 1.39 ± 0.18). When a personalized thermal sensation estimation model based on three wrist skin temperatures was used, the mean RMSE was 1.06 ± 0.29, and the correlation coefficient was 0.89. Thermal sensation estimation technology based on wrist skin temperatures, and combined with wearable devices may facilitate intelligent control of one's thermal environment.

  20. Estimation of Thermal Sensation Based on Wrist Skin Temperatures

    PubMed Central

    Sim, Soo Young; Koh, Myung Jun; Joo, Kwang Min; Noh, Seungwoo; Park, Sangyun; Kim, Youn Ho; Park, Kwang Suk

    2016-01-01

    Thermal comfort is an essential environmental factor related to quality of life and work effectiveness. We assessed the feasibility of wrist skin temperature monitoring for estimating subjective thermal sensation. We invented a wrist band that simultaneously monitors skin temperatures from the wrist (i.e., the radial artery and ulnar artery regions, and upper wrist) and the fingertip. Skin temperatures from eight healthy subjects were acquired while thermal sensation varied. To develop a thermal sensation estimation model, the mean skin temperature, temperature gradient, time differential of the temperatures, and average power of frequency band were calculated. A thermal sensation estimation model using temperatures of the fingertip and wrist showed the highest accuracy (mean root mean square error [RMSE]: 1.26 ± 0.31). An estimation model based on the three wrist skin temperatures showed a slightly better result to the model that used a single fingertip skin temperature (mean RMSE: 1.39 ± 0.18). When a personalized thermal sensation estimation model based on three wrist skin temperatures was used, the mean RMSE was 1.06 ± 0.29, and the correlation coefficient was 0.89. Thermal sensation estimation technology based on wrist skin temperatures, and combined with wearable devices may facilitate intelligent control of one’s thermal environment. PMID:27023538

  1. The quality estimation of exterior wall’s and window filling’s construction design

    NASA Astrophysics Data System (ADS)

    Saltykov, Ivan; Bovsunovskaya, Maria

    2017-10-01

    The article reveals the term of “artificial envelope” in dwelling building. Authors offer a complex multifactorial approach to the design quality estimation of external fencing structures, which is based on various parameters impact. These referred parameters are: functional, exploitation, cost, and also, the environmental index is among them. The quality design index Qк is inputting for the complex characteristic of observed above parameters. The mathematical relation of this index from these parameters is the target function for the quality design estimation. For instance, the article shows the search of optimal variant for wall and window designs in small, middle and large square dwelling premises of economic class buildings. The graphs of target function single parameters are expressed for the three types of residual chamber’s dimensions. As a result of the showing example, there is a choice of window opening’s dimensions, which make the wall’s and window’s constructions properly correspondent to the producible complex requirements. The authors reveal the comparison of recommended window filling’s square in accordance with the building standards, and the square, due to the finding of the optimal variant of the design quality index. The multifactorial approach for optimal design searching, which is mentioned in this article, can be used in consideration of various construction elements of dwelling buildings in accounting of suitable climate, social and economic construction area features.

  2. Influence of scanning parameters on the estimation accuracy of control points of B-spline surfaces

    NASA Astrophysics Data System (ADS)

    Aichinger, Julia; Schwieger, Volker

    2018-04-01

    This contribution deals with the influence of scanning parameters like scanning distance, incidence angle, surface quality and sampling width on the average estimated standard deviations of the position of control points from B-spline surfaces which are used to model surfaces from terrestrial laser scanning data. The influence of the scanning parameters is analyzed by the Monte Carlo based variance analysis. The samples were generated for non-correlated and correlated data, leading to the samples generated by Latin hypercube and replicated Latin hypercube sampling algorithms. Finally, the investigations show that the most influential scanning parameter is the distance from the laser scanner to the object. The angle of incidence shows a significant effect for distances of 50 m and longer, while the surface quality contributes only negligible effects. The sampling width has no influence. Optimal scanning parameters can be found in the smallest possible object distance at an angle of incidence close to 0° in the highest surface quality. The consideration of correlations improves the estimation accuracy and underlines the importance of complete stochastic models for TLS measurements.

  3. Compulsive buying and quality of life: An estimate of the monetary cost of compulsive buying among adults in early midlife.

    PubMed

    Zhang, Chenshu; Brook, Judith S; Leukefeld, Carl G; De La Rosa, Mario; Brook, David W

    2017-06-01

    The aims of this study were to examine the associations between compulsive buying and quality of life and to estimate the monetary cost of compulsive buying for a cohort of men and women at mean age 43. Participants came from a community-based random sample of residents in two New York counties (N=548). The participants were followed from adolescence to early midlife. The mean age of participants at the most recent interview was 43.0 (SD=2.8). Fifty five percent of the participants were females. Over 90% of the participants were white. Linear regression analyses showed that compulsive buying was significantly associated with quality of life, despite controlling for relevant demographic and psychosocial factors. The estimated monetary cost of compulsive buying for this cohort was significant. The fact that the monetary cost of CB is not trivial suggests that individuals are both consciously and unconsciously plagued by their CB. The findings are important for interventionists and clinicians for cost-effective intervention and treatment programs. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  4. Compulsive Buying and Quality of Life: An Estimate of the Monetary Cost of Compulsive Buying among Adults in Early Midlife

    PubMed Central

    Zhang, Chenshu; Brook, Judith S.; Leukefeld, Carl G.; De La Rosa, Mario; Brook, David W.

    2017-01-01

    The aims of this study were to examine the associations between compulsive buying and quality of life and to estimate the monetary cost of compulsive buying for a cohort of men and women at mean age 43. Participants came from a community-based random sample of residents in two New York counties (N=548). The participants were followed from adolescence to early midlife. The mean age of participants at the most recent interview was 43.0 (SD=2.8). Fifty five percent of the participants were females. Over 90% of the participants were white. Linear regression analyses showed that compulsive buying was significantly associated with quality of life, despite controlling for relevant demographic and psychosocial factors. The estimated monetary cost of compulsive buying for this cohort was significant. The fact that the monetary cost of CB is not trivial suggests that individuals are both consciously and unconsciously plagued by their CB. The findings are important for interventionists and clinicians for cost-effective intervention and treatment programs. PMID:28285247

  5. Economic Benefits of Improved Water Quality: Public Perceptions of Option and Preservation Values

    NASA Astrophysics Data System (ADS)

    Bouwes, Nicolaas W., Sr.

    The primary objective of this book is to report the authors‧ research approach to the estimation of benefits of water quality improvements in the South Platte River of northeastern Colorado. Benefits included a “consumer surplus” from enhanced enjoyment of water-based recreation, an “option value” of assured choice of future recreation use, and a “preservation value” of the ecosystem and its bequest to future generations. Concepts such as preservation and option value benefits have been often mentioned but seldom estimated in natural resources research. The authors have met their objective by providing the reader with a detailed description of their research without being tedious.

  6. Document localization algorithms based on feature points and straight lines

    NASA Astrophysics Data System (ADS)

    Skoryukina, Natalya; Shemiakina, Julia; Arlazarov, Vladimir L.; Faradjev, Igor

    2018-04-01

    The important part of the system of a planar rectangular object analysis is the localization: the estimation of projective transform from template image of an object to its photograph. The system also includes such subsystems as the selection and recognition of text fields, the usage of contexts etc. In this paper three localization algorithms are described. All algorithms use feature points and two of them also analyze near-horizontal and near- vertical lines on the photograph. The algorithms and their combinations are tested on a dataset of real document photographs. Also the method of localization quality estimation is proposed that allows configuring the localization subsystem independently of the other subsystems quality.

  7. ERF1_2 -- Enhanced River Reach File 2.0

    USGS Publications Warehouse

    Nolan, Jacqueline V.; Brakebill, John W.; Alexander, Richard B.; Schwarz, Gregory E.

    2003-01-01

    The digital segmented network based on watershed boundaries, ERF1_2, includes enhancements to the U.S. Environmental Protection Agency's River Reach File 1 (RF1) (USEPA, 1996; DeWald and others, 1985) to support national and regional-scale surface water-quality modeling. Alexander and others (1999) developed ERF1, which assessed the hydrologic integrity of the digital reach traces and calculated the mean water time-of-travel in river reaches and reservoirs. ERF1_2 serves as the foundation for SPARROW (Spatially Referenced Regressions (of nutrient transport) on Watershed) modeling. Within the context of a Geographic Information System, SPARROW estimates the proportion of watersheds in the conterminous U.S. with outflow concentrations of several nutrients, including total nitrogen and total phosphorus, (Smith, R.A., Schwarz, G.E., and Alexander, R.B., 1997). This version of the network expands on ERF1 (Version 1.2; Alexander, et al., 1999) and includes the incremental and total drainage area derived from 1-kilometer (km) elevation data for North America. Previous estimates of the water time-of-travel were recomputed for reaches with water-quality monitoring sites that included two reaches. The mean flow and velocity estimates for these split reaches are based on previous estimation methods (Alexander et al., 1999) and are unchanged in ERF1_2. Drainage area calculations provide data used to estimate the contribution of a given nutrient to the outflow. Data estimates depend on the accuracy of node connectivity. Reaches split at water-quality or pesticide-monitoring sites indicate the source point for estimating the contribution and transport of nutrients and their loads throughout the watersheds. The ERF1_2 coverage extends the earlier drainage area founded on the 1-kilometer data for North America (Verdin, 1996; Verdin and Jenson, 1996). A 1-kilometer raster grid of ERF1_2 projected to Lambert Azimuthal Equal Area, NAD 27 Datum (Snyder, 1987), was merged with the HYDRO1K flow direction data set (Verdin and Jenson, 1996) to generate a DEM-based watershed grid, ERF1_2WS_LG. The watershed boundaries are maintained in a raster (grid cell) format as well as a vector (polygon) format for subsequent model analysis. Both the coverage, ERF1_2, and the grid, ERF1_2WS_LG, are available at: URL:http://water.usgs.gov/lookup/getspatial?erf1_2

  8. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  9. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods.

    PubMed

    Vizcaíno, Iván P; Carrera, Enrique V; Muñoz-Romero, Sergio; Cumbal, Luis H; Rojo-Álvarez, José Luis

    2017-10-16

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.

  10. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods

    PubMed Central

    Vizcaíno, Iván P.; Muñoz-Romero, Sergio; Cumbal, Luis H.

    2017-01-01

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer’s kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer’s kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem. PMID:29035333

  11. The StreamCat Dataset: Accumulated Attributes for NHDPlusV2 Catchments (Version 2.1) for the Conterminous United States: Base Flow Index

    EPA Pesticide Factsheets

    This dataset represents the base flow index values within individual, local NHDPlusV2 catchments and upstream, contributing watersheds. Attributes of the landscape layer were calculated for every local NHDPlusV2 catchment and accumulated to provide watershed-level metrics. (See Supplementary Info for Glossary of Terms) The base-flow index (BFI) grid for the conterminous United States was developed to estimate (1) BFI values for ungaged streams, and (2) ground-water recharge throughout the conterminous United States (see Source_Information). Estimates of BFI values at ungaged streams and BFI-based ground-water recharge estimates are useful for interpreting relations between land use and water quality in surface and ground water. The bfi (%) was summarized by local catchment and by watershed to produce local catchment-level and watershed-level metrics as a continuous data type (see Data Structure and Attribute Information for a description).

  12. Estimating minimal important differences for several scales assessing function and quality of life in patients with attention-deficit/hyperactivity disorder.

    PubMed

    Hodgkins, Paul; Lloyd, Andrew; Erder, M Haim; Setyawan, Juliana; Weiss, Margaret D; Sasané, Rahul; Nafees, Beenish

    2017-02-01

    Defining minimal important difference (MID) is critical to interpreting patient-reported outcomes data and treatment efficacy in clinical trials. This study estimates the MID for the Weiss Functional Impairment Rating Scale-Parent Report (WFIRS-P) and the Child Health and Illness Profile-Parent Report (CHIP-CE-PRF76) among parents of young people with attention-deficit/hyperactivity disorder (ADHD) in the UK. Parents of children (6-12 years; n=100) and adolescents (13-17 years; n=117) with ADHD completed a socio-demographic form, the CHIP-CE-PRF76, the WFIRS-P, and the Pediatric Quality of Life scale at baseline and 4 weeks later. At follow-up, a subset of parents completed anchor questions measuring change in the child/adolescent from baseline. MIDs were estimated using anchor-based and distribution-based methods, and separately for children and adolescents. The MID estimates for overall change in the WFIRS-P total score ranged from 11.31 (standard error of measurement) to 13.47 (anchor) for the total sample. The range of MID estimates for the CHIP-CE-PRF76 varied by domain: 6.80-7.41 (satisfaction), 6.18-7.34 (comfort), 5.60-6.72 (resilience), 6.06-7.57 (risk avoidance), and 4.00-5.63 (achievement) for the total sample. Overall, MID estimates for WFIRS-P MID and CHIP-CE-PRF76 were slightly higher for adolescents than for children. This study estimated MIDs for these instruments using several methods. The observed convergence of the MID estimates increases confidence in their reliability and could assist clinicians and decision makers in deriving meaningful interpretations of observed changes in the WFIRS-P and CHIP-CE in clinical trials and practice.

  13. Variability of pesticide detections and concentrations in field replicate water samples collected for the National Water-Quality Assessment Program, 1992-97

    USGS Publications Warehouse

    Martin, Jeffrey D.

    2002-01-01

    Correlation analysis indicates that for most pesticides and concentrations, pooled estimates of relative standard deviation rather than pooled estimates of standard deviation should be used to estimate variability because pooled estimates of relative standard deviation are less affected by heteroscedasticity. The 2 Variability of Pesticide Detections and Concentrations in Field Replicate Water Samples, 1992–97 median pooled relative standard deviation was calculated for all pesticides to summarize the typical variability for pesticide data collected for the NAWQA Program. The median pooled relative standard deviation was 15 percent at concentrations less than 0.01 micrograms per liter (µg/L), 13 percent at concentrations near 0.01 µg/L, 12 percent at concentrations near 0.1 µg/L, 7.9 percent at concentrations near 1 µg/L, and 2.7 percent at concentrations greater than 5 µg/L. Pooled estimates of standard deviation or relative standard deviation presented in this report are larger than estimates based on averages, medians, smooths, or regression of the individual measurements of standard deviation or relative standard deviation from field replicates. Pooled estimates, however, are the preferred method for characterizing variability because they provide unbiased estimates of the variability of the population. Assessments of variability based on standard deviation (rather than variance) underestimate the true variability of the population. Because pooled estimates of variability are larger than estimates based on other approaches, users of estimates of variability must be cognizant of the approach used to obtain the estimate and must use caution in the comparison of estimates based on different approaches.

  14. A novel, fuzzy-based air quality index (FAQI) for air quality assessment

    NASA Astrophysics Data System (ADS)

    Sowlat, Mohammad Hossein; Gharibi, Hamed; Yunesian, Masud; Tayefeh Mahmoudi, Maryam; Lotfi, Saeedeh

    2011-04-01

    The ever increasing level of air pollution in most areas of the world has led to development of a variety of air quality indices for estimation of health effects of air pollution, though the indices have their own limitations such as high levels of subjectivity. Present study, therefore, aimed at developing a novel, fuzzy-based air quality index (FAQI ) to handle such limitations. The index developed by present study is based on fuzzy logic that is considered as one of the most common computational methods of artificial intelligence. In addition to criteria air pollutants (i.e. CO, SO 2, PM 10, O 3, NO 2), benzene, toluene, ethylbenzene, xylene, and 1,3-butadiene were also taken into account in the index proposed, because of their considerable health effects. Different weighting factors were then assigned to each pollutant according to its priority. Trapezoidal membership functions were employed for classifications and the final index consisted of 72 inference rules. To assess the performance of the index, a case study was carried out employing air quality data at five different sampling stations in Tehran, Iran, from January 2008 to December 2009, results of which were then compared to the results obtained from USEPA air quality index (AQI). According to the results from present study, fuzzy-based air quality index is a comprehensive tool for classification of air quality and tends to produce accurate results. Therefore, it can be considered useful, reliable, and suitable for consideration by local authorities in air quality assessment and management schemes. Fuzzy-based air quality index (FAQI).

  15. Validation and evaluation of model-based crosstalk compensation method in simultaneous /sup 99m/Tc stress and /sup 201/Tl rest myocardial perfusion SPECT

    NASA Astrophysics Data System (ADS)

    Song, X.; Frey, E. C.; Wang, W. T.; Du, Y.; Tsui, B. M. W.

    2004-02-01

    Simultaneous acquisition of /sup 99m/Tc stress and /sup 201/Tl rest myocardial perfusion SPECT has several potential advantages, but the image quality is degraded by crosstalk between the Tc and Tl data. We have previously developed a crosstalk model that includes estimates of the downscatter and Pb X-ray for use in crosstalk compensation. In this work, we validated the model by comparing the crosstalk from /sup 99m/Tc to the Tl window calculated using a combination of the SimSET-MCNP Monte Carlo simulation codes. We also evaluated the model-based crosstalk compensation method using both simulated data from the 3-D MCAT phantom and experimental data from a physical phantom with a myocardial defect. In these studies, the Tl distributions were reconstructed from crosstalk contaminated data without crosstalk compensation, with compensation using the model-based crosstalk estimate, and with compensation using the known true crosstalk, and were compared with the Tl distribution reconstructed from uncontaminated Tl data. Results show that the model gave good estimates of both the downscatter photons and Pb X-rays in the simultaneous dual-isotopes myocardial perfusion SPECT. The model-based compensation method provided image quality that was significantly improved as compared to no compensation and was very close to that from the separate acquisition.

  16. New Directions: Questions surrounding suspended particle mass used as a surrogate for air quality and for regulatory control of ambient urban air pollution

    NASA Astrophysics Data System (ADS)

    Hoare, John L.

    2014-07-01

    The original choice of particulate matter mass (PM) as a realistic surrogate for gross air pollution has gradually evolved into routine use nowadays of epidemiologically-based estimates of the monetary and other benefits expected from regulating urban air quality. Unfortunately, the statistical associations facilitating such calculations usually are based on single indices of air pollution whereas the health effects themselves are more broadly based causally. For this and other reasons the economic benefits of control tend to be exaggerated. Primarily because of their assumed inherently inferior respirability, particles ≥10 μm are generally excluded from such considerations. Where the particles themselves are chemically heterogeneous, as in an urban context, this may be inappropriate. Clearly all air-borne particles, whether coarse or fine, are susceptible to inhalation. Hence, the possibility exists for any adhering potentially harmful semi-volatile substances to be subsequently de-sorbed in vivo thereby facilitating their transport deeper into the lungs. Consequently, this alone may be a sufficient reason for including rather than rejecting during air quality monitoring the relatively coarse 10-100 μm particle fraction, ideally in conjunction with routine estimation of the gaseous co-pollutants thereby facilitating a multi-pollutant approach apropos regulation.

  17. Increasing precision of turbidity-based suspended sediment concentration and load estimates.

    PubMed

    Jastram, John D; Zipper, Carl E; Zelazny, Lucian W; Hyer, Kenneth E

    2010-01-01

    Turbidity is an effective tool for estimating and monitoring suspended sediments in aquatic systems. Turbidity can be measured in situ remotely and at fine temporal scales as a surrogate for suspended sediment concentration (SSC), providing opportunity for a more complete record of SSC than is possible with physical sampling approaches. However, there is variability in turbidity-based SSC estimates and in sediment loadings calculated from those estimates. This study investigated the potential to improve turbidity-based SSC, and by extension the resulting sediment loading estimates, by incorporating hydrologic variables that can be monitored remotely and continuously (typically 15-min intervals) into the SSC estimation procedure. On the Roanoke River in southwestern Virginia, hydrologic stage, turbidity, and other water-quality parameters were monitored with in situ instrumentation; suspended sediments were sampled manually during elevated turbidity events; samples were analyzed for SSC and physical properties including particle-size distribution and organic C content; and rainfall was quantified by geologic source area. The study identified physical properties of the suspended-sediment samples that contribute to SSC estimation variance and hydrologic variables that explained variability of those physical properties. Results indicated that the inclusion of any of the measured physical properties in turbidity-based SSC estimation models reduces unexplained variance. Further, the use of hydrologic variables to represent these physical properties, along with turbidity, resulted in a model, relying solely on data collected remotely and continuously, that estimated SSC with less variance than a conventional turbidity-based univariate model, allowing a more precise estimate of sediment loading, Modeling results are consistent with known mechanisms governing sediment transport in hydrologic systems.

  18. PROGRAM PARAMS USERS GUIDE

    EPA Science Inventory

    PARAMS is a Windows-based computer program that implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into eight categories: (1) the properties o...

  19. Prediction of ground water quality index to assess suitability for drinking purposes using fuzzy rule-based approach

    NASA Astrophysics Data System (ADS)

    Gorai, A. K.; Hasni, S. A.; Iqbal, Jawed

    2016-11-01

    Groundwater is the most important natural resource for drinking water to many people around the world, especially in rural areas where the supply of treated water is not available. Drinking water resources cannot be optimally used and sustained unless the quality of water is properly assessed. To this end, an attempt has been made to develop a suitable methodology for the assessment of drinking water quality on the basis of 11 physico-chemical parameters. The present study aims to select the fuzzy aggregation approach for estimation of the water quality index of a sample to check the suitability for drinking purposes. Based on expert's opinion and author's judgement, 11 water quality (pollutant) variables (Alkalinity, Dissolved Solids (DS), Hardness, pH, Ca, Mg, Fe, Fluoride, As, Sulphate, Nitrates) are selected for the quality assessment. The output results of proposed methodology are compared with the output obtained from widely used deterministic method (weighted arithmetic mean aggregation) for the suitability of the developed methodology.

  20. Image and Video Quality Assessment Using LCD: Comparisons with CRT Conditions

    NASA Astrophysics Data System (ADS)

    Tourancheau, Sylvain; Callet, Patrick Le; Barba, Dominique

    In this paper, the impact of display on quality assessment is addressed. Subjective quality assessment experiments have been performed on both LCD and CRT displays. Two sets of still images and two sets of moving pictures have been assessed using either an ACR or a SAMVIQ protocol. Altogether, eight experiments have been led. Results are presented and discussed, some differences are pointed out. Concerning moving pictures, these differences seem to be mainly due to LCD moving artefacts such as motion blur. LCD motion blur has been measured objectively and with psycho-physics experiments. A motion-blur metric based on the temporal characteristics of LCD can be defined. A prediction model have been then designed which predict the differences of perceived quality between CRT and LCD. This motion-blur-based model enables the estimation of perceived quality on LCD with respect to the perceived quality on CRT. Technical solutions to LCD motion blur can thus be evaluated on natural contents by this mean.

  1. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  2. A Portuguese value set for the SF-6D.

    PubMed

    Ferreira, Lara N; Ferreira, Pedro L; Pereira, Luis N; Brazier, John; Rowen, Donna

    2010-08-01

    The SF-6D is a preference-based measure of health derived from the SF-36 that can be used for cost-effectiveness analysis using cost-per-quality adjusted life-year analysis. This study seeks to estimate a system weight for the SF-6D for Portugal and to compare the results with the UK system weights. A sample of 55 health states defined by the SF-6D has been valued by a representative random sample of the Portuguese population, stratified by sex and age (n = 140), using the Standard Gamble (SG). Several models are estimated at both the individual and aggregate levels for predicting health-state valuations. Models with main effects, with interaction effects and with the constant forced to unity are presented. Random effects (RE) models are estimated using generalized least squares (GLS) regressions. Generalized estimation equations (GEE) are used to estimate RE models with the constant forced to unity. Estimations at the individual level were performed using 630 health-state valuations. Alternative functional forms are considered to account for the skewed distribution of health-state valuations. The models are analyzed in terms of their coefficients, overall fit, and the ability for predicting the SG-values. The RE models estimated using GLS and through GEE produce significant coefficients, which are robust across model specification. However, there are concerns regarding some inconsistent estimates, and so parsimonious consistent models were estimated. There is evidence of under prediction in some states assigned to poor health. The results are consistent with the UK results. The models estimated provide preference-based quality of life weights for the Portuguese population when health status data have been collected using the SF-36. Although the sample was randomly drowned findings should be treated with caution, given the small sample size, even knowing that they have been estimated at the individual level.

  3. [Aquatic Ecological Index based on freshwater (ICE(RN-MAE)) for the Rio Negro watershed, Colombia].

    PubMed

    Forero, Laura Cristina; Longo, Magnolia; John Jairo, Ramirez; Guillermo, Chalar

    2014-04-01

    Aquatic Ecological Index based on freshwater (ICE(RN-MAE)) for the Rio Negro watershed, Colombia. Available indices to assess the ecological status of rivers in Colombia are mostly based on subjective hypotheses about macroinvertebrate tolerance to pollution, which have important limitations. Here we present the application of a method to establish an index of ecological quality for lotic systems in Colombia. The index, based on macroinvertebrate abundance and physicochemical variables, was developed as an alternative to the BMWP-Col index. The method consists on determining an environmental gradient from correlations between physicochemical variables and abundance. The scores obtained in each sampling point are used in a standardized correlation for a model of weighted averages (WA). In the WA model abundances are also weighted to estimate the optimum and tolerance values of each taxon; using this information we estimated the index of ecological quality based also on macroinvertebrate (ICE(RN-MAE)) abundance in each sampling site. Subsequently, we classified all sites using the index and concentrations of total phosphorus (TP) in a cluster analysis. Using TP and ICE(RN-MAE), mean, maximum, minimum and standard deviation, we defined threshold values corresponding to three categories of ecological status: good, fair and critical.

  4. Optimal Non-Invasive Fault Classification Model for Packaged Ceramic Tile Quality Monitoring Using MMW Imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Singh, Dharmendra

    2016-04-01

    Millimeter wave (MMW) frequency has emerged as an efficient tool for different stand-off imaging applications. In this paper, we have dealt with a novel MMW imaging application, i.e., non-invasive packaged goods quality estimation for industrial quality monitoring applications. An active MMW imaging radar operating at 60 GHz has been ingeniously designed for concealed fault estimation. Ceramic tiles covered with commonly used packaging cardboard were used as concealed targets for undercover fault classification. A comparison of computer vision-based state-of-the-art feature extraction techniques, viz, discrete Fourier transform (DFT), wavelet transform (WT), principal component analysis (PCA), gray level co-occurrence texture (GLCM), and histogram of oriented gradient (HOG) has been done with respect to their efficient and differentiable feature vector generation capability for undercover target fault classification. An extensive number of experiments were performed with different ceramic tile fault configurations, viz., vertical crack, horizontal crack, random crack, diagonal crack along with the non-faulty tiles. Further, an independent algorithm validation was done demonstrating classification accuracy: 80, 86.67, 73.33, and 93.33 % for DFT, WT, PCA, GLCM, and HOG feature-based artificial neural network (ANN) classifier models, respectively. Classification results show good capability for HOG feature extraction technique towards non-destructive quality inspection with appreciably low false alarm as compared to other techniques. Thereby, a robust and optimal image feature-based neural network classification model has been proposed for non-invasive, automatic fault monitoring for a financially and commercially competent industrial growth.

  5. Fully iterative scatter corrected digital breast tomosynthesis using GPU-based fast Monte Carlo simulation and composition ratio update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kyungsang; Ye, Jong Chul, E-mail: jong.ye@kaist.ac.kr; Lee, Taewon

    2015-09-15

    Purpose: In digital breast tomosynthesis (DBT), scatter correction is highly desirable, as it improves image quality at low doses. Because the DBT detector panel is typically stationary during the source rotation, antiscatter grids are not generally compatible with DBT; thus, a software-based scatter correction is required. This work proposes a fully iterative scatter correction method that uses a novel fast Monte Carlo simulation (MCS) with a tissue-composition ratio estimation technique for DBT imaging. Methods: To apply MCS to scatter estimation, the material composition in each voxel should be known. To overcome the lack of prior accurate knowledge of tissue compositionmore » for DBT, a tissue-composition ratio is estimated based on the observation that the breast tissues are principally composed of adipose and glandular tissues. Using this approximation, the composition ratio can be estimated from the reconstructed attenuation coefficients, and the scatter distribution can then be estimated by MCS using the composition ratio. The scatter estimation and image reconstruction procedures can be performed iteratively until an acceptable accuracy is achieved. For practical use, (i) the authors have implemented a fast MCS using a graphics processing unit (GPU), (ii) the MCS is simplified to transport only x-rays in the energy range of 10–50 keV, modeling Rayleigh and Compton scattering and the photoelectric effect using the tissue-composition ratio of adipose and glandular tissues, and (iii) downsampling is used because the scatter distribution varies rather smoothly. Results: The authors have demonstrated that the proposed method can accurately estimate the scatter distribution, and that the contrast-to-noise ratio of the final reconstructed image is significantly improved. The authors validated the performance of the MCS by changing the tissue thickness, composition ratio, and x-ray energy. The authors confirmed that the tissue-composition ratio estimation was quite accurate under a variety of conditions. Our GPU-based fast MCS implementation took approximately 3 s to generate each angular projection for a 6 cm thick breast, which is believed to make this process acceptable for clinical applications. In addition, the clinical preferences of three radiologists were evaluated; the preference for the proposed method compared to the preference for the convolution-based method was statistically meaningful (p < 0.05, McNemar test). Conclusions: The proposed fully iterative scatter correction method and the GPU-based fast MCS using tissue-composition ratio estimation successfully improved the image quality within a reasonable computational time, which may potentially increase the clinical utility of DBT.« less

  6. Spring cleaning: rural water impacts, valuation, and property rights institutions.

    PubMed

    Kremer, Michael; Leino, Jessica; Miguel, Edward; Zwane, Alix Peterson

    2011-01-01

    Using a randomized evaluation in Kenya, we measure health impacts of spring protection, an investment that improves source water quality. We also estimate households' valuation of spring protection and simulate the welfare impacts of alternatives to the current system of common property rights in water, which limits incentives for private investment. Spring infrastructure investments reduce fecal contamination by 66%, but household water quality improves less, due to recontamination. Child diarrhea falls by one quarter. Travel-cost based revealed preference estimates of households' valuations are much smaller than both stated preference valuations and health planners' valuations, and are consistent with models in which the demand for health is highly income elastic. We estimate that private property norms would generate little additional investment while imposing large static costs due to above-marginal-cost pricing, private property would function better at higher income levels or under water scarcity, and alternative institutions could yield Pareto improvements.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riihimaki, L.; McFarlane, S.; Sivaraman, C.

    The ndrop_mfrsr value-added product (VAP) provides an estimate of the cloud droplet number concentration of overcast water clouds retrieved from cloud optical depth from the multi-filter rotating shadowband radiometer (MFRSR) instrument and liquid water path (LWP) retrieved from the microwave radiometer (MWR). When cloud layer information is available from vertically pointing lidar and radars in the Active Remote Sensing of Clouds (ARSCL) product, the VAP also provides estimates of the adiabatic LWP and an adiabatic parameter (beta) that indicates how divergent the LWP is from the adiabatic case. quality control (QC) flags (qc_drop_number_conc), an uncertainty estimate (drop_number_conc_toterr), and a cloudmore » layer type flag (cloud_base_type) are useful indicators of the quality and accuracy of any given value of the retrieval. Examples of these major input and output variables are given in sample plots in section 6.0.« less

  8. An Updated TRMM Composite Climatology of Tropical Rainfall and Its Validation

    NASA Technical Reports Server (NTRS)

    Wang, Jian-Jian; Adler, Robert F.; Huffman, George; Bolvin, David

    2013-01-01

    An updated 15-yr Tropical Rainfall Measuring Mission (TRMM) composite climatology (TCC) is presented and evaluated. This climatology is based on a combination of individual rainfall estimates made with data from the primaryTRMMinstruments: theTRMM Microwave Imager (TMI) and the precipitation radar (PR). This combination climatology of passive microwave retrievals, radar-based retrievals, and an algorithm using both instruments simultaneously provides a consensus TRMM-based estimate of mean precipitation. The dispersion of the three estimates, as indicated by the standard deviation sigma among the estimates, is presented as a measure of confidence in the final estimate and as an estimate of the uncertainty thereof. The procedures utilized by the compositing technique, including adjustments and quality-control measures, are described. The results give a mean value of the TCC of 4.3mm day(exp -1) for the deep tropical ocean beltbetween 10 deg N and 10 deg S, with lower values outside that band. In general, the TCC values confirm ocean estimates from the Global Precipitation Climatology Project (GPCP) analysis, which is based on passive microwave results adjusted for sampling by infrared-based estimates. The pattern of uncertainty estimates shown by sigma is seen to be useful to indicate variations in confidence. Examples include differences between the eastern and western portions of the Pacific Ocean and high values in coastal and mountainous areas. Comparison of the TCC values (and the input products) to gauge analyses over land indicates the value of the radar-based estimates (small biases) and the limitations of the passive microwave algorithm (relatively large biases). Comparison with surface gauge information from western Pacific Ocean atolls shows a negative bias (16%) for all the TRMM products, although the representativeness of the atoll gauges of open-ocean rainfall is still in question.

  9. Air quality and human health impacts of grasslands and shrublands in the United States

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Varsha; Hirabayashi, Satoshi; Ziv, Guy; Bakshi, Bhavik R.

    2018-06-01

    Vegetation including canopy, grasslands, and shrublands can directly sequester pollutants onto the plant surface, resulting in an improvement in air quality. Until now, several studies have estimated the pollution removal capacity of canopy cover at the level of a county, but no such work exists for grasslands and shrublands. This work quantifies the air pollution removal capacity of grasslands and shrublands at the county-level in the United States and estimates the human health benefits associated with pollution removal using the i-Tree Eco model. Sequestration of pollutants is estimated based on the Leaf Area Index (LAI) obtained from the Moderate Resolution Imaging Spectroradiometer (MODIS) derived dataset estimates of LAI and the percentage land cover obtained from the National Land Cover Database (NLCD) for the year 2010. Calculation of pollution removal capacity using local environmental data indicates that grasslands and shrublands remove a total of 6.42 million tonnes of air pollutants in the United States and the associated monetary benefits total 268 million. Human health impacts and associated monetary value due to pollution removal was observed to be significantly high in urban areas indicating that grasslands and shrublands are equally critical as canopy in improving air quality and human health in urban regions.

  10. Influence of agricultural activities, forest fires and agro-industries on air quality in Thailand.

    PubMed

    Phairuang, Worradorn; Hata, Mitsuhiko; Furuuchi, Masami

    2017-02-01

    Annual and monthly-based emission inventories in northern, central and north-eastern provinces in Thailand, where agriculture and related agro-industries are very intensive, were estimated to evaluate the contribution of agricultural activity, including crop residue burning, forest fires and related agro-industries on air quality monitored in corresponding provinces. The monthly-based emission inventories of air pollutants, or, particulate matter (PM), NOx and SO 2 , for various agricultural crops were estimated based on information on the level of production of typical crops: rice, corn, sugarcane, cassava, soybeans and potatoes using emission factors and other parameters related to country-specific values taking into account crop type and the local residue burning period. The estimated monthly emission inventory was compared with air monitoring data obtained at monitoring stations operated by the Pollution Control Department, Thailand (PCD) for validating the estimated emission inventory. The agro-industry that has the greatest impact on the regions being evaluated, is the sugar processing industry, which uses sugarcane as a raw material and its residue as fuel for the boiler. The backward trajectory analysis of the air mass arriving at the PCD station was calculated to confirm this influence. For the provinces being evaluated which are located in the upper northern, lower northern and northeast in Thailand, agricultural activities and forest fires were shown to be closely correlated to the ambient PM concentration while their contribution to the production of gaseous pollutants is much less. Copyright © 2016. Published by Elsevier B.V.

  11. Linking data sources for measurement of effective coverage in maternal and newborn health: what do we learn from individual- vs ecological-linking methods?

    PubMed

    Willey, Barbara; Waiswa, Peter; Kajjo, Darious; Munos, Melinda; Akuze, Joseph; Allen, Elizabeth; Marchant, Tanya

    2018-06-01

    Improving maternal and newborn health requires improvements in the quality of facility-based care. This is challenging to measure: routine data may be unreliable; respondents in population surveys may be unable to accurately report on quality indicators; and facility assessments lack population level denominators. We explored methods for linking access to skilled birth attendance (SBA) from household surveys to data on provision of care from facility surveys with the aim of estimating population level effective coverage reflecting access to quality care. We used data from Mayuge District, Uganda. Data from household surveys on access to SBA were linked to health facility assessment census data on readiness to provide basic emergency obstetric and newborn care (BEmONC) in the same district. One individual- and two ecological-linking methods were applied. All methods used household survey reports on where care at birth was accessed. The individual-linking method linked this to data about facility readiness from the specific facility where each woman delivered. The first ecological-linking approach used a district-wide mean estimate of facility readiness. The second used an estimate of facility readiness adjusted by level of health facility accessed. Absolute differences between estimates derived from the different linking methods were calculated, and agreement examined using Lin's concordance correlation coefficient. A total of 1177 women resident in Mayuge reported a birth during 2012-13. Of these, 664 took place in facilities within Mayuge, and were eligible for linking to the census of the district's 38 facilities. 55% were assisted by a SBA in a facility. Using the individual-linking method, effective coverage of births that took place with an SBA in a facility ready to provide BEmONC was just 10% (95% confidence interval CI 3-17). The absolute difference between the individual- and ecological-level linking method adjusting for facility level was one percentage point (11%), and tests suggested good agreement. The ecological method using the district-wide estimate demonstrated poor agreement. The proportion of women accessing appropriately equipped facilities for care at birth is far lower than the coverage of facility delivery. To realise the life-saving potential of health services, countries need evidence to inform actions that address gaps in the provision of quality care. Linking household and facility-based information provides a simple but innovative method for estimating quality of care at the population level. These encouraging findings suggest that linking data sets can result in meaningful evidence even when the exact location of care seeking is not known.

  12. Measuring Provider Performance for Physicians Participating in the Merit-Based Incentive Payment System.

    PubMed

    Squitieri, Lee; Chung, Kevin C

    2017-07-01

    In 2017, the Centers for Medicare and Medicaid Services began requiring all eligible providers to participate in the Quality Payment Program or face financial reimbursement penalty. The Quality Payment Program outlines two paths for provider participation: the Merit-Based Incentive Payment System and Advanced Alternative Payment Models. For the first performance period beginning in January of 2017, the Centers for Medicare and Medicaid Services estimates that approximately 83 to 90 percent of eligible providers will not qualify for participation in an Advanced Alternative Payment Model and therefore must participate in the Merit-Based Incentive Payment System program. The Merit-Based Incentive Payment System path replaces existing quality-reporting programs and adds several new measures to evaluate providers using four categories of data: (1) quality, (2) cost/resource use, (3) improvement activities, and (4) advancing care information. These categories will be combined to calculate a weighted composite score for each provider or provider group. Composite Merit-Based Incentive Payment System scores based on 2017 performance data will be used to adjust reimbursed payment in 2019. In this article, the authors provide relevant background for understanding value-based provider performance measurement. The authors also discuss Merit-Based Incentive Payment System reporting requirements and scoring methodology to provide plastic surgeons with the necessary information to critically evaluate their own practice capabilities in the context of current performance metrics under the Quality Payment Program.

  13. Mapping the Paediatric Quality of Life Inventory (PedsQL™) Generic Core Scales onto the Child Health Utility Index-9 Dimension (CHU-9D) Score for Economic Evaluation in Children.

    PubMed

    Lambe, Tosin; Frew, Emma; Ives, Natalie J; Woolley, Rebecca L; Cummins, Carole; Brettell, Elizabeth A; Barsoum, Emma N; Webb, Nicholas J A

    2018-04-01

    The Paediatric Quality of Life Inventory (PedsQL™) questionnaire is a widely used, generic instrument designed for measuring health-related quality of life (HRQoL); however, it is not preference-based and therefore not suitable for cost-utility analysis. The Child Health Utility Index-9 Dimension (CHU-9D), however, is a preference-based instrument that has been primarily developed to support cost-utility analysis. This paper presents a method for estimating CHU-9D index scores from responses to the PedsQL™ using data from a randomised controlled trial of prednisolone therapy for treatment of childhood corticosteroid-sensitive nephrotic syndrome. HRQoL data were collected from children at randomisation, week 16, and months 12, 18, 24, 36 and 48. Observations on children aged 5 years and older were pooled across all data collection timepoints and were then randomised into an estimation (n = 279) and validation (n = 284) sample. A number of models were developed using the estimation data before internal validation. The best model was chosen using multi-stage selection criteria. Most of the models developed accurately predicted the CHU-9D mean index score. The best performing model was a generalised linear model (mean absolute error = 0.0408; mean square error = 0.0035). The proportion of index scores deviating from the observed scores by <  0.03 was 53%. The mapping algorithm provides an empirical tool for estimating CHU-9D index scores and for conducting cost-utility analyses within clinical studies that have only collected PedsQL™ data. It is valid for children aged 5 years or older. Caution should be exercised when using this with children younger than 5 years, older adolescents (>  13 years) or patient groups with particularly poor quality of life. 16645249.

  14. Satellite constraints on surface concentrations of particulate matter

    NASA Astrophysics Data System (ADS)

    Ford Hotmann, Bonne

    Because of the increasing evidence of the widespread adverse effects on human health from exposure to poor air quality and the recommendations of the World Health Organization to significantly reduce PM2.5 in order to reduce these risks, better estimates of surface air quality globally are required. However, surface measurements useful for monitoring particulate exposure are scarce, especially in developing countries which often experience the worst air pollution. Therefore, other methods are necessary to augment estimates in regions with limited surface observations. The prospect of using satellite observations to infer surface air quality is attractive; however, it requires knowledge of the complicated relationship between satellite-observed aerosol optical depth (AOD) and surface concentrations. This dissertation explores how satellite observations can be used in conjunction with a chemical transport model (GEOS-Chem) to better understand this relationship. First, we investigate the seasonality in aerosols over the Southeastern United States using observations from several satellite instruments (MODIS, MISR, CALIOP) and surface network sites (IMPROVE, SEARCH, AERONET). We find that the strong summertime enhancement in satellite-observed aerosol optical depth (factor 2-3 enhancement over wintertime AOD) is not present in surface mass concentrations (25-55% summertime enhancement). Goldstein et al. [2009] previously attributed this seasonality in AOD to biogenic organic aerosol; however, surface observations show that organic aerosol only accounts for ~35% of PM2.5 mass and exhibits similar seasonality to total surface PM2.5. The GEOS-Chem model generally reproduces these surface aerosol measurements, but under represents the AOD seasonality observed by satellites. We show that seasonal differences in water uptake cannot sufficiently explain the magnitude of AOD increase. As CALIOP profiles indicate the presence of additional aerosol in the lower troposphere (below 700 hPa), which cannot be explained by vertical mixing; we conclude that the discrepancy is due to a missing source of aerosols above the surface layer in summer. Next, we examine the usefulness of deriving premature mortality estimates from "satellite-based" PM2.5 concentrations. In particular, we examine how uncertainties in the model AOD-to-surface-PM2.5 relationship, satellite retrieved AOD, and particulars of the concentration-response function can impact these mortality estimates. We find that the satellite-based estimates suggest premature mortality due to chronic PM2.5 exposure is 2-16% higher in the U.S. and 4-13% lower in China compared to model-based estimates. However, this difference is overshadowed by the uncertainty in the methodology, which we quantify to be on order of 20% for the model-to- surface-PM2.5 relationship, 10% for the satellite AOD and 30-60% or greater with regards to the application of concentration response functions. Because there is a desire for acute exposure estimates, especially with regards to extreme events, we also examine how premature mortality due to acute exposure can be estimated from global models and satellite-observations. We find similar differences between model and satellite-based mortality estimates as with chronic exposure. However the range of uncertainty is much larger on these shorter timescales. This work suggests that although satellites can be useful for constraining model estimates of PM2.5, national mortality estimates from the two methods are not significantly different. In order to improve the efficacy of satellite-based PM2.5 mortality estimates, future work will need to focus on improving the model representation of the regional AOD-to-surface-PM2.5 relationship, reducing biases in satellite-retrieved AOD and advancing our understanding of personal and population-level responses to PM2.5 exposure.

  15. The Estimated Likelihood of Nutrients and Pesticides in Nontidal Headwater Streams of the Maryland Coastal Plain During Base Flow

    EPA Science Inventory

    Water quality in nontidal headwater (first-order) streams of the Coastal Plain during base flow in the late winter and spring is related to land use, hydrogeology, and other natural or human influences in contributing watersheds. A random survey of 174 headwater streams of the Mi...

  16. Effective coverage of essential inpatient care for small and sick newborns in a high mortality urban setting: a cross-sectional study in Nairobi City County, Kenya.

    PubMed

    Murphy, Georgina A V; Gathara, David; Mwachiro, Jacintah; Abuya, Nancy; Aluvaala, Jalemba; English, Mike

    2018-05-22

    Effective coverage requires that those in need can access skilled care supported by adequate resources. There are, however, few studies of effective coverage of facility-based neonatal care in low-income settings, despite the recognition that improving newborn survival is a global priority. We used a detailed retrospective review of medical records for neonatal admissions to public, private not-for-profit (mission) and private-for-profit (private) sector facilities providing 24×7 inpatient neonatal care in Nairobi City County to estimate the proportion of small and sick newborns receiving nationally recommended care across six process domains. We used our findings to explore the relationship between facility measures of structure and process and estimate effective coverage. Of 33 eligible facilities, 28 (four public, six mission and 18 private), providing an estimated 98.7% of inpatient neonatal care in the county, agreed to partake. Data from 1184 admission episodes were collected. Overall performance was lowest (weighted mean score 0.35 [95% confidence interval or CI: 0.22-0.48] out of 1) for correct prescription of fluid and feed volumes and best (0.86 [95% CI: 0.80-0.93]) for documentation of demographic characteristics. Doses of gentamicin, when prescribed, were at least 20% higher than recommended in 11.7% cases. Larger (often public) facilities tended to have higher process and structural quality scores compared with smaller, predominantly private, facilities. We estimate effective coverage to be 25% (estimate range: 21-31%). These newborns received high-quality inpatient care, while almost half (44.5%) of newborns needed care but did not receive it and a further 30.4% of newborns received an inadequate service. Failure to receive services and gaps in quality of care both contribute to a shortfall in effective coverage in Nairobi City County. Three-quarters of small and sick newborns do not have access to high-quality facility-based care. Substantial improvements in effective coverage will be required to tackle high neonatal mortality in this urban setting with high levels of poverty.

  17. Essays on information disclosure and the environment

    NASA Astrophysics Data System (ADS)

    Mathai, Koshy

    The essays in this dissertation study information disclosure and environmental policy. The first chapter challenges the longstanding result that firms will, in general, voluntarily disclose information about product quality, in light of the unrealism of the assumption, common to much of the literature, that consumers are identical. When this assumption is relaxed, an efficiency-enhancing role may emerge for disclosure regulation, insofar as it can improve information provision and thus help protect consumers with "moderately atypical" preferences. The paper also endogenizes firms's choice of quality and suggests that disclosure regulation may also raise welfare indirectly, by inducing firms to improve product quality. The second chapter explores the significance of policy-induced technological change (ITC) for the design of carbon-abatement policies. The paper considers both R&D-based and learning-by-doing-based knowledge accumulation, examining each specification under both a cost-effectiveness and a benefit-cost policy criterion. We show analytically that the presence of ITC generally implies a lower profile of optimal carbon taxes, a shifting of abatement effort into the future (in the R&D scenarios), and an increase in the scale of abatement (in the benefit-cost scenarios). Numerical simulations indicate that the impact of ITC on abatement timing is very slight, but the effects on costs, optimal carbon taxes, and cumulative abatement can be large. The third chapter uses a World Bank dataset on Chinese state-owned enterprises to estimate price elasticities of industrial coal demand. A simple coal-demand equation is estimated in many forms, and significant price sensitivity is almost always found: the own-price elasticity is estimated to be roughly -0.5. A cost-function/share-equation system is also estimated, and although the function is frequently ill-behaved, indicating that firms may not be minimizing costs, the elasticity estimates again are large and significant. These findings indicate that, even though China's is not a pure market economy, coal taxation can effectively be used to reduce reliance on fossil fuels and improve environmental quality. We calculate that a thirty-percent tax on industrial coal use could reduce consumption by nearly one hundred million tons annually.

  18. Techniques for estimating the quantity and quality of storm runoff from urban watersheds of Jefferson County, Kentucky

    USGS Publications Warehouse

    Evaldi, R.D.; Moore, B.L.

    1994-01-01

    Linear regression models are presented for estimating storm-runoff volumes, and mean con- centrations and loads of selected constituents in storm runoff from urban watersheds of Jefferson County, Kentucky. Constituents modeled include dissolved oxygen, biochemical and chemical oxygen demand, total and suspended solids, volatile residue, nitrogen, phosphorus and phosphate, calcium, magnesium, barium, copper, iron, lead, and zinc. Model estimations are a function of drainage area, percentage of impervious area, climatological data, and land uses. Estimation models are based on runoff volumes, and concen- trations and loads of constituents in runoff measured at 6 stormwater outfalls and 25 streams in Jefferson County.

  19. The effect of texture granularity on texture synthesis quality

    NASA Astrophysics Data System (ADS)

    Golestaneh, S. Alireza; Subedar, Mahesh M.; Karam, Lina J.

    2015-09-01

    Natural and artificial textures occur frequently in images and in video sequences. Image/video coding systems based on texture synthesis can make use of a reliable texture synthesis quality assessment method in order to improve the compression performance in terms of perceived quality and bit-rate. Existing objective visual quality assessment methods do not perform satisfactorily when predicting the synthesized texture quality. In our previous work, we showed that texture regularity can be used as an attribute for estimating the quality of synthesized textures. In this paper, we study the effect of another texture attribute, namely texture granularity, on the quality of synthesized textures. For this purpose, subjective studies are conducted to assess the quality of synthesized textures with different levels (low, medium, high) of perceived texture granularity using different types of texture synthesis methods.

  20. Twenty-Four-Hour Ambulatory Blood Pressure Monitoring in Hypertension

    PubMed Central

    2012-01-01

    Executive Summary Objective The objective of this health technology assessment was to determine the clinical effectiveness and cost-effectiveness of 24-hour ambulatory blood pressure monitoring (ABPM) for hypertension. Clinical Need: Condition and Target Population Hypertension occurs when either systolic blood pressure, the pressure in the artery when the heart contracts, or diastolic blood pressure, the pressure in the artery when the heart relaxes between beats, are consistently high. Blood pressure (BP) that is consistently more than 140/90 mmHg (systolic/diastolic) is considered high. A lower threshold, greater than 130/80 mmHg (systolic/diastolic), is set for individuals with diabetes or chronic kidney disease. In 2006 and 2007, the age-standardized incidence rate of diagnosed hypertension in Canada was 25.8 per 1,000 (450,000 individuals were newly diagnosed). During the same time period, 22.7% of adult Canadians were living with diagnosed hypertension. A smaller proportion of Canadians are unaware they have hypertension; therefore, the estimated number of Canadians affected by this disease may be higher. Diagnosis and management of hypertension are important, since elevated BP levels are related to the risk of cardiovascular disease, including stroke. In Canada in 2003, the costs to the health care system related to the diagnosis, treatment, and management of hypertension were over $2.3 billion (Cdn). Technology The 24-hour ABPM device consists of a standard inflatable cuff attached to a small computer weighing about 500 grams, which is worn over the shoulder or on a belt. The technology is noninvasive and fully automated. The device takes BP measurements every 15 to 30 minutes over a 24-to 28-hour time period, thus providing extended, continuous BP recordings even during a patient’s normal daily activities. Information on the multiple BP measurements can be downloaded to a computer. The main detection methods used by the device are auscultation and oscillometry. The device avoids some of the pitfalls of conventional office or clinic blood pressure monitoring (CBPM) using a cuff and mercury sphygmomanometer such as observer bias (the phenomenon of measurement error when the observer overemphasizes expected results) and white coat hypertension (the phenomenon of elevated BP when measured in the office or clinic but normal BP when measured outside of the medical setting). Research Questions Is there a difference in patient outcome and treatment protocol using 24-hour ABPM versus CBPM for uncomplicated hypertension? Is there a difference between the 2 technologies when white coat hypertension is taken into account? What is the cost-effectiveness and budget impact of 24-hour ABPM versus CBPM for uncomplicated hypertension? Research Methods Literature Search Search Strategy A literature search was performed on August 4, 2011 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1997 to August 4, 2011. Abstracts were reviewed by a single reviewer. For those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. Articles with unknown eligibility were reviewed with a second clinical epidemiologist and then a group of epidemiologists until consensus was established. The quality of evidence was assessed as high, moderate, low, or very low according to GRADE methodology. Inclusion Criteria English language articles; published between January 1, 1997 and August 4, 2011; adults aged 18 years of age or older; journal articles reporting on the effectiveness, cost-effectiveness, or safety for the comparison of interest; clearly described study design and methods; health technology assessments, systematic reviews, meta-analyses, or randomized controlled trials. Exclusion Criteria non-English papers; animal or in vitro studies; case reports, case series, or case-case studies; studies comparing different antihypertensive therapies and evaluating their antihypertensive effects using 24-hour ABPM; studies on home or self-monitoring of BP, and studies on automated office BP measurement; studies in high-risk subgroups (e.g. diabetes, pregnancy, kidney disease). Outcomes of Interest Patient Outcomes mortality: all cardiovascular events (e.g., myocardial infarction [MI], stroke); non-fatal: all cardiovascular events (e.g., MI, stroke); combined fatal and non-fatal: all cardiovascular events (e.g., MI, stroke); all non-cardiovascular events; control of BP (e.g. systolic and/or diastolic target level). Drug-Related Outcomes percentage of patients who show a reduction in, or stop, drug treatment; percentage of patients who begin multi-drug treatment; drug therapy use (e.g. number, intensity of drug use); drug-related adverse events. Quality of Evidence The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. As stated by the GRADE Working Group, the following definitions of quality were used in grading the quality of the evidence: High Further research is very unlikely to change confidence in the estimate of effect. Moderate Further research is likely to have an important impact on confidence in the estimate of effect and may change the estimate. Low Further research is very likely to have an important impact on confidence in the estimate of effect and is likely to change the estimate. Very Low Any estimate of effect is very uncertain. Summary of Findings Short-Term Follow-Up Studies (Length of Follow-Up of ≤ 1 Year) Based on very low quality of evidence, there is no difference between technologies for non-fatal cardiovascular events. Based on moderate quality of evidence, ABPM resulted in improved BP control among patients with sustained hypertension compared to CBPM. Based on low quality of evidence, ABPM resulted in hypertensive patients being more likely to stop antihypertensive therapy and less likely to proceed to multi-drug therapy compared to CBPM. Based on low quality of evidence, there is a beneficial effect of ABPM on the intensity of antihypertensive drug use compared to CBPM. Based on moderate quality of evidence, there is no difference between technologies in the number of antihypertensive drugs used. Based on low to very low quality of evidence, there is no difference between technologies in the risk for a drug-related adverse event or noncardiovascular event. Long-Term Follow-Up Study (Mean Length of Follow-Up of 5 Years) Based on moderate quality of evidence, there is a beneficial effect of ABPM on total combined cardiovascular events compared to CBPM. Based on low quality of evidence, there is a lack of a beneficial effect of ABPM on nonfatal cardiovascular events compared to CBPM; however, the lack of a beneficial effect is based on a borderline result. Based on low quality of evidence, there is no beneficial effect of ABPM on fatal cardiovascular events compared to CBPM. Based on low quality of evidence, there is no difference between technologies for the number of patients who began multi-drug therapy. Based on low quality of evidence, there is a beneficial effect of CBPM on control of BP compared to ABPM. This result is in the opposite direction than expected. Based on moderate quality of evidence, there is no difference between technologies in the risk for a drug-related adverse event. PMID:23074425

  1. Cloud-Based Service Information System for Evaluating Quality of Life after Breast Cancer Surgery.

    PubMed

    Kao, Hao-Yun; Wu, Wen-Hsiung; Liang, Tyng-Yeu; Lee, King-The; Hou, Ming-Feng; Shi, Hon-Yi

    2015-01-01

    Although recent studies have improved understanding of quality of life (QOL) outcomes of breast conserving surgery, few have used longitudinal data for more than two time points, and few have examined predictors of QOL over two years. Additionally, the longitudinal data analyses in such studies rarely apply the appropriate statistical methodology to control for censoring and inter-correlations arising from repeated measures obtained from the same patient pool. This study evaluated an internet-based system for measuring longitudinal changes in QOL and developed a cloud-based system for managing patients after breast conserving surgery. This prospective study analyzed 657 breast cancer patients treated at three tertiary academic hospitals. Related hospital personnel such as surgeons and other healthcare professionals were also interviewed to determine the requirements for an effective cloud-based system for surveying QOL in breast cancer patients. All patients completed the SF-36, Quality of Life Questionnaire (QLQ-C30) and its supplementary breast cancer measure (QLQ-BR23) at baseline, 6 months, 1 year, and 2 years postoperatively. The 95% confidence intervals for differences in responsiveness estimates were derived by bootstrap estimation. Scores derived by these instruments were interpreted by generalized estimating equation before and after surgery. All breast cancer surgery patients had significantly improved QLQ-C30 and QLQ-BR23 subscale scores throughout the 2-year follow-up period (p<0.05). During the study period, QOL generally had a negative association with advanced age, high Charlson comorbidity index score, tumor stage III or IV, previous chemotherapy, and long post-operative LOS. Conversely, QOL was positively associated with previous radiotherapy and hormone therapy. Additionally, patients with high scores for preoperative QOL tended to have high scores for QLQ-C30, QLQ-BR23 and SF-36 subscales. Based on the results of usability testing, the five constructs were rated on a Likert scale from 1-7 as follows: system usefulness (5.6±1.8), ease of use (5.6±1.5), information quality (5.4±1.4), interface quality (5.5±1.4), and overall satisfaction (5.5±1.6). The current trend in clinical medicine is applying therapies and interventions that improve QOL. Therefore, a potentially vast amount of internet-based QOL data is available for use in defining patient populations that may benefit from therapeutic intervention. Additionally, before undergoing breast conserving surgery, patients should be advised that their postoperative QOL depends not only on the success of the surgery, but also on their preoperative functional status.

  2. Cloud-Based Service Information System for Evaluating Quality of Life after Breast Cancer Surgery

    PubMed Central

    Kao, Hao-Yun; Wu, Wen-Hsiung; Liang, Tyng-Yeu; Lee, King-The; Hou, Ming-Feng; Shi, Hon-Yi

    2015-01-01

    Objective Although recent studies have improved understanding of quality of life (QOL) outcomes of breast conserving surgery, few have used longitudinal data for more than two time points, and few have examined predictors of QOL over two years. Additionally, the longitudinal data analyses in such studies rarely apply the appropriate statistical methodology to control for censoring and inter-correlations arising from repeated measures obtained from the same patient pool. This study evaluated an internet-based system for measuring longitudinal changes in QOL and developed a cloud-based system for managing patients after breast conserving surgery. Methods This prospective study analyzed 657 breast cancer patients treated at three tertiary academic hospitals. Related hospital personnel such as surgeons and other healthcare professionals were also interviewed to determine the requirements for an effective cloud-based system for surveying QOL in breast cancer patients. All patients completed the SF-36, Quality of Life Questionnaire (QLQ-C30) and its supplementary breast cancer measure (QLQ-BR23) at baseline, 6 months, 1 year, and 2 years postoperatively. The 95% confidence intervals for differences in responsiveness estimates were derived by bootstrap estimation. Scores derived by these instruments were interpreted by generalized estimating equation before and after surgery. Results All breast cancer surgery patients had significantly improved QLQ-C30 and QLQ-BR23 subscale scores throughout the 2-year follow-up period (p<0.05). During the study period, QOL generally had a negative association with advanced age, high Charlson comorbidity index score, tumor stage III or IV, previous chemotherapy, and long post-operative LOS. Conversely, QOL was positively associated with previous radiotherapy and hormone therapy. Additionally, patients with high scores for preoperative QOL tended to have high scores for QLQ-C30, QLQ-BR23 and SF-36 subscales. Based on the results of usability testing, the five constructs were rated on a Likert scale from 1–7 as follows: system usefulness (5.6±1.8), ease of use (5.6±1.5), information quality (5.4±1.4), interface quality (5.5±1.4), and overall satisfaction (5.5±1.6). Conclusions The current trend in clinical medicine is applying therapies and interventions that improve QOL. Therefore, a potentially vast amount of internet-based QOL data is available for use in defining patient populations that may benefit from therapeutic intervention. Additionally, before undergoing breast conserving surgery, patients should be advised that their postoperative QOL depends not only on the success of the surgery, but also on their preoperative functional status. PMID:26422018

  3. Local-Scale Air Quality Modeling in Support of Human Health and Exposure Research (Invited)

    NASA Astrophysics Data System (ADS)

    Isakov, V.

    2010-12-01

    Spatially- and temporally-sparse information on air quality is a key concern for air-pollution-related environmental health studies. Monitor networks are sparse in both space and time, are costly to maintain, and are often designed purposely to avoid detecting highly localized sources. Recent studies have shown that more narrowly defining the geographic domain of the study populations and improvements in the measured/estimated ambient concentrations can lead to stronger associations between air pollution and hospital admissions and mortality records. Traditionally, ambient air quality measurements have been used as a primary input to support human health and exposure research. However, there is increasing evidence that the current ambient monitoring network is not capturing sharp gradients in exposure due to the presence of high concentration levels near, for example, major roadways. Many air pollutants exhibit large concentration gradients near large emitters such as major roadways, factories, ports, etc. To overcome these limitations, researchers are now beginning to use air quality models to support air pollution exposure and health studies. There are many advantages to using air quality models over traditional approaches based on existing ambient measurements alone. First, models can provide spatially- and temporally-resolved concentrations as direct input to exposure and health studies and thus better defining the concentration levels for the population in the geographic domain. Air quality models have a long history of use in air pollution regulations, and supported by regulatory agencies and a large user community. Also, models can provide bidirectional linkages between sources of emissions and ambient concentrations, thus allowing exploration of various mitigation strategies to reduce risk to exposure. In order to provide best estimates of air concentrations to support human health and exposure studies, model estimates should consider local-scale features, regional-scale transport, and photochemical transformations. Since these needs are currently not met by a single model, hybrid air quality modeling has recently been developed to combine these capabilities. In this paper, we present the results of two studies where we applied the hybrid modeling approach to provide spatial and temporal details in air quality concentrations to support exposure and health studies: a) an urban-scale air quality accountability study involving near-source exposures to multiple ambient air pollutants, and b) an urban-scale epidemiological study involving human health data based on emergency department visits.

  4. Using water-quality profiles to characterize seasonal water quality and loading in the upper Animas River basin, southwestern Colorado

    USGS Publications Warehouse

    Leib, Kenneth J.; Mast, M. Alisa; Wright, Winfield G.

    2003-01-01

    One of the important types of information needed to characterize water quality in streams affected by historical mining is the seasonal pattern of toxic trace-metal concentrations and loads. Seasonal patterns in water quality are estimated in this report using a technique called water-quality profiling. Water-quality profiling allows land managers and scientists to assess priority areas to be targeted for characterization and(or) remediation by quantifying the timing and magnitude of contaminant occurrence. Streamflow and water-quality data collected at 15 sites in the upper Animas River Basin during water years 1991?99 were used to develop water-quality profiles. Data collected at each sampling site were used to develop ordinary least-squares regression models for streamflow and constituent concentrations. Streamflow was estimated by correlating instantaneous streamflow measured at ungaged sites with continuous streamflow records from streamflow-gaging stations in the subbasin. Water-quality regression models were developed to estimate hardness and dissolved cadmium, copper, and zinc concentrations based on streamflow and seasonal terms. Results from the regression models were used to calculate water-quality profiles for streamflow, constituent concentrations, and loads. Quantification of cadmium, copper, and zinc loads in a stream segment in Mineral Creek (sites M27 to M34) was presented as an example application of water-quality profiling. The application used a method of mass accounting to quantify the portion of metal loading in the segment derived from uncharacterized sources during different seasonal periods. During May, uncharacterized sources contributed nearly 95 percent of the cadmium load, 0 percent of the copper load (or uncharacterized sources also are attenuated), and about 85 percent of the zinc load at M34. During September, uncharacterized sources contributed about 86 percent of the cadmium load, 0 percent of the copper load (or uncharacterized sources also are attenuated), and about 52 percent of the zinc load at M34. Characterized sources accounted for more of the loading gains estimated in the example reach during September, possibly indicating the presence of diffuse inputs during snowmelt runoff. The results indicate that metal sources in the upper Animas River Basin may change substantially with season, regardless of the source.

  5. Multisite evaluation of APEX for water quality: II. Regional parameterization

    USDA-ARS?s Scientific Manuscript database

    Phosphorus (P) index assessment requires independent estimates of long-term average annual P loss from multiple locations, management practices, soils, and landscape positions. Because currently available measured data are insufficient, calibrated and validated process-based models have been propos...

  6. AIR MONITOR SITING BY OBJECTIVE

    EPA Science Inventory

    A method is developed whereby measured pollutant concentrations can be used in conjunction with a mathematical air quality model to estimate the full spatial and temporal concentration distributions of the pollutants over a given region. The method is based on the application of ...

  7. ESTIMATION OF CHEMICAL SPECIFIC PARAMETERS WITHIN PHYSIOLOGICALLY BASED PHARMACOKINETIC/PHARMACODYNAMIC MODELS

    EPA Science Inventory

    While relationships between chemical structure and observed properties or activities (QSAR - quantitative structure activity relationship) can be used to predict the behavior of unknown chemicals, this method is semiempirical in nature relying on high quality experimental data to...

  8. Synthesis and assessment of date palm genetic diversity studies

    USDA-ARS?s Scientific Manuscript database

    A thorough assessment of genetic diversity and population differentiation of Phoenix dactylifera are critical for its dynamic conservation and sustainable utilization of its genetic diversity. Estimates of genetic diversity based on phenotypic, biochemical and molecular markers; and fruit quality tr...

  9. The a priori SDR Estimation Techniques with Reduced Speech Distortion for Acoustic Echo and Noise Suppression

    NASA Astrophysics Data System (ADS)

    Thoonsaengngam, Rattapol; Tangsangiumvisai, Nisachon

    This paper proposes an enhanced method for estimating the a priori Signal-to-Disturbance Ratio (SDR) to be employed in the Acoustic Echo and Noise Suppression (AENS) system for full-duplex hands-free communications. The proposed a priori SDR estimation technique is modified based upon the Two-Step Noise Reduction (TSNR) algorithm to suppress the background noise while preserving speech spectral components. In addition, a practical approach to determine accurately the Echo Spectrum Variance (ESV) is presented based upon the linear relationship assumption between the power spectrum of far-end speech and acoustic echo signals. The ESV estimation technique is then employed to alleviate the acoustic echo problem. The performance of the AENS system that employs these two proposed estimation techniques is evaluated through the Echo Attenuation (EA), Noise Attenuation (NA), and two speech distortion measures. Simulation results based upon real speech signals guarantee that our improved AENS system is able to mitigate efficiently the problem of acoustic echo and background noise, while preserving the speech quality and speech intelligibility.

  10. External Quality Control for Dried Blood Spot Based C-reactive Protein Assay: Experience from the Indonesia Family Life Survey and the Longitudinal Aging Study in India

    PubMed Central

    Hu, Peifeng; Herningtyas, Elizabeth H.; Kale, Varsha; Crimmins, Eileen M.; Risbud, Arun R.; McCreath, Heather; Lee, Jinkook; Strauss, John; O’Brien, Jennifer C.; Bloom, David E.; Seeman, Teresa E.

    2015-01-01

    Measurement of C-reactive protein, a marker of inflammation, in dried blood spots has been increasingly incorporated in community-based social surveys internationally. Although the dried blood spot based CRP assay protocol has been validated in the United States, it remains unclear whether laboratories in other less developed countries can generate C-reactive protein results of similar quality. We therefore conducted external quality monitoring for dried blood spot based C-reactive protein measurement for the Indonesia Family Life Survey and the Longitudinal Aging Study in India. Our results show that dried blood spot based C-reactive protein results in these two countries have excellent and consistent correlations with serum-based values and dried blood spot based results from the reference laboratory in the United States. Even though the results from duplicate samples may have fluctuations in absolute values over time, the relative order of C-reactive protein levels remains similar and the estimates are reasonably precise for population-based studies that investigate the association between socioeconomic factors and health. PMID:25879265

  11. Using flow cytometry to estimate pollen DNA content: improved methodology and applications

    PubMed Central

    Kron, Paul; Husband, Brian C.

    2012-01-01

    Background and Aims Flow cytometry has been used to measure nuclear DNA content in pollen, mostly to understand pollen development and detect unreduced gametes. Published data have not always met the high-quality standards required for some applications, in part due to difficulties inherent in the extraction of nuclei. Here we describe a simple and relatively novel method for extracting pollen nuclei, involving the bursting of pollen through a nylon mesh, compare it with other methods and demonstrate its broad applicability and utility. Methods The method was tested across 80 species, 64 genera and 33 families, and the data were evaluated using established criteria for estimating genome size and analysing cell cycle. Filter bursting was directly compared with chopping in five species, yields were compared with published values for sonicated samples, and the method was applied by comparing genome size estimates for leaf and pollen nuclei in six species. Key Results Data quality met generally applied standards for estimating genome size in 81 % of species and the higher best practice standards for cell cycle analysis in 51 %. In 41 % of species we met the most stringent criterion of screening 10 000 pollen grains per sample. In direct comparison with two chopping techniques, our method produced better quality histograms with consistently higher nuclei yields, and yields were higher than previously published results for sonication. In three binucleate and three trinucleate species we found that pollen-based genome size estimates differed from leaf tissue estimates by 1·5 % or less when 1C pollen nuclei were used, while estimates from 2C generative nuclei differed from leaf estimates by up to 2·5 %. Conclusions The high success rate, ease of use and wide applicability of the filter bursting method show that this method can facilitate the use of pollen for estimating genome size and dramatically improve unreduced pollen production estimation with flow cytometry. PMID:22875815

  12. Development and validation of a food-based diet quality index for New Zealand adolescents

    PubMed Central

    2013-01-01

    Background As there is no population-specific, simple food-based diet index suitable for examination of diet quality in New Zealand (NZ) adolescents, there is a need to develop such a tool. Therefore, this study aimed to develop an adolescent-specific diet quality index based on dietary information sourced from a Food Questionnaire (FQ) and examine its validity relative to a four-day estimated food record (4DFR) obtained from a group of adolescents aged 14 to 18 years. Methods A diet quality index for NZ adolescents (NZDQI-A) was developed based on ‘Adequacy’ and ‘Variety’ of five food groups reflecting the New Zealand Food and Nutrition Guidelines for Healthy Adolescents. The NZDQI-A was scored from zero to 100, with a higher score reflecting a better diet quality. Forty-one adolescents (16 males, 25 females, aged 14–18 years) each completed the FQ and a 4DFR. The test-retest reliability of the FQ-derived NZDQI-A scores over a two-week period and the relative validity of the scores compared to the 4DFR were estimated using Pearson’s correlations. Construct validity was examined by comparing NZDQI-A scores against nutrient intakes obtained from the 4DFR. Results The NZDQI-A derived from the FQ showed good reliability (r = 0.65) and reasonable agreement with 4DFR in ranking participants by scores (r = 0.39). More than half of the participants were classified into the same thirds of scores while 10% were misclassified into the opposite thirds by the two methods. Higher NZDQI-A scores were also associated with lower total fat and saturated fat intakes and higher iron intakes. Conclusions Higher NZDQI-A scores were associated with more desirable fat and iron intakes. The scores derived from either FQ or 4DFR were comparable and reproducible when repeated within two weeks. The NZDQI-A is relatively valid and reliable in ranking diet quality in adolescents at a group level even in a small sample size. Further studies are required to test the predictive validity of this food-based diet index in larger samples. PMID:23759064

  13. Global Burden of Disease estimates of depression – how reliable is the epidemiological evidence?

    PubMed Central

    Brhlikova, Petra; Pollock, Allyson M; Manners, Rachel

    2011-01-01

    Summary Objectives To re-assess the quality of the epidemiological studies used to estimate the global burden of depression 2000, as published in the GBDep study. Design Primary and secondary data sources used in the global burden of depression estimate were identified and assigned to country of origin. Each source was assessed with respect to completeness and representativeness for national/regional estimates and against the inclusion criteria used by the scientific team estimating GBDep. Setting Not applicable. Participants Not applicable. Main outcome measures Not applicable. Results First, National estimates: The 28 scientific sources cited in the GBDep study related to 40 of the 191 WHO member countries. The EURO region had studies relating to 15 of 52 countries whereas AFRO region had studies for only three of 46 countries. Only six of the 40 countries had data drawn from a nationally representative population: the three AFRO country studies were based on a single village or town and, likewise, SEARO region had no nationally representative data; second, GBDep criteria: GBDep inclusion criteria required study sample size of more than 1000 people; 19 (45%) of the 42 studies did not meet this criterion. Sixteen (44%) of 36 studies did not meet the requirement that studies show a clear sample frame and method. GBD estimates rely on estimates of incidence; only two of the 42 country studies provided incidence data (Canada and Norway), the remaining 34 studies were prevalence studies. Duration of depression is based on three studies conducted in the USA and Holland. Conclusions Most studies exhibit significant shortcomings and limitations with respect to study design and analysis and compliance with GBDep inclusion criteria. Poor quality data limit the interpretation and validity of global burden of depression estimates. The uncritical application of these estimates to international healthcare policy-making could divert scarce resources from other public healthcare priorities. PMID:21205775

  14. No-reference video quality measurement: added value of machine learning

    NASA Astrophysics Data System (ADS)

    Mocanu, Decebal Constantin; Pokhrel, Jeevan; Garella, Juan Pablo; Seppänen, Janne; Liotou, Eirini; Narwaria, Manish

    2015-11-01

    Video quality measurement is an important component in the end-to-end video delivery chain. Video quality is, however, subjective, and thus, there will always be interobserver differences in the subjective opinion about the visual quality of the same video. Despite this, most existing works on objective quality measurement typically focus only on predicting a single score and evaluate their prediction accuracies based on how close it is to the mean opinion scores (or similar average based ratings). Clearly, such an approach ignores the underlying diversities in the subjective scoring process and, as a result, does not allow further analysis on how reliable the objective prediction is in terms of subjective variability. Consequently, the aim of this paper is to analyze this issue and present a machine-learning based solution to address it. We demonstrate the utility of our ideas by considering the practical scenario of video broadcast transmissions with focus on digital terrestrial television (DTT) and proposing a no-reference objective video quality estimator for such application. We conducted meaningful verification studies on different video content (including video clips recorded from real DTT broadcast transmissions) in order to verify the performance of the proposed solution.

  15. [The concept of the development of the state of chemical-analytical environmental monitoring].

    PubMed

    Rakhmanin, Iu A; Malysheva, A G

    2013-01-01

    Chemical and analytical monitoring of the quality of environment is based on the accounting of the trace amount of substances. Considering the multicomponent composition of the environment and running processes of transformation of substances in it, in determination of the danger of the exposure to the chemical pollution of environment on population health there is necessary evaluation based on the simultaneous account of complex of substances really contained in the environment and supplying from different sources. Therefore, in the analytical monitoring of the quality and safety of the environment there is a necessary conversion from the orientation, based on the investigation of specific target substances, to estimation of real complex of compounds.

  16. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  17. Efficient depth intraprediction method for H.264/AVC-based three-dimensional video coding

    NASA Astrophysics Data System (ADS)

    Oh, Kwan-Jung; Oh, Byung Tae

    2015-04-01

    We present an intracoding method that is applicable to depth map coding in multiview plus depth systems. Our approach combines skip prediction and plane segmentation-based prediction. The proposed depth intraskip prediction uses the estimated direction at both the encoder and decoder, and does not need to encode residual data. Our plane segmentation-based intraprediction divides the current block into biregions, and applies a different prediction scheme for each segmented region. This method avoids incorrect estimations across different regions, resulting in higher prediction accuracy. Simulation results demonstrate that the proposed scheme is superior to H.264/advanced video coding intraprediction and has the ability to improve the subjective rendering quality.

  18. IDMA-Based MAC Protocol for Satellite Networks with Consideration on Channel Quality

    PubMed Central

    2014-01-01

    In order to overcome the shortcomings of existing medium access control (MAC) protocols based on TDMA or CDMA in satellite networks, interleave division multiple access (IDMA) technique is introduced into satellite communication networks. Therefore, a novel wide-band IDMA MAC protocol based on channel quality is proposed in this paper, consisting of a dynamic power allocation algorithm, a rate adaptation algorithm, and a call admission control (CAC) scheme. Firstly, the power allocation algorithm combining the technique of IDMA SINR-evolution and channel quality prediction is developed to guarantee high power efficiency even in terrible channel conditions. Secondly, the effective rate adaptation algorithm, based on accurate channel information per timeslot and by the means of rate degradation, can be realized. What is more, based on channel quality prediction, the CAC scheme, combining the new power allocation algorithm, rate scheduling, and buffering strategies together, is proposed for the emerging IDMA systems, which can support a variety of traffic types, and offering quality of service (QoS) requirements corresponding to different priority levels. Simulation results show that the new wide-band IDMA MAC protocol can make accurate estimation of available resource considering the effect of multiuser detection (MUD) and QoS requirements of multimedia traffic, leading to low outage probability as well as high overall system throughput. PMID:25126592

  19. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  20. Improved image decompression for reduced transform coding artifacts

    NASA Technical Reports Server (NTRS)

    Orourke, Thomas P.; Stevenson, Robert L.

    1994-01-01

    The perceived quality of images reconstructed from low bit rate compression is severely degraded by the appearance of transform coding artifacts. This paper proposes a method for producing higher quality reconstructed images based on a stochastic model for the image data. Quantization (scalar or vector) partitions the transform coefficient space and maps all points in a partition cell to a representative reconstruction point, usually taken as the centroid of the cell. The proposed image estimation technique selects the reconstruction point within the quantization partition cell which results in a reconstructed image which best fits a non-Gaussian Markov random field (MRF) image model. This approach results in a convex constrained optimization problem which can be solved iteratively. At each iteration, the gradient projection method is used to update the estimate based on the image model. In the transform domain, the resulting coefficient reconstruction points are projected to the particular quantization partition cells defined by the compressed image. Experimental results will be shown for images compressed using scalar quantization of block DCT and using vector quantization of subband wavelet transform. The proposed image decompression provides a reconstructed image with reduced visibility of transform coding artifacts and superior perceived quality.

  1. Dynamic rating curve assessment for hydrometric stations and computation of the associated uncertainties: Quality and station management indicators

    NASA Astrophysics Data System (ADS)

    Morlot, Thomas; Perret, Christian; Favre, Anne-Catherine; Jalbert, Jonathan

    2014-09-01

    A rating curve is used to indirectly estimate the discharge in rivers based on water level measurements. The discharge values obtained from a rating curve include uncertainties related to the direct stage-discharge measurements (gaugings) used to build the curves, the quality of fit of the curve to these measurements and the constant changes in the river bed morphology. Moreover, the uncertainty of discharges estimated from a rating curve increases with the “age” of the rating curve. The level of uncertainty at a given point in time is therefore particularly difficult to assess. A “dynamic” method has been developed to compute rating curves while calculating associated uncertainties, thus making it possible to regenerate streamflow data with uncertainty estimates. The method is based on historical gaugings at hydrometric stations. A rating curve is computed for each gauging and a model of the uncertainty is fitted for each of them. The model of uncertainty takes into account the uncertainties in the measurement of the water level, the quality of fit of the curve, the uncertainty of gaugings and the increase of the uncertainty of discharge estimates with the age of the rating curve computed with a variographic analysis (Jalbert et al., 2011). The presented dynamic method can answer important questions in the field of hydrometry such as “How many gaugings a year are required to produce streamflow data with an average uncertainty of X%?” and “When and in what range of water flow rates should these gaugings be carried out?”. The Rocherousse hydrometric station (France, Haute-Durance watershed, 946 [km2]) is used as an example throughout the paper. Others stations are used to illustrate certain points.

  2. The economic burden of lung cancer and mesothelioma due to occupational and para-occupational asbestos exposure.

    PubMed

    Tompa, Emile; Kalcevich, Christina; McLeod, Chris; Lebeau, Martin; Song, Chaojie; McLeod, Kim; Kim, Joanne; Demers, Paul A

    2017-11-01

    To estimate the economic burden of lung cancer and mesothelioma due to occupational and para-occupational asbestos exposure in Canada. We estimate the lifetime cost of newly diagnosed lung cancer and mesothelioma cases associated with occupational and para-occupational asbestos exposure for calendar year 2011 based on the societal perspective. The key cost components considered are healthcare costs, productivity and output costs, and quality of life costs. There were 427 cases of newly diagnosed mesothelioma cases and 1904 lung cancer cases attributable to asbestos exposure in 2011 for a total of 2331 cases. Our estimate of the economic burden is $C831 million in direct and indirect costs for newly identified cases of mesothelioma and lung cancer and $C1.5 billion in quality of life costs based on a value of $C100 000 per quality-adjusted life year. This amounts to $C356 429 and $C652 369 per case, respectively. The economic burden of lung cancer and mesothelioma associated with occupational and para-occupational asbestos exposure is substantial. The estimate identified is for 2331 newly diagnosed, occupational and para-occupational exposure cases in 2011, so it is only a portion of the burden of existing cases in that year. Our findings provide important information for policy decision makers for priority setting, in particular the merits of banning the mining of asbestos and use of products containing asbestos in countries where they are still allowed and also the merits of asbestos removal in older buildings with asbestos insulation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Water quality monitoring records for estimating tap water arsenic and nitrate: a validation study.

    PubMed

    Searles Nielsen, Susan; Kuehn, Carrie M; Mueller, Beth A

    2010-01-28

    Tap water may be an important source of exposure to arsenic and nitrate. Obtaining and analyzing samples in the context of large studies of health effects can be expensive. As an alternative, studies might estimate contaminant levels in individual homes by using publicly available water quality monitoring records, either alone or in combination with geographic information systems (GIS). We examined the validity of records-based methods in Washington State, where arsenic and nitrate contamination is prevalent but generally observed at modest levels. Laboratory analysis of samples from 107 homes (median 0.6 microg/L arsenic, median 0.4 mg/L nitrate as nitrogen) served as our "gold standard." Using Spearman's rho we compared these measures to estimates obtained using only the homes' street addresses and recent and/or historical measures from publicly monitored water sources within specified distances (radii) ranging from one half mile to 10 miles. Agreement improved as distance decreased, but the proportion of homes for which we could estimate summary measures also decreased. When including all homes, agreement was 0.05-0.24 for arsenic (8 miles), and 0.31-0.33 for nitrate (6 miles). Focusing on the closest source yielded little improvement. Agreement was greatest among homes with private wells. For homes on a water system, agreement improved considerably if we included only sources serving the relevant system (rho = 0.29 for arsenic, rho = 0.60 for nitrate). Historical water quality databases show some promise for categorizing epidemiologic study participants in terms of relative tap water nitrate levels. Nonetheless, such records-based methods must be used with caution, and their use for arsenic may be limited.

  4. A web tool for STORET/WQX water quality data retrieval and Best Management Practice scenario suggestion.

    PubMed

    Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae

    2015-03-01

    Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Nonparametric EROC analysis for observer performance evaluation on joint detection and estimation tasks

    NASA Astrophysics Data System (ADS)

    Wunderlich, Adam; Goossens, Bart

    2014-03-01

    The majority of the literature on task-based image quality assessment has focused on lesion detection tasks, using the receiver operating characteristic (ROC) curve, or related variants, to measure performance. However, since many clinical image evaluation tasks involve both detection and estimation (e.g., estimation of kidney stone composition, estimation of tumor size), there is a growing interest in performance evaluation for joint detection and estimation tasks. To evaluate observer performance on such tasks, Clarkson introduced the estimation ROC (EROC) curve, and the area under the EROC curve as a summary figure of merit. In the present work, we propose nonparametric estimators for practical EROC analysis from experimental data, including estimators for the area under the EROC curve and its variance. The estimators are illustrated with a practical example comparing MRI images reconstructed from different k-space sampling trajectories.

  6. Comparison between deterministic and statistical wavelet estimation methods through predictive deconvolution: Seismic to well tie example from the North Sea

    NASA Astrophysics Data System (ADS)

    de Macedo, Isadora A. S.; da Silva, Carolina B.; de Figueiredo, J. J. S.; Omoboya, Bode

    2017-01-01

    Wavelet estimation as well as seismic-to-well tie procedures are at the core of every seismic interpretation workflow. In this paper we perform a comparative study of wavelet estimation methods for seismic-to-well tie. Two approaches to wavelet estimation are discussed: a deterministic estimation, based on both seismic and well log data, and a statistical estimation, based on predictive deconvolution and the classical assumptions of the convolutional model, which provides a minimum-phase wavelet. Our algorithms, for both wavelet estimation methods introduce a semi-automatic approach to determine the optimum parameters of deterministic wavelet estimation and statistical wavelet estimation and, further, to estimate the optimum seismic wavelets by searching for the highest correlation coefficient between the recorded trace and the synthetic trace, when the time-depth relationship is accurate. Tests with numerical data show some qualitative conclusions, which are probably useful for seismic inversion and interpretation of field data, by comparing deterministic wavelet estimation and statistical wavelet estimation in detail, especially for field data example. The feasibility of this approach is verified on real seismic and well data from Viking Graben field, North Sea, Norway. Our results also show the influence of the washout zones on well log data on the quality of the well to seismic tie.

  7. Competition and quality in a physiotherapy market with fixed prices.

    PubMed

    Pekola, Piia; Linnosmaa, Ismo; Mikkola, Hennamari

    2017-01-01

    Our study focuses on competition and quality in physiotherapy organized and regulated by the Social Insurance Institution of Finland (Kela). We first derive a hypothesis with a theoretical model and then perform empirical analyses of the data. Within the physiotherapy market, prices are regulated by Kela, and after registration eligible firms are accepted to join a pool of firms from which patients choose service providers based on their individual preferences. By using 2SLS estimation techniques, we analyzed the relationship among quality, competition and regulated price. According to the results, competition has a statistically significant (yet weak) negative effect (p = 0.019) on quality. The outcome for quality is likely caused by imperfect information. It seems that Kela has provided too little information for patients about the quality of the service.

  8. Analyzing compound and project progress through multi-objective-based compound quality assessment.

    PubMed

    Nissink, J Willem M; Degorce, Sébastien

    2013-05-01

    Compound-quality scoring methods designed to evaluate multiple drug properties concurrently are useful to analyze and prioritize output from drug-design efforts. However, formalized multiparameter optimization approaches are not widely used in drug design. We rank molecules synthesized in drug-discovery projects using simple and aggregated desirability functions reflecting medicinal chemistry 'rules'. Our quality score deals transparently with missing data, a key requirement in drug-hunting projects where data availability is often limited. We further estimate confidence in the interpretation of such a compound-quality measure. Scores and associated confidences provide systematic insight in the quality of emerging chemical equity. Tracking quality of synthetic output over time yields valuable insight into the progress of drug-design teams, with potential applications in risk and resource management of a drug portfolio.

  9. Model improvements and validation of TerraSAR-X precise orbit determination

    NASA Astrophysics Data System (ADS)

    Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.

    2017-05-01

    The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from the 2012-2014 period. The radar range bias is reduced from -10.3 to -6.1 mm with the updated orbit solutions, which coincides with the reduced standard deviation of the SLR residuals. The improvements are mainly driven by the satellite macro-model for the purpose of solar radiation pressure modeling, improved atmospheric density models, and the use of state-of-the-art gravity field models.

  10. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks

    PubMed Central

    Dawadi, Prafulla N.; Cook, Diane J.; Schmitter-Edgecombe, Maureen

    2014-01-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments. PMID:25530925

  11. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.

    PubMed

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2013-11-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.

  12. Exploring quality of life as an intervention outcome among women with stress-related disorders participating in work rehabilitation.

    PubMed

    Eklund, Mona

    2015-01-01

    Findings from quality of life studies are often inconclusive for reasons such as: i) estimates may address different aspects of quality of life and thus produce different outcomes; ii) quality of life is largely determined by self-factors; and iii) people with a long-term condition rate their quality of life better than those who have had their condition for a short duration. This makes quality of life a complex phenomenon to measure. The above explanations served as hypotheses for this methodologically oriented paper, based on a longitudinal study on women with stress-related disorders receiving work rehabilitation. Eighty-four women participating in a lifestyle intervention or care as usual were compared. Self-ratings of "general quality of life" and a summarized "satisfaction with different life domains" index (according to Manchester Short Assessment of Quality of Life) and two self-factors (self-esteem and self-mastery) were administered at admission and a 6-month follow-up. Participant age and amount of months on sick leave prior to rehabilitation were used as two proxies of duration of the condition. General quality of life distinguished between the groups, whereas satisfaction with life domains did not. Self-esteem and self-mastery were related to both quality of life aspects. Age was related to both estimates of quality of life, whereas duration of sick leave was unrelated to both. General quality of life and satisfaction with life domains produced different results. Outcome studies should apply more than one operationalization of quality of life and self-factors should be considered as important determinants of quality of life. Duration of the condition needs to be acknowledged as well when interpreting levels of quality of life, although the current study could not present any clear-cut findings in this respect.

  13. Tetanus toxoid immunization to reduce mortality from neonatal tetanus.

    PubMed

    Blencowe, Hannah; Lawn, Joy; Vandelaer, Jos; Roper, Martha; Cousens, Simon

    2010-04-01

    Neonatal tetanus remains an important and preventable cause of neonatal mortality globally. Large reductions in neonatal tetanus deaths have been reported following major increases in the coverage of tetanus toxoid immunization, yet the level of evidence for the mortality effect of tetanus toxoid immunization is surprisingly weak with only two trials considered in a Cochrane review. To review the evidence for and estimate the effect on neonatal tetanus mortality of immunization with tetanus toxoid of pregnant women, or women of childbearing age. We conducted a systematic review of multiple databases. Standardized abstraction forms were used. Individual study quality and the overall quality of evidence were assessed using an adaptation of the GRADE approach. Meta-analyses were performed. Only one randomised controlled trial (RCT) and one well-controlled cohort study were identified, which met inclusion criteria for meta-analysis. Immunization of pregnant women or women of childbearing age with at least two doses of tetanus toxoid is estimated to reduce mortality from neonatal tetanus by 94% [95% confidence interval (CI) 80-98%]. Additionally, another RCT with a case definition based on day of death, 3 case-control studies and 1 before-and-after study gave consistent results. Based on the consistency of the mortality data, the very large effect size and that the data are all from low/middle-income countries, the overall quality of the evidence was judged to be moderate. This review uses a standard approach to provide a transparent estimate of the high impact of tetanus toxoid immunization on neonatal tetanus.

  14. Robust and transferable quantification of NMR spectral quality using IROC analysis

    NASA Astrophysics Data System (ADS)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  15. Producing remote sensing-based emission estimates of prescribed burning in the contiguous United States for the U.S. Environmental Protection Agency 2011 National Emissions Inventory

    NASA Astrophysics Data System (ADS)

    McCarty, J. L.; Pouliot, G. A.; Soja, A. J.; Miller, M. E.; Rao, T.

    2013-12-01

    Prescribed fires in agricultural landscapes generally produce smaller burned areas than wildland fires but are important contributors to emissions impacting air quality and human health. Currently, there are a variety of available satellite-based estimates of crop residue burning, including the NOAA/NESDIS Hazard Mapping System (HMS) the Satellite Mapping Automated Reanalysis Tool for Fire Incident Reconciliation (SMARTFIRE 2), the Moderate Resolution Imaging Spectroradiometer (MODIS) Official Burned Area Product (MCD45A1)), the MODIS Direct Broadcast Burned Area Product (MCD64A1) the MODIS Active Fire Product (MCD14ML), and a regionally-tuned 8-day cropland differenced Normalized Burn Ratio product for the contiguous U.S. The purpose of this NASA-funded research was to refine the regionally-tuned product utilizing higher spatial resolution crop type data from the USDA NASS Cropland Data Layer and burned area training data from field work and high resolution commercial satellite data to improve the U.S. Environmental Protection Agency's (EPA) National Emissions Inventory (NEI). The final product delivered to the EPA included a detailed database of 25 different atmospheric emissions at the county level, emission distributions by crop type and seasonality, and GIS data. The resulting emission databases were shared with the U.S. EPA and regional offices, the National Wildfire Coordinating Group (NWGC) Smoke Committee, and all 48 states in the contiguous U.S., with detailed error estimations for Wyoming and Indiana and detailed analyses of results for Florida, Minnesota, North Dakota, Oklahoma, and Oregon. This work also provided opportunities in discovering the different needs of federal and state partners, including the various geospatial abilities and platforms across the many users and how to incorporate expert air quality, policy, and land management knowledge into quantitative earth observation-based estimations of prescribed fire emissions. Finally, this work created direct communication paths between federal and state partners to the scientists creating the remote sensing-based products, further improving the geospatial products and understanding of air quality impacts of prescribed burning at the state, regional, and national scales.

  16. Tchebichef moment based restoration of Gaussian blurred images.

    PubMed

    Kumar, Ahlad; Paramesran, Raveendran; Lim, Chern-Loon; Dass, Sarat C

    2016-11-10

    With the knowledge of how edges vary in the presence of a Gaussian blur, a method that uses low-order Tchebichef moments is proposed to estimate the blur parameters: sigma (σ) and size (w). The difference between the Tchebichef moments of the original and the reblurred images is used as feature vectors to train an extreme learning machine for estimating the blur parameters (σ,w). The effectiveness of the proposed method to estimate the blur parameters is examined using cross-database validation. The estimated blur parameters from the proposed method are used in the split Bregman-based image restoration algorithm. A comparative analysis of the proposed method with three existing methods using all the images from the LIVE database is carried out. The results show that the proposed method in most of the cases performs better than the three existing methods in terms of the visual quality evaluated using the structural similarity index.

  17. Comparing emission rates derived from a model with a plume-based approach and quantifying the contribution of vehicle classes to on-road emissions and air quality.

    PubMed

    Xu, Junshi; Wang, Jonathan; Hilker, Nathan; Fallah-Shorshani, Masoud; Saleh, Marc; Tu, Ran; Wang, An; Minet, Laura; Stogios, Christos; Evans, Greg; Hatzopoulou, Marianne

    2018-06-05

    This study presents a comparison of fleet averaged emission factors (EFs) derived from a traffic emission model with EFs estimated using plume-based measurements, including an investigation of the contribution of vehicle classes to carbon monoxide (CO), nitrogen oxides (NO x ), and elemental carbon (EC) along an urban corridor. To this end, a field campaign was conducted over one week in June 2016 on an arterial road in Toronto, Canada. Traffic data were collected using a traffic camera and a radar, while air quality was characterized using two monitoring stations: one located at ground-level and another at the rooftop of a four-storey building. A traffic simulation model was calibrated and validated and sec-by-sec speed profiles for all vehicle trajectories were extracted to model emissions. In addition, dispersion modelling was conducted to identify the extent to which differences in emissions translate to differences in near-road concentrations. Our results indicate that modelled EFs for CO and NO x are twice as high as plume-based EFs. Besides, modelled results indicate that transit bus emissions accounted for 60% and 70% of the total emissions of NO x and EC. Transit bus emission rates in g/passenger.km for NO x and EC were up to 8 and 22 times the emission rates of passenger cars. In contrast, the Toronto streetcars, which are electrically fuelled, were found to improve near-road air quality despite their negative impact on traffic speeds. Finally, we observe that the difference in estimated concentrations derived from the two methods is not as large as the difference in estimated emissions due to the influence of meteorology and of the urban background given that the study network is located in a busy downtown area. Implications This study presents a comparison of fleet averaged emission factors (EFs) derived from a traffic emission model with EFs estimated using plume-based measurements, including an investigation of the contribution of vehicle classes to various pollutants. Besides, dispersion modelling was conducted to identify the extent to which differences in emissions translate to differences in near-road concentrations. We observe that the difference in estimated concentrations derived from the two methods is not as large as the difference in estimated emissions due to the influence of meteorology and of the urban background as the study network is located in a busy downtown area.

  18. Curriculum-Based Measurement of Oral Reading: Multi-Study Evaluation of Schedule, Duration, and Dataset Quality on Progress Monitoring Outcomes

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Zopluoglu, Cengiz; Monaghen, Barbara D.; Van Norman, Ethan R.

    2013-01-01

    Curriculum-Based Measurement of Oral Reading (CBM-R) is used to collect time series data, estimate the rate of student achievement, and evaluate program effectiveness. A series of 5 studies were carried out to evaluate the validity, reliability, precision, and diagnostic accuracy of progress monitoring across a variety of progress monitoring…

  19. Loose Coupling of Wearable-Based INSs with Automatic Heading Evaluation.

    PubMed

    Bousdar Ahmed, Dina; Munoz Diaz, Estefania

    2017-11-03

    Position tracking of pedestrians by means of inertial sensors is a highly explored field of research. In fact, there are already many approaches to implement inertial navigation systems (INSs). However, most of them use a single inertial measurement unit (IMU) attached to the pedestrian's body. Since wearable-devices will be given items in the future, this work explores the implementation of an INS using two wearable-based IMUs. A loosely coupled approach is proposed to combine the outputs of wearable-based INSs. The latter are based on a pocket-mounted IMU and a foot-mounted IMU. The loosely coupled fusion combines the output of the two INSs not only when these outputs are least erroneous, but also automatically favoring the best output. This approach is named smart update. The main challenge is determining the quality of the heading estimation of each INS, which changes every time. In order to address this, a novel concept to determine the quality of the heading estimation is presented. This concept is subject to a patent application. The results show that the position error rate of the loosely coupled fusion is 10 cm/s better than either the foot INS's or pocket INS's error rate in 95% of the cases.

  20. Evolving Improvements to TRMM Ground Validation Rainfall Estimates

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Kulie, M. S.; Marks, D. A.; Wolff, D. B.; Ferrier, B. S.; Amitai, E.; Silberstein, D. S.; Fisher, B. L.; Wang, J.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. Since the successful 1997 launch of the TRMM satellite, GV rainfall estimates have demonstrated systematic improvements directly related to improved radar and rain gauge data, modified science techniques, and software revisions. Improved rainfall estimates have resulted in higher quality GV rainfall products and subsequently, much improved evaluation products for the satellite-based precipitation estimates from TRMM. This presentation will demonstrate how TRMM GV rainfall products created in a semi-automated, operational environment have evolved and improved through successive generations. Monthly rainfall maps and rainfall accumulation statistics for each primary site will be presented for each stage of GV product development. Contributions from individual product modifications involving radar reflectivity (Ze)-rain rate (R) relationship refinements, improvements in rain gauge bulk-adjustment and data quality control processes, and improved radar and gauge data will be discussed. Finally, it will be demonstrated that as GV rainfall products have improved, rainfall estimation comparisons between GV and satellite have converged, lending confidence to the satellite-derived precipitation measurements from TRMM.

  1. Inland-coastal water interaction: Remote sensing application for shallow-water quality and algal blooms modeling

    NASA Astrophysics Data System (ADS)

    Melesse, Assefa; Hajigholizadeh, Mohammad; Blakey, Tara

    2017-04-01

    In this study, Landsat 8 and Sea-Viewing Wide Field-of-View Sensor (SeaWIFS) sensors were used to model the spatiotemporal changes of four water quality parameters: Landsat 8 (turbidity, chlorophyll-a (chl-a), total phosphate, and total nitrogen) and Sea-Viewing Wide Field-of-View Sensor (SeaWIFS) (algal blooms). The study was conducted in Florda bay, south Florida and model outputs were compared with in-situ observed data. The Landsat 8 based study found that, the predictive models to estimate chl-a and turbidity concentrations, developed through the use of stepwise multiple linear regression (MLR), gave high coefficients of determination in dry season (wet season) (R2 = 0.86(0.66) for chl-a and R2 = 0.84(0.63) for turbidity). Total phosphate and TN were estimated using best-fit multiple linear regression models as a function of Landsat TM and OLI,127 and ground data and showed a high coefficient of determination in dry season (wet season) (R2 = 0.74(0.69) for total phosphate and R2 = 0.82(0.82) for TN). Similarly, the ability of SeaWIFS for chl-a retrieval from optically shallow coastal waters by applying algorithms specific to the pixels' benthic class was evaluated. Benthic class was determined through satellite image-based classification methods. It was found that benthic class based chl-a modeling algorithm was better than the existing regionally-tuned approach. Evaluation of the residuals indicated the potential for further improvement to chl-a estimation through finer characterization of benthic environments. Key words: Landsat, SeaWIFS, water quality, Florida bay, Chl-a, turbidity

  2. Calibration Transfer Between a Bench Scanning and a Submersible Diode Array Spectrophotometer for In Situ Wastewater Quality Monitoring in Sewer Systems.

    PubMed

    Brito, Rita S; Pinheiro, Helena M; Ferreira, Filipa; Matos, José S; Pinheiro, Alexandre; Lourenço, Nídia D

    2016-03-01

    Online monitoring programs based on spectroscopy have a high application potential for the detection of hazardous wastewater discharges in sewer systems. Wastewater hydraulics poses a challenge for in situ spectroscopy, especially when the system includes storm water connections leading to rapid changes in water depth, velocity, and in the water quality matrix. Thus, there is a need to optimize and fix the location of in situ instruments, limiting their availability for calibration. In this context, the development of calibration models on bench spectrophotometers to estimate wastewater quality parameters from spectra acquired with in situ instruments could be very useful. However, spectra contain information not only from the samples, but also from the spectrophotometer generally invalidating this approach. The use of calibration transfer methods is a promising solution to this problem. In this study, calibration models were developed using interval partial least squares (iPLS), for the estimation of total suspended solids (TSS) and chemical oxygen demand (COD) in sewage from Ultraviolet-visible spectra acquired in a bench scanning spectrophotometer. The feasibility of calibration transfer to a submersible, diode array equipment, to be subsequently operated in situ, was assessed using three procedures: slope and bias correction (SBC); single wavelength standardization (SWS) on mean spectra; and local centering (LC). The results showed that SBC was the most adequate for the available data, adding insignificant error to the base model estimates. Single wavelength standardization was a close second best, potentially more robust, and independent of the base iPLS model. Local centering was shown to be inadequate for the samples and instruments used. © The Author(s) 2016.

  3. Energy efficiency in wireless communication systems

    DOEpatents

    Caffrey, Michael Paul; Palmer, Joseph McRae

    2012-12-11

    Wireless communication systems and methods utilize one or more remote terminals, one or more base terminals, and a communication channel between the remote terminal(s) and base terminal(s). The remote terminal applies a direct sequence spreading code to a data signal at a spreading factor to provide a direct sequence spread spectrum (DSSS) signal. The DSSS signal is transmitted over the communication channel to the base terminal which can be configured to despread the received DSSS signal by a spreading factor matching the spreading factor utilized to spread the data signal. The remote terminal and base terminal can dynamically vary the matching spreading factors to adjust the data rate based on an estimation of operating quality over time between the remote terminal and base terminal such that the amount of data being transmitted is substantially maximized while providing a specified quality of service.

  4. The cost effectiveness of a quality improvement program to reduce maternal and fetal mortality in a regional referral hospital in Accra, Ghana.

    PubMed

    Goodman, David M; Ramaswamy, Rohit; Jeuland, Marc; Srofenyoh, Emmanuel K; Engmann, Cyril M; Olufolabi, Adeyemi J; Owen, Medge D

    2017-01-01

    To evaluate the cost-effectiveness of a quality improvement intervention aimed at reducing maternal and fetal mortality in Accra, Ghana. Quasi-experimental, time-sequence intervention, retrospective cost-effectiveness analysis. Data were collected on the cost and outcomes of a 5-year Kybele-Ghana Health Service Quality Improvement (QI) intervention conducted at Ridge Regional Hospital, a tertiary referral center in Accra, Ghana, focused on systems, personnel, and communication. Maternal deaths prevented were estimated comparing observed rates with counterfactual projections of maternal mortality and case-fatality rates for hypertensive disorders of pregnancy and obstetric hemorrhage. Stillbirths prevented were estimated based on counterfactual estimates of stillbirth rates. Cost-effectiveness was then calculated using estimated disability-adjusted life years averted and subjected to Monte Carlo and one-way sensitivity analyses to test the importance of assumptions inherent in the calculations. Incremental Cost-effectiveness ratio (ICER), which represents the cost per disability-adjusted life-year (DALY) averted by the intervention compared to a model counterfactual. From 2007-2011, 39,234 deliveries were affected by the QI intervention implemented at Ridge Regional Hospital. The total budget for the program was $2,363,100. Based on program estimates, 236 (±5) maternal deaths and 129 (±13) intrapartum stillbirths were averted (14,876 DALYs), implying an ICER of $158 ($129-$195) USD. This value is well below the highly cost-effective threshold of $1268 USD. Sensitivity analysis considered DALY calculation methods, and yearly prevalence of risk factors and case fatality rates. In each of these analyses, the program remained highly cost-effective with an ICER ranging from $97-$218. QI interventions to reduce maternal and fetal mortality in low resource settings can be highly cost effective. Cost-effectiveness analysis is feasible and should regularly be conducted to encourage fiscal responsibility in the pursuit of improved maternal and child health.

  5. Modeling crop residue burning experiments to evaluate smoke emissions and plume transport.

    PubMed

    Zhou, Luxi; Baker, Kirk R; Napelenok, Sergey L; Pouliot, George; Elleman, Robert; O'Neill, Susan M; Urbanski, Shawn P; Wong, David C

    2018-06-15

    Crop residue burning is a common land management practice that results in emissions of a variety of pollutants with negative health impacts. Modeling systems are used to estimate air quality impacts of crop residue burning to support retrospective regulatory assessments and also for forecasting purposes. Ground and airborne measurements from a recent field experiment in the Pacific Northwest focused on cropland residue burning was used to evaluate model performance in capturing surface and aloft impacts from the burning events. The Community Multiscale Air Quality (CMAQ) model was used to simulate multiple crop residue burns with 2 km grid spacing using field-specific information and also more general assumptions traditionally used to support National Emission Inventory based assessments. Field study specific information, which includes area burned, fuel consumption, and combustion completeness, resulted in increased biomass consumption by 123 tons (60% increase) on average compared to consumption estimated with default methods in the National Emission Inventory (NEI) process. Buoyancy heat flux, a key parameter for model predicted fire plume rise, estimated from fuel loading obtained from field measurements can be 30% to 200% more than when estimated using default field information. The increased buoyancy heat flux resulted in higher plume rise by 30% to 80%. This evaluation indicates that the regulatory air quality modeling system can replicate intensity and transport (horizontal and vertical) features for crop residue burning in this region when region-specific information is used to inform emissions and plume rise calculations. Further, previous vertical emissions allocation treatment of putting all cropland residue burning in the surface layer does not compare well with measured plume structure and these types of burns should be modeled more similarly to prescribed fires such that plume rise is based on an estimate of buoyancy. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Toward coordinated space-based air quality, carbon cycle, and ecosystem measurements to quantify air quality-ecosystem interactions

    NASA Astrophysics Data System (ADS)

    Neu, J. L.; Schimel, D.; Lerdau, M.; Drewry, D.; Fu, D.; Payne, V.; Bowman, K. W.; Worden, J. R.

    2016-12-01

    Tropospheric ozone concentrations are increasing in many regions of the world, and this ozone can severely damage vegetation. Ozone enters plants through their stomata and oxidizes tissues, inhibiting physiology and decreasing ecosystem productivity. Ozone has been experimentally shown to reduce crop production, with important implications for global food security as concentrations rise. Ozone damage to forests also alters productivity and carbon storage and may drive changes in species distributions and biodiversity. Process-based quantitative estimates of these ozone impacts on terrestrial ecosystems at continental to global scales as well as of feedbacks to air quality via production of volatile organic compounds (VOCs) are thus crucial to sustainable development planning. We demonstrate that leveraging planned and proposed missions to measure ozone, formaldehyde, and isoprene along with solar-induced fluorescence (SiF), evapotranspiration, and plant nitrogen content can meet the requirements of an integrated observing system for air quality-ecosystem interactions while also meeting the needs of the individual Air Quality, Carbon Cycle, and Ecosystems communities.

  7. Impact of obesity and knee osteoarthritis on morbidity and mortality in older Americans.

    PubMed

    Losina, Elena; Walensky, Rochelle P; Reichmann, William M; Holt, Holly L; Gerlovin, Hanna; Solomon, Daniel H; Jordan, Joanne M; Hunter, David J; Suter, Lisa G; Weinstein, Alexander M; Paltiel, A David; Katz, Jeffrey N

    2011-02-15

    Obesity and knee osteoarthritis are among the most frequent chronic conditions affecting Americans aged 50 to 84 years. To estimate quality-adjusted life-years lost due to obesity and knee osteoarthritis and health benefits of reducing obesity prevalence to levels observed a decade ago. The U.S. Census and obesity data from national data sources were combined with estimated prevalence of symptomatic knee osteoarthritis to assign persons aged 50 to 84 years to 4 subpopulations: nonobese without knee osteoarthritis (reference group), nonobese with knee osteoarthritis, obese without knee osteoarthritis, and obese with knee osteoarthritis. The Osteoarthritis Policy Model, a computer simulation model of knee osteoarthritis and obesity, was used to estimate quality-adjusted life-year losses due to knee osteoarthritis and obesity in comparison with the reference group. United States. U.S. population aged 50 to 84 years. Quality-adjusted life-years lost owing to knee osteoarthritis and obesity. Estimated total losses of per-person quality-adjusted life-years ranged from 1.857 in nonobese persons with knee osteoarthritis to 3.501 for persons affected by both conditions, resulting in a total of 86.0 million quality-adjusted life-years lost due to obesity, knee osteoarthritis, or both. Quality-adjusted life-years lost due to knee osteoarthritis and/or obesity represent 10% to 25% of the remaining quality-adjusted survival of persons aged 50 to 84 years. Hispanic and black women had disproportionately high losses. Model findings suggested that reversing obesity prevalence to levels seen 10 years ago would avert 178,071 cases of coronary heart disease, 889,872 cases of diabetes, and 111,206 total knee replacements. Such a reduction in obesity would increase the quantity of life by 6,318,030 years and improve life expectancy by 7,812,120 quality-adjusted years in U.S. adults aged 50 to 84 years. Comorbidity incidences were derived from prevalence estimates on the basis of life expectancy of the general population, potentially resulting in conservative underestimates. Calibration analyses were conducted to ensure comparability of model-based projections and data from external sources. The number of quality-adjusted life-years lost owing to knee osteoarthritis and obesity seems to be substantial, with black and Hispanic women experiencing disproportionate losses. Reducing mean body mass index to the levels observed a decade ago in this population would yield substantial health benefits. The National Institutes of Health and the Arthritis Foundation.

  8. Simultaneous motion estimation and image reconstruction (SMEIR) for 4D cone-beam CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Gu, Xuejun

    2013-10-15

    Purpose: Image reconstruction and motion model estimation in four-dimensional cone-beam CT (4D-CBCT) are conventionally handled as two sequential steps. Due to the limited number of projections at each phase, the image quality of 4D-CBCT is degraded by view aliasing artifacts, and the accuracy of subsequent motion modeling is decreased by the inferior 4D-CBCT. The objective of this work is to enhance both the image quality of 4D-CBCT and the accuracy of motion model estimation with a novel strategy enabling simultaneous motion estimation and image reconstruction (SMEIR).Methods: The proposed SMEIR algorithm consists of two alternating steps: (1) model-based iterative image reconstructionmore » to obtain a motion-compensated primary CBCT (m-pCBCT) and (2) motion model estimation to obtain an optimal set of deformation vector fields (DVFs) between the m-pCBCT and other 4D-CBCT phases. The motion-compensated image reconstruction is based on the simultaneous algebraic reconstruction technique (SART) coupled with total variation minimization. During the forward- and backprojection of SART, measured projections from an entire set of 4D-CBCT are used for reconstruction of the m-pCBCT by utilizing the updated DVF. The DVF is estimated by matching the forward projection of the deformed m-pCBCT and measured projections of other phases of 4D-CBCT. The performance of the SMEIR algorithm is quantitatively evaluated on a 4D NCAT phantom. The quality of reconstructed 4D images and the accuracy of tumor motion trajectory are assessed by comparing with those resulting from conventional sequential 4D-CBCT reconstructions (FDK and total variation minimization) and motion estimation (demons algorithm). The performance of the SMEIR algorithm is further evaluated by reconstructing a lung cancer patient 4D-CBCT.Results: Image quality of 4D-CBCT is greatly improved by the SMEIR algorithm in both phantom and patient studies. When all projections are used to reconstruct a 3D-CBCT by FDK, motion-blurring artifacts are present, leading to a 24.4% relative reconstruction error in the NACT phantom. View aliasing artifacts are present in 4D-CBCT reconstructed by FDK from 20 projections, with a relative error of 32.1%. When total variation minimization is used to reconstruct 4D-CBCT, the relative error is 18.9%. Image quality of 4D-CBCT is substantially improved by using the SMEIR algorithm and relative error is reduced to 7.6%. The maximum error (MaxE) of tumor motion determined from the DVF obtained by demons registration on a FDK-reconstructed 4D-CBCT is 3.0, 2.3, and 7.1 mm along left–right (L-R), anterior–posterior (A-P), and superior–inferior (S-I) directions, respectively. From the DVF obtained by demons registration on 4D-CBCT reconstructed by total variation minimization, the MaxE of tumor motion is reduced to 1.5, 0.5, and 5.5 mm along L-R, A-P, and S-I directions. From the DVF estimated by SMEIR algorithm, the MaxE of tumor motion is further reduced to 0.8, 0.4, and 1.5 mm along L-R, A-P, and S-I directions, respectively.Conclusions: The proposed SMEIR algorithm is able to estimate a motion model and reconstruct motion-compensated 4D-CBCT. The SMEIR algorithm improves image reconstruction accuracy of 4D-CBCT and tumor motion trajectory estimation accuracy as compared to conventional sequential 4D-CBCT reconstruction and motion estimation.« less

  9. Updating of states in operational hydrological models

    NASA Astrophysics Data System (ADS)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  10. The Use of a Cognitive Protectant to Help Maintain Quality of Life and Cognition in Premenopausal Women with Breast Cancer Undergoing Adjuvant Chemotherapy

    DTIC Science & Technology

    2005-10-01

    quality of life and cognitive function are experienced by women with breast cancer who are receiving adjuvant chemotherapy. These decrements can be identified in some women even several years following treatment. The majority of relevant research has been based on retrospective data in women with breast cancer. Current estimates suggest that 25% of breast cancers will be diagnosed in women under age 50, yet very little data are available regarding younger women’s cognitive function and quality of life during chemotherapy.

  11. A Deep Neural Network Model for Rainfall Estimation UsingPolarimetric WSR-88DP Radar Observations

    NASA Astrophysics Data System (ADS)

    Tan, H.; Chandra, C. V.; Chen, H.

    2016-12-01

    Rainfall estimation based on radar measurements has been an important topic for a few decades. Generally, radar rainfall estimation is conducted through parametric algorisms such as reflectivity-rainfall relation (i.e., Z-R relation). On the other hand, neural networks are developed for ground rainfall estimation based on radar measurements. This nonparametric method, which takes into account of both radar observations and rainfall measurements from ground rain gauges, has been demonstrated successfully for rainfall rate estimation. However, the neural network-based rainfall estimation is limited in practice due to the model complexity and structure, data quality, as well as different rainfall microphysics. Recently, the deep learning approach has been introduced in pattern recognition and machine learning areas. Compared to traditional neural networks, the deep learning based methodologies have larger number of hidden layers and more complex structure for data representation. Through a hierarchical learning process, the high level structured information and knowledge can be extracted automatically from low level features of the data. In this paper, we introduce a novel deep neural network model for rainfall estimation based on ground polarimetric radar measurements .The model is designed to capture the complex abstractions of radar measurements at different levels using multiple layers feature identification and extraction. The abstractions at different levels can be used independently or fused with other data resource such as satellite-based rainfall products and/or topographic data to represent the rain characteristics at certain location. In particular, the WSR-88DP radar and rain gauge data collected in Dallas - Fort Worth Metroplex and Florida are used extensively to train the model, and for demonstration purposes. Quantitative evaluation of the deep neural network based rainfall products will also be presented, which is based on an independent rain gauge network.

  12. Using cell phone location to assess misclassification errors in air pollution exposure estimation.

    PubMed

    Yu, Haofei; Russell, Armistead; Mulholland, James; Huang, Zhijiong

    2018-02-01

    Air pollution epidemiologic and health impact studies often rely on home addresses to estimate individual subject's pollution exposure. In this study, we used detailed cell phone location data, the call detail record (CDR), to account for the impact of spatiotemporal subject mobility on estimates of ambient air pollutant exposure. This approach was applied on a sample with 9886 unique simcard IDs in Shenzhen, China, on one mid-week day in October 2013. Hourly ambient concentrations of six chosen pollutants were simulated by the Community Multi-scale Air Quality model fused with observational data, and matched with detailed location data for these IDs. The results were compared with exposure estimates using home addresses to assess potential exposure misclassification errors. We found the misclassifications errors are likely to be substantial when home location alone is applied. The CDR based approach indicates that the home based approach tends to over-estimate exposures for subjects with higher exposure levels and under-estimate exposures for those with lower exposure levels. Our results show that the cell phone location based approach can be used to assess exposure misclassification error and has the potential for improving exposure estimates in air pollution epidemiology studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Use of near infrared spectroscopy for estimating meat chemical composition, quality traits and fatty acid content from cattle fed sunflower or flaxseed.

    PubMed

    Prieto, N; López-Campos, O; Aalhus, J L; Dugan, M E R; Juárez, M; Uttaro, B

    2014-10-01

    This study tested the ability of near infrared reflectance spectroscopy (NIRS) to predict meat chemical composition, quality traits and fatty acid (FA) composition from 63 steers fed sunflower or flaxseed in combination with high forage diets. NIRS calibrations, tested by cross-validation, were successful for predicting crude protein, moisture and fat content with coefficients of determination (R(2)) (RMSECV, g·100g(-1) wet matter) of 0.85 (0.48), 0.90 (0.60) and 0.86 (1.08), respectively, but were not reliable for meat quality attributes. This technology accurately predicted saturated, monounsaturated and branched FA and conjugated linoleic acid content (R(2): 0.83-0.97; RMSECV: 0.04-1.15mg·g(-1) tissue) and might be suitable for screening purposes in meat based on the content of FAs beneficial to human health such as rumenic and vaccenic acids. Further research applying NIRS to estimate meat quality attributes will require the use on-line of a fibre-optic probe on intact samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Water Quality Variable Estimation using Partial Least Squares Regression and Multi-Scale Remote Sensing.

    NASA Astrophysics Data System (ADS)

    Peterson, K. T.; Wulamu, A.

    2017-12-01

    Water, essential to all living organisms, is one of the Earth's most precious resources. Remote sensing offers an ideal approach to monitor water quality over traditional in-situ techniques that are highly time and resource consuming. Utilizing a multi-scale approach, incorporating data from handheld spectroscopy, UAS based hyperspectal, and satellite multispectral images were collected in coordination with in-situ water quality samples for the two midwestern watersheds. The remote sensing data was modeled and correlated to the in-situ water quality variables including chlorophyll content (Chl), turbidity, and total dissolved solids (TDS) using Normalized Difference Spectral Indices (NDSI) and Partial Least Squares Regression (PLSR). The results of the study supported the original hypothesis that correlating water quality variables with remotely sensed data benefits greatly from the use of more complex modeling and regression techniques such as PLSR. The final results generated from the PLSR analysis resulted in much higher R2 values for all variables when compared to NDSI. The combination of NDSI and PLSR analysis also identified key wavelengths for identification that aligned with previous study's findings. This research displays the advantages and future for complex modeling and machine learning techniques to improve water quality variable estimation from spectral data.

  15. Automatic detection of retina disease: robustness to image quality and localization of anatomy structure.

    PubMed

    Karnowski, T P; Aykac, D; Giancardo, L; Li, Y; Nichols, T; Tobin, K W; Chaum, E

    2011-01-01

    The automated detection of diabetic retinopathy and other eye diseases in images of the retina has great promise as a low-cost method for broad-based screening. Many systems in the literature which perform automated detection include a quality estimation step and physiological feature detection, including the vascular tree and the optic nerve / macula location. In this work, we study the robustness of an automated disease detection method with respect to the accuracy of the optic nerve location and the quality of the images obtained as judged by a quality estimation algorithm. The detection algorithm features microaneurysm and exudate detection followed by feature extraction on the detected population to describe the overall retina image. Labeled images of retinas ground-truthed to disease states are used to train a supervised learning algorithm to identify the disease state of the retina image and exam set. Under the restrictions of high confidence optic nerve detections and good quality imagery, the system achieves a sensitivity and specificity of 94.8% and 78.7% with area-under-curve of 95.3%. Analysis of the effect of constraining quality and the distinction between mild non-proliferative diabetic retinopathy, normal retina images, and more severe disease states is included.

  16. Relationship between weight-related behavioral profiles and health outcomes by sexual orientation and gender.

    PubMed

    VanKim, Nicole A; Erickson, Darin J; Eisenberg, Marla E; Lust, Katherine; Rosser, B R Simon; Laska, Melissa N

    2016-07-01

    Examine relationships between weight-related factors and weight status, body dissatisfaction, chronic health conditions, and quality of life across sexual orientation and gender. Two- and four-year college students participated in the College Student Health Survey (n = 28,703; 2009-2013). Risk differences were calculated to estimate relationships between behavioral profiles and weight status, body satisfaction, diagnosis of a chronic condition, and quality of life, stratified by gender and sexual orientation. Four behavioral profiles, characterized as "healthier eating habits, more physically active," "healthier eating habits," "moderate eating habits," and "unhealthy weight control," were utilized based on latent class analyses, estimated from nine weight-related behavioral survey items. Sexual orientation differences in weight and quality of life were identified. For example, sexual minority groups reported significantly poorer quality of life than their heterosexual counterparts (females: 22.5%-38.6% (sexual minority) vs. 19.8% (heterosexual); males: 14.3%-26.7% (sexual minority) vs. 11.8% (heterosexual)). Compared with the "healthier eating habits, more physically active" profile, the "unhealthy weight control" profile was associated with obesity, poor body satisfaction, and poor quality of life in multiple gender/sexual orientation subgroups. Interventions are needed to address obesity, body dissatisfaction, and poor quality of life among sexual minority college students. © 2016 The Obesity Society.

  17. Towards an effective data peer review

    NASA Astrophysics Data System (ADS)

    Düsterhus, André; Hense, Andreas

    2014-05-01

    Peer review is an established procedure to ensure the quality of scientific publications and is currently used as a prerequisite for acceptance of papers in the scientific community. In the past years the publication of raw data and its metadata got increased attention, which led to the idea of bringing it to the same standards the journals for traditional publications have. One missing element to achieve this is a comparable peer review scheme. This contribution introduces the idea of a quality evaluation process, which is designed to analyse the technical quality as well as the content of a dataset. It bases on quality tests, which results are evaluated with the help of the knowledge of an expert. The results of the tests and the expert knowledge are evaluated probabilistically and are statistically combined. As a result the quality of a dataset is estimated with a single value only. This approach allows the reviewer to quickly identify the potential weaknesses of a dataset and generate a transparent and comprehensible report. To demonstrate the scheme, an application on a large meteorological dataset will be shown. Furthermore, potentials and risks of such a scheme will be introduced and practical implications for its possible introduction to data centres investigated. Especially, the effects of reducing the estimate of quality of a dataset to a single number will be critically discussed.

  18. Color image lossy compression based on blind evaluation and prediction of noise characteristics

    NASA Astrophysics Data System (ADS)

    Ponomarenko, Nikolay N.; Lukin, Vladimir V.; Egiazarian, Karen O.; Lepisto, Leena

    2011-03-01

    The paper deals with JPEG adaptive lossy compression of color images formed by digital cameras. Adaptation to noise characteristics and blur estimated for each given image is carried out. The dominant factor degrading image quality is determined in a blind manner. Characteristics of this dominant factor are then estimated. Finally, a scaling factor that determines quantization steps for default JPEG table is adaptively set (selected). Within this general framework, two possible strategies are considered. A first one presumes blind estimation for an image after all operations in digital image processing chain just before compressing a given raster image. A second strategy is based on prediction of noise and blur parameters from analysis of RAW image under quite general assumptions concerning characteristics parameters of transformations an image will be subject to at further processing stages. The advantages of both strategies are discussed. The first strategy provides more accurate estimation and larger benefit in image compression ratio (CR) compared to super-high quality (SHQ) mode. However, it is more complicated and requires more resources. The second strategy is simpler but less beneficial. The proposed approaches are tested for quite many real life color images acquired by digital cameras and shown to provide more than two time increase of average CR compared to SHQ mode without introducing visible distortions with respect to SHQ compressed images.

  19. Use of Electronic Documentation for Quality Improvement in Hospice

    PubMed Central

    Cagle, John G.; Rokoske, Franziska S.; Durham, Danielle; Schenck, Anna P.; Spence, Carol; Hanson, Laura C.

    2015-01-01

    Little evidence exists on the use of electronic documentation in hospice and its relationship to quality improvement practices. The purposes of this study were to: (1) estimate the prevalence of electronic documentation use in hospice; (2) identify organizational characteristics associated with use of electronic documentation; and (3) determine whether quality measurement practices differed based on documentation format (electronic vs. nonelectronic). Surveys concerning the use of electronic documentation for quality improvement practices and the monitoring of quality-related care and outcomes were collected from 653 hospices. Users of electronic documentation were able to monitor a wider range of quality-related data than users of nonelectronic documentation. Quality components such as advanced care planning, cultural needs, experience during care of the actively dying, and the number/types of care being delivered were more likely to be documented by users of electronic documentation. Use of electronic documentation may help hospices to monitor quality and compliance. PMID:22267819

  20. Web-based assessments of physical activity in youth: considerations for design and scale calibration.

    PubMed

    Saint-Maurice, Pedro F; Welk, Gregory J

    2014-12-01

    This paper describes the design and methods involved in calibrating a Web-based self-report instrument to estimate physical activity behavior. The limitations of self-report measures are well known, but calibration methods enable the reported information to be equated to estimates obtained from objective data. This paper summarizes design considerations for effective development and calibration of physical activity self-report measures. Each of the design considerations is put into context and followed by a practical application based on our ongoing calibration research with a promising online self-report tool called the Youth Activity Profile (YAP). We first describe the overall concept of calibration and how this influences the selection of appropriate self-report tools for this population. We point out the advantages and disadvantages of different monitoring devices since the choice of the criterion measure and the strategies used to minimize error in the measure can dramatically improve the quality of the data. We summarize strategies to ensure quality control in data collection and discuss analytical considerations involved in group- vs individual-level inference. For cross-validation procedures, we describe the advantages of equivalence testing procedures that directly test and quantify agreement. Lastly, we introduce the unique challenges encountered when transitioning from paper to a Web-based tool. The Web offers considerable potential for broad adoption but an iterative calibration approach focused on continued refinement is needed to ensure that estimates are generalizable across individuals, regions, seasons and countries.

  1. Hydrogeology and water quality of the Pepacton Reservoir Watershed in southeastern New York. Part 4. Quantity and quality of ground-water and tributary contributions to stream base flow in selected main-valley reaches

    USGS Publications Warehouse

    Heisig, Paul M.

    2004-01-01

    Estimates of the quantity and quality of ground-water discharge from valley-fill deposits were calculated for nine valley reaches within the Pepacton watershed in southeastern New York in July and August of 2001. Streamflow and water quality at the upstream and downstream end of each reach and at intervening tributaries were measured under base-flow conditions and used in mass-balance equations to determine quantity and quality of ground-water discharge. These measurements and estimates define the relative magnitudes of upland (tributary inflow) and valley-fill (ground-water discharge) contributions to the main-valley streams and provide a basis for understanding the effects of hydrogeologic setting on these contributions. Estimates of the water-quality of ground-water discharge also provide an indication of the effects of road salt, manure, and human wastewater from villages on the water quality of streams that feed the Pepacton Reservoir. The most common contaminant in ground-water discharge was chloride from road salt; concentrations were less than 15 mg/L.Investigation of ground-water quality within a large watershed by measurement of stream base-flow quantity and quality followed by mass-balance calculations has benefits and drawbacks in comparison to direct ground-water sampling from wells. First, sampling streams is far less expensive than siting, installing, and sampling a watershed-wide network of wells. Second, base-flow samples represent composite samples of ground-water discharge from the most active part of the ground-water flow system across a drainage area, whereas a well network would only be representative of discrete points within local ground-water flow systems. Drawbacks to this method include limited reach selection because of unfavorable or unrepresentative hydrologic conditions, potential errors associated with a large number of streamflow and water-quality measurements, and limited ability to estimate concentrations of nonconservative constituents such as nutrients.The total gain in streamflow from the upper end to the lower end of each valley reach was positively correlated with the annual-runoff volume calculated for the drainage area of the reach. This correlation was not greatly affected by the proportions of ground-water and tributary contributions, except at two reaches that lost much of their tributary flow after the July survey. In these reaches, the gain in total streamflow showed a negative departure from this correlation.Calculated ground-water discharge exceeded the total tributary inflow in each valley reach in both surveys. Groundwater discharge, as a percentage of streamflow gain, was greatest among reaches in wide valleys (about 1,000-ft wide valley floors) that contain permeable valley fill because tributary flows were seasonally diminished or absent as a result of streambed infiltration. Tributary inflows, as a percentage of streamflow gain, were highest in reaches of narrow valleys (200-500-ft wide valley floors) with little valley fill and high annual runoff.Stream-water and ground-water quality were characterized by major-ion type as either (1) naturally occurring water types, relatively unaffected by road salt, or (2) road-salt-affected water types having elevated concentrations of chloride and sodium. The naturally occurring waters were typically the calcium-bicarbonate type, but some contained magnesium and (or) sulfate as secondary ions. Magnesium concentration in base flow is probably related to the amount of till and its carbonate content, or to the amount of lime used on cultivated fields within a drainage area. Sulfate was a defining ion only in dilute waters (with short or unreactive flow paths) with low concentrations of bicarbonate. Nearly all tributary waters were classified as naturally occurring water types.Ground-water discharge from nearly all valley reaches that contain State or county highways had elevated concentrations of chloride and sodsodium. The mean chloride concentrations of ground-water discharge--from 8 to 13 milligrams per liter--did not exceed Federal or State standards, but were about 5 times higher than naturally occurring levels. Application of road salt along a valley bottom probably affects only the shallow ground water in the area between a road and a stream. The elevated concentrations of chloride and sodium in the base-flow samples from such reaches indicate that the concentrations in the affected ground water were high enough to offset the low concentrations in all unaffected ground water entering the reach.Nutrient (nitrate and orthophosphate) concentrations in base-flow samples collected throughout the valleyreach network could not generally be used to estimate their concentrations in ground-water discharge because these constituents can be transformed or removed from water through biological uptake, transformation, or by adsorption on sediments. Base-flow samples from streams with upgradient manure sources or villages served by septic systems consistently had the highest concentrations of these nutrients.

  2. The management of patients with T1 adenocarcinoma of the low rectum: a decision analysis.

    PubMed

    Johnston, Calvin F; Tomlinson, George; Temple, Larissa K; Baxter, Nancy N

    2013-04-01

    Decision making for patients with T1 adenocarcinoma of the low rectum, when treatment options are limited to a transanal local excision or abdominoperineal resection, is challenging. The aim of this study was to develop a contemporary decision analysis to assist patients and clinicians in balancing the goals of maximizing life expectancy and quality of life in this situation. We constructed a Markov-type microsimulation in open-source software. Recurrence rates and quality-of-life parameters were elicited by systematic literature reviews. Sensitivity analyses were performed on key model parameters. Our base case for analysis was a 65-year-old man with low-lying T1N0 rectal cancer. We determined the sensitivity of our model for sex, age up to 80, and T stage. The main outcome measured was quality-adjusted life-years. In the base case, selecting transanal local excision over abdominoperineal resection resulted in a loss of 0.53 years of life expectancy but a gain of 0.97 quality-adjusted life-years. One-way sensitivity analysis demonstrated a health state utility value threshold for permanent colostomy of 0.93. This value ranged from 0.88 to 1.0 based on tumor recurrence risk. There were no other model sensitivities. Some model parameter estimates were based on weak data. In our model, transanal local excision was found to be the preferable approach for most patients. An abdominoperineal resection has a 3.5% longer life expectancy, but this advantage is lost when the quality-of-life reduction reported by stoma patients is weighed in. The minority group in whom abdominoperineal resection is preferred are those who are unwilling to sacrifice 7% of their life expectancy to avoid a permanent stoma. This is estimated to be approximately 25% of all patients. The threshold increases to 12% of life expectancy in high-risk tumors. No other factors are found to be relevant to the decision.

  3. Assessment of visual landscape quality using IKONOS imagery.

    PubMed

    Ozkan, Ulas Yunus

    2014-07-01

    The assessment of visual landscape quality is of importance to the management of urban woodlands. Satellite remote sensing may be used for this purpose as a substitute for traditional survey techniques that are both labour-intensive and time-consuming. This study examines the association between the quality of the perceived visual landscape in urban woodlands and texture measures extracted from IKONOS satellite data, which features 4-m spatial resolution and four spectral bands. The study was conducted in the woodlands of Istanbul (the most important element of urban mosaic) lying along both shores of the Bosporus Strait. The visual quality assessment applied in this study is based on the perceptual approach and was performed via a survey of expressed preferences. For this purpose, representative photographs of real scenery were used to elicit observers' preferences. A slide show comprising 33 images was presented to a group of 153 volunteers (all undergraduate students), and they were asked to rate the visual quality of each on a 10-point scale (1 for very low visual quality, 10 for very high). Average visual quality scores were calculated for landscape. Texture measures were acquired using the two methods: pixel-based and object-based. Pixel-based texture measures were extracted from the first principle component (PC1) image. Object-based texture measures were extracted by using the original four bands. The association between image texture measures and perceived visual landscape quality was tested via Pearson's correlation coefficient. The analysis found a strong linear association between image texture measures and visual quality. The highest correlation coefficient was calculated between standard deviation of gray levels (SDGL) (one of the pixel-based texture measures) and visual quality (r = 0.82, P < 0.05). The results showed that perceived visual quality of urban woodland landscapes can be estimated by using texture measures extracted from satellite data in combination with appropriate modelling techniques.

  4. Determining Level of Service for Multilane Median Opening Zone

    NASA Astrophysics Data System (ADS)

    Ali, Paydar; Johnnie, Ben-Edigbe

    2017-08-01

    The road system is a capital-intensive investment, requiring thorough schematic framework and funding. Roads are built to provide an intrinsic quality of service which satisfies the road users. Roads that provide good services are expected to deliver operational performance that is consistent with their design specifications. Level of service and cumulative percentile speed distribution methods have been used in previous studies to estimate the quality of multilane highway service. Whilst the level of service approach relies on speed/flow curve, the cumulative percentile speed distribution is based solely speed. These estimation methods were used in studies carried out in Johor Malaysia. The aim of the studies is to ascertain the extent of speed reduction caused by midblock U-turn facilities as well as verify which estimation method is more reliable. At selected sites, road segments for both directional flows were divided into free-flow and midblock zones. Traffic volume, speed and vehicle type data for each zone were collected continuously for six weeks. Both estimation methods confirmed that speed reduction would be caused by midblock u-turn facilities. However level of service methods suggested that the quality of service would improve from level F to E or D at midblock zone in spite of speed reduction. Level of service was responding to traffic volume reduction at midblock u-turn facility not travel speed reduction. The studies concluded that since level of service was more responsive to traffic volume reduction than travel speed, it cannot be solely relied upon when assessing the quality of multilane highway service.

  5. Paying physician group practices for quality: A statewide quasi-experiment.

    PubMed

    Conrad, Douglas A; Grembowski, David; Perry, Lisa; Maynard, Charles; Rodriguez, Hector; Martin, Diane

    2013-12-01

    This article presents the results of a unique quasi-experiment of the effects of a large-scale pay-for-performance (P4P) program implemented by a leading health insurer in Washington state during 2001-2007. The authors received external funding to provide an objective impact evaluation of the program. The program was unique in several respects: (1) It was designed dynamically, with two discrete intervention periods-one in which payment incentives were based on relative performance (the "contest" period) and a second in which payment incentives were based on absolute performance compared to achievable benchmarks. (2) The program was designed in collaboration with large multispecialty group practices, with an explicit run-in period to test the quality metrics. Public reporting of the quality scorecard for all participating medical groups was introduced 1 year before the quality incentive payment program's inception, and continued throughout 2002-2007. (3) The program was implemented in stages with distinct medical groups. A control group of comparable group practices also was assembled, and difference-in-differences methodology was applied to estimate program effects. Case mix measures were included in all multivariate analyses. The regression design permitted a contrast of intervention effects between the "contest" approach in the sub-period of 2003-2004 and the absolute standard, "achievable benchmarks of care" approach in sub-period 2005-2007. Most of the statistically significant quality incentive program coefficients were small and negative (opposite to program intent). A consistent pattern of differential intervention impact in the sub-periods did not emerge. Cumulatively, the probit regression estimates indicate that neither the quality scorecard nor the quality incentive payment program had a significant positive effect on general clinical quality. Based on key informant interviews with medical leaders, practicing physicians, and administrators of the participating groups, the authors conclude that several factors likely combined to dampen program effects: (1) modest size of the incentive; (2) use of rewards only, rather than a balance of rewards and penalties; (3) targeting incentive payments to the group, thus potentially weakening incentive effects at the individual level. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Optimizing chronic disease management mega-analysis: economic evaluation.

    PubMed

    2013-01-01

    As Ontario's population ages, chronic diseases are becoming increasingly common. There is growing interest in services and care models designed to optimize the management of chronic disease. To evaluate the cost-effectiveness and expected budget impact of interventions in chronic disease cohorts evaluated as part of the Optimizing Chronic Disease Management mega-analysis. Sector-specific costs, disease incidence, and mortality were calculated for each condition using administrative databases from the Institute for Clinical Evaluative Sciences. Intervention outcomes were based on literature identified in the evidence-based analyses. Quality-of-life and disease prevalence data were obtained from the literature. Analyses were restricted to interventions that showed significant benefit for resource use or mortality from the evidence-based analyses. An Ontario cohort of patients with each chronic disease was constructed and followed over 5 years (2006-2011). A phase-based approach was used to estimate costs across all sectors of the health care system. Utility values identified in the literature and effect estimates for resource use and mortality obtained from the evidence-based analyses were applied to calculate incremental costs and quality-adjusted life-years (QALYs). Given uncertainty about how many patients would benefit from each intervention, a system-wide budget impact was not determined. Instead, the difference in lifetime cost between an individual-administered intervention and no intervention was presented. Of 70 potential cost-effectiveness analyses, 8 met our inclusion criteria. All were found to result in QALY gains and cost savings compared with usual care. The models were robust to the majority of sensitivity analyses undertaken, but due to structural limitations and time constraints, few sensitivity analyses were conducted. Incremental cost savings per patient who received intervention ranged between $15 per diabetic patient with specialized nursing to $10,665 per patient wth congestive heart failure receiving in-home care. Evidence used to inform estimates of effect was often limited to a single trial with limited generalizability across populations, interventions, and health care systems. Because of the low clinical fidelity of health administrative data sets, intermediate clinical outcomes could not be included. Cohort costs included an average of all health care costs and were not restricted to costs associated with the disease. Intervention costs were based on resource use specified in clinical trials. Applying estimates of effect from the evidence-based analyses to real-world resource use resulted in cost savings for all interventions. On the basis of quality-of-life data identified in the literature, all interventions were found to result in a greater QALY gain than usual care would. Implementation of all interventions could offer significant cost reductions. However, this analysis was subject to important limitations. Chronic diseases are the leading cause of death and disability in Ontario. They account for a third of direct health care costs across the province. This study aims to evaluate the cost-effectiveness of health care interventions that might improve the management of chronic diseases. The evaluated interventions led to lower costs and better quality of life than usual care. Offering these options could reduce costs per patient. However, the studies used in this analysis were of medium to very low quality, and the methods had many limitations.

  7. Can superior natural amenities create high-quality employment opportunities? The case of nonconsumptive river recreation in central Idaho

    USGS Publications Warehouse

    McKean, J.R.; Johnson, D.M.; Johnson, Richard L.; Taylor, R.G.

    2005-01-01

    Central Idaho has superior environmental amenities, as evidenced by exceptionally high-value tourism, such as guided whitewater rafting. The focus of our study concerns the attainment of high-quality jobs in a high-quality natural environment. We estimate cumulative wage rate effects unique to nonconsumptive river recreation in central Idaho for comparison with other sectors. The cumulative effects are based on a detailed survey of recreation spending and a modified synthesized input–output model. Cumulative wage rate effects support using the abundance of environmental amenities to expand and attract high-wage, environmentally sensitive firms, as opposed to expanded tourism to improve employment quality.

  8. Estimating Lightning NOx Emissions for Regional Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Holloway, T.; Scotty, E.; Harkey, M.

    2014-12-01

    Lightning emissions have long been recognized as an important source of nitrogen oxides (NOx) on a global scale, and an essential emission component for global atmospheric chemistry models. However, only in recent years have regional air quality models incorporated lightning NOx emissions into simulations. The growth in regional modeling of lightning emissions has been driven in part by comparisons with satellite-derived estimates of column NO2, especially from the Ozone Monitoring Instrument (OMI) aboard the Aura satellite. We present and evaluate a lightning inventory for the EPA Community Multiscale Air Quality (CMAQ) model. Our approach follows Koo et al. [2010] in the approach to spatially and temporally allocating a given total value based on cloud-top height and convective precipitation. However, we consider alternate total NOx emission values (which translate into alternate lightning emission factors) based on a review of the literature and performance evaluation against OMI NO2 for July 2007 conditions over the U.S. and parts of Canada and Mexico. The vertical distribution of lightning emissions follow a bimodal distribution from Allen et al. [2012] calculated over 27 vertical model layers. Total lightning NO emissions for July 2007 show the highest above-land emissions in Florida, southeastern Texas and southern Louisiana. Although agreement with OMI NO2 across the domain varied significantly depending on lightning NOx assumptions, agreement among the simulations at ground-based NO2 monitors from the EPA Air Quality System database showed no meaningful sensitivity to lightning NOx. Emissions are compared with prior studies, which find similar distribution patterns, but a wide range of calculated magnitudes.

  9. MODELING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN

    EPA Science Inventory

    To help address the aggregate exposure assessment needs of the Food Quality Protection Act, a physically-based probabilistic model (SHEDS-Pesticides, version 3) has been applied to estimate aggregate chlorpyrifos exposure and dose to children. Two age groups (0-4, 5-9 years) a...

  10. School-based peer-related social competence interventions for children with autism spectrum disorder: a meta-analysis and descriptive review of single case research design studies.

    PubMed

    Whalon, Kelly J; Conroy, Maureen A; Martinez, Jose R; Werch, Brittany L

    2015-06-01

    The purpose of this review was to critically examine and summarize the impact of school-based interventions designed to facilitate the peer-related social competence of children with autism spectrum disorder (ASD). Reviewed studies employed a single-case experimental design, targeted peer-related social competence, included children 3-12 years old with an ASD, and took place in school settings. Articles were analyzed descriptively and using the evaluative method to determine study quality. Additionally, effect size estimates were calculated using nonoverlap of all pairs method and Tau-U. A total of 37 studies including 105 children were reviewed. Overall, ES estimates ranged from weak to strong, but on average, the reviewed interventions produced a moderate to strong effect, and quality ratings were generally in the acceptable to high range. Findings suggest that children with ASD can benefit from social skill interventions implemented with peers in school settings.

  11. Cross-validation of two liquid water path retrieval algorithms applied to ground-based microwave radiation measurements by RPG-HATPRO instrument

    NASA Astrophysics Data System (ADS)

    Kostsov, Vladimir; Ionov, Dmitry; Biryukov, Egor; Zaitsev, Nikita

    2017-04-01

    A built-in operational regression algorithm (REA) of liquid water path (LWP) retrieval supplied by the manufacturer of the RPG-HATPRO microwave radiometer has been compared to a so-called physical algorithm (PHA) based on the inversion of the radiative transfer equation. The comparison has been performed for different scenarios of microwave observations by the RPG-HATPRO instrument that has been operating at St.Petersburg University since June 2012. The data for the scenarios have been collected within the time period December 2012 - December 2014. The estimations of bias and random error for both REA and PHA have been obtained. Special attention has been paid to the analysis of the quality of the LWP retrievals during and after rain events that have been detected by the built-in rain sensor. The estimation has been done of the time period after a rain event when the retrieval quality has to be considered as insufficient.

  12. Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET

    NASA Astrophysics Data System (ADS)

    Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET

    2018-05-01

    Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.

  13. The Effect of Nurse Practitioner Co-Management on the Care of Geriatric Conditions

    PubMed Central

    Reuben, David B.; Ganz, David A.; Roth, Carol P.; McCreath, Heather E.; Ramirez, Karina D.; Wenger, Neil S.

    2013-01-01

    Background/Objectives The quality of care for geriatric conditions remains poor. The Assessing Care of Vulnerable Elders (ACOVE)-2 model (case finding, delegation of data collection, structured visit notes, physician and patient education, and linkage to community resources) improves the quality of care for geriatric conditions when implemented by primary care physicians (PCPs) or by nurse practitioners (NPs) co-managing care with an academic geriatrician. However, it is unclear whether community-based PCP-NP co-management can achieve similar results. Design Case study. Setting Two community-based primary care practices. Participants Patients > 75 years who screened positive for at least one condition: falls, urinary incontinence (UI), dementia, and depression. Intervention The ACOVE-2 model augmented by NP co-management of conditions. Measurements Quality of care by medical record review using ACOVE-3 quality indicators (QIs). Patients receiving co-management were compared with those who received PCP care alone in the same practices. Results Of 1084 screened patients, 658 (61%) screened positive for > 1 condition; 485 of these patients were randomly selected for chart review and triggered a mean of 7 QIs. A NP saw approximately half (49%) for co-management. Overall, patients received 57% of recommended care. Quality scores for all conditions (falls: 80% versus 34%; UI: 66% versus 19%; dementia: 59% versus 38%) except depression (63% versus 60%) were higher for patients seen by a NP. In analyses adjusted for gender, age of patient, number of conditions, site, and a NP estimate of medical management style, NP co-management remained significantly associated with receiving recommended care (p<0.001), as did the NP estimate of medical management style (p=0.02). Conclusion Compared to usual care using the ACOVE-2 model, NP co-management is associated with better quality of care for geriatric conditions in community-based primary care. PMID:23772723

  14. Heritability estimates on resting state fMRI data using ENIGMA analysis pipeline.

    PubMed

    Adhikari, Bhim M; Jahanshad, Neda; Shukla, Dinesh; Glahn, David C; Blangero, John; Reynolds, Richard C; Cox, Robert W; Fieremans, Els; Veraart, Jelle; Novikov, Dmitry S; Nichols, Thomas E; Hong, L Elliot; Thompson, Paul M; Kochunov, Peter

    2018-01-01

    Big data initiatives such as the Enhancing NeuroImaging Genetics through Meta-Analysis consortium (ENIGMA), combine data collected by independent studies worldwide to achieve more generalizable estimates of effect sizes and more reliable and reproducible outcomes. Such efforts require harmonized image analyses protocols to extract phenotypes consistently. This harmonization is particularly challenging for resting state fMRI due to the wide variability of acquisition protocols and scanner platforms; this leads to site-to-site variance in quality, resolution and temporal signal-to-noise ratio (tSNR). An effective harmonization should provide optimal measures for data of different qualities. We developed a multi-site rsfMRI analysis pipeline to allow research groups around the world to process rsfMRI scans in a harmonized way, to extract consistent and quantitative measurements of connectivity and to perform coordinated statistical tests. We used the single-modality ENIGMA rsfMRI preprocessing pipeline based on modelfree Marchenko-Pastur PCA based denoising to verify and replicate resting state network heritability estimates. We analyzed two independent cohorts, GOBS (Genetics of Brain Structure) and HCP (the Human Connectome Project), which collected data using conventional and connectomics oriented fMRI protocols, respectively. We used seed-based connectivity and dual-regression approaches to show that the rsfMRI signal is consistently heritable across twenty major functional network measures. Heritability values of 20-40% were observed across both cohorts.

  15. Estimation of Fine and Oversize Particle Ratio in a Heterogeneous Compound with Acoustic Emissions.

    PubMed

    Nsugbe, Ejay; Ruiz-Carcel, Cristobal; Starr, Andrew; Jennions, Ian

    2018-03-13

    The final phase of powder production typically involves a mixing process where all of the particles are combined and agglomerated with a binder to form a single compound. The traditional means of inspecting the physical properties of the final product involves an inspection of the particle sizes using an offline sieving and weighing process. The main downside of this technique, in addition to being an offline-only measurement procedure, is its inability to characterise large agglomerates of powders due to sieve blockage. This work assesses the feasibility of a real-time monitoring approach using a benchtop test rig and a prototype acoustic-based measurement approach to provide information that can be correlated to product quality and provide the opportunity for future process optimisation. Acoustic emission (AE) was chosen as the sensing method due to its low cost, simple setup process, and ease of implementation. The performance of the proposed method was assessed in a series of experiments where the offline quality check results were compared to the AE-based real-time estimations using data acquired from a benchtop powder free flow rig. A designed time domain based signal processing method was used to extract particle size information from the acquired AE signal and the results show that this technique is capable of estimating the required ratio in the washing powder compound with an average absolute error of 6%.

  16. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. © 2016 John Wiley & Sons Ltd.

  17. Multivariate Meta-Analysis of Preference-Based Quality of Life Values in Coronary Heart Disease.

    PubMed

    Stevanović, Jelena; Pechlivanoglou, Petros; Kampinga, Marthe A; Krabbe, Paul F M; Postma, Maarten J

    2016-01-01

    There are numerous health-related quality of life (HRQol) measurements used in coronary heart disease (CHD) in the literature. However, only values assessed with preference-based instruments can be directly applied in a cost-utility analysis (CUA). To summarize and synthesize instrument-specific preference-based values in CHD and the underlying disease-subgroups, stable angina and post-acute coronary syndrome (post-ACS), for developed countries, while accounting for study-level characteristics, and within- and between-study correlation. A systematic review was conducted to identify studies reporting preference-based values in CHD. A multivariate meta-analysis was applied to synthesize the HRQoL values. Meta-regression analyses examined the effect of study level covariates age, publication year, prevalence of diabetes and gender. A total of 40 studies providing preference-based values were detected. Synthesized estimates of HRQoL in post-ACS ranged from 0.64 (Quality of Well-Being) to 0.92 (EuroQol European"tariff"), while in stable angina they ranged from 0.64 (Short form 6D) to 0.89 (Standard Gamble). Similar findings were observed in estimates applying to general CHD. No significant improvement in model fit was found after adjusting for study-level covariates. Large between-study heterogeneity was observed in all the models investigated. The main finding of our study is the presence of large heterogeneity both within and between instrument-specific HRQoL values. Current economic models in CHD ignore this between-study heterogeneity. Multivariate meta-analysis can quantify this heterogeneity and offers the means for uncertainty around HRQoL values to be translated to uncertainty in CUAs.

  18. Physical Activity Interventions in Faith-Based Organizations: A Systematic Review.

    PubMed

    Tristão Parra, Maíra; Porfírio, Gustavo J M; Arredondo, Elva M; Atallah, Álvaro N

    2018-03-01

    To review and assess the effectiveness of physical activity interventions delivered in faith-based organizations. We searched the Cochrane Library, DoPHER, EMBASE, LILACS, MEDLINE, PsycINFO, WHO ICTRP, and Clinicaltrials.gov databases until January 2016, without restriction of language or publication date. Randomized and nonrandomized controlled trials investigating physical activity interventions for adults delivered in faith-based organizations. Two independent reviewers extracted data and assessed study methodological quality. We used relative risk and mean difference with 95% confidence interval to estimate the effect of the interventions on measures of physical activity, physical fitness, and health. The review included 18 studies. Study participants were predominantly female, and the majority of trials were conducted in the United States. Study heterogeneity did not allow us to conduct meta-analyses. Although interventions delivered in faith-based organizations increased physical activity and positively influenced measures of health and fitness in participants, the quality of the evidence was very low. Faith-based organizations are promising settings to promote physical activity, consequently addressing health disparities. However, high-quality randomized clinical trials are needed to adequately assess the effectiveness of interventions delivered in faith-based organizations.

  19. A design space exploration for control of Critical Quality Attributes of mAb.

    PubMed

    Bhatia, Hemlata; Read, Erik; Agarabi, Cyrus; Brorson, Kurt; Lute, Scott; Yoon, Seongkyu

    2016-10-15

    A unique "design space (DSp) exploration strategy," defined as a function of four key scenarios, was successfully integrated and validated to enhance the DSp building exercise, by increasing the accuracy of analyses and interpretation of processed data. The four key scenarios, defining the strategy, were based on cumulative analyses of individual models developed for the Critical Quality Attributes (23 Glycan Profiles) considered for the study. The analyses of the CQA estimates and model performances were interpreted as (1) Inside Specification/Significant Model (2) Inside Specification/Non-significant Model (3) Outside Specification/Significant Model (4) Outside Specification/Non-significant Model. Each scenario was defined and illustrated through individual models of CQA aligning the description. The R(2), Q(2), Model Validity and Model Reproducibility estimates of G2, G2FaGbGN, G0 and G2FaG2, respectively, signified the four scenarios stated above. Through further optimizations, including the estimation of Edge of Failure and Set Point Analysis, wider and accurate DSps were created for each scenario, establishing critical functional relationship between Critical Process Parameters (CPPs) and Critical Quality Attributes (CQAs). A DSp provides the optimal region for systematic evaluation, mechanistic understanding and refining of a QbD approach. DSp exploration strategy will aid the critical process of consistently and reproducibly achieving predefined quality of a product throughout its lifecycle. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. [Volatile organic compounds (VOCs) emitted from furniture and electrical appliances].

    PubMed

    Tanaka-Kagawa, Toshiko; Jinno, Hideto; Furukawa, Yoko; Nishimura, Tetsuji

    2010-01-01

    Organic chemicals are widely used as ingredients in household products. Therefore, furniture and other household products as well as building products may influence the indoor air quality. This study was performed to estimate quantitatively influence of household products on indoor air quality. Volatile organic compound (VOC) emissions were investigated for 10 products including furniture (chest, desk, dining table, sofa, cupboard) and electrical appliances (refrigerator, electric heater, desktop personal computer, liquid crystal display television and audio) by the large chamber test method (JIS A 1912) under the standard conditions of 28 degrees C, 50% relative humidity and 0.5 times/h ventilation. Emission rate of total VOC (TVOC) from the sofa showed the highest; over 7900 microg toluene-equivalent/unit/h. Relatively high TVOC emissions were observed also from desk and chest. Based on the emission rates, the impacts on the indoor TVOC were estimated by the simple model with a volume of 17.4 m3 and ventilation frequency of 0.5 times/h. The estimated TVOC increment for the sofa was 911 microg/m3, accounting for almost 230% of the provisional target value, 400 microg/m3. The values of estimated increment of toluene emitted from cupboard and styrene emitted from refrigerator were 10% and 16% of guideline values, respectively. These results revealed that VOC emissions from household products may influence significantly indoor air quality.

  1. Demand assessment and price-elasticity estimation of quality-improved primary health care in Palestine: a contribution from the contingent valuation method.

    PubMed

    Mataria, Awad; Luchini, Stéphane; Daoud, Yousef; Moatti, Jean-Paul

    2007-10-01

    This paper proposes a new methodology to assess demand and price-elasticity for health care, based on patients' stated willingness to pay (WTP) values for certain aspects of health care quality improvements. A conceptual analysis of how respondents consider contingent valuation (CV) questions allowed us to specify a probability density function of stated WTP values, and consequently, to model a demand function for quality-improved health care, using a parametric survival approach. The model was empirically estimated using a CV study intended to assess patients' values for improving the quality of primary health care (PHC) services in Palestine. A random sample of 499 individuals was interviewed following medical consultation in four PHC centers. Quality was assessed using a multi-attribute approach; and respondents valued seven specific quality improvements using a decomposed valuation scenario and a payment card elicitation technique. Our results suggest an inelastic demand at low user fees levels, and when the price-increase is accompanied with substantial quality-improvements. Nevertheless, demand becomes more and more elastic if user fees continue to rise. On the other hand, patients' reactions to price-increase turn out to depend on their level of income. Our results can be used to design successful health care financing strategies that include a consideration of patients' preferences and financial capacities. John Wiley & Sons, Ltd.

  2. Analysis of quality raw data of second generation sequencers with Quality Assessment Software.

    PubMed

    Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur

    2011-04-18

    Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  3. Accuracy Estimation and Parameter Advising for Protein Multiple Sequence Alignment

    PubMed Central

    DeBlasio, Dan

    2013-01-01

    Abstract We develop a novel and general approach to estimating the accuracy of multiple sequence alignments without knowledge of a reference alignment, and use our approach to address a new task that we call parameter advising: the problem of choosing values for alignment scoring function parameters from a given set of choices to maximize the accuracy of a computed alignment. For protein alignments, we consider twelve independent features that contribute to a quality alignment. An accuracy estimator is learned that is a polynomial function of these features; its coefficients are determined by minimizing its error with respect to true accuracy using mathematical optimization. Compared to prior approaches for estimating accuracy, our new approach (a) introduces novel feature functions that measure nonlocal properties of an alignment yet are fast to evaluate, (b) considers more general classes of estimators beyond linear combinations of features, and (c) develops new regression formulations for learning an estimator from examples; in addition, for parameter advising, we (d) determine the optimal parameter set of a given cardinality, which specifies the best parameter values from which to choose. Our estimator, which we call Facet (for “feature-based accuracy estimator”), yields a parameter advisor that on the hardest benchmarks provides more than a 27% improvement in accuracy over the best default parameter choice, and for parameter advising significantly outperforms the best prior approaches to assessing alignment quality. PMID:23489379

  4. Insitu aircraft verification of the quality of satellite cloud winds over oceanic regions

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Skillman, W. C.

    1979-01-01

    A five year aircraft experiment to verify the quality of satellite cloud winds over oceans using in situ aircraft inertial navigation system wind measurements is presented. The final results show that satellite measured cumulus cloud motions are very good estimators of the cloud base wind for trade wind and subtropical high regions. The average magnitude of the vector differences between the cloud motion and the cloud base wind is given. For cumulus clouds near frontal regions, the cloud motion agreed best with the mean cloud layer wind. For a very limited sample, cirrus cloud motions also most closely followed the mean wind in the cloud layer.

  5. Sources of Error in Substance Use Prevalence Surveys

    PubMed Central

    Johnson, Timothy P.

    2014-01-01

    Population-based estimates of substance use patterns have been regularly reported now for several decades. Concerns with the quality of the survey methodologies employed to produce those estimates date back almost as far. Those concerns have led to a considerable body of research specifically focused on understanding the nature and consequences of survey-based errors in substance use epidemiology. This paper reviews and summarizes that empirical research by organizing it within a total survey error model framework that considers multiple types of representation and measurement errors. Gaps in our knowledge of error sources in substance use surveys and areas needing future research are also identified. PMID:27437511

  6. Characterization of fiber diameter using image analysis

    NASA Astrophysics Data System (ADS)

    Baheti, S.; Tunak, M.

    2017-10-01

    Due to high surface area and porosity, the applications of nanofibers have increased in recent years. In the production process, determination of average fiber diameter and fiber orientation is crucial for quality assessment. The objective of present study was to compare the relative performance of different methods discussed in literature for estimation of fiber diameter. In this work, the existing automated fiber diameter analysis software packages available in literature were developed and validated based on simulated images of known fiber diameter. Finally, all methods were compared for their reliable and accurate estimation of fiber diameter in electro spun nanofiber membranes based on obtained mean and standard deviation.

  7. FPGA-based fused smart-sensor for tool-wear area quantitative estimation in CNC machine inserts.

    PubMed

    Trejo-Hernandez, Miguel; Osornio-Rios, Roque Alfredo; de Jesus Romero-Troncoso, Rene; Rodriguez-Donate, Carlos; Dominguez-Gonzalez, Aurelio; Herrera-Ruiz, Gilberto

    2010-01-01

    Manufacturing processes are of great relevance nowadays, when there is a constant claim for better productivity with high quality at low cost. The contribution of this work is the development of a fused smart-sensor, based on FPGA to improve the online quantitative estimation of flank-wear area in CNC machine inserts from the information provided by two primary sensors: the monitoring current output of a servoamplifier, and a 3-axis accelerometer. Results from experimentation show that the fusion of both parameters makes it possible to obtain three times better accuracy when compared with the accuracy obtained from current and vibration signals, individually used.

  8. Quantification of GABA, glutamate and glutamine in a single measurement at 3 T using GABA‐edited MEGA‐PRESS

    PubMed Central

    Sanaei Nezhad, Faezeh; Anton, Adriana; Michou, Emilia; Jung, JeYoung; Parkes, Laura M.

    2017-01-01

    γ‐Aminobutyric acid (GABA) and glutamate (Glu), major neurotransmitters in the brain, are recycled through glutamine (Gln). All three metabolites can be measured by magnetic resonance spectroscopy in vivo, although GABA measurement at 3 T requires an extra editing acquisition, such as Mescher–Garwood point‐resolved spectroscopy (MEGA‐PRESS). In a GABA‐edited MEGA‐PRESS spectrum, Glu and Gln co‐edit with GABA, providing the possibility to measure all three in one acquisition. In this study, we investigated the reliability of the composite Glu + Gln (Glx) peak estimation and the possibility of Glu and Gln separation in GABA‐edited MEGA‐PRESS spectra. The data acquired in vivo were used to develop a quality assessment framework which identified MEGA‐PRESS spectra in which Glu and Gln could be estimated reliably. Phantoms containing Glu, Gln, GABA and N‐acetylaspartate (NAA) at different concentrations were scanned using GABA‐edited MEGA‐PRESS at 3 T. Fifty‐six sets of spectra in five brain regions were acquired from 36 healthy volunteers. Based on the Glu/Gln ratio, data were classified as either within or outside the physiological range. A peak‐by‐peak quality assessment was performed on all data to investigate whether quality metrics can discriminate between these two classes of spectra. The quality metrics were as follows: the GABA signal‐to‐noise ratio, the NAA linewidth and the Glx Cramer–Rao lower bound (CRLB). The Glu and Gln concentrations were estimated with precision across all phantoms with a linear relationship between the measured and true concentrations: R 1 = 0.95 for Glu and R 1 = 0.91 for Gln. A quality assessment framework was set based on the criteria necessary for a good GABA‐edited MEGA‐PRESS spectrum. Simultaneous criteria of NAA linewidth <8 Hz and Glx CRLB <16% were defined as optimum features for reliable Glu and Gln quantification. Glu and Gln can be reliably quantified from GABA‐edited MEGA‐PRESS acquisitions. However, this reliability should be controlled using the quality assessment methods suggested in this work. PMID:29130590

  9. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  10. Prevalence of autosomal dominant polycystic kidney disease in the European Union.

    PubMed

    Willey, Cynthia J; Blais, Jaime D; Hall, Anthony K; Krasa, Holly B; Makin, Andrew J; Czerwiec, Frank S

    2017-08-01

    Autosomal dominant polycystic kidney disease (ADPKD) is a leading cause of end-stage renal disease, but estimates of its prevalence vary by >10-fold. The objective of this study was to examine the public health impact of ADPKD in the European Union (EU) by estimating minimum prevalence (point prevalence of known cases) and screening prevalence (minimum prevalence plus cases expected after population-based screening). A review of the epidemiology literature from January 1980 to February 2015 identified population-based studies that met criteria for methodological quality. These examined large German and British populations, providing direct estimates of minimum prevalence and screening prevalence. In a second approach, patients from the 2012 European Renal Association‒European Dialysis and Transplant Association (ERA-EDTA) Registry and literature-based inflation factors that adjust for disease severity and screening yield were used to estimate prevalence across 19 EU countries (N = 407 million). Population-based studies yielded minimum prevalences of 2.41 and 3.89/10 000, respectively, and corresponding estimates of screening prevalences of 3.3 and 4.6/10 000. A close correspondence existed between estimates in countries where both direct and registry-derived methods were compared, which supports the validity of the registry-based approach. Using the registry-derived method, the minimum prevalence was 3.29/10 000 (95% confidence interval 3.27-3.30), and if ADPKD screening was implemented in all countries, the expected prevalence was 3.96/10 000 (3.94-3.98). ERA-EDTA-based prevalence estimates and application of a uniform definition of prevalence to population-based studies consistently indicate that the ADPKD point prevalence is <5/10 000, the threshold for rare disease in the EU. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA.

  11. Estimating the agricultural fertilizer NH3 emission in China based on the bi-directional CMAQ model and an agro-ecosystem model

    NASA Astrophysics Data System (ADS)

    Wang, S.

    2014-12-01

    Atmospheric ammonia (NH3) plays an important role in fine particle formation. Accurate estimates of ammonia can reduce uncertainties in air quality modeling. China is one of the largest countries emitting ammonia with the majority of NH3 emissions coming from the agricultural practices, such as fertilizer applications and animal operations. The current ammonia emission estimates in China are mainly based on pre-defined emission factors. Thus, there are considerable uncertainties in estimating NH3 emissions, especially in time and space distribution. For example, fertilizer applications vary in the date of application and amount by geographical regions and crop types. In this study, the NH3 emission from the agricultural fertilizer use in China of 2011 was estimated online by an agricultural fertilizer modeling system coupling a regional air-quality model and an agro-ecosystem model, which contains three main components 1) the Environmental Policy Integrated Climate (EPIC) model, 2) the meso-scale meteorology Weather Research and Forecasting (WRF) model and 3) the CMAQ air quality model with bi-directional ammonia fluxes. The EPIC output information about daily fertilizer application and soil characteristics would be the input of the CMAQ model. In order to run EPIC model, much Chinese local information is collected and processed. For example, Crop land data are computed from the MODIS land use data at 500-m resolution and crop categories at Chinese county level; the fertilizer use rate for different fertilizer types, crops and provinces are obtained from Chinese statistic materials. The system takes into consideration many influencing factors on agriculture ammonia emission, including weather, the fertilizer application method, timing, amount, and rate for specific pastures and crops. The simulated fertilizer data is compared with the NH3 emissions and fertilizer application data from other sources. The results of CMAQ modeling are also discussed and analyzed with field measurements. The estimated agricultural fertilizer NH3 emission in this study is about 3Tg in 2011. The regions with the highest emission rates are located in the North China Plain. Monthly, the peak ammonia emissions occur in April to July.

  12. The contingent behavior of charter fishing participants on the Chesapeake Bay: Welfare estimates associated with water quality improvements

    USGS Publications Warehouse

    Poor, P.J.; Breece, M.

    2006-01-01

    Water quality in the Chesapeake Bay has deteriorated over recent years. Historically, fishing has contributed to the region's local economy in terms of commercial and recreational harvests. A contingent behavior model is used to estimate welfare measures for charter fishing participants with regard to a hypothetical improvement in water quality. Using a truncated Poisson count model corrected for endogenous stratification, it was found that charter fishers not only contribute to the local market economy, but they also place positive non-market value on preserving the Bay's water quality. Using two estimates for travels costs it is estimated that the individual consumer surplus is $200 and $117 per trip, and the average individual consumer surplus values for an improvement in water quality is $75 and $44 for two models estimated. ?? 2006 University of Newcastle upon Tyne.

  13. A bootstrap method for estimating uncertainty of water quality trends

    USGS Publications Warehouse

    Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura

    2015-01-01

    Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.

  14. Estimated loads and yields of suspended soils and water-quality constituents in Kentucky streams

    USGS Publications Warehouse

    Crain, Angela S.

    2001-01-01

    Loads and yields of suspended solids, nutrients, major ions, trace elements, organic carbon, fecal coliform, dissolved oxygen, and alkalinity were estimated for 22 streams in 11 major river basins in Kentucky. Mean daily discharge was estimated at ungaged stations or stations with incomplete discharge records using drainage-area ratio, regression analysis, or a combination of the two techniques. Streamflow was partitioned into total and base flow and used to estimate loads and yields for suspended solids and water-quality constituents by use of the ESTIMATOR and FLUX computer programs. The relative magnitude of constituent transport to streams from groundand surface-water sources was determined for the 22 stations. Nutrient and suspended solids yields for drainage basins with relatively homogenous land use were used to estimate the total-flow and base-flow yields of nutrient and suspended solids for forested, agricultural, and urban land. Yields of nutrients?nitrite plus nitrate, ammonia plus organic nitrogen, and total phosphorus?in forested drainage basins were generally less than 1 ton per square mile per year ((ton/mi2)/yr) and were generally less than 2 (ton/mi2)/yr in agricultural drainage basins. The smallest total-flow yields for nitrogen (nitrite plus nitrate) was estimated at Levisa Fork at Paintsville in which 95 percent of the land is forested. This site also had one of the smallest total-flow yields for ammonia plus organic nitrogen. In general, nutrient yields from forested lands were lower than those from urban and agricultural land. Some of the largest estimated total-flow yields of nutrients among agricultural basins were for streams in the Licking River Basin, the North Fork Licking River near Milford, and the South Fork Licking River at Cynthiana. Agricultural land constitutes greater than 75 percent of the drainage area in these two basins. Possible sources of nutrients discharging into the Licking River are farm and residential fertilizers. Estimated base-flow yields of suspended solids and nutrients at several basins in the larger Green River and Lower Cumberland River Basins were about half of their estimated total-flow yields. The karst terrain in these basins makes the ground water highly susceptible to contamination, especially if a confining unit is thin or absent.

  15. Preliminary psychometric testing of the Fox Simple Quality-of-Life Scale.

    PubMed

    Fox, Sherry

    2004-06-01

    Although quality of life is extensively defined as subjective and multidimensional with both affective and cognitive components, few instruments capture important dimensions of the construct, and few are both conceptually congruent and user friendly for the clinical setting. The aim of this study was to develop and test a measure that would be easy to use clinically and capture both cognitive and affective components of quality of life. Initial item sources for the Fox Simple Quality-of-Life Scale (FSQOLS) were literature-based. Thirty items were compiled for content validity assessment by a panel of expert healthcare clinicians from various disciplines, predominantly nursing. Five items were removed as a result of the review because they reflected negatively worded or redundant items. The 25-item scale was mailed to 177 people with lung, colon, and ovarian cancer in various stages. Cancer types were selected theoretically, based on similarity in prognosis, degree of symptom burden, and possible meaning and experience. Of the 145 participants, all provided complete data on the FSQOLS. Psychometric evaluation of the FSQOLS included item-total correlations, principal components analysis with varimax rotation revealing two factors explaining 50% variance, reliability estimation using alpha estimates, and item-factor correlations. The FSQOLS exhibited significant convergent validity with four popular quality-of-life instruments: the Ferrans and Powers Quality of Life Index, the Functional Assessment of Cancer Therapy Scale, the Short-Form-36 Health Survey, and the General Well-Being Scale. Content validity of the scale was explored and supported using qualitative interviews of 14 participants with lung, colon and ovarian cancer, who were a subgroup of the sample for the initial instrument testing.

  16. Attenuation-based automatic kilovolt (kV)-selection in computed tomography of the chest: effects on radiation exposure and image quality.

    PubMed

    Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael; Achenbach, Stephan; Uder, Michael; Lell, Michael M

    2013-12-01

    To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100-140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3--excellent, 0--not diagnostic). The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all p<0.001). Subjective image quality was excellent in both groups. The attenuation based kV-selection algorithm enables relevant dose reduction (~27%) in chest-CT while keeping image quality parameters at high levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Prediction of HDR quality by combining perceptually transformed display measurements with machine learning

    NASA Astrophysics Data System (ADS)

    Choudhury, Anustup; Farrell, Suzanne; Atkins, Robin; Daly, Scott

    2017-09-01

    We present an approach to predict overall HDR display quality as a function of key HDR display parameters. We first performed subjective experiments on a high quality HDR display that explored five key HDR display parameters: maximum luminance, minimum luminance, color gamut, bit-depth and local contrast. Subjects rated overall quality for different combinations of these display parameters. We explored two models | a physical model solely based on physically measured display characteristics and a perceptual model that transforms physical parameters using human vision system models. For the perceptual model, we use a family of metrics based on a recently published color volume model (ICT-CP), which consists of the PQ luminance non-linearity (ST2084) and LMS-based opponent color, as well as an estimate of the display point spread function. To predict overall visual quality, we apply linear regression and machine learning techniques such as Multilayer Perceptron, RBF and SVM networks. We use RMSE and Pearson/Spearman correlation coefficients to quantify performance. We found that the perceptual model is better at predicting subjective quality than the physical model and that SVM is better at prediction than linear regression. The significance and contribution of each display parameter was investigated. In addition, we found that combined parameters such as contrast do not improve prediction. Traditional perceptual models were also evaluated and we found that models based on the PQ non-linearity performed better.

  18. Motion compensation for cone-beam CT using Fourier consistency conditions

    NASA Astrophysics Data System (ADS)

    Berger, M.; Xia, Y.; Aichinger, W.; Mentl, K.; Unberath, M.; Aichert, A.; Riess, C.; Hornegger, J.; Fahrig, R.; Maier, A.

    2017-09-01

    In cone-beam CT, involuntary patient motion and inaccurate or irreproducible scanner motion substantially degrades image quality. To avoid artifacts this motion needs to be estimated and compensated during image reconstruction. In previous work we showed that Fourier consistency conditions (FCC) can be used in fan-beam CT to estimate motion in the sinogram domain. This work extends the FCC to 3\\text{D} cone-beam CT. We derive an efficient cost function to compensate for 3\\text{D} motion using 2\\text{D} detector translations. The extended FCC method have been tested with five translational motion patterns, using a challenging numerical phantom. We evaluated the root-mean-square-error and the structural-similarity-index between motion corrected and motion-free reconstructions. Additionally, we computed the mean-absolute-difference (MAD) between the estimated and the ground-truth motion. The practical applicability of the method is demonstrated by application to respiratory motion estimation in rotational angiography, but also to motion correction for weight-bearing imaging of knees. Where the latter makes use of a specifically modified FCC version which is robust to axial truncation. The results show a great reduction of motion artifacts. Accurate estimation results were achieved with a maximum MAD value of 708 μm and 1184 μm for motion along the vertical and horizontal detector direction, respectively. The image quality of reconstructions obtained with the proposed method is close to that of motion corrected reconstructions based on the ground-truth motion. Simulations using noise-free and noisy data demonstrate that FCC are robust to noise. Even high-frequency motion was accurately estimated leading to a considerable reduction of streaking artifacts. The method is purely image-based and therefore independent of any auxiliary data.

  19. Economic evaluation in short bowel syndrome (SBS): an algorithm to estimate utility scores for a patient-reported SBS-specific quality of life scale (SBS-QoL™).

    PubMed

    Lloyd, Andrew; Kerr, Cicely; Breheny, Katie; Brazier, John; Ortiz, Aurora; Borg, Emma

    2014-03-01

    Condition-specific preference-based measures can offer utility data where they would not otherwise be available or where generic measures may lack sensitivity, although they lack comparability across conditions. This study aimed to develop an algorithm for estimating utilities from the short bowel syndrome health-related quality of life scale (SBS-QoL™). SBS-QoL™ items were selected based on factor and item performance analysis of a European SBS-QoL™ dataset and consultation with 3 SBS clinical experts. Six-dimension health states were developed using 8 SBS-QoL™ items (2 dimensions combined 2 SBS-QoL™ items). SBS health states were valued by a UK general population sample (N = 250) using the lead-time time trade-off method. Preference weights or 'utility decrements' for each severity level of each dimension were estimated by regression models and used to develop the scoring algorithm. Mean utilities for the SBS health states ranged from -0.46 (worst health state, very much affected on all dimensions) to 0.92 (best health state, not at all affected on all dimensions). The random effects model with maximum likelihood estimation regression had the best predictive ability and lowest root mean squared error and mean absolute error, and was used to develop the scoring algorithm. The preference-weighted scoring algorithm for the SBS-QoL™ developed is able to estimate a wide range of utility values from patient-level SBS-QoL™ data. This allows estimation of SBS HRQL impact for the purpose of economic evaluation of SBS treatment benefits.

  20. Automated estimation of river bathymetry using change detection based on Landsat imagery and river morphological models

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Jagers, B.; Van De Giesen, N.; Baart, F.; van Dam, A.

    2015-12-01

    Free global data sets on river bathymetry at global scale are not yet available. While one of the mostly used free elevation datasets, SRTM, provides data on location and elevation of rivers, its quality usually is very limited. This happens mainly because water mask was derived from older satellite imagery, such as Landsat 5, and also because the radar instruments perform bad near water, especially with the presence of vegetation in riparian zone. Additional corrections are required before it can be used for applications such as higher resolution surface water flow simulations. On the other hand, medium resolution satellite imagery from Landsat mission can be used to estimate water mask changes during the last 40 years. Water mask from Landsat imagery can be derived on per-image basis, in some cases, resulting in up to one thousand water masks. For rivers where significant water mask changes can be observed, this information can be used to improve quality of existing digital elevation models in the range between minimum and maximum observed water levels. Furthermore, we can use this information to further estimate river bathymetry using morphological models. We will evaluate how Landsat imagery can be used to estimate river bathymetry and will point to cases of significant inconsistencies between SRTM and Landsat-based water masks. We will also explore other challenges on a way to automated estimation of river bathymetry using fusion of numerical morphological models and remote sensing data. Some of them include automatic generation of model mesh, estimation of river morphodynamic properties and issues related to spectral method used to analyse optical satellite imagery.

  1. [Water environmental capacity calculation model for the rivers in drinking water source conservation area].

    PubMed

    Chen, Ding-jiang; Lü, Jun; Shen, Ye-na; Jin, Shu-quan; Shi, Yi-ming

    2008-09-01

    Based on the one-dimension model for water environmental capacity (WEC) in river, a new model for the WEC estimation in river-reservoir system was developed in drinking water source conservation area (DWSCA). In the new model, the concept was introduced that the water quality target of the rivers in DWSCA was determined by the water quality demand of reservoir for drinking water source. It implied that the WEC of the reservoir could be used as the water quality control target at the reach-end of the upstream rivers in DWSCA so that the problems for WEC estimation might be avoided that the differences of the standards for a water quality control target between in river and in reservoir, such as the criterions differences for total phosphorus (TP)/total nitrogen (TN) between in reservoir and in river according to the National Surface Water Quality Standard of China (GB 3838-2002), and the difference of designed hydrology conditions for WEC estimation between in reservoir and in river. The new model described the quantitative relationship between the WEC of drinking water source and of the river, and it factually expressed the continuity and interplay of these low water areas. As a case study, WEC for the rivers in DWSCA of Laohutan reservoir located in southeast China was estimated using the new model. Results indicated that the WEC for TN and TP was 65.05 t x a(-1) and 5.05 t x a(-1) in the rivers of the DWSCA, respectively. According to the WEC of Laohutan reservoir and current TN and TP quantity that entered into the rivers, about 33.86 t x a(-1) of current TN quantity should be reduced in the DWSCA, while there was 2.23 t x a(-1) of residual WEC of TP in the rivers. The modeling method was also widely applicable for the continuous water bodies with different water quality targets, especially for the situation of higher water quality control target in downstream water body than that in upstream.

  2. Applying the concept of Independent Applicability to results from the National Aquatic Resource Surveys

    EPA Science Inventory

    The assessments resulting from the National Aquatic Resource Surveys have taken the tact of basing estimates of resource condition on the biological indicators of quality. The physical habitat, chemical, hydrological, and watershed indicators are used to evaluate the relative ra...

  3. Cost-utility analysis of the housing and health intervention for homeless and unstably housed persons living with HIV.

    PubMed

    Holtgrave, David R; Wolitski, Richard J; Pals, Sherri L; Aidala, Angela; Kidder, Daniel P; Vos, David; Royal, Scott; Iruka, Nkemdiri; Briddell, Kate; Stall, Ron; Bendixen, Arturo Valdivia

    2013-06-01

    We present a cost-utility analysis based on data from the Housing and Health (H&H) Study of rental assistance for homeless and unstably housed persons living with HIV in Baltimore, Chicago and Los Angeles. As-treated analyses found favorable associations of housing with HIV viral load, emergency room use, and perceived stress (an outcome that can be quantitatively linked to quality of life). We combined these outcome data with information on intervention costs to estimate the cost-per-quality-adjusted-life-year (QALY) saved. We estimate that the cost-per-QALY-saved by the HIV-related housing services is $62,493. These services compare favorably (in terms of cost-effectiveness) to other well-accepted medical and public health services.

  4. Topogram-based tube current modulation of head computed tomography for optimizing image quality while protecting the eye lens with shielding.

    PubMed

    Lin, Ming-Fang; Chen, Chia-Yuen; Lee, Yuan-Hao; Li, Chia-Wei; Gerweck, Leo E; Wang, Hao; Chan, Wing P

    2018-01-01

    Background Multiple rounds of head computed tomography (CT) scans increase the risk of radiation-induced lens opacification. Purpose To investigate the effects of CT eye shielding and topogram-based tube current modulation (TCM) on the radiation dose received by the lens and the image quality of nasal and periorbital imaging. Material and Methods An anthropomorphic phantom was CT-scanned using either automatic tube current modulation or a fixed tube current. The lens radiation dose was estimated using cropped Gafchromic films irradiated with or without a shield over the orbit. Image quality, assessed using regions of interest drawn on the bilateral extraorbital areas and the nasal bone with a water-based marker, was evaluated using both a signal-to-noise ratio (SNR) and contrast-noise ratio (CNR). Two CT specialists independently assessed image artifacts using a three-point Likert scale. Results The estimated radiation dose received by the lens was significantly lower when barium sulfate or bismuth-antimony shields were used in conjunction with a fixed tube current (22.0% and 35.6% reduction, respectively). Topogram-based TCM mitigated the beam hardening-associated artifacts of bismuth-antimony and barium sulfate shields. This increased the SNR by 21.6% in the extraorbital region and the CNR by 7.2% between the nasal bones and extraorbital regions. The combination of topogram-based TCM and barium sulfate or bismuth-antimony shields reduced lens doses by 12.2% and 27.2%, respectively. Conclusion Image artifacts induced by the bismuth-antimony shield at a fixed tube current for lenticular radioprotection were significantly reduced by topogram-based TCM, which increased the SNR of the anthropomorphic nasal bones and periorbital tissues.

  5. Global Top-Down Smoke-Aerosol Emissions Estimation Using Satellite Fire Radiative Power Measurements

    NASA Technical Reports Server (NTRS)

    Ichoku, C.; Ellison, L.

    2014-01-01

    Fire emissions estimates have long been based on bottom-up approaches that are not only complex, but also fraught with compounding uncertainties. We present the development of a global gridded (1 deg ×1 deg) emission coefficients (Ce) product for smoke total particulate matter (TPM) based on a top-down approach using coincident measurements of fire radiative power (FRP) and aerosol optical thickness (AOT) from the Moderate-resolution Imaging Spectroradiometer (MODIS) sensors aboard the Terra and Aqua satellites. This new Fire Energetics and Emissions Research version 1.0 (FEER.v1) Ce product has now been released to the community and can be obtained from http://feer.gsfc. nasa.gov/, along with the corresponding 1-to-1 mapping of their quality assurance (QA) flags that will enable the Ce values to be filtered by quality for use in various applications. The regional averages of Ce values for different ecosystem types were found to be in the ranges of 16-21/gMJ-1 for savanna and grasslands, 15-32/gMJ-1 for tropical forest, 9-12/gMJ-1 for North American boreal forest, and 18- 26/MJ-1 for Russian boreal forest, croplands and natural vegetation. The FEER.v1 Ce product was multiplied by time-integrated FRP data to calculate regional smoke TPM emissions, which were compared with equivalent emissions products from three existing inventories. FEER.v1 showed higher and more reasonable smoke TPM estimates than two other emissions inventories that are based on bottom-up approaches and already reported in the literature to be too low, but portrayed an overall reasonable agreement with another top-down approach. This suggests that top-down approaches may hold better promise and need to be further developed to accelerate the reduction of uncertainty associated with fire emissions estimation in air-quality and climate research and applications. Results of the analysis of FEER.v1 data for 2004-2011 show that 65-85 Tg yr-1 of TPM is emitted globally from open biomass burning, with a generally decreasing trend over this short time period. The FEER.v1 Ce product is the first global gridded product in the family of "emission factors", that is based essentially on satellite measurements, and requires only direct satellite FRP measurements of an actively burning fire anywhere to evaluate its emission rate in near-real time, which is essential for operational activities, such as the monitoring and forecasting of smoke emission impacts on air quality.

  6. Sediment delivery estimates in water quality models altered by resolution and source of topographic data.

    PubMed

    Beeson, Peter C; Sadeghi, Ali M; Lang, Megan W; Tomer, Mark D; Daughtry, Craig S T

    2014-01-01

    Moderate-resolution (30-m) digital elevation models (DEMs) are normally used to estimate slope for the parameterization of non-point source, process-based water quality models. These models, such as the Soil and Water Assessment Tool (SWAT), use the Universal Soil Loss Equation (USLE) and Modified USLE to estimate sediment loss. The slope length and steepness factor, a critical parameter in USLE, significantly affects sediment loss estimates. Depending on slope range, a twofold difference in slope estimation potentially results in as little as 50% change or as much as 250% change in the LS factor and subsequent sediment estimation. Recently, the availability of much finer-resolution (∼3 m) DEMs derived from Light Detection and Ranging (LiDAR) data has increased. However, the use of these data may not always be appropriate because slope values derived from fine spatial resolution DEMs are usually significantly higher than slopes derived from coarser DEMs. This increased slope results in considerable variability in modeled sediment output. This paper addresses the implications of parameterizing models using slope values calculated from DEMs with different spatial resolutions (90, 30, 10, and 3 m) and sources. Overall, we observed over a 2.5-fold increase in slope when using a 3-m instead of a 90-m DEM, which increased modeled soil loss using the USLE calculation by 130%. Care should be taken when using LiDAR-derived DEMs to parameterize water quality models because doing so can result in significantly higher slopes, which considerably alter modeled sediment loss. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  7. Cost-effectiveness of a motivational intervention for alcohol-involved youth in a hospital emergency department.

    PubMed

    Neighbors, Charles J; Barnett, Nancy P; Rohsenow, Damaris J; Colby, Suzanne M; Monti, Peter M

    2010-05-01

    Brief interventions in the emergency department targeting risk-taking youth show promise to reduce alcohol-related injury. This study models the cost-effectiveness of a motivational interviewing-based intervention relative to brief advice to stop alcohol-related risk behaviors (standard care). Average cost-effectiveness ratios were compared between conditions. In addition, a cost-utility analysis examined the incremental cost of motivational interviewing per quality-adjusted life year gained. Microcosting methods were used to estimate marginal costs of motivational interviewing and standard care as well as two methods of patient screening: standard emergency-department staff questioning and proactive outreach by counseling staff. Average cost-effectiveness ratios were computed for drinking and driving, injuries, vehicular citations, and negative social consequences. Using estimates of the marginal effect of motivational interviewing in reducing drinking and driving, estimates of traffic fatality risk from drinking-and-driving youth, and national life tables, the societal costs per quality-adjusted life year saved by motivational interviewing relative to standard care were also estimated. Alcohol-attributable traffic fatality risks were estimated using national databases. Intervention costs per participant were $81 for standard care, $170 for motivational interviewing with standard screening, and $173 for motivational interviewing with proactive screening. The cost-effectiveness ratios for motivational interviewing were more favorable than standard care across all study outcomes and better for men than women. The societal cost per quality-adjusted life year of motivational interviewing was $8,795. Sensitivity analyses indicated that results were robust in terms of variability in parameter estimates. This brief intervention represents a good societal investment compared with other commonly adopted medical interventions.

  8. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol

    PubMed Central

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-01-01

    Introduction There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Methods and analysis Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Discussion Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. PMID:27591026

  9. Protocol for the evaluation of a quality-based pay for performance scheme in Liberia.

    PubMed

    Bawo, Luke; Leonard, Kenneth L; Mohammed, Rianna

    2015-01-13

    Improving the quality of care at hospitals is a key next step in rebuilding Liberia's health system. In order to improve the efficiency, effectiveness, and quality of care at the secondary hospital level, the country is developing a system to upgrade health worker skills and competencies, and shifting towards improved provider accountability for results, including a Graduate Medical Residency Program (GMRP) and provider accountability for improvements in quality through performance-based financing (PBF) at the hospital level. This document outlines the protocol for the impact evaluation of the hospital improvement program. The evaluation will provide an estimate of the impact of the project and investigate the mechanism for success in a way that can provide general lessons about the quality of health care in low-income countries. The evaluation aims 1) to provide the best possible estimate of program impact and 2) to quantitatively describe the changes that took place within facilities as a result of the program. In particular, the impact evaluation focuses on the changes in human resources within the hospitals. As such, we use a three-period intensive evaluation of treated and matched comparison hospitals to see how services change in treated hospitals as well as a continuous data collection effort to track the activities of individual health workers within treated hospitals. We are particularly interested in understanding how facilities met quality targets. Did they bring in new health workers with higher qualifications? Did they improve the knowledge or competence of their existing staff? Did they improve the availability of medicines and equipment so that the capacities of existing health workers were improved? Did they address the motivation of health workers so that individuals with the same competence and capacity were able to provide higher quality? And, if they did improve quality, did patients notice?

  10. EAPhy: A Flexible Tool for High-throughput Quality Filtering of Exon-alignments and Data Processing for Phylogenetic Methods.

    PubMed

    Blom, Mozes P K

    2015-08-05

    Recently developed molecular methods enable geneticists to target and sequence thousands of orthologous loci and infer evolutionary relationships across the tree of life. Large numbers of genetic markers benefit species tree inference but visual inspection of alignment quality, as traditionally conducted, is challenging with thousands of loci. Furthermore, due to the impracticality of repeated visual inspection with alternative filtering criteria, the potential consequences of using datasets with different degrees of missing data remain nominally explored in most empirical phylogenomic studies. In this short communication, I describe a flexible high-throughput pipeline designed to assess alignment quality and filter exonic sequence data for subsequent inference. The stringency criteria for alignment quality and missing data can be adapted based on the expected level of sequence divergence. Each alignment is automatically evaluated based on the stringency criteria specified, significantly reducing the number of alignments that require visual inspection. By developing a rapid method for alignment filtering and quality assessment, the consistency of phylogenetic estimation based on exonic sequence alignments can be further explored across distinct inference methods, while accounting for different degrees of missing data.

  11. Measuring Gait Quality in Parkinson’s Disease through Real-Time Gait Phase Recognition

    PubMed Central

    Mileti, Ilaria; Germanotta, Marco; Di Sipio, Enrica; Imbimbo, Isabella; Pacilli, Alessandra; Erra, Carmen; Petracca, Martina; Del Prete, Zaccaria; Bentivoglio, Anna Rita; Padua, Luca

    2018-01-01

    Monitoring gait quality in daily activities through wearable sensors has the potential to improve medical assessment in Parkinson’s Disease (PD). In this study, four gait partitioning methods, two based on thresholds and two based on a machine learning approach, considering the four-phase model, were compared. The methods were tested on 26 PD patients, both in OFF and ON levodopa conditions, and 11 healthy subjects, during walking tasks. All subjects were equipped with inertial sensors placed on feet. Force resistive sensors were used to assess reference time sequence of gait phases. Goodness Index (G) was evaluated to assess accuracy in gait phases estimation. A novel synthetic index called Gait Phase Quality Index (GPQI) was proposed for gait quality assessment. Results revealed optimum performance (G < 0.25) for three tested methods and good performance (0.25 < G < 0.70) for one threshold method. The GPQI resulted significantly higher in PD patients than in healthy subjects, showing a moderate correlation with clinical scales score. Furthermore, in patients with severe gait impairment, GPQI was found higher in OFF than in ON state. Our results unveil the possibility of monitoring gait quality in PD through real-time gait partitioning based on wearable sensors. PMID:29558410

  12. AHP-based spatial analysis of water quality impact assessment due to change in vehicular traffic caused by highway broadening in Sikkim Himalaya

    NASA Astrophysics Data System (ADS)

    Banerjee, Polash; Ghose, Mrinal Kanti; Pradhan, Ratika

    2018-05-01

    Spatial analysis of water quality impact assessment of highway projects in mountainous areas remains largely unexplored. A methodology is presented here for Spatial Water Quality Impact Assessment (SWQIA) due to highway-broadening-induced vehicular traffic change in the East district of Sikkim. Pollution load of the highway runoff was estimated using an Average Annual Daily Traffic-Based Empirical model in combination with mass balance model to predict pollution in the rivers within the study area. Spatial interpolation and overlay analysis were used for impact mapping. Analytic Hierarchy Process-Based Water Quality Status Index was used to prepare a composite impact map. Model validation criteria, cross-validation criteria, and spatial explicit sensitivity analysis show that the SWQIA model is robust. The study shows that vehicular traffic is a significant contributor to water pollution in the study area. The model is catering specifically to impact analysis of the concerned project. It can be an aid for decision support system for the project stakeholders. The applicability of SWQIA model needs to be explored and validated in the context of a larger set of water quality parameters and project scenarios at a greater spatial scale.

  13. Cost effectiveness of a general practice chronic disease management plan for coronary heart disease in Australia.

    PubMed

    Chew, Derek P; Carter, Robert; Rankin, Bree; Boyden, Andrew; Egan, Helen

    2010-05-01

    The cost effectiveness of a general practice-based program for managing coronary heart disease (CHD) patients in Australia remains uncertain. We have explored this through an economic model. A secondary prevention program based on initial clinical assessment and 3 monthly review, optimising of pharmacotherapies and lifestyle modification, supported by a disease registry and financial incentives for quality of care and outcomes achieved was assessed in terms of incremental cost effectiveness ratio (ICER), in Australian dollars per disability adjusted life year (DALY) prevented. Based on 2006 estimates, 263 487 DALYs were attributable to CHD in Australia. The proposed program would add $115 650 000 to the annual national heath expenditure. Using an estimated 15% reduction in death and disability and a 40% estimated program uptake, the program's ICER is $8081 per DALY prevented. With more conservative estimates of effectiveness and uptake, estimates of up to $38 316 per DALY are observed in sensitivity analysis. Although innovation in CHD management promises improved future patient outcomes, many therapies and strategies proven to reduce morbidity and mortality are available today. A general practice-based program for the optimal application of current therapies is likely to be cost-effective and provide substantial and sustainable benefits to the Australian community.

  14. Mitigating Provider Uncertainty in Service Provision Contracts

    NASA Astrophysics Data System (ADS)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  15. Recommended advanced techniques for waterborne pathogen detection in developing countries.

    PubMed

    Alhamlan, Fatimah S; Al-Qahtani, Ahmed A; Al-Ahdal, Mohammed N

    2015-02-19

    The effect of human activities on water resources has expanded dramatically during the past few decades, leading to the spread of waterborne microbial pathogens. The total global health impact of human infectious diseases associated with pathogenic microorganisms from land-based wastewater pollution was estimated to be approximately three million disability-adjusted life years (DALY), with an estimated economic loss of nearly 12 billion US dollars per year. Although clean water is essential for healthy living, it is not equally granted to all humans. Indeed, people who live in developing countries are challenged every day by an inadequate supply of clean water. Polluted water can lead to health crises that in turn spread waterborne pathogens. Taking measures to assess the water quality can prevent these potential risks. Thus, a pressing need has emerged in developing countries for comprehensive and accurate assessments of water quality. This review presents current and emerging advanced techniques for assessing water quality that can be adopted by authorities in developing countries.

  16. Tracking acid mine-drainage in Southeast Arizona using GIS and sediment delivery models

    USGS Publications Warehouse

    Norman, L.M.; Gray, F.; Guertin, D.P.; Wissler, C.; Bliss, J.D.

    2008-01-01

    This study investigates the application of models traditionally used to estimate erosion and sediment deposition to assess the potential risk of water quality impairment resulting from metal-bearing materials related to mining and mineralization. An integrated watershed analysis using Geographic Information Systems (GIS) based tools was undertaken to examine erosion and sediment transport characteristics within the watersheds. Estimates of stream deposits of sediment from mine tailings were related to the chemistry of surface water to assess the effectiveness of the methodology to assess the risk of acid mine-drainage being dispersed downstream of abandoned tailings and waste rock piles. A watershed analysis was preformed in the Patagonia Mountains in southeastern Arizona which has seen substantial mining and where recent water quality samples have reported acidic surface waters. This research demonstrates an improvement of the ability to predict streams that are likely to have severely degraded water quality as a result of past mining activities. ?? Springer Science+Business Media B.V. 2007.

  17. Evaluation of fraction of absorbed photosynthetically active radiation products for different canopy radiation transfer regimes: methodology and results using Joint Research Center products derived from SeaWiFS against ground-based estimations.

    Treesearch

    Nadine Gobron; Bernard Pinty; Ophélie Aussedat; Jing M. Chen; Warren B. Cohen; Rasmus Fensholt; Valery Gond; Karl Fred Huemmrich; Thomas Lavergne; Frédéric Méline; Jeffrey L. Privette; Inge Sandholt; Malcolm Taberner; David P. Turner; Michael M. Verstraete; Jean-Luc Widlowski

    2006-01-01

    This paper discusses the quality and the accuracy of the Joint Research Center (JRC) fraction of absorbed photosynthetically active radiation (FAPAR) products generated from an analysis of Sea-viewing Wide Field-of-view Sensor (SeaWiFS) data. The FAPAR value acts as an indicator of the presence and state of the vegetation and it can be estimated from remote sensing...

  18. Backward Registration Based Aspect Ratio Similarity (ARS) for Image Retargeting Quality Assessment.

    PubMed

    Zhang, Yabin; Fang, Yuming; Lin, Weisi; Zhang, Xinfeng; Li, Leida

    2016-06-28

    During the past few years, there have been various kinds of content-aware image retargeting operators proposed for image resizing. However, the lack of effective objective retargeting quality assessment metrics limits the further development of image retargeting techniques. Different from traditional Image Quality Assessment (IQA) metrics, the quality degradation during image retargeting is caused by artificial retargeting modifications, and the difficulty for Image Retargeting Quality Assessment (IRQA) lies in the alternation of the image resolution and content, which makes it impossible to directly evaluate the quality degradation like traditional IQA. In this paper, we interpret the image retargeting in a unified framework of resampling grid generation and forward resampling. We show that the geometric change estimation is an efficient way to clarify the relationship between the images. We formulate the geometric change estimation as a Backward Registration problem with Markov Random Field (MRF) and provide an effective solution. The geometric change aims to provide the evidence about how the original image is resized into the target image. Under the guidance of the geometric change, we develop a novel Aspect Ratio Similarity metric (ARS) to evaluate the visual quality of retargeted images by exploiting the local block changes with a visual importance pooling strategy. Experimental results on the publicly available MIT RetargetMe and CUHK datasets demonstrate that the proposed ARS can predict more accurate visual quality of retargeted images compared with state-of-the-art IRQA metrics.

  19. Estimation of environmental flow incorporating water quality and hypothetical climate change scenarios.

    PubMed

    Walling, Bendangtola; Chaudhary, Shushobhit; Dhanya, C T; Kumar, Arun

    2017-05-01

    Environmental flows (Eflow, hereafter) are the flows to be maintained in the river for its healthy functioning and the sustenance and protection of aquatic ecosystems. Estimation of Eflow in any river stretch demands consideration of various factors such as flow regime, ecosystem, and health of river. However, most of the Eflow estimation studies have neglected the water quality factor. This study urges the need to consider water quality criterion in the estimation of Eflow and proposes a framework for estimating Eflow incorporating water quality variations under present and hypothetical future scenarios of climate change and pollution load. The proposed framework is applied on the polluted stretch of Yamuna River passing through Delhi, India. Required Eflow at various locations along the stretch are determined by considering possible variations in future water quantity and quality. Eflow values satisfying minimum quality requirements for different river water usage classes (classes A, B, C, and D as specified by the Central Pollution Control Board, India) are found to be between 700 and 800 m 3 /s. The estimated Eflow values may aid policymakers to derive upstream storage-release policies or effluent restrictions. Generalized nature of this framework will help its implementation on any river systems.

  20. Dietary Protein Intake in Young Children in Selected Low-Income Countries Is Generally Adequate in Relation to Estimated Requirements for Healthy Children, Except When Complementary Food Intake Is Low.

    PubMed

    Arsenault, Joanne E; Brown, Kenneth H

    2017-05-01

    Background: Previous research indicates that young children in low-income countries (LICs) generally consume greater amounts of protein than published estimates of protein requirements, but this research did not account for protein quality based on the mix of amino acids and the digestibility of ingested protein. Objective: Our objective was to estimate the prevalence of inadequate protein and amino acid intake by young children in LICs, accounting for protein quality. Methods: Seven data sets with information on dietary intake for children (6-35 mo of age) from 6 LICs (Peru, Guatemala, Ecuador, Bangladesh, Uganda, and Zambia) were reanalyzed to estimate protein and amino acid intake and assess adequacy. The protein digestibility-corrected amino acid score of each child's diet was calculated and multiplied by the original (crude) protein intake to obtain an estimate of available protein intake. Distributions of usual intake were obtained to estimate the prevalence of inadequate protein and amino acid intake for each cohort according to Estimated Average Requirements. Results: The prevalence of inadequate protein intake was highest in breastfeeding children aged 6-8 mo: 24% of Bangladeshi and 16% of Peruvian children. With the exception of Bangladesh, the prevalence of inadequate available protein intake decreased by age 9-12 mo and was very low in all sites (0-2%) after 12 mo of age. Inadequate protein intake in children <12 mo of age was due primarily to low energy intake from complementary foods, not inadequate protein density. Conclusions: Overall, most children consumed protein amounts greater than requirements, except for the younger breastfeeding children, who were consuming low amounts of complementary foods. These findings reinforce previous evidence that dietary protein is not generally limiting for children in LICs compared with estimated requirements for healthy children, even after accounting for protein quality. However, unmeasured effects of infection and intestinal dysfunction on the children's protein requirements could modify this conclusion.

Top